Your manager would like to have topic availability over consistency. Which setting do you need to change in order to enable that?
A. compression.type
B. unclean.leader.election.enable
C. min.insync.replicas
A producer application in a developer machine was able to send messages to a Kafka topic. After copying the producer application into another developer's machine, the producer is able to connect to Kafka but unable to produce to the same Kafka topic because of an authorization issue. What is the likely issue?
A. Broker configuration needs to be changed to allow a different producer
B. You cannot copy a producer application from one machine to another
C. The Kafka ACL does not allow another machine IP
D. The Kafka Broker needs to be rebooted
What client protocol is supported for the schema registry? (select two)
A. HTTP
B. HTTPS
C. JDBC
D. Websocket
E. SASL
Which is an optional field in an Avro record?
A. doc
B. name
C. namespace
D. fields
Which of the following errors are retriable from a producer perspective? (select two)
A. MESSAGE_TOO_LARGE
B. INVALID_REQUIRED_ACKS
C. NOT_ENOUGH_REPLICAS
D. NOT_LEADER_FOR_PARTITION
E. TOPIC_AUTHORIZATION_FAILED
Select the Kafka Streams joins that are always windowed joins.
A. KStream-KStream join
B. KTable-KTable join
C. KStream-GlobalKTable
D. KStream-KTable join
A consumer wants to read messages from partitions 0 and 1 of a topic topic1. Code snippet is shown
below.
consumer.subscribe(Arrays.asList("topic1"));
List
pc.add(new PartitionTopic("topic1", 0)); pc.add(new PartitionTopic("topic1", 1)); consumer.assign(pc);
A. This works fine. subscribe() will subscribe to the topic and assign() will assign partitions to the consumer.
B. Throws IllegalStateException
In Kafka Streams, by what value are internal topics prefixed by?
A. tasks-
B. application.id
C. group.id
D. kafka-streams-
What Java library is KSQL based on?
A. Kafka Streams
B. REST Proxy
C. Schema Registry
D. Kafka Connect
We would like to be in an at-most once consuming scenario. Which offset commit strategy would you recommend?
A. Commit the offsets on disk, after processing the data
B. Do not commit any offsets and read from beginning
C. Commit the offsets in Kafka, after processing the data
D. Commit the offsets in Kafka, before processing the data
Nowadays, the certification exams become more and more important and required by more and more enterprises when applying for a job. But how to prepare for the exam effectively? How to prepare for the exam in a short time with less efforts? How to get a ideal result and how to find the most reliable resources? Here on Vcedump.com, you will find all the answers. Vcedump.com provide not only Confluent exam questions, answers and explanations but also complete assistance on your exam preparation and certification application. If you are confused on your CCDAK exam preparations and Confluent certification application, do not hesitate to visit our Vcedump.com to find your solutions here.