Confluent CCDAK Exam (page: 5)
Confluent Certified Developer for Apache Kafka Certification Examination
Updated on: 25-Dec-2025

Viewing Page 5 of 31

A consumer wants to read messages from a specific partition of a topic. How can this be achieved?

  1. Call subscribe(String topic, int partition) passing the topic and partition number as the arguments
  2. Call assign() passing a Collection of TopicPartitions as the argument
  3. Call subscribe() passing TopicPartition as the argument

Answer(s): B

Explanation:

assign() can be used for manual assignment of a partition to a consumer, in which case subscribe() must not be used. Assign() takes a collection of TopicPartition object as an argument


Reference:

https://kafka.apache.org/23/javadoc/org/apache/kafka/clients/consumer/KafkaConsumer.html#assign-java.util.Collection-



If I want to have extremely high confidence that leaders and replicas have my data, I should use?

  1. acks=all, replication factor=2, min.insync.replicas=1
  2. acks=1, replication factor=3, min.insync.replicas=2
  3. acks=all, replication factor=3, min.insync.replicas=2
  4. acks=all, replication factor=3, min.insync.replicas=1

Answer(s): C

Explanation:

acks=all means the leader will wait for all in-sync replicas to acknowledge the record. Also the min in- sync replica setting specifies the minimum number of replicas that need to be in-sync for the partition to remain available for writes.



Where are the dynamic configurations for a topic stored?

  1. In Zookeeper
  2. In an internal Kafka topictopic_configuratins
  3. In server.properties
  4. On the Kafka broker file system

Answer(s): A

Explanation:

Dynamic topic configurations are maintained in Zookeeper.



You are using JDBC source connector to copy data from 2 tables to two Kafka topics. There is one connector created with max.tasks equal to 2 deployed on a cluster of 3 workers. How many tasks are launched?

  1. 6
  2. 1
  3. 2
  4. 3

Answer(s): C

Explanation:

we have two tables, so the max number of tasks is 2



If I want to send binary data through the REST proxy to topic "test_binary", it needs to be base64 encoded. A consumer connecting directly into the Kafka topic

  1. "test_binary" will receive
  2. binary data
  3. avro data
  4. json data
  5. base64 encoded data, it will need to decode it

Answer(s): B

Explanation:

On the producer side, after receiving base64 data, the REST Proxy will convert it into bytes and then send that bytes payload to Kafka. Therefore, consumers reading directly from Kafka will receive binary data.



Viewing Page 5 of 31



Share your comments for Confluent CCDAK exam with other users:

Testbear 6/13/2023 11:15:00 AM

please upload
ITALY