Integration Testing with Embedded Kafka

Explore the benefits and techniques of using an embedded Kafka broker for integration testing, providing a controlled environment for end-to-end testing of Kafka applications.

14.2 Integration Testing with Embedded Kafka

Integration testing is a crucial step in the software development lifecycle, especially for systems built on distributed architectures like Apache Kafka. Embedded Kafka provides a lightweight and efficient way to perform integration testing by simulating a Kafka environment within your test suite. This approach allows developers to test Kafka applications end-to-end without the overhead of managing an external Kafka cluster.

Benefits of Using Embedded Kafka

Embedded Kafka offers several advantages for integration testing:

  • Isolation: Tests run in a controlled environment, minimizing interference from external systems.
  • Speed: Embedded Kafka is lightweight, reducing the time required to start and stop Kafka instances.
  • Simplicity: No need to configure and manage a separate Kafka cluster for testing purposes.
  • Consistency: Ensures consistent test results by providing a predictable Kafka environment.

Setting Up Embedded Kafka

To set up Embedded Kafka, you can use libraries such as kafka-junit or EmbeddedKafka from the spring-kafka-test package. These libraries provide utilities to start and stop Kafka brokers within your test suite.

Java Example

Here’s how you can set up Embedded Kafka in a Java test using spring-kafka-test:

 1import org.apache.kafka.clients.consumer.ConsumerRecord;
 2import org.apache.kafka.clients.producer.ProducerRecord;
 3import org.junit.jupiter.api.Test;
 4import org.springframework.kafka.core.DefaultKafkaProducerFactory;
 5import org.springframework.kafka.core.KafkaTemplate;
 6import org.springframework.kafka.test.context.EmbeddedKafka;
 7import org.springframework.kafka.test.utils.KafkaTestUtils;
 8
 9import java.util.Map;
10
11import static org.assertj.core.api.Assertions.assertThat;
12
13@EmbeddedKafka(partitions = 1, topics = { "test-topic" })
14public class EmbeddedKafkaTest {
15
16    @Test
17    public void testKafkaProducerAndConsumer() {
18        Map<String, Object> producerProps = KafkaTestUtils.producerProps(embeddedKafka);
19        DefaultKafkaProducerFactory<Integer, String> producerFactory = new DefaultKafkaProducerFactory<>(producerProps);
20        KafkaTemplate<Integer, String> template = new KafkaTemplate<>(producerFactory);
21
22        template.send(new ProducerRecord<>("test-topic", 1, "test-message"));
23
24        ConsumerRecord<Integer, String> record = KafkaTestUtils.getSingleRecord(consumer, "test-topic");
25        assertThat(record.value()).isEqualTo("test-message");
26    }
27}

Scala Example

Using Scala and kafka-junit, you can set up Embedded Kafka as follows:

 1import net.manub.embeddedkafka.{EmbeddedKafka, EmbeddedKafkaConfig}
 2import org.scalatest.flatspec.AnyFlatSpec
 3import org.scalatest.matchers.should.Matchers
 4
 5class EmbeddedKafkaSpec extends AnyFlatSpec with Matchers with EmbeddedKafka {
 6
 7  "A Kafka producer" should "send messages to a topic" in {
 8    implicit val config = EmbeddedKafkaConfig(kafkaPort = 6001, zooKeeperPort = 6000)
 9
10    withRunningKafka {
11      val message = "test-message"
12      val topic = "test-topic"
13
14      publishToKafka(topic, message)
15
16      consumeFirstStringMessageFrom(topic) should be(message)
17    }
18  }
19}

Kotlin Example

For Kotlin, you can use spring-kafka-test similarly to Java:

 1import org.apache.kafka.clients.consumer.ConsumerRecord
 2import org.apache.kafka.clients.producer.ProducerRecord
 3import org.junit.jupiter.api.Test
 4import org.springframework.kafka.core.DefaultKafkaProducerFactory
 5import org.springframework.kafka.core.KafkaTemplate
 6import org.springframework.kafka.test.context.EmbeddedKafka
 7import org.springframework.kafka.test.utils.KafkaTestUtils
 8import kotlin.test.assertEquals
 9
10@EmbeddedKafka(partitions = 1, topics = ["test-topic"])
11class EmbeddedKafkaTest {
12
13    @Test
14    fun `test Kafka producer and consumer`() {
15        val producerProps = KafkaTestUtils.producerProps(embeddedKafka)
16        val producerFactory = DefaultKafkaProducerFactory<Int, String>(producerProps)
17        val template = KafkaTemplate(producerFactory)
18
19        template.send(ProducerRecord("test-topic", 1, "test-message"))
20
21        val record: ConsumerRecord<Int, String> = KafkaTestUtils.getSingleRecord(consumer, "test-topic")
22        assertEquals("test-message", record.value())
23    }
24}

Clojure Example

In Clojure, you can use clj-kafka for embedded testing:

 1(ns embedded-kafka-test
 2  (:require [clj-kafka.test-utils :as test-utils]
 3            [clj-kafka.producer :as producer]
 4            [clj-kafka.consumer :as consumer]
 5            [clojure.test :refer :all]))
 6
 7(deftest test-kafka-producer-and-consumer
 8  (test-utils/with-embedded-kafka
 9    (let [topic "test-topic"
10          message "test-message"]
11      (producer/send-message {:topic topic :value message})
12      (let [record (consumer/consume-message {:topic topic})]
13        (is (= message (:value record))))))

Managing Topics and Data Within the Test Lifecycle

When using Embedded Kafka, it’s essential to manage topics and data effectively to ensure tests are isolated and repeatable. Here are some best practices:

  • Topic Creation: Create topics dynamically within the test setup to avoid conflicts with other tests.
  • Data Management: Clear data after each test to prevent state leakage. Use utilities provided by the testing library to reset topics.
  • Test Isolation: Use unique topic names per test or test suite to ensure isolation.

Best Practices for Cleaning Up and Avoiding State Leakage

To maintain a clean test environment, follow these best practices:

  • Tear Down: Ensure that the Kafka broker is stopped after each test to release resources.
  • Data Cleanup: Delete or reset topics after tests to prevent data from affecting subsequent tests.
  • Resource Management: Use try-finally blocks or equivalent constructs to guarantee cleanup even if tests fail.

Visualizing Embedded Kafka Setup

To better understand the setup and flow of an Embedded Kafka test, consider the following diagram:

    graph TD;
	    A["Start Test"] --> B["Initialize Embedded Kafka"];
	    B --> C["Create Topics"];
	    C --> D["Produce Messages"];
	    D --> E["Consume Messages"];
	    E --> F["Assert Results"];
	    F --> G["Cleanup"];
	    G --> H["End Test"];

Diagram Description: This flowchart illustrates the lifecycle of an Embedded Kafka test, from initialization to cleanup.

References and Further Reading

Knowledge Check

To reinforce your understanding of Embedded Kafka for integration testing, consider the following questions:

Test Your Knowledge: Embedded Kafka Integration Testing

Loading quiz…

By following these guidelines and examples, you can effectively use Embedded Kafka for integration testing, ensuring that your Kafka applications are robust and reliable.

In this section

Revised on Thursday, April 23, 2026