Integration Testing Strategies in Clojure

How to structure modern Clojure integration tests with fixtures, real dependencies, containerized services, and standard runners.

Integration testing in Clojure is where the language’s simplicity either pays off or gets lost in system sprawl. The goal is not to test everything end to end on every run. The goal is to prove that your important seams actually work together:

  • HTTP handlers and the application system
  • application code and the real database
  • event producers and consumers
  • code and external services or service substitutes

The modern testing baseline is still clojure.test plus fixtures. Runners such as Kaocha add ergonomics, watch mode, profiling, and reporting, but the core lifecycle model remains the same: start the part of the system you need, run real calls through it, and shut it down cleanly.

What Counts As An Integration Test

An integration test is not just “a slow unit test.” It verifies a boundary:

  • process boundary
  • network boundary
  • storage boundary
  • serialization boundary
  • dependency lifecycle boundary

That means a good integration suite is driven by system risk. If your project’s biggest failures come from SQL behavior, migrations, and transactions, database integration tests deserve more attention than browser automation. If your failures come from message ordering and retries, the important tests are around queues and consumers.

Start With clojure.test Fixtures

You do not need a special framework to structure integration tests. A :once fixture is often enough to manage a whole-system lifecycle:

 1(ns my-app.integration.http-test
 2  (:require [clojure.test :refer [deftest is use-fixtures]])
 3  (:import [java.net URI]
 4           [java.net.http HttpClient HttpRequest HttpResponse$BodyHandlers]))
 5
 6(defonce state (atom nil))
 7
 8(defn with-system [f]
 9  (let [{:keys [port stop]} (start-test-system)]
10    (reset! state {:port port :stop stop})
11    (try
12      (f)
13      (finally
14        (stop)))))
15
16(use-fixtures :once with-system)
17
18(deftest create-user-round-trip
19  (let [client (HttpClient/newHttpClient)
20        request (-> (HttpRequest/newBuilder
21                      (URI/create (str "http://localhost:" (:port @state) "/users/42")))
22                    (.header "content-type" "application/json")
23                    (.PUT (java.net.http.HttpRequest$BodyPublishers/ofString "{\"name\":\"Ada\"}"))
24                    (.build))
25        response (.send client request (HttpResponse$BodyHandlers/ofString))]
26    (is (= 200 (.statusCode response)))))

The important design point is not the specific HTTP client. It is the lifecycle:

  • boot the real system boundary
  • hit it the way production code would
  • shut it down deterministically

Use The Same Database Engine When Behavior Matters

Older integration-testing advice often jumps to H2 or another in-memory database for everything. That can be useful for some tests, but it is not a safe substitute when production behavior depends on:

  • PostgreSQL transaction semantics
  • database-specific types
  • locking behavior
  • indexes and query planner behavior
  • generated columns or JSON operators

If those behaviors matter, the strongest test uses the same engine family you run in production. That is where container-based testing earns its keep.

Containers Are Often The Right Boundary

The Testcontainers project remains a practical way to boot lightweight, throwaway infrastructure for tests. In Clojure, teams often use it through Java interop or Clojure wrappers, depending on the repo’s conventions.

That is often a better strategy than hand-maintaining “almost production” shared test servers. Containers make test setup slower than a pure mock, but far more trustworthy when you care about real behavior.

Use container-backed tests when you need confidence in:

  • migrations
  • message broker wiring
  • object storage behavior
  • service startup order
  • SQL compatibility

Prefer Fakes For Unreliable Or Expensive External Services

Not every dependency should be real in every test. External payment providers, third-party APIs, and remote SaaS services are usually better represented by:

  • stubs
  • mock servers
  • contract tests
  • recorded fixtures where appropriate

The rule is simple:

  • use the real dependency when its behavior is central to your risk
  • use a substitute when the external system is too expensive, unstable, or outside your control

That choice is architectural, not ideological.

Keep Test Data Reset Explicit

Integration tests become flaky when the cleanup story is fuzzy. Strong suites are explicit about:

  • schema setup
  • seed data
  • per-test rollback or truncation
  • time sources
  • queues and offsets
  • blob/object cleanup

If a test can only pass when the world happens to be in the right state, it is not trustworthy enough to gate releases.

Runners And Reporting

Plain clojure.test is enough to define the tests. Runners such as Kaocha improve the operating model:

  • watch mode
  • fail-fast mode
  • randomization
  • test filtering
  • JUnit XML plugins
  • profiling and slow-test visibility

That is useful because integration tests are expensive enough that how you run them matters almost as much as how you write them.

A Better Integration-Test Workflow

    flowchart LR
	    A["Select Real Boundary"] --> B["Boot Fixture or Container"]
	    B --> C["Run Real Calls Through System"]
	    C --> D["Assert Response, State, and Side Effects"]
	    D --> E["Tear Down Cleanly"]

The important thing to notice is that the test is centered on a boundary, not on a helper library.

Common Anti-Patterns

  • treating H2 as a universal substitute for a production relational database
  • asserting only HTTP status codes and not state changes or outputs
  • sharing one mutable test environment across unrelated tests
  • hiding lifecycle complexity in magical helpers nobody understands
  • overmocking the exact dependency you most need confidence in

Key Takeaways

  • Integration tests verify seams, not just functions.
  • clojure.test fixtures remain the foundation of most Clojure integration suites.
  • Use the same database engine when SQL behavior matters.
  • Containers are often the right choice for real infrastructure boundaries.
  • Choose substitutes only when real dependencies are too costly or unstable for the test’s purpose.

References and Further Reading

Ready to Test Your Knowledge?

Loading quiz…
Revised on Thursday, April 23, 2026