Learn how lazy sequences defer work in Clojure, when that improves composition and memory use, and which realization traps create bugs or confusion.
Lazy sequence: A sequence whose elements are produced only when a consumer asks for them, instead of all at once up front.
Laziness is one of Clojure’s most powerful ideas because it lets you describe pipelines separately from the moment values are realized. That makes large-data processing, streaming-style composition, and even infinite conceptual collections practical.
But laziness is not automatically “better.” It changes when work happens, which means it also changes where memory, side effects, and debugging surprises show up.
A lazy sequence lets you express a potentially large pipeline without immediately materializing the whole result.
1(def expensive-results
2 (->> (range)
3 (map #(* % %))
4 (filter odd?)))
5
6(take 5 expensive-results)
7;; => (1 9 25 49 81)
This pipeline can conceptually continue forever because take is the real consumer. The laziness is useful only because something downstream limits or consumes the sequence responsibly.
Many sequence operations already return lazy results:
mapfilterremoverepeatrepeatedlyiteraterange in its unbounded formThat means you are using laziness even when you did not write lazy-seq directly.
Use lazy-seq when you are defining a recursive or custom producer and you want element production deferred.
1(defn countdown [n]
2 (lazy-seq
3 (when (not (neg? n))
4 (cons n (countdown (dec n))))))
5
6(take 4 (countdown 10))
7;; => (10 9 8 7)
The important part is not the recursion by itself. The important part is that recursive work stays suspended until the consumer actually asks for more elements.
When people say “infinite data structure” in Clojure, they usually mean a sequence that has no conceptual end, not one that is fully stored.
1(take 8 (iterate inc 0))
2;; => (0 1 2 3 4 5 6 7)
3
4(take 6 (repeat :ok))
5;; => (:ok :ok :ok :ok :ok :ok)
Those are only practical because the consumer sets a limit. Without that limit, you are no longer modeling a clever abstraction. You are just asking the program to keep working forever.
Lazy values become concrete when a consumer forces them. Common forcing operations include:
doalldorunintoreducecount1(def rows
2 (map #(do (println "fetch" %) {:id %})
3 (range 3)))
4
5(take 1 rows)
6;; prints only what is needed
7
8(doall rows)
9;; forces the whole sequence
This is why laziness changes debugging and side-effect behavior. If the pipeline contains effects, you must know who is forcing it and when.
Lazy pipelines are best when the work is pure and compositional. If you hide logging, HTTP requests, or writes inside a lazy pipeline, the evaluation point can become much harder to reason about.
Prefer clear consumers for effectful work:
1(doseq [path ["a.txt" "b.txt" "c.txt"]]
2 (println "processing" path))
Use lazy sequences to describe value production. Use doseq, run!, explicit reduction, or other deliberate consumers when the main purpose is to perform effects.
Laziness saves memory only when already-consumed portions can be discarded. If you keep a reference to the head while also walking the tail, you can accidentally retain much more than you expect.
1(def all-lines
2 (line-seq (java.io.BufferedReader.
3 (java.io.StringReader. "a\nb\nc"))))
The lesson is not “never use lazy sequences.” The lesson is that lazy sequences are still values with lifetimes. If a consumer retains the wrong reference, the garbage collector cannot reclaim the already-traversed part.
Some lazy sequence operations are chunked, which means elements are realized in small batches rather than one at a time. That improves throughput but can surprise you if you assumed perfectly one-by-one evaluation.
In practice, the useful rule is:
If exact streaming behavior is critical, you may want a different abstraction such as reducers, transducers with an explicit consumer, or I/O APIs designed for streaming.
Lazy sequences are especially good when:
They are weaker when:
A team builds a lazy sequence that wraps API calls inside map, then passes that value through several layers before anyone realizes it. During debugging, the same endpoint seems to be called from surprising places.
What went wrong?
The stronger answer is that laziness moved the execution point away from the place where the pipeline was defined. The issue is not merely “lazy sequences are bad.” The issue is hiding side effects inside a deferred pipeline without a clear, deliberate consumer.