Node.js 25.9: The stream/iter API Finally Lands as Experimental

Node.js 25.9: The stream/iter API Finally Lands as Experimental

lschvn5 min read

Node.js 25.9.0 dropped on April 1st with a batch of quality-of-life additions, several of which have been in the making for over a year. The headline features are the new experimental stream/iter module and the --max-heap-size CLI flag, but there's more worth knowing about.

stream/iter: Async Iteration for Streams

The new experimental/streams/iter module (to be promoted to stable in a future release) adds two functions:

  • stream.iter(readable) β€” returns an async iterator that yields chunks from a Readable stream
  • stream.consume(readable) β€” creates a writable stream that drains a readable, useful for piping patterns

The practical effect is that you can now use for await...of directly over any Node.js Readable stream:

import { iter } from 'node:experimental/streams/iter';
import { createReadStream } from 'node:fs';

for await (const chunk of iter(createReadStream('file.txt'))) {
  process(chunk);
}

This replaces the Readable.from() workaround that many developers used to bridge streams and async iterables. Readable.from() was designed to create a stream from an iterable β€” using it as a stream consumer was always a hack. The new API makes the intent explicit and avoids the double-buffering overhead of the old pattern.

The consume() function is oriented toward transforming streams:

import { consume } from 'node:experimental/streams/iter';
import { createReadStream, createWriteStream } from 'node:fs';

const writable = createWriteStream('output.txt');
await consume(createReadStream('input.txt')).pipe(writable);

James M Snell, who implemented the feature, also added benchmarks in the same PR β€” the API is designed to have minimal overhead compared to manual stream consumption.

--max-heap-size: Hard Memory Bounds

Node.js processes have always been subject to V8's heap limits, but setting them required either environment variables (--max-old-space-size) or programmatic APIs. --max-heap-size is a straightforward CLI flag:

node --max-heap-size=512 server.js

Unlike --max-old-space-size, which controls only the old generation, --max-heap-size applies to V8's total heap including code generation and new generation. This makes it more predictable for containerized workloads where you want a hard memory ceiling that the orchestrator can rely on.

The flag was contributed by tannal and had been in discussion for several years before landing.

AsyncLocalStorage Gets Using Scopes

AsyncLocalStorage has been a staple for request-scoped context in web frameworks since Node.js 16. The new addition is support for using scopes β€” based on ECMAScript's Explicit Resource Management stage 3 proposal (the Symbol.dispose pattern).

The using keyword calls a Symbol.dispose method when a block exits, whether normally or via an error. With the new API, you can bind an AsyncLocalStorage instance to a scope:

import { AsyncLocalStorage } from 'node:async_hooks';

const storage = new AsyncLocalStorage();

{
  using scope = storage.enable();
  storage.run({ requestId: 'abc123' }, () => {
    // storage.get() returns { requestId: 'abc123' }
  });
}
// storage is automatically cleared when scope exits

This eliminates the need for explicit try/finally cleanup in many patterns. In high-throughput servers that create many short-lived storage instances, using scopes prevent the AsyncLocalStorage store from growing unboundedly.

Crypto: TurboSHAKE and KangarooTwelve

The crypto module gains two new hash functions via WebCrypto integration:

  • TurboSHAKE β€” variable-length output, suitable for streaming and tree hashing applications
  • KangarooTwelve β€” fast 128-bit hash, a SHA-3 derivative designed as a faster alternative to SHA-256 for everyday use cases

These are available through the standard WebCrypto SubtleCrypto.digest() interface under their respective algorithm names.

Other Notable Changes

  • Test runner mocking consolidated: MockModuleOptions.defaultExport and MockModuleOptions.namedExports merged into a single MockModuleOptions.exports option, with an automated codemod available
  • npm upgraded to 11.12.1: includes the latest npm features and security fixes
  • SEA code cache for ESM: Single Executable Applications now support code caching for ESM entry points, improving startup time for bundled Node.js applications
  • module.register() deprecated: the legacy module registration API is now formally deprecated (DEP0205)

Node.js 25 is the current unstable version line; Node.js 24 will become the next LTS candidate later this year. Most of these features will backport to LTS lines as they stabilize.

Frequently Asked Questions

Related articles

More coverage with overlapping topics and tags.

JetStream 3: The Benchmark That Actually Reflects How Modern Web Apps Run
JavaScript

JetStream 3: The Benchmark That Actually Reflects How Modern Web Apps Run

WebKit, Google, and Mozilla just released JetStream 3 β€” the first major overhaul of the benchmark suite since 2019. It drops microbenchmarks in favor of realistic workloads, rewrites WebAssembly scoring, and introduces Dart, Kotlin, and Rust compiled to Wasm.
State of TypeScript 2026: GitHub's #1 Language, Project Corsa, and the Supply Chain Reckoning
npm

State of TypeScript 2026: GitHub's #1 Language, Project Corsa, and the Supply Chain Reckoning

A look back at the major events that reshaped TypeScript's position in the JavaScript ecosystem β€” from surpassing JavaScript on GitHub to npm supply chain compromises and the Go-based compiler rewrite targeting 10x faster builds.
Google's JSIR: An MLIR-Based Intermediate Representation for JavaScript Analysis
JavaScript

Google's JSIR: An MLIR-Based Intermediate Representation for JavaScript Analysis

Google has open sourced JSIR, a next-generation JavaScript analysis tool built on MLIR. It supports both high-level dataflow analysis and lossless source-to-source transformation β€” used internally for Hermes bytecode decompilation and AI-powered JavaScript deobfuscation.

Comments

Log in Log in to join the conversation.

No comments yet. Be the first to share your thoughts.