Async iterators in Node.js: the cleanest pattern for paginated data
← Back
April 4, 2026NodeJS6 min read

Async iterators in Node.js: the cleanest pattern for paginated data

Published April 4, 20266 min read

I was paginating through a database table with a manual while loop, tracking cursor state, and calling next() over and over. Then I found async generators and rewrote the same logic in a way that looks exactly like iterating over an array. The complexity disappeared and the resulting code was composable in ways the loop wasn't. Here is the pattern.

Async generator for paginated database queries

typescript
import { Pool } from 'pg';

async function* paginateQuery(
  pool: Pool,
  query: string,
  params: unknown[] = [],
  pageSize = 1000,
): AsyncGenerator {
  let offset = 0;
  
  while (true) {
    const result = await pool.query(
      `${query} LIMIT $${params.length + 1} OFFSET $${params.length + 2}`,
      [...params, pageSize, offset]
    );
    
    if (result.rows.length === 0) break;
    
    yield result.rows;
    
    if (result.rows.length < pageSize) break; // Last page
    offset += pageSize;
  }
}

// Usage — looks like any other iteration
for await (const batch of paginateQuery(pool, 'SELECT * FROM users WHERE active = $1', [true])) {
  await processUserBatch(batch);
  console.log(`Processed ${batch.length} users`);
}

Cursor-based pagination

typescript
async function* paginateByCursor(
  fetchPage: (cursor: string | null, limit: number) => Promise,
  pageSize = 100,
): AsyncGenerator {
  let cursor: string | null = null;
  
  while (true) {
    const items = await fetchPage(cursor, pageSize);
    
    for (const item of items) {
      yield item; // Yield one item at a time
    }
    
    if (items.length < pageSize) break;
    cursor = items[items.length - 1].id;
  }
}

// External API pagination
async function fetchUsersPage(cursor: string | null, limit: number) {
  const url = new URL('https://api.example.com/users');
  url.searchParams.set('limit', String(limit));
  if (cursor) url.searchParams.set('after', cursor);
  const resp = await fetch(url.toString());
  return (await resp.json()).users;
}

// Process each user individually without loading all into memory
for await (const user of paginateByCursor(fetchUsersPage, 100)) {
  await syncUserToDatabase(user);
}

Composing async iterables

typescript
// Generic utilities for async iterables
async function* map(
  source: AsyncIterable,
  fn: (item: T) => Promise | U,
): AsyncGenerator {
  for await (const item of source) {
    yield await fn(item);
  }
}

async function* filter(
  source: AsyncIterable,
  predicate: (item: T) => Promise | boolean,
): AsyncGenerator {
  for await (const item of source) {
    if (await predicate(item)) yield item;
  }
}

async function* take(source: AsyncIterable, n: number): AsyncGenerator {
  let count = 0;
  for await (const item of source) {
    yield item;
    if (++count >= n) break;
  }
}

// Compose: get first 100 active users' email addresses
const allUsers = paginateByCursor(fetchUsersPage, 500);
const activeUsers = filter(allUsers, user => user.active);
const emails = map(activeUsers, user => user.email);
const first100 = take(emails, 100);

const emailList: string[] = [];
for await (const email of first100) {
  emailList.push(email);
}

Collecting results from an async generator

typescript
// Collect all items (use carefully — loads everything into memory)
async function collect(source: AsyncIterable): Promise {
  const results: T[] = [];
  for await (const item of source) {
    results.push(item);
  }
  return results;
}

// Batch items for bulk operations
async function* batched(
  source: AsyncIterable,
  batchSize: number,
): AsyncGenerator {
  let batch: T[] = [];
  for await (const item of source) {
    batch.push(item);
    if (batch.length >= batchSize) {
      yield batch;
      batch = [];
    }
  }
  if (batch.length > 0) yield batch;
}

// Process in batches of 50
for await (const batch of batched(paginateByCursor(fetchUsersPage), 50)) {
  await db.bulkInsert(batch); // Insert 50 at a time
}

The elegance of async iterables is that they are lazy by default. No data is fetched until you iterate. You can compose map, filter, take, and batch operations without them executing until the final for-await-of. This is the same benefit as Python generators or Java streams — deferred execution means you only process what you need, and memory usage stays constant regardless of the total dataset size.

Share this
← All Posts6 min read