core
Pipeline, stream utilities, and factory functions.
Streams made easy. Commonly used stream patterns for Web Streams API and NodeJS Stream. Modular, composable, and lightweight.
Weekly downloads
Fewer dependencies
Test coverage
Works with both Web Streams API and Node.js Streams. Write once, run anywhere — server, browser, or edge.
Install only what you need. Each stream type is a separate package, keeping your bundle size minimal.
Chain readable, transform, and writable streams together
with a simple pipeline() call. Build complex
data flows from simple parts.
CSV parsing, compression, character encoding, validation, base64, cryptographic digests — all the patterns you commonly need, ready to use.
Stream data to and from DynamoDB, S3, SQS, SNS, and Lambda with purpose-built stream adapters.
Zero-bloat design. Each package has minimal dependencies and small install size, perfect for serverless and edge environments.
Build a complete data pipeline in just a few lines:
import { pipeline, createReadableStream } from '@datastream/core'
import { csvParseStream } from '@datastream/csv'
import { validateStream } from '@datastream/validate'
import { gzipCompressStream } from '@datastream/compress'
const streams = [
createReadableStream(csvData),
csvParseStream({ header: true }),
validateStream(schema),
gzipCompressStream()
]
await pipeline(streams)Explore the available stream packages.
Pipeline, stream utilities, and factory functions.
DynamoDB, S3, SQS, SNS, and Lambda streams.
Parse and format CSV data.
Brotli, gzip, deflate, and zstd compression.
Character encoding detection, decoding, and encoding.
Schema validation for stream data.
Object manipulation, batching, pivoting, and filtering.
String streams, splitting, replacing, and counting.
Install datastream in seconds and start building stream pipelines today.
Read the docs