Adapters
Pidgey supports multiple storage backends through adapters. Choose the right one for your needs—you can always change later. Your job code never changes.
Quick Start: Which Adapter to Choose
- New project / local development → SQLite
- Production with Postgres already in use → PostgreSQL
- High throughput / low latency → Redis
SQLite
Zero-setup storage for development and small-scale production.
SQLite runs in-process with no external dependencies. Perfect for local development, testing, and single-server deployments.
When to Use
- Local development & testing
- CI/CD pipelines
- Embedded applications
- Low-traffic production (<200 jobs/sec)
- Single-worker deployments
Configuration
import { defineConfig } from '@pidgeyjs/core';
export default defineConfig({
adapter: 'sqlite',
filename: './pidgey.db', // Persistent storage
});In-memory Mode
export default defineConfig({
adapter: 'sqlite',
filename: ':memory:', // No persistence
});In-memory databases are destroyed when the process exits and cannot be shared between
processes. Use :memory: only for unit tests or single-process apps.
Limitations
- Single worker process (no distribution)
- Lower throughput (~100-200 jobs/sec)
- Polling-based (not event-driven)
PostgreSQL
Production-ready storage using your existing database.
No separate infrastructure needed—jobs live alongside your app data.
When to Use
- Production apps
- Multi-worker deployments
- ACID guarantees & durability
- Already using Postgres
- Moderate throughput requirements
Configuration
import { defineConfig } from '@pidgeyjs/core';
export default defineConfig({
adapter: 'postgres',
connection: process.env.DATABASE_URL,
});Migrations
npx pidgey migrateCreates the _pidgey_jobs table in your database.
Performance
- ~180-200 jobs/sec per worker
- Multiple concurrent workers supported
- Polling interval: configurable (default 100ms)
Limitations
- Higher latency than Redis (~50ms)
- Polling-based
- Requires migrations
Redis
High-throughput, event-driven queues powered by BullMQ.
When to Use
- High throughput (>1,000 jobs/sec)
- Millisecond-level latency
- Distributed workers across servers
- Complex queue topologies (priorities, rate limiting)
- Existing Redis infrastructure
Configuration
import { defineConfig } from '@pidgeyjs/core';
export default defineConfig({
adapter: 'redis',
options: {
host: process.env.REDIS_HOST,
port: Number(process.env.REDIS_PORT || 6379),
password: process.env.REDIS_PASSWORD,
},
});Performance
- 10,000+ jobs/sec per worker
- Sub-5ms latency
- Event-driven (no polling)
- Scales horizontally across many workers
Limitations
- Requires Redis infrastructure
- Additional maintenance overhead
Adapter Comparison (Quick Overview)
| Feature | SQLite | PostgreSQL | Redis |
|---|---|---|---|
| Setup | ⭐ None | ⭐⭐ Migrations | ⭐⭐⭐ Redis required |
| Throughput | ~200/sec | ~180/sec | 10,000+/sec |
| Latency | <10ms | ~50ms | <5ms |
| Concurrent Workers | 1 | Many | Many |
| Persistence | File or memory | Durable | Durable |
| Best For | Dev & testing | Production apps | High-scale & bursty |
Redis is ~80× faster than the database adapters due to in-memory speed and event-driven processing. Actual throughput depends on job complexity and worker concurrency.
Environment-based Configuration
Switch adapters based on NODE_ENV or an explicit environment variable:
import { defineConfig } from '@pidgeyjs/core';
const adapter = process.env.JOB_ADAPTER || 'sqlite';
export default defineConfig(
adapter === 'redis'
? {
adapter: 'redis',
options: { host: process.env.REDIS_HOST!, port: Number(process.env.REDIS_PORT || 6379) },
}
: adapter === 'postgres'
? { adapter: 'postgres', connection: process.env.DATABASE_URL! }
: { adapter: 'sqlite', filename: './dev.db' }
);# Development
JOB_ADAPTER=sqlite npm run dev
# Production with Postgres
JOB_ADAPTER=postgres DATABASE_URL=postgres://... npm start
# High throughput with Redis
JOB_ADAPTER=redis REDIS_HOST=redis.internal npm startMigration Guide
SQLite → PostgreSQL
- Install Postgres adapter:
npm install @pidgeyjs/postgres- Update config and connection.
- Run migrations:
npx pidgey migrate - Deploy. Jobs work unchanged.
PostgreSQL → Redis
- Install Redis adapter:
npm install @pidgeyjs/redis- Update config to use Redis.
- Deploy. Jobs migrate automatically.
No code changes required for jobs or handlers.
Next Steps
- Getting Started — Set up Pidgey
- API Reference — Full API documentation
- Worker Configuration — Configure job processing