TrueSpec

Background Jobs with BullMQ

Queue setup, workers, retries, job scheduling, and monitoring dashboard

What You’ll Build

After following this guide, you will have a working implementation of background jobs with bullmq in your project. Offload slow tasks (email sending, image processing, API calls) to background job queues with BullMQ. Jobs are persisted in Redis, so they survive server restarts. Covers job creation, worker processing, automatic retries with exponential backoff, scheduled/recurring jobs, and a monitoring dashboard.

Use Cases & Problems Solved

  • Build reliable server endpoints that clients can consume consistently
  • Handle common backend patterns like routing, middleware, and error handling
  • Provide a clean interface between your frontend and data layer

Prerequisites

  • Node.js 18+
  • Redis server running

Step-by-Step Implementation

Install BullMQ

The following snippet shows how to install bullmq. Copy this into your project and adjust the values for your environment.

npm install bullmq

Define queue and add jobs

The following snippet shows how to define queue and add jobs. Copy this into your project and adjust the values for your environment.

const { Queue } = require('bullmq');

const emailQueue = new Queue('email', {
  connection: { host: 'localhost', port: 6379 },
  defaultJobOptions: {
    attempts: 3,
    backoff: { type: 'exponential', delay: 1000 },
    removeOnComplete: 100,
    removeOnFail: 500,
  },
});

// Add a job (from your API handler)
await emailQueue.add('welcome-email', {
  to: 'user@example.com',
  subject: 'Welcome!',
  template: 'welcome',
});

// Schedule a job for later
await emailQueue.add('reminder', { userId: 123 }, {
  delay: 24 * 60 * 60 * 1000, // 24 hours from now
});

Create a worker to process jobs

The following snippet shows how to create a worker to process jobs. Copy this into your project and adjust the values for your environment.

const { Worker } = require('bullmq');

const worker = new Worker('email', async (job) => {
  console.log(\`Processing \${job.name} for \${job.data.to}\`);
  await sendEmail(job.data);
  return { sent: true };
}, {
  connection: { host: 'localhost', port: 6379 },
  concurrency: 5,
});

worker.on('completed', (job, result) => console.log(\`Job \${job.id} completed\`));
worker.on('failed', (job, err) => console.error(\`Job \${job.id} failed: \${err.message}\`));

⚠️ Don’t Do This

❌ Doing slow work inside the HTTP request handler

app.post('/signup', async (req, res) => {
  const user = await createUser(req.body);
  await sendWelcomeEmail(user); // 2-3 seconds!
  await generateAvatar(user);   // 5 seconds!
  res.json(user); // User waits 8+ seconds for signup
});

✅ Queue background work and respond immediately

app.post('/signup', async (req, res) => {
  const user = await createUser(req.body);
  await emailQueue.add('welcome', { userId: user.id });
  await avatarQueue.add('generate', { userId: user.id });
  res.json(user); // Response in ~100ms!
});

Testing

Add these tests to verify your API endpoints work correctly:

// __tests__/api.test.ts
import { describe, it, expect } from 'vitest';

describe('Background Jobs with BullMQ', () => {
  it('should return 200 for valid requests', async () => {
    const res = await fetch('/api/endpoint', { method: 'GET' });
    expect(res.status).toBe(200);
    const data = await res.json();
    expect(data).toBeDefined();
  });

  it('should return 400 for invalid input', async () => {
    const res = await fetch('/api/endpoint', {
      method: 'POST',
      body: JSON.stringify({}),
    });
    expect(res.status).toBe(400);
  });

  it('should handle errors gracefully', async () => {
    const res = await fetch('/api/endpoint/nonexistent');
    expect(res.status).toBe(404);
  });
});

Verification

# Start your worker in a separate terminal:
node worker.js

# Add a job from your API:
curl -X POST http://localhost:3000/signup ...

# Check worker terminal for processing output
# Check Redis for job status:
redis-cli KEYS 'bull:email:*'

Related Specs

Intermediate

API Rate Limiting Strategies

Token bucket, sliding window, Redis-backed, and per-user limits

API & Backend
Advanced

Secure Webhook Handler

Signature verification, idempotency, retry handling, and queue processing

API & Backend
Intermediate

WebSocket Chat with Socket.io

Rooms, namespaces, auth middleware, reconnection, and scaling

API & Backend