Deep-dive on the Next Gen Platform. Join the Webinar!

Skip Navigation
Show nav
Dev Center
  • Get Started
  • Documentation
  • Changelog
  • Search
  • Get Started
    • Node.js
    • Ruby on Rails
    • Ruby
    • Python
    • Java
    • PHP
    • Go
    • Scala
    • Clojure
    • .NET
  • Documentation
  • Changelog
  • More
    Additional Resources
    • Home
    • Elements
    • Products
    • Pricing
    • Careers
    • Help
    • Status
    • Events
    • Podcasts
    • Compliance Center
    Heroku Blog

    Heroku Blog

    Find out what's new with Heroku on our blog.

    Visit Blog
  • Log inorSign up
Hide categories

Categories

  • Heroku Architecture
    • Compute (Dynos)
      • Dyno Management
      • Dyno Concepts
      • Dyno Behavior
      • Dyno Reference
      • Dyno Troubleshooting
    • Stacks (operating system images)
    • Networking & DNS
    • Platform Policies
    • Platform Principles
  • Developer Tools
    • Command Line
    • Heroku VS Code Extension
  • Deployment
    • Deploying with Git
    • Deploying with Docker
    • Deployment Integrations
  • Continuous Delivery & Integration (Heroku Flow)
    • Continuous Integration
  • Language Support
    • Node.js
      • Node.js Behavior in Heroku
      • Working with Node.js
      • Troubleshooting Node.js Apps
    • Ruby
      • Rails Support
      • Working with Bundler
      • Working with Ruby
      • Ruby Behavior in Heroku
      • Troubleshooting Ruby Apps
    • Python
      • Working with Python
      • Background Jobs in Python
      • Python Behavior in Heroku
      • Working with Django
    • Java
      • Java Behavior in Heroku
      • Working with Java
      • Working with Maven
      • Working with Spring Boot
      • Troubleshooting Java Apps
    • PHP
      • Working with PHP
      • PHP Behavior in Heroku
    • Go
      • Go Dependency Management
    • Scala
    • Clojure
    • .NET
      • Working with .NET
  • Databases & Data Management
    • Heroku Postgres
      • Postgres Basics
      • Postgres Getting Started
      • Postgres Performance
      • Postgres Data Transfer & Preservation
      • Postgres Availability
      • Postgres Special Topics
      • Migrating to Heroku Postgres
    • Heroku Key-Value Store
    • Apache Kafka on Heroku
    • Other Data Stores
  • AI
    • Working with AI
    • Heroku Inference
      • AI Models
      • Inference Essentials
      • Inference API
      • Quick Start Guides
  • Monitoring & Metrics
    • Logging
  • App Performance
  • Add-ons
    • All Add-ons
  • Collaboration
  • Security
    • App Security
    • Identities & Authentication
      • Single Sign-on (SSO)
    • Private Spaces
      • Infrastructure Networking
    • Compliance
  • Heroku Enterprise
    • Enterprise Accounts
    • Enterprise Teams
    • Heroku Connect (Salesforce sync)
      • Heroku Connect Administration
      • Heroku Connect Reference
      • Heroku Connect Troubleshooting
  • Patterns & Best Practices
  • Extending Heroku
    • Platform API
    • App Webhooks
    • Heroku Labs
    • Building Add-ons
      • Add-on Development Tasks
      • Add-on APIs
      • Add-on Guidelines & Requirements
    • Building CLI Plugins
    • Developing Buildpacks
    • Dev Center
  • Accounts & Billing
  • Troubleshooting & Support
  • Integrating with Salesforce
  • Language Support
  • Node.js
  • Working with Node.js
  • Background Jobs in Node.js with Redis

Background Jobs in Node.js with Redis

English — 日本語に切り替える

Last updated December 04, 2024

Table of Contents

  • Getting Started
  • Application Overview

It’s important for a web application to serve end-user requests as fast as possible. A good rule of thumb is to avoid web requests that run longer than 500 milliseconds. If you find that your app has requests that take 1, 2, or more seconds to complete, consider using a background job instead.

For a more information about this architectural pattern, read Worker Dynos, Background Jobs and Queueing.

This consideration is even more important for Node.js servers where computationally expensive requests can block the Event Loop. This block prevents the server from responding to any new requests until the computation is complete. Separating this computation into a different process keeps your web server light and responsive.

This article demonstrates worker queues with a sample Node.js application using Bull to manage the queue of background jobs.

Screenshot

This article assumes that you have Redis (for local development) and the Heroku CLI installed.

 

You can’t complete this tutorial with a Fir-generation app.

 

Using dynos to complete this tutorial counts towards your usage. We recommend using our low-cost plans to complete this tutorial. Eligible students can apply for platform credits through our new Heroku for GitHub Students program.

Getting Started

Complete these steps to clone this application into your Heroku account.

Via Dashboard

  1. Click Deploy
  2. Under the Resources tab in the Dashboard, scale the worker process to at least 1 dyno so that jobs are processed.
  3. Open your app in the browser to kick off new jobs and watch them complete.

Via CLI

$ git clone git@github.com:heroku-examples/node-workers-example.git
$ cd node-workers-example
$ heroku create
$ heroku addons:create heroku-redis
$ git push heroku main
$ heroku ps:scale worker=1
$ heroku open

Application Overview

The application consists of two processes.

  • web — An Express server that serves the frontend assets, accepts new background jobs, and reports on the status of existing jobs.
  • worker — A small Node.js process that executes incoming jobs.

You can scale these processes independently based on specific application needs. For more information about Heroku’s process model, read the Process Model article.

The web process serves the index.html and client.js files, which implement a simplified example of a frontend interface that kicks off new jobs and checks in on them.

Screenshot

Web Process

server.js is a tiny Express server. The important points to note are connecting to the Redis server and setting up the named work queue.

let REDIS_URL = process.env.REDIS_URL || 'redis://127.0.0.1:6379';

let workQueue = new Queue('work', REDIS_URL);

Another important note is kicking off the job when a POST request comes in.

app.post('/job', async (req, res) => {
  let job = await workQueue.add();
  res.json({ id: job.id });
});

Typically, you don’t give clients direct access to kicking off background jobs this way, but this simple example is for demonstration purposes.

Worker Process

worker.js spins up a cluster of worker processes using throng. In this example, the job sleeps for a bit before resolving, but it’s a good starting point for writing your own workers.

Two concurrency concepts are important to understand. The first is the number of workers.

let workers = process.env.WEB_CONCURRENCY || 2;

Each worker is a standalone Node.js process with an independent Event Loop. On Heroku dynos, a default value is set for you in the WEB_CONCURRENCY environment variable. This value scales based on the amount of memory on the dyno, but sometimes you must tune it for your specific application. To learn more, read Optimizing Node.js Application Concurrency.

The second concurrency concept is the maximum number of jobs each worker processes at a time.

let maxJobsPerWorker = 50;

workQueue.process(maxJobsPerWorker, async (job) => {
  // ...
});

Each worker picks jobs off of the Redis queue and processes them. This setting controls how many jobs each worker attempts to process at one time. Tuning this setting to your application is likely a task to complete. If each job mostly waits on network responses, like an external API or service, configure this setting to be higher. If each job is CPU intensive, consider configuring the setting to as low as 1. In this case, we recommend trying to spin up more worker processes.

Client Web App

client.js implements a tiny web frontend so that you can kick off jobs and watch them process. In production, Bull recommends several official UI’s to monitor the state of your job queue.

What you learned in this article is a small example of Bull’s capabilities. It has many more features, such as:

  • Priority queues
  • Rate limiting
  • Scheduled jobs
  • Retries

For more information on using these features, see the Bull documentation.

Keep reading

  • Working with Node.js

Feedback

Log in to submit feedback.

Using WebSockets on Heroku with Node.js Best Practices for Node.js Development

Information & Support

  • Getting Started
  • Documentation
  • Changelog
  • Compliance Center
  • Training & Education
  • Blog
  • Support Channels
  • Status

Language Reference

  • Node.js
  • Ruby
  • Java
  • PHP
  • Python
  • Go
  • Scala
  • Clojure
  • .NET

Other Resources

  • Careers
  • Elements
  • Products
  • Pricing
  • RSS
    • Dev Center Articles
    • Dev Center Changelog
    • Heroku Blog
    • Heroku News Blog
    • Heroku Engineering Blog
  • Twitter
    • Dev Center Articles
    • Dev Center Changelog
    • Heroku
    • Heroku Status
  • Github
  • LinkedIn
  • © 2025 Salesforce, Inc. All rights reserved. Various trademarks held by their respective owners. Salesforce Tower, 415 Mission Street, 3rd Floor, San Francisco, CA 94105, United States
  • heroku.com
  • Legal
  • Terms of Service
  • Privacy Information
  • Responsible Disclosure
  • Trust
  • Contact
  • Cookie Preferences
  • Your Privacy Choices