Deep-dive on the Next Gen Platform. Join the Webinar!

Skip Navigation
Show nav
Dev Center
  • Get Started
  • Documentation
  • Changelog
  • Search
  • Get Started
    • Node.js
    • Ruby on Rails
    • Ruby
    • Python
    • Java
    • PHP
    • Go
    • Scala
    • Clojure
    • .NET
  • Documentation
  • Changelog
  • More
    Additional Resources
    • Home
    • Elements
    • Products
    • Pricing
    • Careers
    • Help
    • Status
    • Events
    • Podcasts
    • Compliance Center
    Heroku Blog

    Visit the Heroku Blog

    Find news and updates from Heroku in the blog.

    Visit Blog
  • Log inorSign up
Hide categories

Categories

  • Heroku Architecture
    • Compute (Dynos)
      • Dyno Management
      • Dyno Concepts
      • Dyno Behavior
      • Dyno Reference
      • Dyno Troubleshooting
    • Stacks (operating system images)
    • Networking & DNS
    • Platform Policies
    • Platform Principles
  • Developer Tools
    • Command Line
    • Heroku VS Code Extension
  • Deployment
    • Deploying with Git
    • Deploying with Docker
    • Deployment Integrations
  • Continuous Delivery & Integration (Heroku Flow)
    • Continuous Integration
  • Language Support
    • Node.js
      • Working with Node.js
      • Troubleshooting Node.js Apps
      • Node.js Behavior in Heroku
    • Ruby
      • Rails Support
      • Working with Bundler
      • Working with Ruby
      • Ruby Behavior in Heroku
      • Troubleshooting Ruby Apps
    • Python
      • Working with Python
      • Background Jobs in Python
      • Python Behavior in Heroku
      • Working with Django
    • Java
      • Java Behavior in Heroku
      • Working with Java
      • Working with Maven
      • Working with Spring Boot
      • Troubleshooting Java Apps
    • PHP
      • PHP Behavior in Heroku
      • Working with PHP
    • Go
      • Go Dependency Management
    • Scala
    • Clojure
    • .NET
      • Working with .NET
  • Databases & Data Management
    • Heroku Postgres
      • Postgres Basics
      • Postgres Getting Started
      • Postgres Performance
      • Postgres Data Transfer & Preservation
      • Postgres Availability
      • Postgres Special Topics
      • Migrating to Heroku Postgres
    • Heroku Key-Value Store
    • Apache Kafka on Heroku
    • Other Data Stores
  • AI
    • Working with AI
  • Monitoring & Metrics
    • Logging
  • App Performance
  • Add-ons
    • All Add-ons
  • Collaboration
  • Security
    • App Security
    • Identities & Authentication
      • Single Sign-on (SSO)
    • Private Spaces
      • Infrastructure Networking
    • Compliance
  • Heroku Enterprise
    • Enterprise Accounts
    • Enterprise Teams
    • Heroku Connect (Salesforce sync)
      • Heroku Connect Administration
      • Heroku Connect Reference
      • Heroku Connect Troubleshooting
  • Patterns & Best Practices
  • Extending Heroku
    • Platform API
    • App Webhooks
    • Heroku Labs
    • Building Add-ons
      • Add-on Development Tasks
      • Add-on APIs
      • Add-on Guidelines & Requirements
    • Building CLI Plugins
    • Developing Buildpacks
    • Dev Center
  • Accounts & Billing
  • Troubleshooting & Support
  • Integrating with Salesforce
  • Databases & Data Management
  • Apache Kafka on Heroku
  • Apache Kafka on Heroku Add-on Migration

Apache Kafka on Heroku Add-on Migration

English — 日本語に切り替える

Last updated December 13, 2022

Table of Contents

  • When to Migrate Between Kafka Add-ons?
  • How Do I Handle the Migration?
  • How Do I Migrate Between Add-ons While in a Maintenance Window?
  • How Do I Migrate Between Add-ons Without Entering a Maintenance Window?

Scaling up or down between plan levels of Apache Kafka on Heroku is normally seamless and performed in-place. However, there are a few circumstances when actual data migration is required. This document provides an overview of those conditions and the applicable processes.

When to Migrate Between Kafka Add-ons?

There are 3 cases when migrating between Kafka add-ons is necessary:

  • You have a multi-tenant Kafka (Kafka Basic) add-on and you want to start using a dedicated Kafka add-on.
  • You have a dedicated Kafka add-on and you want to start using a multi-tenant Kafka (Kafka Basic) add-on.
  • You have a Beta multi-tenant Kafka (Kafka Basic) add-on and the cluster that hosts the add-on is reaching end-of-life.

How Do I Handle the Migration?

In many scenarios, your application can enter a maintenance window and migrate to a new add-on without modifying your application’s code. In general, we recommend entering a maintenance window if you can, because it drastically reduces the complexity of the migration and doesn’t require significant changes to your app.

If your application can’t enter a maintenance window, you must migrate to a new add-on by double-writing to both sets of topics, and cutting over from the old add-on to the new one after the new add-on has received writes for a time period longer than your retention time.

How Do I Migrate Between Add-ons While in a Maintenance Window?

The high-level steps for migrating during a maintenance window are:

  1. Provision the new add-on with all relevant topics and consumer groups.
  2. Enter your maintenance window.
  3. Stop your Kafka producers.
  4. Ensure your Kafka consumers are fully caught up.
  5. Switch over to the new add-on.
  6. Start your Kafka producers and consumers.
  7. Exit your maintenance window.
$ heroku addons:create heroku-kafka:basic-0 --as NEW_KAFKA -a mackerel
$ heroku kafka:topics:create my-topic-name NEW_KAFKA -a mackerel
$ heroku kafka:consumer-groups:create my-group-name KAFKA -a mackerel
$ heroku ps:scale producer=0 -a mackerel
# check consumers
heroku ps:scale consumer=0 -a mackerel
heroku maintenance:on -a mackerel
# kafka-parallel-2019
heroku addons:attach kafka-symmetrical-26061 --as OLD_KAFKA -a mackerel
$ heroku addons:attach kafka-parallel-2019 --as KAFKA -a mackerel
$ heroku ps:scale producer=1 consumer=1 -a mackerel
$ heroku addons:destroy kafka-symmetrical-26061 -a mackerel

How Do I Migrate Between Add-ons Without Entering a Maintenance Window?

The high-level steps for migrating without entering a maintenance window are:

  1. Prepare your app for double-write.
  2. Provision the new Kafka add-on with all relevant topics and consumer groups.
  3. Double-write to both the old and the new add-ons.
  4. Wait for the new add-on to contain the same historical data as the old add-on.
  5. Stop producing to the old add-on.
  6. Destroy the old add-on.

These steps are described in greater detail in this section.

Step 1: Prepare Your App for Double-Write

Your app must support two sets of Kafka config vars (one for each add-on).

This example uses KAFKA_URL, KAFKA_CLIENT_CERT, KAFKA_CLIENT_CERT_KEY, and KAFKA_TRUSTED_CERT for the old Kafka add-on before double-writing begins, and it uses them for the new Kafka add-on after double-writing begins.

This example uses OLD_KAFKA_URL, OLD_KAFKA_CLIENT_CERT, OLD_KAFKA_CLIENT_CERT_KEY and OLD_KAFKA_TRUSTED_CERT for the old Kafka add-on after double-writing begins. This set of config vars exists only while double-writing is taking place.

Two additional config vars are required, which tell producers and consumers where to write to and read from:

  • PRODUCER_ADDON_NAMES is used by producers to discover which add-ons to write to.
  • CONSUMER_ADDON_NAME is used by consumers to discover which add-on to read from.

You must add support to your app for:

  • Producing to all add-ons specified in PRODUCER_ADDON_NAMES
  • Consuming from the add-on specified in CONSUMER_ADDON_NAME

Consumers must handle duplicate messages idempotently. For more information, see the article on robust usage of Apache Kafka on Heroku.

Step 2: Provision the New Add-on

Before provisioning the new add-on, attach your existing Kafka add-on with a new name in preparation:

$ heroku addons:attach kafka-symmetrical-26061 --as OLD_KAFKA -a mackerel
$ heroku addons:create heroku-kafka:basic-0 --as KAFKA -a mackerel

Step 3: Create Topics and Consumer Groups on the New Add-on

Get a list of topics and consumer groups from your old add-on:

$ heroku kafka:topics OLD_KAFKA -a mackerel
$ heroku kafka:consumer-groups OLD_KAFKA -a mackerel

Now, you can create those topics and consumer groups on your new add-on:

$ heroku kafka:topics:create my-topic-name KAFKA -a mackerel
$ heroku kafka:consumer-groups:create my-group-name KAFKA -a mackerel

Step 4: Double-Write to the Old and New Add-ons

Your app must produce to both sets of topics and consume from the old add-on’s topics while the new add-on’s topics fill with data:

$ heroku config:set PRODUCER_ADDON_NAMES=OLD_KAFKA,KAFKA -a mackerel
$ heroku config:set CONSUMER_ADDON_NAME=OLD_KAFKA -a mackerel

Step 5: Wait for the New Add-on to Contain Enough Historical Data

After the new add-on has been receiving writes for longer than your retention time, both add-ons represent the same data. This means you can switch your consumers from the old add-on to the new add-on:

$ heroku config:set CONSUMER_ADDON_NAME=KAFKA -a mackerel

Step 6: Stop Producing to the Old Add-on

When you’re comfortable consuming from the new add-on, you can stop producing to the old add-on:

$ heroku config:set PRODUCER_ADDON_NAMES=KAFKA -a mackerel

Step 7: Destroy the Old Add-on

Because your app is no longer consuming from the old add-on, it’s safe to destroy it:

$ heroku addons:destroy OLD_KAFKA_URL -a mackerel

Keep reading

  • Apache Kafka on Heroku

Feedback

Log in to submit feedback.

Robust Usage of Apache Kafka on Heroku Connecting to Apache Kafka on Heroku in a Private or Shield Space via PrivateLink

Information & Support

  • Getting Started
  • Documentation
  • Changelog
  • Compliance Center
  • Training & Education
  • Blog
  • Support Channels
  • Status

Language Reference

  • Node.js
  • Ruby
  • Java
  • PHP
  • Python
  • Go
  • Scala
  • Clojure
  • .NET

Other Resources

  • Careers
  • Elements
  • Products
  • Pricing
  • RSS
    • Dev Center Articles
    • Dev Center Changelog
    • Heroku Blog
    • Heroku News Blog
    • Heroku Engineering Blog
  • Twitter
    • Dev Center Articles
    • Dev Center Changelog
    • Heroku
    • Heroku Status
  • Github
  • LinkedIn
  • © 2025 Salesforce, Inc. All rights reserved. Various trademarks held by their respective owners. Salesforce Tower, 415 Mission Street, 3rd Floor, San Francisco, CA 94105, United States
  • heroku.com
  • Legal
  • Terms of Service
  • Privacy Information
  • Responsible Disclosure
  • Trust
  • Contact
  • Cookie Preferences
  • Your Privacy Choices