Queue Lanes

One Service Shouldn't Clog the Whole Queue

When multiple services share a single queue and one of them sends a burst of 500 requests, it can occupy every available slot and starve everything else out. Queue lanes give each traffic class its own isolated segment of the queue — a burst from one doesn't starve the others.

Queue Starvation in Practice

You have 3 services — a user API, a nightly batch job, and an event processor — all calling the same downstream resource. The batch job fires at midnight with 1,000 requests. Every available slot fills up. The user API and event processor are now waiting behind 1,000 batch requests. This is queue starvation — one workload's burst blocking all others, even when those others have higher-value work to do.

Lanes Keep Traffic Classes Isolated

Each lane gets a segment of queue capacity. A burst in the batch lane doesn't reach into the user-api lane's capacity. You get strict isolation without needing separate resources per service — a single resource, segmented by lane.

import { ratequeue } from "@ratequeue/sdk";

// User API requests — in the "user-api" lane
await ratequeue.acquire(
  "downstream-api",
  { lane: "user-api", priority: 100, apiKey: process.env.RATEQUEUE_API_KEY! },
  async () => { await callDownstreamApi(request); }
);

// Batch job requests — isolated in their own lane
await ratequeue.acquire(
  "downstream-api",
  { lane: "batch", priority: 1, apiKey: process.env.RATEQUEUE_API_KEY! },
  async () => { await processRecord(record); }
);

// Event processor — in its own lane
await ratequeue.acquire(
  "downstream-api",
  { lane: "events", priority: 50, apiKey: process.env.RATEQUEUE_API_KEY! },
  async () => { await handleEvent(event); }
);

Allocation Strategies

Two strategies available, configured per resource in the dashboard — no code changes needed to switch between them.

Strict allocation

Each lane gets a fixed portion of capacity. Simple and predictable — each traffic class has a guaranteed slice regardless of the others.

Strict within lane

Lanes are isolated from each other, but capacity within a lane is allocated based on priority. Idle lane capacity is made available to others.

Lanes + Reserved Capacity

Lanes and reserved capacity work together. A lane can have its own reserved capacity block — guaranteeing that even a specific traffic class within a lane always has headroom, regardless of how busy the overall resource is. This is the combination that provides hard isolation for critical workloads.

Isolate your traffic classes before they collide

Lanes and reserved capacity are available on the paid plan. Start free to validate the integration.