---
title: "Cloudflare Queues - Managed Message Queue Service"
description: "A managed message queue service for reliable, asynchronous message delivery between Cloudflare Workers or to services built anywhere."
url: "https://www.cloudflare.com/products/queues"
---

# Queues

> Cloudflare Queues enables reliable, asynchronous message delivery between Cloudflare Workers — or to services built anywhere.

## Key Features

- Reliable message delivery
- Batching and delays
- Automatic retries
- Dead-letter queues
- Pull-based consumers
- Event subscriptions
- Built-in observability

## Benefits

### Asynchronous Message Processing

Queues help developers offload work from the request path so users don't have to wait, enabling reliable and asynchronous message handling.

### Built for Workers Platform

Seamlessly integrated with Cloudflare Workers for easy setup and management within your existing workflow.

### Event Subscriptions

Get notified and take action programmatically through Workers by using Queues to subscribe to events from KV, Cloudflare Workers, R2, Workers AI, and Vectorize.

## Use Cases

### ETL Pipelines

Reliably buffer large volumes of data for your ETL pipelines without overwhelming your databases or data warehouses. Ensure no data is lost during traffic spikes and process information at a manageable, consistent pace.

### Asynchronous user-lifecycle tasks

Offload time-consuming user-lifecycle tasks like sending welcome emails or processing profile pictures to a background worker. Keep your application's user interface fast and responsive by handling these jobs asynchronously.

### Web crawlers

Build distributed and resilient web crawlers by using Queues to manage the list of URLs to be fetched and processed. Easily scale your crawling infrastructure by adding more consumers and automatically retry failed jobs to ensure comprehensive data collection.

## Code Examples

### Basic Queue Operations

Send messages to a queue and process them asynchronously.

```typescript
// Producer Worker - Send messages to queue
export default {
  async fetch(request, env) {
    const { searchParams } = new URL(request.url);
    const message = searchParams.get('message');
    
    if (!message) {
      return new Response('Missing message parameter', { status: 400 });
    }

    // Send message to queue
    await env.MY_QUEUE.send({
      message: message,
      timestamp: new Date().toISOString(),
      userId: request.headers.get('X-User-ID')
    });

    return new Response('Message sent to queue', { status: 200 });
  }
};

// Consumer Worker - Process messages from queue
export default {
  async queue(batch, env, ctx) {
    for (const message of batch.messages) {
      try {
        // Process the message
        console.log('Processing message:', message.body);
        
        // Simulate some work
        await new Promise(resolve => setTimeout(resolve, 1000));
        
        // Acknowledge the message
        message.ack();
      } catch (error) {
        console.error('Failed to process message:', error);
        message.retry();
      }
    }
  }
};
```

### ETL Pipeline with Queues

Build reliable ETL pipelines by buffering data through queues.

```typescript
// Data ingestion worker
export default {
  async fetch(request, env) {
    const data = await request.json();
    
    // Send data to ETL queue for processing
    await env.ETL_QUEUE.send({
      type: 'data_ingestion',
      payload: data,
      timestamp: new Date().toISOString(),
      source: 'api'
    });

    return new Response('Data queued for processing', { status: 200 });
  }
};

// ETL processing worker
export default {
  async queue(batch, env, ctx) {
    for (const message of batch.messages) {
      try {
        const { type, payload } = message.body;
        
        if (type === 'data_ingestion') {
          // Transform the data
          const transformedData = await transformData(payload);
          
          // Store in data warehouse
          await env.DATA_WAREHOUSE.prepare(
            'INSERT INTO processed_data (data, processed_at) VALUES (?, ?)'
          ).bind(JSON.stringify(transformedData), new Date().toISOString()).run();
        }
        
        message.ack();
      } catch (error) {
        console.error('ETL processing failed:', error);
        message.retry();
      }
    }
  }
};

async function transformData(data) {
  // Your data transformation logic here
  return {
    ...data,
    processed: true,
    transformedAt: new Date().toISOString()
  };
}
```

### Web Crawler with Queues

Build distributed web crawlers using queues to manage URL processing.

```typescript
// URL discovery worker
export default {
  async fetch(request, env) {
    const { searchParams } = new URL(request.url);
    const startUrl = searchParams.get('url');
    
    if (!startUrl) {
      return new Response('Missing URL parameter', { status: 400 });
    }

    // Add initial URL to crawl queue
    await env.CRAWL_QUEUE.send({
      url: startUrl,
      depth: 0,
      maxDepth: 3
    });

    return new Response('Crawling started', { status: 200 });
  }
};

// Crawler worker
export default {
  async queue(batch, env, ctx) {
    for (const message of batch.messages) {
      try {
        const { url, depth, maxDepth } = message.body;
        
        // Fetch the page
        const response = await fetch(url);
        const html = await response.text();
        
        // Extract links
        const links = extractLinks(html, url);
        
        // Process the page content
        await processPage(url, html);
        
        // Add new links to queue if within depth limit
        if (depth < maxDepth) {
          for (const link of links) {
            await env.CRAWL_QUEUE.send({
              url: link,
              depth: depth + 1,
              maxDepth: maxDepth
            });
          }
        }
        
        message.ack();
      } catch (error) {
        console.error('Crawling failed:', error);
        message.retry();
      }
    }
  }
};

function extractLinks(html, baseUrl) {
  // Simple link extraction logic
  const linkRegex = /<a[^>]+href=["']([^"']+)["'][^>]*>/gi;
  const links = [];
  let match;
  
  while ((match = linkRegex.exec(html)) !== null) {
    const href = match[1];
    const absoluteUrl = new URL(href, baseUrl).toString();
    links.push(absoluteUrl);
  }
  
  return links;
}

async function processPage(url, html) {
  // Your page processing logic here
  console.log(`Processed page: ${url}`);
}
```

## Resources

- [Full Documentation](https://developers.cloudflare.com/queues): Complete technical documentation
- [Get Started](https://dash.cloudflare.com/sign-up): Sign up and start building
- [Pricing](/plans.md): See pricing details

## Related Products

- [Artifacts](/products/artifacts.md): Git-native versioned storage
- [Cache Reserve](/products/cache-reserve.md): Persistent caching for static content
- [D1](/products/d1.md): Serverless SQL
- [Data Platform](/products/data-platform.md): Ingest, Catalog & Query

---

*This is a markdown version of [https://www.cloudflare.com/products/queues](https://www.cloudflare.com/products/queues) for AI/LLM consumption.*
