integrationIntermediate10 min

Batch Processing Guide

Process large datasets efficiently

Learn how to use batch endpoints for bulk operations, optimize chunk sizes, and handle partial failures in large data processing workflows.

01

When to Use Batch Processing

Batch processing is ideal when you need to process hundreds or thousands of records. It reduces API calls and improves efficiency.

02

Optimal Chunk Sizes

Different endpoints have different optimal chunk sizes. Generally, 100-500 items per batch provides the best balance of throughput and reliability.

typescript
class="text-pink-400">const CHUNK_SIZE = class="text-amber-400">100;

class="text-pink-400">async class="text-pink-400">function processBatch(items: string[]) {
  class="text-pink-400">const chunks = [];
  class="text-pink-400">for (class="text-pink-400">let i = class="text-amber-400">0; i < items.length; i += CHUNK_SIZE) {
    chunks.push(items.slice(i, i + CHUNK_SIZE));
  }

  class="text-pink-400">return Promise.all(
    chunks.map(chunk =>
      fetch(class="text-emerald-class="text-amber-400">400">'/api/batch/verify', {
        method: class="text-emerald-class="text-amber-400">400">'POST',
        body: JSON.stringify({ items: chunk })
      })
    )
  );
}
03

Handling Partial Failures

Batch responses include success and failure arrays. Implement retry logic for failed items while storing successful results.

04

Progress Tracking

For very large batches, use webhook notifications to track progress without polling.

Related Guides

Ready to implement?

Explore our API documentation to start building with Olympus.