gittech. site

for different kinds of informations and explorations.

Batch-AI – A TypeScript SDK for Batch AI Calls Across Providers

Published at
3 days ago

batch-ai

npm version License: MIT TypeScript

A unified TypeScript SDK for making batch AI requests across different model providers. Process thousands of prompts efficiently using official batch APIs from OpenAI and Anthropic.

Inspired by the Vercel AI SDK, this library aims to provide a unified interface for batch processing across different AI providers. Just like Vercel's SDK allows developers to easily switch between different LLM providers without changing their application code, batch-ai lets you handle large-scale batch processing with the same simplicity - write once, run with any supported provider.

Features

  • πŸš€ Unified Interface: Single API for multiple AI providers
  • πŸ”’ Type Safety: Full TypeScript support with Zod schema validation
  • πŸ“¦ Provider Support:
    • OpenAI (gpt-4o, etc)
    • Anthropic (Claude 3.5 Sonnet, etc)
    • Coming Soon:
      • Google (Gemini)
      • xAI (Grok)
      • Want another provider? Open an issue!
  • πŸ› οΈ Batch Operations:
    • createObjectBatch: Generate structured outputs (JSON) from prompts
    • Coming Soon:
      • generateTextBatch: Generate free-form text responses
      • Want to speed up text batch development? Open an issue!
  • ⚑ Performance: Process thousands of prompts efficiently
  • πŸ” Error Handling: Robust error handling with detailed error types

Installation

npm install batch-ai
# or
yarn add batch-ai
# or
pnpm add batch-ai

Quick Start

API Key Configuration

You can configure your API keys in one of two ways:

  1. Environment Variables (Recommended):
# .env
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-...
  1. Explicit Configuration:
const model = openai('gpt-4o', {
  apiKey: 'sk-...', // Your OpenAI API key
});

// or
const model = anthropic('claude-3-5-sonnet-20241022', {
  apiKey: 'sk-...', // Your Anthropic API key
});

Basic Usage

import { z } from 'zod';
import { openai, createObjectBatch, getObjectBatch } from 'batch-ai';

// Define your output schema using Zod
const responseSchema = z.object({
  sentiment: z.enum(['positive', 'negative', 'neutral']),
  confidence: z.number().min(0).max(1),
});

// Initialize the model
const model = openai('gpt-4o', {
  apiKey: process.env.OPENAI_API_KEY, // Optional if set in environment
});

// Prepare your batch requests
const requests = [
  {
    customId: 'review-1',
    input: 'I absolutely love this product! Best purchase ever.',
  },
  {
    customId: 'review-2',
    input: 'This is terrible, would not recommend.',
  },
];

// Create a new batch
const { batchId } = await createObjectBatch({
  model,
  requests,
  outputSchema: responseSchema,
});

// Later, retrieve the batch results
const { batch, results } = await getObjectBatch({
  model,
  batchId,
});

// Check batch status
if (batch.status === 'completed' && results) {
  console.log('Results:', results);
  // [
  //   {
  //     customId: 'review-1',
  //     output: { sentiment: 'positive', confidence: 0.98 }
  //   },
  //   {
  //     customId: 'review-2',
  //     output: { sentiment: 'negative', confidence: 0.95 }
  //   }
  // ]
}

Supported Providers

OpenAI

import { openai } from 'batch-ai';

const model = openai('gpt-4o', {
  apiKey: process.env.OPENAI_API_KEY,
});

Anthropic

import { anthropic } from 'batch-ai';

const model = anthropic('claude-3-5-sonnet-20241022', {
  apiKey: process.env.ANTHROPIC_API_KEY,
});

API Reference

Factory Functions

openai(modelId: OpenAIModel, config?: LanguageModelConfig)

Creates an OpenAI language model instance.

interface LanguageModelConfig {
  apiKey?: string;
}

anthropic(modelId: AnthropicModel, config?: LanguageModelConfig)

Creates an Anthropic language model instance.

Batch Operations

createObjectBatch

Creates a new batch of requests.

interface CreateObjectBatchParams {
  model: LanguageModel;
  requests: BatchRequest<string>[];
  outputSchema: z.ZodSchema<unknown>;
}

interface CreateObjectBatchResponse {
  batchId: string;
}

getObjectBatch

Retrieves batch status and results.

interface GetObjectBatchParams {
  model: LanguageModel;
  batchId: string;
}

// Returns
interface {
  batch: Batch;
  results?: BatchResponse<TOutput>[];
}

Types

BatchStatus

type BatchStatus =
  | 'validating'
  | 'in_progress'
  | 'completed'
  | 'failed'
  | 'expired'
  | 'cancelling'
  | 'cancelled';

BatchResponse<T>

interface BatchResponse<T> {
  customId: string;
  output?: T;
  error?: {
    code: string;
    message: string;
  };
  usage?: {
    promptTokens: number;
    completionTokens: number;
    totalTokens: number;
  };
}

Error Handling

The SDK throws typed BatchError instances:

class BatchError extends Error {
  constructor(message: string, public code: string, public batchId?: string);
}

Common error codes:

  • batch_creation_failed: Failed to create a new batch
  • batch_retrieval_failed: Failed to retrieve batch status
  • results_not_ready: Batch results are not yet available
  • results_retrieval_failed: Failed to retrieve batch results
  • batch_cancellation_failed: Failed to cancel batch

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.