Resources
The OcrivaClient exposes five resource objects. Each resource groups related API methods and handles serialization, authentication, and error mapping for you.
| Resource | Property | Description |
|---|---|---|
| Upload | client.upload | Upload files for OCR processing |
| Templates | client.templates | Create and manage extraction templates |
| Projects | client.projects | Query accessible projects |
| Processing History | client.processingHistory | Access OCR results and statistics |
| Batch | client.batch | Upload and track multiple files at once |
Upload
Upload a File
Upload a single file for OCR processing. The SDK accepts a browser File, a Blob, or a Node.js Buffer / Uint8Array.
async file(
file: File | Blob | Uint8Array,
params: UploadFileParams,
): Promise<UploadResponse>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
file | File | Blob | Uint8Array | Yes | The binary file to upload |
params.fileName | string | Yes | Display name of the file |
params.fileType | FileType | Yes | Broad category: 'pdf', 'image', or 'document' |
params.fileSize | string | Yes | File size in bytes as a string |
params.mimeType | string | Yes | MIME type, e.g. 'application/pdf' |
params.projectId | string | Yes | Project to associate this upload with |
params.templateId | string | No | Template to apply during processing |
params.batchId | string | No | Add this file to an existing batch |
params.uploadType | string | No | Optional upload type hint |
params.metadata | Record<string, unknown> | No | Custom key-value metadata |
Examples
Upload a File object from a browser <input>:
import { OcrivaClient } from '@ocriva/sdk';
const client = new OcrivaClient({ apiKey: process.env.OCRIVA_API_KEY! });
async function handleFileInput(file: File) {
const result = await client.upload.file(file, {
fileName: file.name,
fileType: 'pdf',
fileSize: String(file.size),
mimeType: file.type,
projectId: 'proj_abc123',
});
console.log('Upload ID:', result.id);
console.log('Status:', result.status);
}Upload a Buffer in Node.js:
import { readFileSync } from 'node:fs';
const buffer = readFileSync('./receipt.jpg');
const result = await client.upload.file(buffer, {
fileName: 'receipt.jpg',
fileType: 'image',
fileSize: String(buffer.byteLength),
mimeType: 'image/jpeg',
projectId: 'proj_abc123',
templateId: 'tmpl_xyz789',
});Upload a Blob with custom metadata:
const response = await fetch('https://example.com/document.pdf');
const blob = await response.blob();
const result = await client.upload.file(blob, {
fileName: 'document.pdf',
fileType: 'pdf',
fileSize: String(blob.size),
mimeType: 'application/pdf',
projectId: 'proj_abc123',
metadata: { source: 'crm', customerId: 'cust_999' },
});Response shape (UploadResponse)
{
id: string;
fileName: string;
fileUrl: string;
publicUrl: string;
filePath: string;
fileSize: string;
mimeType: string;
status: 'uploaded' | 'processing' | 'completed' | 'failed';
templateId?: string;
projectId: string;
uploadedAt: string;
userId: string;
organizationId?: string;
bucketName?: string;
}Templates
List Templates
List all templates accessible to the current token.
async list(params?: ListTemplatesParams): Promise<Template[]>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
params.projectId | string | No | Filter by project ID |
params.page | number | No | Page number (1-based) |
params.limit | number | No | Items per page |
// List all templates
const templates = await client.templates.list();
// Filter by project with pagination
const page1 = await client.templates.list({
projectId: 'proj_abc123',
page: 1,
limit: 20,
});
console.log(`Found ${page1.length} templates`);Create a Template
Create a new extraction template.
async create(params: CreateTemplateParams): Promise<Template>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
params.name | string | Yes | Template name |
params.description | string | Yes | Template description |
params.templateSchema | Record<string, unknown> | Yes | JSON schema defining fields to extract |
params.projectId | string | Yes | Project this template belongs to |
params.instructions | string | No | Custom AI assistant instructions |
const template = await client.templates.create({
name: 'Invoice Template',
description: 'Extracts line items, totals, and vendor information from invoices',
projectId: 'proj_abc123',
templateSchema: {
type: 'object',
properties: {
invoiceNumber: { type: 'string', description: 'Invoice number' },
vendorName: { type: 'string', description: 'Vendor or supplier name' },
totalAmount: { type: 'number', description: 'Total invoice amount' },
dueDate: { type: 'string', description: 'Payment due date (YYYY-MM-DD)' },
lineItems: {
type: 'array',
items: {
type: 'object',
properties: {
description: { type: 'string' },
quantity: { type: 'number' },
unitPrice: { type: 'number' },
},
},
},
},
},
instructions: 'Extract all monetary values as numbers without currency symbols.',
});
console.log('Template ID:', template.templateId);Update a Template
Update an existing template. All fields except projectId are optional.
async update(id: string, params: UpdateTemplateParams): Promise<Template>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
id | string | Yes | Template ID to update |
params.projectId | string | Yes | Project ID (used to scope the update) |
params.name | string | No | New template name |
params.description | string | No | New description |
params.templateSchema | Record<string, unknown> | No | Updated JSON schema |
params.instructions | string | No | Updated AI instructions |
// Partial update — only change the instructions
const updated = await client.templates.update('tmpl_xyz789', {
projectId: 'proj_abc123',
instructions: 'Always return amounts in USD. If currency is not USD, convert it.',
});Delete a Template
Soft-delete a template. The template becomes inactive and is no longer available for new uploads.
async delete(id: string): Promise<void>await client.templates.delete('tmpl_xyz789');
// Returns void — no response bodyUpload a Data File
Attach a data file (JSON, Markdown, or plain text) to a template. The AI uses the file as additional context during extraction.
async uploadFile(templateId: string, file: File | Blob): Promise<Template>import { readFileSync } from 'node:fs';
// Attach a reference schema as a Markdown file
const mdContent = readFileSync('./field-guide.md');
const blob = new Blob([mdContent], { type: 'text/markdown' });
const updatedTemplate = await client.templates.uploadFile('tmpl_xyz789', blob);
console.log('Attached files:', updatedTemplate.templateId);Delete a Data File
Remove an attached data file from a template.
async deleteFile(templateId: string, fileId: string): Promise<Template>const updatedTemplate = await client.templates.deleteFile(
'tmpl_xyz789',
'file_abc456',
);Projects
List Projects
List projects accessible to the current token.
- Project token: returns the single project the token is bound to.
- Organization token: returns all projects in the organization.
async list(params?: ListProjectsParams): Promise<Project[]>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
params.projectId | string | No | Filter to a specific project (for organization tokens) |
// List all accessible projects
const projects = await client.projects.list();
for (const project of projects) {
console.log(`${project.id}: ${project.name} (active: ${project.isActive})`);
}
// Scope to one project using an organization token
const filtered = await client.projects.list({ projectId: 'proj_abc123' });Response shape (Project)
{
id: string;
name: string;
description?: string;
organizationId: string;
isActive: boolean;
createdAt: string;
updatedAt: string;
}Processing History
List Processing History
List processing history records with optional filtering and pagination.
async list(
params?: ListProcessingHistoryParams,
): Promise<PaginatedResponse<ProcessingHistory>>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
params.projectId | string | No | Project ID (required for organization tokens) |
params.status | ProcessingStatus | No | Filter by status |
params.templateId | string | No | Filter by template |
params.priority | ProcessingPriority | No | Filter by priority |
params.search | string | No | Search by filename or template name |
params.fromDate | string | No | Start date filter (YYYY-MM-DD) |
params.toDate | string | No | End date filter (YYYY-MM-DD) |
params.batchId | string | No | Filter by batch ID |
params.page | number | No | Page number (1-based) |
params.limit | number | No | Items per page |
// List completed records for a project
const result = await client.processingHistory.list({
projectId: 'proj_abc123',
status: 'completed',
page: 1,
limit: 50,
});
console.log(`Page ${result.pagination.page} of ${result.pagination.pages}`);
console.log(`Total records: ${result.pagination.total}`);
for (const record of result.data) {
console.log(`${record.id}: ${record.fileName} — ${record.status}`);
}
// Filter by date range
const recent = await client.processingHistory.list({
projectId: 'proj_abc123',
fromDate: '2026-03-01',
toDate: '2026-03-31',
});
// Filter by template
const byTemplate = await client.processingHistory.list({
projectId: 'proj_abc123',
templateId: 'tmpl_xyz789',
status: 'failed',
});Get a Record
Retrieve a single processing history record by ID.
async get(id: string, projectId?: string): Promise<ProcessingHistory>const record = await client.processingHistory.get(
'hist_abc123',
'proj_abc123', // required for organization tokens
);
console.log('Status:', record.status);
console.log('Result:', record.processingResult);Export a Result
Download a processing result as a Blob in the specified format.
async getResult(id: string, params?: ExportResultParams): Promise<Blob>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
id | string | Yes | Processing history record ID |
params.format | string | No | Output format: 'json', 'text', 'xml', 'html', 'csv', 'pdf', 'docx' |
params.projectId | string | No | Project ID (required for organization tokens) |
// Download as JSON and parse
const blob = await client.processingHistory.getResult('hist_abc123', {
format: 'json',
projectId: 'proj_abc123',
});
const data = JSON.parse(await blob.text());
console.log(data);
// Download as PDF and save to disk (Node.js)
import { writeFileSync } from 'node:fs';
const pdfBlob = await client.processingHistory.getResult('hist_abc123', {
format: 'pdf',
projectId: 'proj_abc123',
});
const arrayBuffer = await pdfBlob.arrayBuffer();
writeFileSync('./result.pdf', Buffer.from(arrayBuffer));
// Download as CSV
const csvBlob = await client.processingHistory.getResult('hist_abc123', {
format: 'csv',
projectId: 'proj_abc123',
});
const csv = await csvBlob.text();
console.log(csv);Get Statistics
Get aggregated processing statistics for a project.
async stats(params: { projectId: string }): Promise<ProcessingHistoryStats>const stats = await client.processingHistory.stats({
projectId: 'proj_abc123',
});
console.log('Total:', stats.total);
console.log('Pending:', stats.pending);
console.log('Processing:', stats.processing);
console.log('Completed:', stats.completed);
console.log('Failed:', stats.failed);Response shape (ProcessingHistoryStats)
{
total: number;
pending: number;
processing: number;
completed: number;
failed: number;
}Batch
Use the batch resource to upload and track multiple files as a single job. Each file is processed independently; the batch record tracks overall progress and lets you export all results at once.
Upload a Batch
Upload multiple files as a single batch. Supports up to 50 files per call.
async upload(
files: (File | Blob)[],
params: BatchUploadParams,
): Promise<Batch>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
files | (File | Blob)[] | Yes | Array of files to upload (max 50) |
params.projectId | string | Yes | Project to associate this batch with |
params.templateId | string | No | Template to apply to all files |
params.name | string | No | Batch name (auto-generated if omitted) |
params.metadata | Record<string, unknown> | No | Custom key-value metadata |
import { readFileSync } from 'node:fs';
const files = ['invoice-01.pdf', 'invoice-02.pdf', 'invoice-03.pdf'].map(
(name) => {
const buffer = readFileSync(`./${name}`);
return new File([buffer], name, { type: 'application/pdf' });
},
);
const batch = await client.batch.upload(files, {
projectId: 'proj_abc123',
templateId: 'tmpl_xyz789',
name: 'March Invoices',
metadata: { month: 'march', year: '2026' },
});
console.log('Batch ID:', batch.id);
console.log('Total files:', batch.totalFiles);
console.log('Status:', batch.status);List Batches
List batches with optional filtering and pagination.
async list(params?: ListBatchParams): Promise<PaginatedResponse<Batch>>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
params.projectId | string | No | Project ID (required for organization tokens) |
params.status | BatchStatus | No | Filter by status |
params.page | number | No | Page number (1-based) |
params.limit | number | No | Items per page |
const result = await client.batch.list({
projectId: 'proj_abc123',
status: 'completed',
page: 1,
limit: 10,
});
console.log(`Total batches: ${result.pagination.total}`);
for (const batch of result.data) {
console.log(`${batch.id}: ${batch.name} — ${batch.progress}% complete`);
}Get Batch Details
Retrieve a single batch by ID, including current progress.
async get(batchId: string): Promise<Batch>const batch = await client.batch.get('batch_abc123');
console.log('Status:', batch.status);
console.log('Progress:', batch.progress, '%');
console.log(`${batch.completedFiles} / ${batch.totalFiles} files done`);
console.log('Failed:', batch.failedFiles);Response shape (Batch)
{
id: string;
name: string;
projectId: string;
organizationId: string;
userId: string;
templateId?: string;
templateName?: string;
totalFiles: number;
completedFiles: number;
failedFiles: number;
status: BatchStatus; // 'uploading' | 'processing' | 'completed' | 'partially_failed' | 'failed'
uploadIds: string[];
processingHistoryIds: string[];
progress: number; // 0–100
metadata?: Record<string, unknown>;
createdAt: string;
updatedAt: string;
}Export Batch Results
Export all completed processing results for a batch as a single file.
async exportResults(batchId: string, params: ExportBatchParams): Promise<Blob>Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
batchId | string | Yes | Batch ID |
params.format | string | Yes | Export format: 'json', 'csv', 'text', 'xml', 'pdf', 'docx' |
// Export as JSON and parse
const blob = await client.batch.exportResults('batch_abc123', {
format: 'json',
});
const results = JSON.parse(await blob.text());
console.log(`Exported ${results.length} results`);
// Export as CSV and save (Node.js)
import { writeFileSync } from 'node:fs';
const csvBlob = await client.batch.exportResults('batch_abc123', {
format: 'csv',
});
writeFileSync('./batch-results.csv', Buffer.from(await csvBlob.arrayBuffer()));