Ocriva Logo

Documents

Resources

Complete method reference for upload, templates, projects, processing history, and batch resources.

sdkresourcesreferenceapi

Published: 4/1/2026

Resources

The OcrivaClient exposes five resource objects. Each resource groups related API methods and handles serialization, authentication, and error mapping for you.

ResourcePropertyDescription
Uploadclient.uploadUpload files for OCR processing
Templatesclient.templatesCreate and manage extraction templates
Projectsclient.projectsQuery accessible projects
Processing Historyclient.processingHistoryAccess OCR results and statistics
Batchclient.batchUpload and track multiple files at once

Upload

Upload a File

Upload a single file for OCR processing. The SDK accepts a browser File, a Blob, or a Node.js Buffer / Uint8Array.

async file(
  file: File | Blob | Uint8Array,
  params: UploadFileParams,
): Promise<UploadResponse>

Parameters

ParameterTypeRequiredDescription
fileFile | Blob | Uint8ArrayYesThe binary file to upload
params.fileNamestringYesDisplay name of the file
params.fileTypeFileTypeYesBroad category: 'pdf', 'image', or 'document'
params.fileSizestringYesFile size in bytes as a string
params.mimeTypestringYesMIME type, e.g. 'application/pdf'
params.projectIdstringYesProject to associate this upload with
params.templateIdstringNoTemplate to apply during processing
params.batchIdstringNoAdd this file to an existing batch
params.uploadTypestringNoOptional upload type hint
params.metadataRecord<string, unknown>NoCustom key-value metadata

Examples

Upload a File object from a browser <input>:

import { OcrivaClient } from '@ocriva/sdk';
 
const client = new OcrivaClient({ apiKey: process.env.OCRIVA_API_KEY! });
 
async function handleFileInput(file: File) {
  const result = await client.upload.file(file, {
    fileName: file.name,
    fileType: 'pdf',
    fileSize: String(file.size),
    mimeType: file.type,
    projectId: 'proj_abc123',
  });
 
  console.log('Upload ID:', result.id);
  console.log('Status:', result.status);
}

Upload a Buffer in Node.js:

import { readFileSync } from 'node:fs';
 
const buffer = readFileSync('./receipt.jpg');
 
const result = await client.upload.file(buffer, {
  fileName: 'receipt.jpg',
  fileType: 'image',
  fileSize: String(buffer.byteLength),
  mimeType: 'image/jpeg',
  projectId: 'proj_abc123',
  templateId: 'tmpl_xyz789',
});

Upload a Blob with custom metadata:

const response = await fetch('https://example.com/document.pdf');
const blob = await response.blob();
 
const result = await client.upload.file(blob, {
  fileName: 'document.pdf',
  fileType: 'pdf',
  fileSize: String(blob.size),
  mimeType: 'application/pdf',
  projectId: 'proj_abc123',
  metadata: { source: 'crm', customerId: 'cust_999' },
});

Response shape (UploadResponse)

{
  id: string;
  fileName: string;
  fileUrl: string;
  publicUrl: string;
  filePath: string;
  fileSize: string;
  mimeType: string;
  status: 'uploaded' | 'processing' | 'completed' | 'failed';
  templateId?: string;
  projectId: string;
  uploadedAt: string;
  userId: string;
  organizationId?: string;
  bucketName?: string;
}

Templates

List Templates

List all templates accessible to the current token.

async list(params?: ListTemplatesParams): Promise<Template[]>

Parameters

ParameterTypeRequiredDescription
params.projectIdstringNoFilter by project ID
params.pagenumberNoPage number (1-based)
params.limitnumberNoItems per page
// List all templates
const templates = await client.templates.list();
 
// Filter by project with pagination
const page1 = await client.templates.list({
  projectId: 'proj_abc123',
  page: 1,
  limit: 20,
});
 
console.log(`Found ${page1.length} templates`);

Create a Template

Create a new extraction template.

async create(params: CreateTemplateParams): Promise<Template>

Parameters

ParameterTypeRequiredDescription
params.namestringYesTemplate name
params.descriptionstringYesTemplate description
params.templateSchemaRecord<string, unknown>YesJSON schema defining fields to extract
params.projectIdstringYesProject this template belongs to
params.instructionsstringNoCustom AI assistant instructions
const template = await client.templates.create({
  name: 'Invoice Template',
  description: 'Extracts line items, totals, and vendor information from invoices',
  projectId: 'proj_abc123',
  templateSchema: {
    type: 'object',
    properties: {
      invoiceNumber: { type: 'string', description: 'Invoice number' },
      vendorName: { type: 'string', description: 'Vendor or supplier name' },
      totalAmount: { type: 'number', description: 'Total invoice amount' },
      dueDate: { type: 'string', description: 'Payment due date (YYYY-MM-DD)' },
      lineItems: {
        type: 'array',
        items: {
          type: 'object',
          properties: {
            description: { type: 'string' },
            quantity: { type: 'number' },
            unitPrice: { type: 'number' },
          },
        },
      },
    },
  },
  instructions: 'Extract all monetary values as numbers without currency symbols.',
});
 
console.log('Template ID:', template.templateId);

Update a Template

Update an existing template. All fields except projectId are optional.

async update(id: string, params: UpdateTemplateParams): Promise<Template>

Parameters

ParameterTypeRequiredDescription
idstringYesTemplate ID to update
params.projectIdstringYesProject ID (used to scope the update)
params.namestringNoNew template name
params.descriptionstringNoNew description
params.templateSchemaRecord<string, unknown>NoUpdated JSON schema
params.instructionsstringNoUpdated AI instructions
// Partial update — only change the instructions
const updated = await client.templates.update('tmpl_xyz789', {
  projectId: 'proj_abc123',
  instructions: 'Always return amounts in USD. If currency is not USD, convert it.',
});

Delete a Template

Soft-delete a template. The template becomes inactive and is no longer available for new uploads.

async delete(id: string): Promise<void>
await client.templates.delete('tmpl_xyz789');
// Returns void — no response body

Upload a Data File

Attach a data file (JSON, Markdown, or plain text) to a template. The AI uses the file as additional context during extraction.

async uploadFile(templateId: string, file: File | Blob): Promise<Template>
import { readFileSync } from 'node:fs';
 
// Attach a reference schema as a Markdown file
const mdContent = readFileSync('./field-guide.md');
const blob = new Blob([mdContent], { type: 'text/markdown' });
 
const updatedTemplate = await client.templates.uploadFile('tmpl_xyz789', blob);
console.log('Attached files:', updatedTemplate.templateId);

Delete a Data File

Remove an attached data file from a template.

async deleteFile(templateId: string, fileId: string): Promise<Template>
const updatedTemplate = await client.templates.deleteFile(
  'tmpl_xyz789',
  'file_abc456',
);

Projects

List Projects

List projects accessible to the current token.

  • Project token: returns the single project the token is bound to.
  • Organization token: returns all projects in the organization.
async list(params?: ListProjectsParams): Promise<Project[]>

Parameters

ParameterTypeRequiredDescription
params.projectIdstringNoFilter to a specific project (for organization tokens)
// List all accessible projects
const projects = await client.projects.list();
 
for (const project of projects) {
  console.log(`${project.id}: ${project.name} (active: ${project.isActive})`);
}
 
// Scope to one project using an organization token
const filtered = await client.projects.list({ projectId: 'proj_abc123' });

Response shape (Project)

{
  id: string;
  name: string;
  description?: string;
  organizationId: string;
  isActive: boolean;
  createdAt: string;
  updatedAt: string;
}

Processing History

List Processing History

List processing history records with optional filtering and pagination.

async list(
  params?: ListProcessingHistoryParams,
): Promise<PaginatedResponse<ProcessingHistory>>

Parameters

ParameterTypeRequiredDescription
params.projectIdstringNoProject ID (required for organization tokens)
params.statusProcessingStatusNoFilter by status
params.templateIdstringNoFilter by template
params.priorityProcessingPriorityNoFilter by priority
params.searchstringNoSearch by filename or template name
params.fromDatestringNoStart date filter (YYYY-MM-DD)
params.toDatestringNoEnd date filter (YYYY-MM-DD)
params.batchIdstringNoFilter by batch ID
params.pagenumberNoPage number (1-based)
params.limitnumberNoItems per page
// List completed records for a project
const result = await client.processingHistory.list({
  projectId: 'proj_abc123',
  status: 'completed',
  page: 1,
  limit: 50,
});
 
console.log(`Page ${result.pagination.page} of ${result.pagination.pages}`);
console.log(`Total records: ${result.pagination.total}`);
 
for (const record of result.data) {
  console.log(`${record.id}: ${record.fileName} — ${record.status}`);
}
 
// Filter by date range
const recent = await client.processingHistory.list({
  projectId: 'proj_abc123',
  fromDate: '2026-03-01',
  toDate: '2026-03-31',
});
 
// Filter by template
const byTemplate = await client.processingHistory.list({
  projectId: 'proj_abc123',
  templateId: 'tmpl_xyz789',
  status: 'failed',
});

Get a Record

Retrieve a single processing history record by ID.

async get(id: string, projectId?: string): Promise<ProcessingHistory>
const record = await client.processingHistory.get(
  'hist_abc123',
  'proj_abc123', // required for organization tokens
);
 
console.log('Status:', record.status);
console.log('Result:', record.processingResult);

Export a Result

Download a processing result as a Blob in the specified format.

async getResult(id: string, params?: ExportResultParams): Promise<Blob>

Parameters

ParameterTypeRequiredDescription
idstringYesProcessing history record ID
params.formatstringNoOutput format: 'json', 'text', 'xml', 'html', 'csv', 'pdf', 'docx'
params.projectIdstringNoProject ID (required for organization tokens)
// Download as JSON and parse
const blob = await client.processingHistory.getResult('hist_abc123', {
  format: 'json',
  projectId: 'proj_abc123',
});
const data = JSON.parse(await blob.text());
console.log(data);
 
// Download as PDF and save to disk (Node.js)
import { writeFileSync } from 'node:fs';
 
const pdfBlob = await client.processingHistory.getResult('hist_abc123', {
  format: 'pdf',
  projectId: 'proj_abc123',
});
const arrayBuffer = await pdfBlob.arrayBuffer();
writeFileSync('./result.pdf', Buffer.from(arrayBuffer));
 
// Download as CSV
const csvBlob = await client.processingHistory.getResult('hist_abc123', {
  format: 'csv',
  projectId: 'proj_abc123',
});
const csv = await csvBlob.text();
console.log(csv);

Get Statistics

Get aggregated processing statistics for a project.

async stats(params: { projectId: string }): Promise<ProcessingHistoryStats>
const stats = await client.processingHistory.stats({
  projectId: 'proj_abc123',
});
 
console.log('Total:', stats.total);
console.log('Pending:', stats.pending);
console.log('Processing:', stats.processing);
console.log('Completed:', stats.completed);
console.log('Failed:', stats.failed);

Response shape (ProcessingHistoryStats)

{
  total: number;
  pending: number;
  processing: number;
  completed: number;
  failed: number;
}

Batch

Use the batch resource to upload and track multiple files as a single job. Each file is processed independently; the batch record tracks overall progress and lets you export all results at once.

Upload a Batch

Upload multiple files as a single batch. Supports up to 50 files per call.

async upload(
  files: (File | Blob)[],
  params: BatchUploadParams,
): Promise<Batch>

Parameters

ParameterTypeRequiredDescription
files(File | Blob)[]YesArray of files to upload (max 50)
params.projectIdstringYesProject to associate this batch with
params.templateIdstringNoTemplate to apply to all files
params.namestringNoBatch name (auto-generated if omitted)
params.metadataRecord<string, unknown>NoCustom key-value metadata
import { readFileSync } from 'node:fs';
 
const files = ['invoice-01.pdf', 'invoice-02.pdf', 'invoice-03.pdf'].map(
  (name) => {
    const buffer = readFileSync(`./${name}`);
    return new File([buffer], name, { type: 'application/pdf' });
  },
);
 
const batch = await client.batch.upload(files, {
  projectId: 'proj_abc123',
  templateId: 'tmpl_xyz789',
  name: 'March Invoices',
  metadata: { month: 'march', year: '2026' },
});
 
console.log('Batch ID:', batch.id);
console.log('Total files:', batch.totalFiles);
console.log('Status:', batch.status);

List Batches

List batches with optional filtering and pagination.

async list(params?: ListBatchParams): Promise<PaginatedResponse<Batch>>

Parameters

ParameterTypeRequiredDescription
params.projectIdstringNoProject ID (required for organization tokens)
params.statusBatchStatusNoFilter by status
params.pagenumberNoPage number (1-based)
params.limitnumberNoItems per page
const result = await client.batch.list({
  projectId: 'proj_abc123',
  status: 'completed',
  page: 1,
  limit: 10,
});
 
console.log(`Total batches: ${result.pagination.total}`);
for (const batch of result.data) {
  console.log(`${batch.id}: ${batch.name} — ${batch.progress}% complete`);
}

Get Batch Details

Retrieve a single batch by ID, including current progress.

async get(batchId: string): Promise<Batch>
const batch = await client.batch.get('batch_abc123');
 
console.log('Status:', batch.status);
console.log('Progress:', batch.progress, '%');
console.log(`${batch.completedFiles} / ${batch.totalFiles} files done`);
console.log('Failed:', batch.failedFiles);

Response shape (Batch)

{
  id: string;
  name: string;
  projectId: string;
  organizationId: string;
  userId: string;
  templateId?: string;
  templateName?: string;
  totalFiles: number;
  completedFiles: number;
  failedFiles: number;
  status: BatchStatus; // 'uploading' | 'processing' | 'completed' | 'partially_failed' | 'failed'
  uploadIds: string[];
  processingHistoryIds: string[];
  progress: number; // 0–100
  metadata?: Record<string, unknown>;
  createdAt: string;
  updatedAt: string;
}

Export Batch Results

Export all completed processing results for a batch as a single file.

async exportResults(batchId: string, params: ExportBatchParams): Promise<Blob>

Parameters

ParameterTypeRequiredDescription
batchIdstringYesBatch ID
params.formatstringYesExport format: 'json', 'csv', 'text', 'xml', 'pdf', 'docx'
// Export as JSON and parse
const blob = await client.batch.exportResults('batch_abc123', {
  format: 'json',
});
const results = JSON.parse(await blob.text());
console.log(`Exported ${results.length} results`);
 
// Export as CSV and save (Node.js)
import { writeFileSync } from 'node:fs';
 
const csvBlob = await client.batch.exportResults('batch_abc123', {
  format: 'csv',
});
writeFileSync('./batch-results.csv', Buffer.from(await csvBlob.arrayBuffer()));