API Documentation
Integrate Agents for Data's file conversion and data processing capabilities into your applications.
Authentication
All API requests require authentication using an API key. Include your API key in the X-API-Key
header:
X-API-Key: tbl_live_xxxxxxxxxxxxx
Getting an API Key
- Sign in to your TabLab workspace
- Navigate to Settings → API Keys
- Click "Generate New API Key"
- Configure permissions (e.g.,
datasets:*:write
for dataset creation) - Copy the key immediately (it won't be shown again)
API Key Permissions
API keys support scoped permissions to limit access:
datasets:*:read
- Read access to all datasets in workspacedatasets:*:write
- Write access (create, update, delete datasets)datasets:dataset-id:read
- Read access to specific datasetdatasets:dataset-id:write
- Write access to specific dataset
Base URL
All API endpoints use the base URL: https://api.agentsfordata.com/public/v1
Rate Limits
API endpoints have different rate limits based on your plan:
- Free Plan: 100 requests/hour
- Pro Plan: 1,000 requests/hour
- Business Plan: 10,000 requests/hour
Rate limit information is included in response headers:
X-RateLimit-Limit
- Maximum requests allowed in the windowX-RateLimit-Remaining
- Requests remaining in the current windowX-RateLimit-Reset
- Time when the rate limit resets (Unix timestamp)
Dataset Creation Workflow
Creating datasets through the API follows a simple 3-step process:
- Generate Pre-Signed URLs - Request signed URLs for uploading your data files
- Upload Files - Upload your CSV/Parquet/JSON files directly to the signed URLs
- Create Dataset - Create the dataset with references to the uploaded files
Quick Start Example
# Step 1: Generate pre-signed upload URL
curl -X POST https://api.agentsfordata.com/public/v1/storage/signed-urls \
-H "X-API-Key: tbl_live_your_api_key" \
-H "Content-Type: application/json" \
-d '{
"workspace_id": "your-workspace-id",
"files": [{"filename": "customers.csv", "content_type": "text/csv"}]
}'
# Step 2: Upload file to pre-signed URL (use URL from step 1 response)
curl -X PUT "https://storage.googleapis.com/..." \
-H "Content-Type: text/csv" \
--upload-file customers.csv
# Step 3: Create dataset with uploaded file
curl -X POST https://api.agentsfordata.com/public/v1/datasets \
-H "X-API-Key: tbl_live_your_api_key" \
-H "Content-Type: application/json" \
-d '{
"dataset_name": "customer_data",
"description": "Customer demographic data",
"visibility": "private",
"tags": ["customers"],
"workspace_id": "your-workspace-id",
"tables": [{
"table_name": "customers",
"signed_url": "https://storage.googleapis.com/...",
"description": "Customer records"
}]
}'
Dataset Management Endpoints
1. Generate Pre-Signed Upload URLs
Generate temporary pre-signed URLs for uploading data files to cloud storage.
POST https://api.agentsfordata.com/public/v1/storage/signed-urls
Request Body
{
"workspace_id": "550e8400-e29b-41d4-a716-446655440000",
"files": [
{
"filename": "customers.csv",
"content_type": "text/csv"
},
{
"filename": "orders.csv",
"content_type": "text/csv"
}
]
}
Response
{
"urls": [
{
"filename": "customers.csv",
"signed_url": "https://storage.googleapis.com/bucket/temp/abc123.csv?X-Goog-Signature=...",
"expires_at": "2024-01-15T12:00:00Z"
},
{
"filename": "orders.csv",
"signed_url": "https://storage.googleapis.com/bucket/temp/def456.csv?X-Goog-Signature=...",
"expires_at": "2024-01-15T12:00:00Z"
}
]
}
2. Create Dataset with Tables
Create a new dataset and ingest data from uploaded files.
POST https://api.agentsfordata.com/public/v1/datasets
Request Body
{
"dataset_name": "sales_analytics_2024",
"description": "Quarterly sales data for analysis",
"visibility": "private",
"tags": ["sales", "q4", "analytics"],
"workspace_id": "550e8400-e29b-41d4-a716-446655440000",
"tables": [
{
"table_name": "customers",
"signed_url": "https://storage.googleapis.com/bucket/temp/customers.csv?signature=...",
"description": "Customer demographic data"
},
{
"table_name": "orders",
"signed_url": "https://storage.googleapis.com/bucket/temp/orders.csv?signature=...",
"description": "Order transaction history"
}
]
}
Response
{
"success": true,
"data": {
"dataset": {
"id": "dataset-uuid-123",
"name": "sales_analytics_2024",
"workspace_id": "550e8400-e29b-41d4-a716-446655440000",
"visibility": "private",
"created_at": "2024-01-15T10:30:00Z"
},
"summary": {
"total": 2,
"successful": 2,
"failed": 0
},
"tables": [
{
"table_name": "customers",
"status": "success",
"rows_ingested": 1500,
"schema": [
{"name": "customer_id", "type": "INTEGER"},
{"name": "name", "type": "VARCHAR"},
{"name": "email", "type": "VARCHAR"}
]
},
{
"table_name": "orders",
"status": "success",
"rows_ingested": 5200,
"schema": [
{"name": "order_id", "type": "INTEGER"},
{"name": "customer_id", "type": "INTEGER"},
{"name": "amount", "type": "DECIMAL"}
]
}
]
}
}
3. Execute SQL Query
Execute a SQL query against datasets in your workspace. Use fully-qualified table names: workspace.dataset.table
POST https://api.agentsfordata.com/public/v1/query
Request Body
{
"sql": "SELECT customer_id, name, email FROM acme_corp.sales_analytics_2024.customers LIMIT 10",
"timeout_seconds": 30
}
Response
{
"success": true,
"data": {
"columns": ["customer_id", "name", "email"],
"rows": [
{"customer_id": 1, "name": "John Doe", "email": "john@example.com"},
{"customer_id": 2, "name": "Jane Smith", "email": "jane@example.com"}
],
"count": 2,
"execution_time": "15ms"
}
}
4. List Datasets in Workspace
Retrieve all datasets in a workspace that you have access to.
GET https://api.agentsfordata.com/public/v1/workspaces/{workspaceId}/datasets
Query Parameters
expand=tables
- Include table schemas in response
Response
{
"success": true,
"data": {
"datasets": [
{
"id": "dataset-uuid-123",
"name": "sales_analytics_2024",
"workspace_id": "550e8400-e29b-41d4-a716-446655440000",
"visibility": "private",
"created_at": "2024-01-15T10:30:00Z",
"tags": ["sales", "q4"]
}
],
"count": 1
}
}
5. Get Dataset Details
Retrieve details about a specific dataset.
GET https://api.agentsfordata.com/public/v1/datasets/{datasetId}
6. Update Dataset Metadata
Update dataset name, description, visibility, or tags.
PATCH https://api.agentsfordata.com/public/v1/datasets/{datasetId}
Request Body
{
"name": "sales_analytics_2024_updated",
"short_description": "Updated Q4 2024 sales data",
"visibility": "public",
"tags": ["sales", "q4", "analytics", "updated"]
}
7. Delete Dataset
Permanently delete a dataset and all its data.
DELETE https://api.agentsfordata.com/public/v1/datasets/{datasetId}
File Conversion Endpoints
1. Get Upload URLs
Generate signed URLs for uploading source files and downloading converted results.
POST /api/public/v1/convert/upload-urls
Request Body
{
"source_format": "csv",
"destination_format": "json",
"filename": "my-data" // optional
}
Response
{
"success": true,
"data": {
"source": {
"upload_url": "https://storage.example.com/upload/...",
"file_path": "2025-01-15/abc123/source-my-data.csv",
"content_type": "text/csv"
},
"destination": {
"download_url": "https://storage.example.com/download/...",
"file_path": "2025-01-15/abc123/my-data.json",
"content_type": "application/json"
}
},
"meta": {
"request_id": "req_1234567890",
"timestamp": "2025-01-15T10:30:00Z",
"version": "v1"
}
}
2. Upload File
After getting the upload URL, upload your source file directly to the signed URL using a PUT request:
PUT {source.upload_url}
Content-Type: {source.content_type}
[file contents]
3. Convert File
Convert the uploaded file to the desired format.
POST /api/public/v1/convert
Request Body
{
"source_url": "2025-01-15/abc123/source-my-data.csv",
"source_format": "csv",
"destination_format": "json",
"filename": "my-data" // optional
}
Response
{
"success": true,
"data": {
"download_url": "https://storage.example.com/download/...",
"file_path": "2025-01-15/abc123/my-data.json",
"expires_at": "2025-01-15T15:30:00Z"
},
"meta": {
"request_id": "req_1234567890",
"timestamp": "2025-01-15T10:30:00Z",
"version": "v1"
}
}
4. List Supported Formats
Get a list of all supported file formats and their conversion capabilities.
GET /api/public/v1/convert/formats
Response
{
"success": true,
"data": {
"formats": [
{
"format": "csv",
"display_name": "CSV",
"mime_type": "text/csv",
"can_read": true,
"can_write": true
},
{
"format": "json",
"display_name": "JSON",
"mime_type": "application/json",
"can_read": true,
"can_write": true
}
// ... more formats
]
},
"meta": {
"request_id": "req_1234567890",
"timestamp": "2025-01-15T10:30:00Z",
"version": "v1"
}
}
Error Handling
All errors follow a consistent format:
{
"success": false,
"error": {
"code": "ERROR_CODE",
"message": "Human-readable error message",
"details": {} // optional additional context
},
"meta": {
"request_id": "req_1234567890",
"timestamp": "2025-01-15T10:30:00Z",
"version": "v1"
}
}
Common Error Codes
Status Code | Error Code | Description |
---|---|---|
400 | INVALID_REQUEST | The request body is malformed or missing required fields |
400 | UNSUPPORTED_FORMAT | The specified format is not supported |
401 | UNAUTHORIZED | Invalid or missing API key |
403 | FORBIDDEN | API key lacks required permissions |
429 | RATE_LIMITED | Rate limit exceeded for this endpoint |
500 | INTERNAL_ERROR | An unexpected error occurred |
503 | SERVICE_UNAVAILABLE | Service temporarily unavailable |
Code Examples
Creating a Dataset with cURL
#!/bin/bash
API_KEY="tbl_live_your_api_key"
WORKSPACE_ID="your-workspace-id"
API_BASE="https://api.agentsfordata.com/public/v1"
# Step 1: Generate pre-signed upload URL
echo "Step 1: Generating pre-signed upload URL..."
SIGNED_URL_RESPONSE=$(curl -s -X POST "$API_BASE/storage/signed-urls" \
-H "X-API-Key: $API_KEY" \
-H "Content-Type: application/json" \
-d "{
\"workspace_id\": \"$WORKSPACE_ID\",
\"files\": [{
\"filename\": \"customers.csv\",
\"content_type\": \"text/csv\"
}]
}")
SIGNED_URL=$(echo $SIGNED_URL_RESPONSE | jq -r '.urls[0].signed_url')
echo "Generated signed URL: $SIGNED_URL"
# Step 2: Upload file to signed URL
echo "Step 2: Uploading file..."
curl -X PUT "$SIGNED_URL" \
-H "Content-Type: text/csv" \
--upload-file customers.csv
echo "File uploaded successfully"
# Step 3: Create dataset with uploaded file
echo "Step 3: Creating dataset..."
DATASET_RESPONSE=$(curl -s -X POST "$API_BASE/datasets" \
-H "X-API-Key: $API_KEY" \
-H "Content-Type: application/json" \
-d "{
\"dataset_name\": \"customer_data\",
\"description\": \"Customer demographic data\",
\"visibility\": \"private\",
\"tags\": [\"customers\", \"demographics\"],
\"workspace_id\": \"$WORKSPACE_ID\",
\"tables\": [{
\"table_name\": \"customers\",
\"signed_url\": \"$SIGNED_URL\",
\"description\": \"Customer records\"
}]
}")
DATASET_ID=$(echo $DATASET_RESPONSE | jq -r '.data.dataset.id')
echo "Dataset created successfully with ID: $DATASET_ID"
# Step 4: Query the dataset
echo "Step 4: Querying the dataset..."
curl -s -X POST "$API_BASE/query" \
-H "X-API-Key: $API_KEY" \
-H "Content-Type: application/json" \
-d "{
\"sql\": \"SELECT * FROM workspace.customer_data.customers LIMIT 5\",
\"timeout_seconds\": 30
}" | jq
Creating a Dataset with JavaScript (Node.js)
const fs = require('fs').promises;
const API_KEY = 'tbl_live_your_api_key';
const WORKSPACE_ID = 'your-workspace-id';
const API_BASE = 'https://api.agentsfordata.com/public/v1';
async function createDataset(filePath, datasetName, tableName) {
try {
// Step 1: Generate pre-signed upload URL
console.log('Step 1: Generating pre-signed upload URL...');
const urlsResponse = await fetch(`${API_BASE}/storage/signed-urls`, {
method: 'POST',
headers: {
'X-API-Key': API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify({
workspace_id: WORKSPACE_ID,
files: [{
filename: tableName + '.csv',
content_type: 'text/csv'
}]
})
});
const urlsData = await urlsResponse.json();
const signedUrl = urlsData.urls[0].signed_url;
console.log('Generated signed URL:', signedUrl);
// Step 2: Upload file to signed URL
console.log('Step 2: Uploading file...');
const fileBuffer = await fs.readFile(filePath);
await fetch(signedUrl, {
method: 'PUT',
headers: {
'Content-Type': 'text/csv'
},
body: fileBuffer
});
console.log('File uploaded successfully');
// Step 3: Create dataset with uploaded file
console.log('Step 3: Creating dataset...');
const datasetResponse = await fetch(`${API_BASE}/datasets`, {
method: 'POST',
headers: {
'X-API-Key': API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify({
dataset_name: datasetName,
description: 'Created via API',
visibility: 'private',
tags: ['api-created'],
workspace_id: WORKSPACE_ID,
tables: [{
table_name: tableName,
signed_url: signedUrl,
description: 'Data uploaded via API'
}]
})
});
const datasetData = await datasetResponse.json();
if (!datasetData.success) {
throw new Error(`Dataset creation failed: ${datasetData.error?.message}`);
}
console.log('Dataset created successfully:', datasetData.data.dataset.id);
console.log('Tables:', datasetData.data.summary);
return datasetData.data.dataset;
} catch (error) {
console.error('Error creating dataset:', error);
throw error;
}
}
// Usage
createDataset('./customers.csv', 'customer_data', 'customers')
.then(dataset => {
console.log('\nDataset ready!');
console.log('Dataset ID:', dataset.id);
})
.catch(error => console.error('Failed:', error));
Creating a Dataset with Python
import requests
import json
API_KEY = 'tbl_live_your_api_key'
WORKSPACE_ID = 'your-workspace-id'
API_BASE = 'https://api.agentsfordata.com/public/v1'
def create_dataset(file_path: str, dataset_name: str, table_name: str) -> dict:
"""
Create a dataset by uploading a file and creating dataset metadata.
Args:
file_path: Path to the CSV file to upload
dataset_name: Name for the dataset
table_name: Name for the table
Returns:
Dataset information dictionary
"""
headers = {
'X-API-Key': API_KEY,
'Content-Type': 'application/json'
}
# Step 1: Generate pre-signed upload URL
print('Step 1: Generating pre-signed upload URL...')
urls_response = requests.post(
f'{API_BASE}/storage/signed-urls',
headers=headers,
json={
'workspace_id': WORKSPACE_ID,
'files': [{
'filename': f'{table_name}.csv',
'content_type': 'text/csv'
}]
}
)
urls_response.raise_for_status()
urls_data = urls_response.json()
signed_url = urls_data['urls'][0]['signed_url']
print(f'Generated signed URL: {signed_url}')
# Step 2: Upload file to signed URL
print('Step 2: Uploading file...')
with open(file_path, 'rb') as f:
upload_response = requests.put(
signed_url,
headers={'Content-Type': 'text/csv'},
data=f.read()
)
upload_response.raise_for_status()
print('File uploaded successfully')
# Step 3: Create dataset with uploaded file
print('Step 3: Creating dataset...')
dataset_response = requests.post(
f'{API_BASE}/datasets',
headers=headers,
json={
'dataset_name': dataset_name,
'description': 'Created via API',
'visibility': 'private',
'tags': ['api-created'],
'workspace_id': WORKSPACE_ID,
'tables': [{
'table_name': table_name,
'signed_url': signed_url,
'description': 'Data uploaded via API'
}]
}
)
dataset_response.raise_for_status()
dataset_data = dataset_response.json()
if not dataset_data.get('success'):
raise Exception(f"Dataset creation failed: {dataset_data.get('error', {}).get('message')}")
print(f"Dataset created successfully: {dataset_data['data']['dataset']['id']}")
print(f"Tables summary: {dataset_data['data']['summary']}")
return dataset_data['data']['dataset']
# Usage
if __name__ == '__main__':
try:
dataset = create_dataset(
file_path='./customers.csv',
dataset_name='customer_data',
table_name='customers'
)
print('\nDataset ready!')
print(f'Dataset ID: {dataset["id"]}')
print(f'Workspace ID: {dataset["workspace_id"]}')
except Exception as e:
print(f'Failed to create dataset: {e}')
File Conversion with cURL
# Step 1: Get upload URLs
curl -X POST https://api.agentsfordata.com/public/v1/convert/upload-urls \
-H "X-API-Key: tbl_live_your_api_key" \
-H "Content-Type: application/json" \
-d '{
"source_format": "csv",
"destination_format": "json",
"filename": "my-data"
}'
# Step 2: Upload your file (response from step 1 provides upload_url)
curl -X PUT "https://storage.example.com/upload/..." \
-H "Content-Type: text/csv" \
--data-binary @input.csv
# Step 3: Convert the file
curl -X POST https://api.agentsfordata.com/public/v1/convert \
-H "X-API-Key: tbl_live_your_api_key" \
-H "Content-Type: application/json" \
-d '{
"source_url": "2025-01-15/abc123/source-my-data.csv",
"source_format": "csv",
"destination_format": "json",
"filename": "my-data"
}'
# Step 4: Download converted file (response from step 3 provides download_url)
curl -o output.json "https://storage.example.com/download/..."
File Conversion with JavaScript (Node.js)
const API_KEY = 'tbl_live_your_api_key';
const BASE_URL = 'https://api.agentsfordata.com/public/v1';
async function convertFile(inputFile, sourceFormat, destinationFormat) {
try {
// Step 1: Get upload URLs
const urlsResponse = await fetch(`${BASE_URL}/convert/upload-urls`, {
method: 'POST',
headers: {
'X-API-Key': API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify({
source_format: sourceFormat,
destination_format: destinationFormat,
filename: 'my-data'
})
});
const urlsData = await urlsResponse.json();
if (!urlsData.success) {
throw new Error(urlsData.error.message);
}
const { source, destination } = urlsData.data;
// Step 2: Upload file
const fileBuffer = await fs.promises.readFile(inputFile);
await fetch(source.upload_url, {
method: 'PUT',
headers: {
'Content-Type': source.content_type
},
body: fileBuffer
});
// Step 3: Convert file
const convertResponse = await fetch(`${BASE_URL}/convert`, {
method: 'POST',
headers: {
'X-API-Key': API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify({
source_url: source.file_path,
source_format: sourceFormat,
destination_format: destinationFormat,
filename: 'my-data'
})
});
const convertData = await convertResponse.json();
if (!convertData.success) {
throw new Error(convertData.error.message);
}
// Step 4: Download converted file
const downloadResponse = await fetch(convertData.data.download_url);
const convertedFile = await downloadResponse.arrayBuffer();
return Buffer.from(convertedFile);
} catch (error) {
console.error('Conversion error:', error);
throw error;
}
}
// Usage
convertFile('input.csv', 'csv', 'json')
.then(data => console.log('Conversion successful'))
.catch(error => console.error('Conversion failed:', error));
File Conversion with Python
import requests
from typing import BinaryIO
API_KEY = 'tbl_live_your_api_key'
BASE_URL = 'https://api.agentsfordata.com/public/v1'
def convert_file(
input_file: BinaryIO,
source_format: str,
destination_format: str,
filename: str = 'my-data'
) -> bytes:
"""
Convert a file using the Agents for Data API.
Args:
input_file: File object to convert
source_format: Source file format (e.g., 'csv')
destination_format: Destination format (e.g., 'json')
filename: Optional filename for the conversion
Returns:
Converted file contents as bytes
"""
headers = {
'X-API-Key': API_KEY,
'Content-Type': 'application/json'
}
# Step 1: Get upload URLs
urls_response = requests.post(
f'{BASE_URL}/convert/upload-urls',
headers=headers,
json={
'source_format': source_format,
'destination_format': destination_format,
'filename': filename
}
)
urls_response.raise_for_status()
urls_data = urls_response.json()
if not urls_data['success']:
raise Exception(urls_data['error']['message'])
source = urls_data['data']['source']
destination = urls_data['data']['destination']
# Step 2: Upload file
upload_response = requests.put(
source['upload_url'],
headers={'Content-Type': source['content_type']},
data=input_file.read()
)
upload_response.raise_for_status()
# Step 3: Convert file
convert_response = requests.post(
f'{BASE_URL}/convert',
headers=headers,
json={
'source_url': source['file_path'],
'source_format': source_format,
'destination_format': destination_format,
'filename': filename
}
)
convert_response.raise_for_status()
convert_data = convert_response.json()
if not convert_data['success']:
raise Exception(convert_data['error']['message'])
# Step 4: Download converted file
download_response = requests.get(convert_data['data']['download_url'])
download_response.raise_for_status()
return download_response.content
# Usage
if __name__ == '__main__':
with open('input.csv', 'rb') as f:
converted_data = convert_file(f, 'csv', 'json')
with open('output.json', 'wb') as f:
f.write(converted_data)
print('Conversion successful!')
Supported File Formats
The API supports conversion between the following formats:
- CSV (Comma-Separated Values) -
text/csv
- JSON (JavaScript Object Notation) -
application/json
- Excel (.xlsx) -
application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
- Parquet -
application/vnd.apache.parquet
- TSV (Tab-Separated Values) -
text/tab-separated-values
You can convert between any combination of these formats. Use the List Supported Formats endpoint to get the current list of supported formats and their capabilities.
Best Practices
Error Handling
- Always check the
success
field in the response before processing data - Implement exponential backoff for retrying failed requests
- Use the
request_id
from error responses when contacting support
Rate Limiting
- Monitor rate limit headers in responses
- Implement queuing for batch operations
- Cache the results of the formats endpoint to avoid unnecessary calls
File Size Limits
- Maximum file size: 50 MB per file
- For larger files, consider splitting them or contacting support for enterprise options
Security
- Never expose your API key in client-side code or public repositories
- Rotate API keys regularly
- Use environment variables to store API keys
- Create separate API keys for different applications or environments
Support
Need help? Here are some resources:
- API Status: Check our status page for any ongoing issues
- Support Email: support@agentsfordata.com
- Documentation: Full OpenAPI specification available at
/api/public/v1/openapi.json
Changelog
Version 1.0 (Current)
- Initial release of the Public API
- File conversion endpoints
- Support for CSV, JSON, Excel, Parquet, and TSV formats