Authentication & onboarding
- Access is limited to Enterprise tier customers—ask your Hatch Account Manager to enable the Bulk Export API for your organization.
- A Manager must create a reporting API key in Settings → API Keys.
- Send the key in an
Authorization: Bearer <api-key> header on every request.
Rate limits & constraints
- Data is updated nightly and will only contain the previous day’s data
- Up to 1 export every 5 minutes per organization.
- Maximum 100 exports per organization per day.
- Date ranges may not exceed 180 days into the past.
- Maximum payload size: 100 MB uncompressed. Larger requests return
413 Payload Too Large.
GET /export-data
Method & Path: GET https://bulk-export-api.usehatchapp.com/api/v1/export-data
Request a downloadable ZIP archive containing newline-delimited JSON for either the contact_history or conversation_items dataset.
Exports default to newline-delimited JSON, but you can also request a CSV version by adding &output_format=csv to the end of the query url.
Prefer to explore interactively? Use the Swagger UI linked above to send test requests, toggle the output format, and view sample payloads.
Query parameters
| Name | Required | Description |
|---|
data_source | Yes | Dataset to export. Allowed values: contact_history, conversation_items. |
start_date | Yes | Inclusive start date in YYYY-MM-DD. Earliest supported date depends on your contracted data retention window. |
end_date | Yes | Inclusive end date in YYYY-MM-DD. Must be on/after start_date and within 180 days of it. |
output_format | No | Defaults to json. Set to csv to receive a CSV file (still packaged inside the ZIP). |
Example request
curl -L "https://bulk-export-api.usehatchapp.com/api/v1/export-data?data_source=contact_history&start_date=2025-10-30&end_date=2025-11-04&output_format=csv" \
-H "Authorization: Bearer <api-key>" \
-o contact_history_2025-10-30_2025-11-04.zip
Response
- Status:
200 OK
- Headers:
Content-Type: application/zip
- Body: Binary ZIP archive containing one data file—
.json for the default JSONL export or .csv when output_format=csv.
- File names:
{data_source}_{start_date}_{end_date}.zip → {data_source}_{start_date}_{end_date}.json (or .csv).
conversation_items_2024-02-01_2024-02-07.zip
└── conversation_items_2024-02-01_2024-02-07.json
{"conversation_id": "c_123", "actor_type": "contact", "direction": "inbound", "..."}
{"conversation_id": "c_124", "actor_type": "hatch", "direction": "outbound", "..."}
| Status | Meaning |
|---|
200 | Export succeeded and returned records. |
204 | Request succeeded but the specified range did not contain any rows. |
400 | Invalid query parameters (e.g., bad date format, unsupported dataset). |
401 | Missing or invalid API key. |
413 | Result exceeded the 100 MB size limit. Narrow the date range. |
429 | Rate limit exceeded (1 export every 5 minutes / 100 exports per day). |
500 | Unexpected server error. Retry with backoff. |
Operational guidance
- Use the schema overview to inspect columns before mapping the export into your warehouse.
- Poll
GET /export-data on a schedule that respects both the 5-minute spacing rule and daily cap.
- Monitor
413 responses—if they occur frequently, reduce the date window (for example, export weekly instead of monthly).
- Store exported ZIP archives in an object store (S3, GCS, Azure Blob) before processing so you can re-run failed ETL jobs without hitting rate limits.
Next steps
- Review the Bulk Export API integration guide for more information on the available export data.
- Request the Bulk Export API to be enabled for your organization through your Hatch Account Team.
- Check for API Keys under Integrations → API Keys
- Schedule exports at a cadence that aligns with the published limits (one every 5 minutes, 100 per day).