Skip to main content
Hero Light

Data Observability

Get comprehensive insights into your data pipelines with seamless integration to Spark, Airflow, dbt, and Flink. Visualize lineage, track dependencies, and monitor execution. Learn more about Observability →

Lake

Run SQL queries, upload datasets, and collaborate on both public and private data. Every operation automatically captures lineage metadata for complete observability. Learn more about the Lake →

Getting Started

The easiest way to get started is to send a POST request to our ingestion endpoint using cURL. Note, replace [OLEANDER-API-KEY] below with your API key.
curl -X POST https://oleander.dev/api/v1/lineage \
  -H 'Content-Type: application/json' \
  -H 'Authorization: Bearer [OLEANDER-API-KEY]' \
  -d '{
    "eventType": "START",
    "eventTime": "2024-12-25T05:45:18.249Z",
    "run": {
      "runId": "e05c7916-6cbb-470e-b13b-25af0a757eac"
    },
    "job": {
      "namespace": "my-namespace",
      "name": "my-job"
    },
    "inputs": [{
      "namespace": "my-namespace",
      "name": "my-input"
    }],
    "producer": "https://github.com/OpenLineage/OpenLineage/blob/v1-0-0/client",
    "schemaURL": "https://openlineage.io/spec/1-0-5/OpenLineage.json#/definitions/RunEvent"
  }'
While you can manually construct and send OpenLineage events, we recommend using the client libraries or integrations:
  1. Use Client Libraries - Check out the Python client or Java client
  2. Set up Integrations - Explore our integrations for Airflow, Spark, and more
  3. Explore the Lake - Start running SQL queries and uploading datasets
  4. Chat with AI Assistants - Set up Ollie and Lea for automated incident investigation and compliance monitoring
Ready to get started? Check out our resources or dive into OpenLineage integration.