Airflow Post-hook
Pre-work Before Running the Pipeline
Create an Airflow Connection for Chicory API
Purpose: This connection securely stores your Chicory API credentials, allowing Airflow tasks to authenticate with the Chicory platform without exposing tokens in your DAG code. The agent will use this connection to receive cost analysis requests from your pipeline.
In Airflow UI → Admin → Connections → +
Fill in the following:
Conn Id:
chicory_apiConn Type:
HTTPHost:
https://app.chicory.ai/api/v1Extra (JSON):
{ "Authorization": "Bearer YOUR_CHICORY_API_TOKEN" }Save. This will let tasks authenticate without hardcoding the token.

Set Chicory Agent ID as Airflow Variable
Purpose: This variable identifies which specific Chicory agent should handle cost analysis for your pipeline. Each agent is configured with specific tools and context for your infrastructure, ensuring the right agent processes your cost analysis requests with access to your BigQuery, dbt, and dashboard tools.
airflow variables set CHICORY_AGENT_ID "YOUR_CHICORY_AGENT_ID"Airflow Post-hook
Add a task at the end of your DAG to call the agent after a successful pipeline run.
Purpose: The post-hook ensures cost analysis happens automatically after every successful pipeline run, providing consistent cost tracking without manual intervention.
What's in the DAG Code
The airflow_dag.py file contains a complete example showing:
Standard Pipeline Tasks:
start_task: Pipeline initializationtrigger_dbt_job: Executes dbt transformations via dbt Cloud APIUses Airflow Variables for dbt account/job IDs and authentication tokens
Chicory Integration Function (
run_chicory_agent):Retrieves connection details from the Airflow connection you configured
Constructs a detailed prompt with pipeline context (run_id, DAG name)
Makes API call to trigger the Chicory agent asynchronously
Polls for completion and returns results via XCom
Task Dependencies:
start_task >> trigger_dbt_job >> chicory_posthook >> end_taskEnsures cost analysis only runs after successful dbt execution
Maintains clear pipeline lineage for debugging
Agent Task
Sample user prompt/context
Alternative Orchestration Methods
1. Event-Driven Approaches
Instead of Airflow post-hooks, consider these reactive methods:
Cloud Functions/Lambda Triggers: Automatically trigger on BigQuery job completion
Pub/Sub Messaging: Real-time cost tracking with message queues
2. Scheduled Approaches
For batch cost analysis:
BigQuery Scheduled Queries: Native cost aggregation without external orchestration
Cloud Scheduler: Simple cron-based cost analysis triggers
3. dbt Integration
Embed cost tracking in your transformation layer:
dbt Post-hooks: Add cost tracking directly to model runs
dbt Exposures: Track cost lineage with downstream dependencies
Last updated