DAG generation · Intelligent pipeline builder
Generate pipelines, then run them in Airflow
POST /pipeline-builder/pipelines/generate or the CLI. After generation, open Airflow to trigger runs. Manifest below refreshes on each load.Step 1
Generate
Create DAGs from your pipeline definitions.
Step 2
Deploy
Artifacts land under AI_Pipelines and dags/.
Step 3
Operate
Reload the scheduler; monitor in Airflow UI.
Generated pipelines
Files under intelligent_pipeline_builder_2/AI_Pipelines/. Run triggers Airflow DAGs on the server.
| File | pipeline_id | dag_id | Actions |
|---|---|---|---|
| No pipeline files found. Generate from opportunities or the form below. | |||
GET /pipeline-builder/pipelines · DELETE /pipeline-builder/pipelines/{filename} · POST /pipeline-builder/pipelines/{filename}/trigger
Run pipeline builder (from pipeline_opportunities.csv)
Generates pipelines sequentially and streams logs from the server.
0%
Live log
—
Generate Airflow pipeline
POST /pipelines/generate — writes under intelligent_pipeline_builder_2/AI_Pipelines/ on the server.
Set NEXT_PUBLIC_ADMIN_API_URL or Settings if the API is not on 127.0.0.1:8090.
API — generate pipelines
From repository root (server must be running).
cd /path/to/kinefinai/intelligent_pipeline_builder_2 ./scripts/run_api.sh # Then POST /pipelines/generate (see Pipeline Builder README for curl example).
Manifest snapshot
Read from disk on each load.
ENOENT: no such file or directory, open '/app/intelligent_pipeline_builder_2/AI_Pipelines/_manifest.json'
Output locations
Where files land.
- intelligent_pipeline_builder_2/AI_Pipelines/
- intelligent_pipeline_builder_2/airflow/dags/ (framework DAG)
- intelligent_pipeline_builder_2/AI_Pipelines/_manifest.json