Microsoft Fabric Updates Blog

Apache Airflow Job File Management APIs

Introducing Apache Airflow Job File Management APIs—a significant advancement in workflow orchestration, designed to make processes more efficient, secure, and developer-friendly. These APIs are designed to give you full programmatic control over job files in your Apache Airflow environments, enabling seamless automation and integration across your data workflows.

The File Management APIs provide endpoints to:

  • Upload and manage DAG files: Easily add new DAGs or update existing ones without manual intervention.
  • List and retrieve files: Get a complete view of your job files for auditing and troubleshooting.
  • Secure file operations: Built-in support for authentication and role-based access ensures enterprise-grade security.

These capabilities align with the broader enhancements in Apache Airflow Jobs, including service-oriented architecture, granular task execution, and improved observability.

What’s New?

The new Apache Airflow Job File Management APIs support:

  • Get Apache Airflow Job File: Returns job file from Apache Airflow by path.
  • Create/Update Apache Airflow Job File: Creates or updates an Apache Airflow Job file.
  • Delete Apache Airflow Job File: Deletes the specified Apache Airflow Job file.
  • List Apache Airflow Job Files: Lists the files from the specified Apache Airflow Job.

Key Use Cases

Here are some practical scenarios where these APIs shine:

  1. Dynamic Workflow Updates: Enable real-time updates to DAGs without restarting your Airflow environment. This is particularly useful for data engineering teams managing frequent schema changes.
  2. Compliance and Audit: Retrieve historical versions of DAG files for compliance checks or debugging. The APIs make it easy to track changes and maintain audit trails.
  3. Error Recovery: Combine file management with customer error handling and retry mechanisms. If a DAG fails due to a configuration issue, roll back to a previous version programmatically and resume operations without downtime.

Getting Started

To start using the File Management APIs, check out our documentation on API capabilities for Fabric Data Factory’s Apache Airflow Job.

The new File Management APIs are more than just a convenience — they’re a foundation for scalable, secure, and automated workflow orchestration.

Related blog posts

Apache Airflow Job File Management APIs

February 9, 2026 by Leo Li

The On-premises Data Gateway manual update feature is now available in preview! This new capability simplifies gateway maintenance and helps you keep your environment secure and up to date with greater flexibility and control. With this new feature, administrators can now manually trigger updates—either directly through the gateway UI or programmatically via API or script. … Continue reading “Manual update for on-premises data gateway (Preview)”

February 3, 2026 by Bogdan Crivat

As executives plan the next phase of their data and AI transformation, the bar for analytics infrastructure continues to rise. Enterprises are expected to support traditional business intelligence, increasingly complex analytics, and a new generation of AI-driven workloads—often on the same data, at the same time, and with far greater expectations for speed and cost … Continue reading “A turning point for enterprise data warehousing “