Microsoft Fabric Updates Blog

Apache Airflow Job File Management APIs

Introducing Apache Airflow Job File Management APIs—a significant advancement in workflow orchestration, designed to make processes more efficient, secure, and developer-friendly. These APIs are designed to give you full programmatic control over job files in your Apache Airflow environments, enabling seamless automation and integration across your data workflows.

The File Management APIs provide endpoints to:

  • Upload and manage DAG files: Easily add new DAGs or update existing ones without manual intervention.
  • List and retrieve files: Get a complete view of your job files for auditing and troubleshooting.
  • Secure file operations: Built-in support for authentication and role-based access ensures enterprise-grade security.

These capabilities align with the broader enhancements in Apache Airflow Jobs, including service-oriented architecture, granular task execution, and improved observability.

What’s New?

The new Apache Airflow Job File Management APIs support:

  • Get Apache Airflow Job File: Returns job file from Apache Airflow by path.
  • Create/Update Apache Airflow Job File: Creates or updates an Apache Airflow Job file.
  • Delete Apache Airflow Job File: Deletes the specified Apache Airflow Job file.
  • List Apache Airflow Job Files: Lists the files from the specified Apache Airflow Job.

Key Use Cases

Here are some practical scenarios where these APIs shine:

  1. Dynamic Workflow Updates: Enable real-time updates to DAGs without restarting your Airflow environment. This is particularly useful for data engineering teams managing frequent schema changes.
  2. Compliance and Audit: Retrieve historical versions of DAG files for compliance checks or debugging. The APIs make it easy to track changes and maintain audit trails.
  3. Error Recovery: Combine file management with customer error handling and retry mechanisms. If a DAG fails due to a configuration issue, roll back to a previous version programmatically and resume operations without downtime.

Getting Started

To start using the File Management APIs, check out our documentation on API capabilities for Fabric Data Factory’s Apache Airflow Job.

The new File Management APIs are more than just a convenience — they’re a foundation for scalable, secure, and automated workflow orchestration.

Entradas de blog relacionadas

Apache Airflow Job File Management APIs

diciembre 18, 2025 por Connie Xu

The Spark job definition activity in Microsoft Fabric Data Factory pipelines now supports connection property, unlocking a more secure and production-ready way to run your SJDs. What’s New? With this update, you can configure Notebook activities to run as Service Principal (SPN) or Workspace Identity (WI). These authentication methods are our recommended approach for production … Continue reading “Run Spark Job Definitions in Pipelines with Service Principal or Workspace Identity”

diciembre 18, 2025 por Leo Li

Here is the December 2025 release of the on-premises data gateway (version 3000.298).