Microsoft Fabric Updates Blog

Apache Airflow Job File Management APIs

Introducing Apache Airflow Job File Management APIs—a significant advancement in workflow orchestration, designed to make processes more efficient, secure, and developer-friendly. These APIs are designed to give you full programmatic control over job files in your Apache Airflow environments, enabling seamless automation and integration across your data workflows.

The File Management APIs provide endpoints to:

  • Upload and manage DAG files: Easily add new DAGs or update existing ones without manual intervention.
  • List and retrieve files: Get a complete view of your job files for auditing and troubleshooting.
  • Secure file operations: Built-in support for authentication and role-based access ensures enterprise-grade security.

These capabilities align with the broader enhancements in Apache Airflow Jobs, including service-oriented architecture, granular task execution, and improved observability.

What’s New?

The new Apache Airflow Job File Management APIs support:

  • Get Apache Airflow Job File: Returns job file from Apache Airflow by path.
  • Create/Update Apache Airflow Job File: Creates or updates an Apache Airflow Job file.
  • Delete Apache Airflow Job File: Deletes the specified Apache Airflow Job file.
  • List Apache Airflow Job Files: Lists the files from the specified Apache Airflow Job.

Key Use Cases

Here are some practical scenarios where these APIs shine:

  1. Dynamic Workflow Updates: Enable real-time updates to DAGs without restarting your Airflow environment. This is particularly useful for data engineering teams managing frequent schema changes.
  2. Compliance and Audit: Retrieve historical versions of DAG files for compliance checks or debugging. The APIs make it easy to track changes and maintain audit trails.
  3. Error Recovery: Combine file management with customer error handling and retry mechanisms. If a DAG fails due to a configuration issue, roll back to a previous version programmatically and resume operations without downtime.

Getting Started

To start using the File Management APIs, check out our documentation on API capabilities for Fabric Data Factory’s Apache Airflow Job.

The new File Management APIs are more than just a convenience — they’re a foundation for scalable, secure, and automated workflow orchestration.

Postingan blog terkait

Apache Airflow Job File Management APIs

Februari 17, 2026 berdasarkan Penny Zhou

Coauthor: Abhishek Narain Ensuring secure connectivity to your data sources is critical for modern data estates. We have released the Key-Pair authentication for Snowflake connections Preview in October and are happy to announce it is now Generally Available (GA). This release offers an enhanced security alternative to basic authentication methods, such as username and password, … Continue reading “Snowflake Key-Pair Authentication (Generally Available)”

Februari 9, 2026 berdasarkan Leo Li

The On-premises Data Gateway manual update feature is now available in preview! This new capability simplifies gateway maintenance and helps you keep your environment secure and up to date with greater flexibility and control. With this new feature, administrators can now manually trigger updates—either directly through the gateway UI or programmatically via API or script. … Continue reading “Manual update for on-premises data gateway (Preview)”