Microsoft Fabric Updates Blog

Microsoft Fabric July 2023 Update

Welcome to the July 2023 update. We have features in Core, Synapse, Data Factory, Data Activator, Community, and Power BI. Contents Core Help Pane Monitoring Hub improvements Admin Microsoft Public Preview Default Sharing On by Default Effective July 5th, 2023 OneLake OneLake file explorer update with support for switching organizational accounts Item sharing Synapse Data … Continue reading “Microsoft Fabric July 2023 Update”

Wondering how to incrementally amass data in your data destination? This is how!

In a lot of scenarios, you want to only get new data from your sources and append it to your data destination to report over. With Dataflows Gen2 that comes with support for data destinations, you can setup your own pattern to load new data, replace some old data and keep your reports up to … Continue reading “Wondering how to incrementally amass data in your data destination? This is how!”

Microsoft Fabric Data Factory Webinar Series – August 2023

Are you interested in learning more about Data Factory, the cloud-based data integration service that allows you to create data-driven workflows in Microsoft Fabric? If so, you are invited to join our webinars, where we will show you how to use Data Factory to transform and orchestrate your data in various scenarios. Each webinar will … Continue reading “Microsoft Fabric Data Factory Webinar Series – August 2023”

Data Pipeline Performance Improvements Part 2: Creating an Array of JSONs

Welcome back to Part 2 of this 3-part series on optimizing Data Pipelines for historical loads. In the first two parts, we are introducing two technical patterns. Then in Part 3, we will bring everything together, covering an end-to-end design pattern. To recap, in Part 1 we covered how to parse a time interval (dd.hh:mm:ss) … Continue reading “Data Pipeline Performance Improvements Part 2: Creating an Array of JSONs”

Data Pipeline Performance Improvements Part 1: How to convert a time interval (dd.hh:mm:ss) into seconds

Series Overview Welcome to this short series where we’ll be discussing the technical methods used to improve Data Pipeline Copy activity performance through parallelization by logically partitioning any source. Often, we see solutions leveraging a single Copy Activity to move large volumes of data. While this works great, you might face a scenario where you … Continue reading “Data Pipeline Performance Improvements Part 1: How to convert a time interval (dd.hh:mm:ss) into seconds”