Microsoft Fabric Updates Blog

Edit the Destination Table Column Type when Copying Data to Lakehouse Table, Data Warehouse and SQL Data Stores 

To improve the flexibility for copying data in Fabric Data Factory, we are excited to announce that now you can edit destination table column types when copying data! 

Supported scenarios

This new feature allows you to edit the data type of the column for a new or auto-created destination table, if your data destination is one of the following data stores:

  • Lakehouse Table 
  • Data Warehouse 

SQL data stores: 

  • Azure SQL database 
  • Azure Synapse Analytics 
  • Azure SQL Database Managed Instance 
  • SQL server 

How to edit the destination table column type 

You can edit the destination data column through the Copy assistant or Copy activity in your data pipeline.  

Column mapping with the Copy assistant

When you specify Lakehouse Table/Data Warehouse/SQL data stores as your destination and select Load to new table on Connect to data destination page, the column mapping automatically shows. You can edit the destination column type based on your scenario. 

When copying data to a Lakehouse Table, for example, there may be a column called PersonID in the source that is an int type. You can now change it to a string type when mapping to destination column.

Note that editing the destination type is not currently supported when your source column is decimal type.  

Column mapping when your destination is Lakehouse Table with the Copy assistant.

Let’s take a look at another example when copying data to Data Warehouse/SQL data stores. Here, we have an id column in our source that is an int type and we can change it to a float type when mapping to the destination column. 

Column mapping when your destination is Data Warehouse/SQL data stores with the Copy assistant. 

 

Column mapping with the Copy activity 

In the data pipeline Copy activity, select the connection and configure the table in your source and destination.

Note that you should apply a new table/auto create table in destination.

Then, go to the Mapping tab and select Import schemas.  

Specify your destination table column type. You can follow the same steps as the Copy assistant to map the columns.

 

For more information about Lakehouse destination column mapping, read https://learn.microsoft.com/en-us/fabric/data-factory/connector-lakehouse-copy-activity#mapping 

For more information about Data Warehouse/SQL data stores destination column mapping, read: 

Have any questions or feedback? Leave a comment below!

Related blog posts

Edit the Destination Table Column Type when Copying Data to Lakehouse Table, Data Warehouse and SQL Data Stores 

September 26, 2024 by Ye Xu

Fast Copy in Dataflow Gen2 is now General Available! This powerful feature enables rapid and efficient ingestion of large data volumes, leveraging the same robust backend as the Copy Activity in Data pipelines. With Fast Copy, you can experience significantly shorter data processing times and improved cost efficiency for your Dataflow Gen2. Additionally, it boosts … Continue reading “Announcing the General Availability of Fast Copy in Dataflows Gen2”

September 26, 2024 by Leo Li

Fabric Data Pipeline support in the On-Premises Data Gateway is now generally available! The on-premises data gateway allows you to seamlessly bring on-premises data to Microsoft Fabric. With data pipelines and the on-premises data gateway, you can perform high-scale data ingestion of your on-premises data into Fabric. Enhancements Over Self-Hosted Integration Runtime in Azure Data … Continue reading “Announcing the General Availability of Fabric Data Pipeline Support in the On-Premises Data Gateway”