Microsoft Fabric Updates Blog

Edit the Destination Table Column Type when Copying Data to Lakehouse Table, Data Warehouse and SQL Data Stores 

To improve the flexibility for copying data in Fabric Data Factory, we are excited to announce that now you can edit destination table column types when copying data! 

Supported scenarios

This new feature allows you to edit the data type of the column for a new or auto-created destination table, if your data destination is one of the following data stores:

  • Lakehouse Table 
  • Data Warehouse 

SQL data stores: 

  • Azure SQL database 
  • Azure Synapse Analytics 
  • Azure SQL Database Managed Instance 
  • SQL server 

How to edit the destination table column type 

You can edit the destination data column through the Copy assistant or Copy activity in your data pipeline.  

Column mapping with the Copy assistant

When you specify Lakehouse Table/Data Warehouse/SQL data stores as your destination and select Load to new table on Connect to data destination page, the column mapping automatically shows. You can edit the destination column type based on your scenario. 

When copying data to a Lakehouse Table, for example, there may be a column called PersonID in the source that is an int type. You can now change it to a string type when mapping to destination column.

Note that editing the destination type is not currently supported when your source column is decimal type.  

Column mapping when your destination is Lakehouse Table with the Copy assistant.

Let’s take a look at another example when copying data to Data Warehouse/SQL data stores. Here, we have an id column in our source that is an int type and we can change it to a float type when mapping to the destination column. 

Column mapping when your destination is Data Warehouse/SQL data stores with the Copy assistant. 

 

Column mapping with the Copy activity 

In the data pipeline Copy activity, select the connection and configure the table in your source and destination.

Note that you should apply a new table/auto create table in destination.

Then, go to the Mapping tab and select Import schemas.  

Specify your destination table column type. You can follow the same steps as the Copy assistant to map the columns.

 

For more information about Lakehouse destination column mapping, read https://learn.microsoft.com/en-us/fabric/data-factory/connector-lakehouse-copy-activity#mapping 

For more information about Data Warehouse/SQL data stores destination column mapping, read: 

Have any questions or feedback? Leave a comment below!

Postagens relacionadas em blogs

Edit the Destination Table Column Type when Copying Data to Lakehouse Table, Data Warehouse and SQL Data Stores 

outubro 29, 2024 de Leo Li

We’re excited to announce several powerful updates to the Virtual Network (VNET) Data Gateway, designed to further enhance performance and improve the overall user experience. These new features allow users to better manage increasing workloads, perform complex data transformations, and simplify log management. Expanded Cluster Size from 5 to 7 One of the key improvements … Continue reading “New Features and Enhancements for Virtual Network Data Gateway”

outubro 28, 2024 de Estera Kot

We’re thrilled to announce that the Native Execution Engine is now available at no additional cost, unlocking next-level performance and efficiency for your workloads. What’s New?  The Native Execution Engine now supports Fabric Runtime 1.3, which includes Apache Spark 3.5 and Delta Lake 3.2. This upgrade enhances Microsoft Fabric’s Data Engineering and Data Science workflows, … Continue reading “Native Execution Engine available at no additional cost!”