Microsoft Fabric Updates Blog

Edit the Destination Table Column Type when Copying Data to Lakehouse Table, Data Warehouse and SQL Data Stores 

To improve the flexibility for copying data in Fabric Data Factory, we are excited to announce that now you can edit destination table column types when copying data! 

Supported scenarios

This new feature allows you to edit the data type of the column for a new or auto-created destination table, if your data destination is one of the following data stores:

  • Lakehouse Table 
  • Data Warehouse 

SQL data stores: 

  • Azure SQL database 
  • Azure Synapse Analytics 
  • Azure SQL Database Managed Instance 
  • SQL server 

How to edit the destination table column type 

You can edit the destination data column through the Copy assistant or Copy activity in your data pipeline.  

Column mapping with the Copy assistant

When you specify Lakehouse Table/Data Warehouse/SQL data stores as your destination and select Load to new table on Connect to data destination page, the column mapping automatically shows. You can edit the destination column type based on your scenario. 

When copying data to a Lakehouse Table, for example, there may be a column called PersonID in the source that is an int type. You can now change it to a string type when mapping to destination column.

Note that editing the destination type is not currently supported when your source column is decimal type.  

Column mapping when your destination is Lakehouse Table with the Copy assistant.

Let’s take a look at another example when copying data to Data Warehouse/SQL data stores. Here, we have an id column in our source that is an int type and we can change it to a float type when mapping to the destination column. 

Column mapping when your destination is Data Warehouse/SQL data stores with the Copy assistant. 

 

Column mapping with the Copy activity 

In the data pipeline Copy activity, select the connection and configure the table in your source and destination.

Note that you should apply a new table/auto create table in destination.

Then, go to the Mapping tab and select Import schemas.  

Specify your destination table column type. You can follow the same steps as the Copy assistant to map the columns.

 

For more information about Lakehouse destination column mapping, read https://learn.microsoft.com/en-us/fabric/data-factory/connector-lakehouse-copy-activity#mapping 

For more information about Data Warehouse/SQL data stores destination column mapping, read: 

Have any questions or feedback? Leave a comment below!

Related blog posts

Edit the Destination Table Column Type when Copying Data to Lakehouse Table, Data Warehouse and SQL Data Stores 

June 21, 2024 by Marc Bushong

Developing ETLs/ELTs can be a complex process when you add in business logic, large amounts of data, and the high volume of table data that needs to be moved from source to target. This is especially true in analytical workloads involving relational data when there is a need to either fully reload a table or incrementally update a table. Traditionally this is easily completed in a flavor of SQL (or name your favorite relational database). But a question is, how can we execute a mature, dynamic, and scalable ETL/ELT utilizing T-SQL with Microsoft Fabric? The answer is with Fabric Pipelines and Data Warehouse.

June 18, 2024 by RK Iyer

✎ Co-Author – Abhishek Narain Overview Building an effective Lakehouse starts with establishing a robust ingestion layer. Ingestion refers to the process of collecting, importing, and processing raw data from various sources into the data lake. Data ingestion is fundamental to the success of a data lake as it enables the consolidation, exploration, and processing … Continue reading “Demystifying Data Ingestion in Fabric: Fundamental Components for Ingesting Data into a Fabric Lakehouse using Fabric Data Pipelines”