Microsoft Fabric Updates Blog

Edit the Destination Table Column Type when Copying Data to Lakehouse Table, Data Warehouse and SQL Data Stores 

To improve the flexibility for copying data in Fabric Data Factory, we are excited to announce that now you can edit destination table column types when copying data! 

Supported scenarios

This new feature allows you to edit the data type of the column for a new or auto-created destination table, if your data destination is one of the following data stores:

  • Lakehouse Table 
  • Data Warehouse 

SQL data stores: 

  • Azure SQL database 
  • Azure Synapse Analytics 
  • Azure SQL Database Managed Instance 
  • SQL server 

How to edit the destination table column type 

You can edit the destination data column through the Copy assistant or Copy activity in your data pipeline.  

Column mapping with the Copy assistant

When you specify Lakehouse Table/Data Warehouse/SQL data stores as your destination and select Load to new table on Connect to data destination page, the column mapping automatically shows. You can edit the destination column type based on your scenario. 

When copying data to a Lakehouse Table, for example, there may be a column called PersonID in the source that is an int type. You can now change it to a string type when mapping to destination column.

Note that editing the destination type is not currently supported when your source column is decimal type.  

Column mapping when your destination is Lakehouse Table with the Copy assistant.

Let’s take a look at another example when copying data to Data Warehouse/SQL data stores. Here, we have an id column in our source that is an int type and we can change it to a float type when mapping to the destination column. 

Column mapping when your destination is Data Warehouse/SQL data stores with the Copy assistant. 

 

Column mapping with the Copy activity 

In the data pipeline Copy activity, select the connection and configure the table in your source and destination.

Note that you should apply a new table/auto create table in destination.

Then, go to the Mapping tab and select Import schemas.  

Specify your destination table column type. You can follow the same steps as the Copy assistant to map the columns.

 

For more information about Lakehouse destination column mapping, read https://learn.microsoft.com/en-us/fabric/data-factory/connector-lakehouse-copy-activity#mapping 

For more information about Data Warehouse/SQL data stores destination column mapping, read: 

Have any questions or feedback? Leave a comment below!

Billets de blog associés

Edit the Destination Table Column Type when Copying Data to Lakehouse Table, Data Warehouse and SQL Data Stores 

juin 24, 2024 par Justin Barry

When we talk about Microsoft Fabric workspace collaboration, a common scenario is developers and their teams using a shared workspace environment, which means they have access to “live items”. A change made directly within a workspace would override and affect all other developers or users utilizing that workspace. This is where git becomes increasingly important … Continue reading “Microsoft Fabric Lifecycle Management: Getting started with development in isolation using a Private Workspace”

juin 21, 2024 par Marc Bushong

Developing ETLs/ELTs can be a complex process when you add in business logic, large amounts of data, and the high volume of table data that needs to be moved from source to target. This is especially true in analytical workloads involving relational data when there is a need to either fully reload a table or incrementally update a table. Traditionally this is easily completed in a flavor of SQL (or name your favorite relational database). But a question is, how can we execute a mature, dynamic, and scalable ETL/ELT utilizing T-SQL with Microsoft Fabric? The answer is with Fabric Pipelines and Data Warehouse.