Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Alaahady
Advocate I
Advocate I

Copy Data to Fabric Warehouse – Handling nvarchar(max) and datetimeoffset(2) Columns

Hi Fabric Community,

I'm using the Copy Data activity in Microsoft Fabric to load data from an on-premises SQL Server into a Fabric Warehouse. I’ve run into a couple of issues and would appreciate any guidance:

  1. nvarchar(max) Column Issue
    One of the columns in my source table is comment, defined as nvarchar(max). Some rows contain more than 8000 characters, which causes the copy job to fail.
    Is there a recommended way to truncate the data to 8000 characters during the copy process to avoid this error?

  2. datetimeoffset(2) Conversion Issue
    Another column is of type datetimeoffset(2). Fabric automatically converts it to varchar, which works — but if I try to change the mapping to datetime2, I get an error about incompatible data types.
    Is there a proper way to handle datetimeoffset fields when copying to Fabric Warehouse?

Interestingly, if I exclude these two columns, the incremental refresh works successfully. So I suspect these columns are causing the issue.

Alaahady_0-1758896398962.pngAlaahady_1-1758896411784.png

 

Any advice or best practices would be greatly appreciated!

Thanks in advance.

1 ACCEPTED SOLUTION

Hi @Alaahady 

Just checking in to see if you had a chance to review the previous response. If you still need assistance, please let us know and we’ll be glad to help.

View solution in original post

10 REPLIES 10
Alaahady
Advocate I
Advocate I

HI,

It is working now , thank you all for your help 

v-aatheeque
Community Support
Community Support

Hi @Alaahady 


We wanted to follow up to check if you’ve had an opportunity to review the previous responses. If you require further assistance, please don’t hesitate to let us know.

Hi @Alaahady 

Just checking in to see if you had a chance to review the previous response. If you still need assistance, please let us know and we’ll be glad to help.

tayloramy
Community Champion
Community Champion

Hi @Alaahady

 

In your source query, throw in a left() call around that column to truncate it at 8000 characters. 

 

As for not seeing the upsert option, I think it only exists for lakehouses as the destination. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

 

tayloramy
Community Champion
Community Champion

Hi @Alaahady 

 

What's going on and why: Fabric Warehouse doesn't persist nvarchar or datetimeoffset columns. Instead, Warehouse uses UTF-8 varchar (no nvarchar at rest), and datetimeoffset is not a supported persisted type. On very long strings, you'll also hit the 8,000-char boundary unless you land them as varchar(max) in Warehouse (currently preview). This is why your copy works when those two columns are excluded. See Fabric Warehouse data types.

Quick solution

  1. Truncate the long nvarchar(max) during the copy (safe 8,000 chars):
    Use a source SQL query in your Copy activity and pre-shape the column:
SELECT
  ...,
  LEFT(comment, 8000) AS comment    -- trims to 8000
FROM dbo.YourSourceTable;

Pre-create the target column as varchar(8000) in the Warehouse.

If you actually need more than 8,000 characters, create the target column as varchar(max) (preview, up to ~1 MB per cell) and keep your source query as comment (no LEFT). Note that in the SQL analytics endpoint it still appears as varchar(8000), but the Warehouse stores up to the preview limit. Docs: Data types (varchar(max) preview note) and Fabric blog: Working with large data types.

  1. Convert datetimeoffset(2) to datetime2(2) in the source query:
    Best practice is to normalize to UTC, then cast:
SELECT
  ...,
  CAST(SWITCHOFFSET(your_datetimeoffset_col, '+00:00') AS datetime2(2)) AS your_dt_utc
FROM dbo.YourSourceTable;

Alternatively, use AT TIME ZONE if you need to convert from a known local zone before storing as UTC:

SELECT
  CAST(
    (your_local_dt AT TIME ZONE 'Pacific Standard Time') AT TIME ZONE 'UTC'
    AS datetime2(2)
  ) AS your_dt_utc;

Warehouse guidance: use datetime2 instead of datetimeoffset. Docs: Warehouse data types, SWITCHOFFSET, AT TIME ZONE.



If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

Hi @tayloramy 

Thank you for your mesage, I agree this is a good solution, but I don't think Copy Job support to select query, it just give you the list of tables in the SQL servrer and I select table and columns, it doesn't event give me the option to use Cast 

Hi @Alaahady

That's correct, you would need to use a pipeline for this to have more control.  

I don't think Pipeline has incremental option, provided that I have to ID column and I need to insert a new records that ID doesn't exists before and update exisitng record the record_modify_date is > max exist record_modify_date 

Hi @Alaahady

 

Copy activities do have an upsert option: 

tayloramy_0-1759157373387.png

 

If your source table also has a modified datetime column, I'd recommend pre-filtering to records that were modified since the last pipeline run as well for better performance. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

Hi @tayloramy 

Thank you for your suggestion — it helped me move one step forward. I've successfully built a pipeline that sets the variable max_date = MAX(rec_mod_dtz) from the target table in the warehouse. The variable was set correctly, and I added a Copy Data activity to retrieve the date using a query from the SQL source server, casting the variable as you recommended:

    ,LEFT(comment, 8000) AS comment    -- trims to 8000
  ,CAST(SWITCHOFFSET([rec_mod_dtz], '+00:00') AS datetime2(2)) AS rec_mod_dtz

Alaahady_0-1759168199365.png

 

However, I couldn’t locate the Upsert option. I proceeded with the Copy activity,

 

 

Alaahady_1-1759168231850.png

 

Alaahady_2-1759168254129.png

 

though I'm unsure whether it will replace existing records or append new ones.

Currently, I'm encountering the following error:

ErrorCode=DWCopyCommandOperationFailed
Message=String or binary data would be truncated while reading column of type 'VARCHAR(8000)'
 

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors