The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello,
I am using a data pipeline to copy the csv files from azure blob storage to Fabric warehouse using wildcard file path. I created the structure already in warehouse tables as how it will expect. But when am loading them using pipeline it is giving me the error which i am not able to reproduce it is saying modified date column is not matching with external data type.
But the thing here is same file is loading successfully using filepath scenario not using wildcard file path.
Success scenario - loading the file using filepath and data is loading successfully to warehouse table.
Failure scenario - Here am using wildcard file path and data load is failing to target warehouse table.
Error Message - Message=SQL DW Copy Command operation failed with error 'Column 'ModifiedDate' of type 'DATETIME2' is not compatible with external data type 'Parquet physical type: BYTE_ARRAY, logical type: UTF8', please try with 'VARCHAR(8000)'
Do anyone know why this is happening its working with single file path but using wildcard filepath its failing. ALso i checked the target warehouse table column type it is Datetime2(3). It is saying date field is compatible but varchar(8000) is a string ryt how can it work why it is saying its not compatible any idea. Did anyone faced this issue ?
Solved! Go to Solution.
Hi @pavannarani ,
Please follow the example described in the official documentation to check if the settings are correct:
Copy and transform data in Azure Blob Storage - Azure Data Factory & Azure Synapse | Microsoft Learn
Hope it helps!
Best regards,
Community Support Team_ Scott Chang
If this post helps then please consider Accept it as the solution to help the other members find it more quickly.
Thank you so much i went through the documentation shared wildcard characters should be passed correctly to fetch or load the data into target system. I retested them with required folders and specified characters it worked for me.
Hi @pavannarani ,
Please follow the example described in the official documentation to check if the settings are correct:
Copy and transform data in Azure Blob Storage - Azure Data Factory & Azure Synapse | Microsoft Learn
Hope it helps!
Best regards,
Community Support Team_ Scott Chang
If this post helps then please consider Accept it as the solution to help the other members find it more quickly.
User | Count |
---|---|
35 | |
14 | |
11 | |
11 | |
8 |
User | Count |
---|---|
44 | |
44 | |
19 | |
18 | |
17 |