Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello Community,
I have a requirement where i had all my raw data in Fabric lakehouse in form of csv files so i want to load all this CSV data into warehouse automatically. I tried the data pipeline its not working as lakehouse treats every file in parquet format, so i thought of using dataflows gen2 so here i can make transformations like changing the ID columns to integer and added destination as warehouse.
Tables structure i created prior to the above step, but when am running or refreshing dataflows gen2 it is failing with the below error.
Error Code: TridentLakeClientUserException, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Failed to insert a table., InnerException: Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: f350ab35-644d-46d7-a4d8-d1b6ea8b1ae4 and Batch Id: f350ab35-644d-46d7-a4d8-d1b6ea8b1ae4@d42fd039-ddd1-4991-9080-a798617703d2$2025-01-28T11:03:24.1484081Z@82505453-61a6-4c40-a189-4fab7811ef6e. underlying error code: 'TridentLakeClientUserException', error: [error=[code=TridentLakeClientUserException,pbi.error=[code=TridentLakeClientUserException,parameters=[ErrorMessage=Internal error PBIServiceException.,HttpStatusCode=403],details={},exceptionCulprit=1]]], Underlying error: Unable to create a table on the Lakehouse SQL catalog due to metadata refresh failure, for Lakehouse Id: f350ab35-644d-46d7-a4d8-d1b6ea8b1ae4 and Batch Id: f350ab35-644d-46d7-a4d8-d1b6ea8b1ae4@d42fd039-ddd1-4991-9080-a798617703d2$2025-01-28T11:03:24.1484081Z@82505453-61a6-4c40-a189-4fab7811ef6e. underlying error code: 'TridentLakeClientUserException',
Did anyone faced this issue, any help will be really appreciated.
Thanks in advance.
Solved! Go to Solution.
@v-pbandela-msft Yes, thanks for the response.
I will try to mention how I achieved this.
1. Lakehouse - Imported the raw data into bronze layer in lakehouse files section into Bronze folder
2. Warehouse: Created all the tables structure in the fabric warehouse as a first step with appropriate datatypes.
3. DataflowGen2: Created a connection to the CSV file source lakehouse and for each file i changed the columns data types which should match with the structure created in the warehouse.
4. Refresh - when am done for all csv files i published and refreshed the dataflow gen2 which loads all this csv files to lakehouse tables as Silver Layer
5. Datapipeline - Created the data pipeline as below
Dataflow Activity - Refreshes the dataflowgen2 automatically evertime for all csv files and loads into lakehouse tables whenever csv files gets updated with latest data same will reflect in lakehouse tables.
Lookup Activity - I am maintaining metadata of source(CSV) and target(warehouse table) names in one config table in warehouse. So am reading the data from the table.
ForEach Activity - I used the output from previous lookup activity and used only the table name from response.
Copy Activity - Inside foreach activity i took this copy activity and here I used source as lakehouse tables and target as warehouse. Make sure to have the source lakehouse table name and target warehouse names similar; then it is easier to automate the process for each iteration. If not, you may have to customize the expression accordingly.
Here, if you notice, we already loaded the data into Lakehouse tables with the required structure so we can go ahead and configure it with no issues. Ran the pipeline, and it worked as expected. Once you do all steps, the pipeline looks like below.
Thanks
Hello @pavannarani
I would prefer data pipelines for this scenario, but we can discuss that later.
The error you’re experiencing is related to metadata refresh failure when attempting to create a table in the Lakehouse SQL catalog. This is a known issue that several users have reported.
The error suggests a problem with metadata synchronization between the Lakehouse and the SQL Endpoint. This can sometimes occur due to performance issues or delays in the synchronization process
Use the ‘Refresh Metadata’ button on the SQL Endpoint
Try disabling the “Enable staging” option in your Dataflow Gen2 settings.
Try creating a new workspace and recreating your dataflow there
Could you please tell us what issues you are facing in Pipeline as well.
please accept this answer if this is helpful
@nilendraFabric Thanks for the response.
I have tried the pipeline initially but its failing to load data into warehouse as I heard lakehouse treats by default the files in parquet format, so it is giving error with varchar(8000) datatype it is expecting the destination table should have only varchar instead of all other datatypes.
I will try the approach of using dataflow gen2 to lakehouse instead of warehouse and see how it goes.
Is there any way of loading into the warehouse please let me know.
Hi @pavannarani,
Thank you for reaching out in Microsoft Community Forum.
Please follow below steps to load data into the warehouse;
1. Explicitly map and convert column datatypes (e.g., varchar(8000)) to match the warehouse schema during transformation.
2. Use Dataflows within the pipeline to handle datatype conversions and schema mapping when moving data from Lakehouse to the warehouse
Please continue using Microsoft community forum.
If you found this post helpful, please consider marking it as "Accept as Solution" and select "Yes" if it was helpful. help other members find it more easily.
Regards,
Pavan.
@v-pbandela-msft Yes, thanks for the response.
I will try to mention how I achieved this.
1. Lakehouse - Imported the raw data into bronze layer in lakehouse files section into Bronze folder
2. Warehouse: Created all the tables structure in the fabric warehouse as a first step with appropriate datatypes.
3. DataflowGen2: Created a connection to the CSV file source lakehouse and for each file i changed the columns data types which should match with the structure created in the warehouse.
4. Refresh - when am done for all csv files i published and refreshed the dataflow gen2 which loads all this csv files to lakehouse tables as Silver Layer
5. Datapipeline - Created the data pipeline as below
Dataflow Activity - Refreshes the dataflowgen2 automatically evertime for all csv files and loads into lakehouse tables whenever csv files gets updated with latest data same will reflect in lakehouse tables.
Lookup Activity - I am maintaining metadata of source(CSV) and target(warehouse table) names in one config table in warehouse. So am reading the data from the table.
ForEach Activity - I used the output from previous lookup activity and used only the table name from response.
Copy Activity - Inside foreach activity i took this copy activity and here I used source as lakehouse tables and target as warehouse. Make sure to have the source lakehouse table name and target warehouse names similar; then it is easier to automate the process for each iteration. If not, you may have to customize the expression accordingly.
Here, if you notice, we already loaded the data into Lakehouse tables with the required structure so we can go ahead and configure it with no issues. Ran the pipeline, and it worked as expected. Once you do all steps, the pipeline looks like below.
Thanks
Hi @pavannarani,
We hope your issue has been resolved, please mark it as "Accept as Solution" and give it a 'Kudos' so others can find it easily.
If you need any further assistance, feel free to reach out.
Please continue using Microsoft community forum.
Thank you,
Pavan.
If this solution is working could you please accept the answer!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
9 | |
5 | |
3 | |
3 | |
2 |
User | Count |
---|---|
6 | |
4 | |
3 | |
3 | |
3 |