Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
I am working with a dataflow Gen2 that imports csv data into a data warehouse by inserting a new table.
The dataflow publishes without issues but it always fails on refresh with:
"104100 Couldn't refresh the entity because of an internal error"
I was initially getting an error related to datatypes but seeing as all of the columns are now set to be text those errors have stopped appearing.
This error message is pretty vague and seems like there isn't really any real solution to this. Just wondering if there is any kind of workaround I can use to get that data into the data warehouse.
Solved! Go to Solution.
I got the same error code but in my case it happens when using only one DFg2 to ingest data from an on-prem DB via the Power BI data gateway. I have a support ticket open already and MS support is working on it.
There is a workaround and it consists in using 2 DFg2s chained together, then the error does not appear and the data gets ingested as intended into a lakehouse I created specifically for this.
I got the same error code but in my case it happens when using only one DFg2 to ingest data from an on-prem DB via the Power BI data gateway. I have a support ticket open already and MS support is working on it.
There is a workaround and it consists in using 2 DFg2s chained together, then the error does not appear and the data gets ingested as intended into a lakehouse I created specifically for this.
Thanks, this solved my issue. I'm guessing it has to do with it needing both the source and destination within the same dg2 to have the same gateway. Seems kind of silly since obviously if you are transferring to a warehouse or something it's not always going to be on prem.
Exactly, especially if you're migrating from an on-prem relational DBMS or a relational WH to a lakehouse in the Fabric cloud, which should be done from a pipeline, not a dataflow. My understanding is that this capability is coming in 24Q1 and is called OPGW (On-Prem GateWay) and will allow bulk copy from an on-prem source into a Fabric persistance layer. Can't wait.
Hello @ChrisM2091
Thanks for using the Fabric community.
Are you using a data gateway ? If yes does the gateway logs give you any more info .
I am in agreement and this not at all useful .
Thanks
HImanshu
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.