Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
I am working with a dataflow Gen2 that imports csv data into a data warehouse by inserting a new table.
The dataflow publishes without issues but it always fails on refresh with:
"104100 Couldn't refresh the entity because of an internal error"
I was initially getting an error related to datatypes but seeing as all of the columns are now set to be text those errors have stopped appearing.
This error message is pretty vague and seems like there isn't really any real solution to this. Just wondering if there is any kind of workaround I can use to get that data into the data warehouse.
Solved! Go to Solution.
I got the same error code but in my case it happens when using only one DFg2 to ingest data from an on-prem DB via the Power BI data gateway. I have a support ticket open already and MS support is working on it.
There is a workaround and it consists in using 2 DFg2s chained together, then the error does not appear and the data gets ingested as intended into a lakehouse I created specifically for this.
I got the same error code but in my case it happens when using only one DFg2 to ingest data from an on-prem DB via the Power BI data gateway. I have a support ticket open already and MS support is working on it.
There is a workaround and it consists in using 2 DFg2s chained together, then the error does not appear and the data gets ingested as intended into a lakehouse I created specifically for this.
Thanks, this solved my issue. I'm guessing it has to do with it needing both the source and destination within the same dg2 to have the same gateway. Seems kind of silly since obviously if you are transferring to a warehouse or something it's not always going to be on prem.
Exactly, especially if you're migrating from an on-prem relational DBMS or a relational WH to a lakehouse in the Fabric cloud, which should be done from a pipeline, not a dataflow. My understanding is that this capability is coming in 24Q1 and is called OPGW (On-Prem GateWay) and will allow bulk copy from an on-prem source into a Fabric persistance layer. Can't wait.
Hello @ChrisM2091
Thanks for using the Fabric community.
Are you using a data gateway ? If yes does the gateway logs give you any more info .
I am in agreement and this not at all useful .
Thanks
HImanshu
User | Count |
---|---|
5 | |
5 | |
3 | |
3 | |
2 |