Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
ChrisM2091
Frequent Visitor

104100 Couldn't refresh the entity because of an internal error

I am working with a dataflow Gen2 that imports csv data into a data warehouse by inserting a new table. 

The dataflow publishes without issues but it always fails on refresh with:
"104100 Couldn't refresh the entity because of an internal error"

I was initially getting an error related to datatypes but seeing as all of the columns are now set to be text those errors have stopped appearing. 

This error message is pretty vague and seems like there isn't really any real solution to this. Just wondering if there is any kind of workaround I can use to get that data into the data warehouse. 


1 ACCEPTED SOLUTION
Element115
Power Participant
Power Participant

I got the same error code but in my case it happens when using only one DFg2 to ingest data from an on-prem DB via the Power BI data gateway.  I have a support ticket open already and MS support is working on it.  

 

There is a workaround and it consists in using 2 DFg2s chained together, then the error does not appear and the data gets ingested as intended into a lakehouse I created specifically for this.

View solution in original post

4 REPLIES 4
Element115
Power Participant
Power Participant

I got the same error code but in my case it happens when using only one DFg2 to ingest data from an on-prem DB via the Power BI data gateway.  I have a support ticket open already and MS support is working on it.  

 

There is a workaround and it consists in using 2 DFg2s chained together, then the error does not appear and the data gets ingested as intended into a lakehouse I created specifically for this.

Thanks, this solved my issue. I'm guessing it has to do with it needing both the source and destination within the same dg2 to have the same gateway. Seems kind of silly since obviously if you are transferring to a warehouse or something it's not always going to be on prem. 

Exactly, especially if you're migrating from an on-prem relational DBMS or a relational WH to a lakehouse in the Fabric cloud, which should be done from a pipeline, not a dataflow.  My understanding is that this capability is coming in 24Q1 and is called OPGW (On-Prem GateWay) and will allow bulk copy from an on-prem source into a Fabric persistance layer.  Can't wait.

HimanshuS-msft
Community Support
Community Support

Hello @ChrisM2091 
Thanks for using the Fabric community.
Are you using  a data gateway ? If yes does the gateway logs give you any more info .

I am in agreement and this not at all useful .

Thanks
HImanshu

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.