Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
ChrisM2091
Frequent Visitor

104100 Couldn't refresh the entity because of an internal error

I am working with a dataflow Gen2 that imports csv data into a data warehouse by inserting a new table. 

The dataflow publishes without issues but it always fails on refresh with:
"104100 Couldn't refresh the entity because of an internal error"

I was initially getting an error related to datatypes but seeing as all of the columns are now set to be text those errors have stopped appearing. 

This error message is pretty vague and seems like there isn't really any real solution to this. Just wondering if there is any kind of workaround I can use to get that data into the data warehouse. 


1 ACCEPTED SOLUTION
Element115
Power Participant
Power Participant

I got the same error code but in my case it happens when using only one DFg2 to ingest data from an on-prem DB via the Power BI data gateway.  I have a support ticket open already and MS support is working on it.  

 

There is a workaround and it consists in using 2 DFg2s chained together, then the error does not appear and the data gets ingested as intended into a lakehouse I created specifically for this.

View solution in original post

4 REPLIES 4
Element115
Power Participant
Power Participant

I got the same error code but in my case it happens when using only one DFg2 to ingest data from an on-prem DB via the Power BI data gateway.  I have a support ticket open already and MS support is working on it.  

 

There is a workaround and it consists in using 2 DFg2s chained together, then the error does not appear and the data gets ingested as intended into a lakehouse I created specifically for this.

Thanks, this solved my issue. I'm guessing it has to do with it needing both the source and destination within the same dg2 to have the same gateway. Seems kind of silly since obviously if you are transferring to a warehouse or something it's not always going to be on prem. 

Exactly, especially if you're migrating from an on-prem relational DBMS or a relational WH to a lakehouse in the Fabric cloud, which should be done from a pipeline, not a dataflow.  My understanding is that this capability is coming in 24Q1 and is called OPGW (On-Prem GateWay) and will allow bulk copy from an on-prem source into a Fabric persistance layer.  Can't wait.

HimanshuS-msft
Community Support
Community Support

Hello @ChrisM2091 
Thanks for using the Fabric community.
Are you using  a data gateway ? If yes does the gateway logs give you any more info .

I am in agreement and this not at all useful .

Thanks
HImanshu

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

PBI_Carousel_NL_June

Fabric Community Update - June 2024

Get the latest Fabric updates from Build 2024, key Skills Challenge voucher deadlines, top blogs, forum posts, and product ideas.

MayFBCUpdateCarousel

Fabric Monthly Update - May 2024

Check out the May 2024 Fabric update to learn about new features.

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

Top Solution Authors