Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
Sanchana
Frequent Visitor

Dataflow Gen2: Couldn't refresh the entity because of an issue with the mashup document

I am trying to insert data into Lakehouse using Dataflow Gen 2. Upon publishing the data into Dataflow Gen 2 this is error Iam getting.

Error Code: 104100, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataFormat.Error: Failed to insert a table., InnerException: Invalid cell value '#{0}'., Underlying error: Invalid cell value '#N/A'. Details: Reason = DataFormat.Error;Message = Invalid cell value '#N/A'.;Message.Format = Invalid cell value '#{0}'.;Message.Parameters = {"#N/A"}

 

Please do let me know how to resolve this issue.

1 ACCEPTED SOLUTION
StephenBatich
Frequent Visitor

Disabling the staging tables resolved the issue for me. Right click on the query, and uncheck 'Enable staging':

 

StephenBatich_0-1706804565030.png

 

View solution in original post

21 REPLIES 21
StephenBatich
Frequent Visitor

Disabling the staging tables resolved the issue for me. Right click on the query, and uncheck 'Enable staging':

 

StephenBatich_0-1706804565030.png

 

Hi @Sanchana 

 

Just following up to see on the last response and was just checking back to see if you have a resolution yet.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.


Thanks.

The solution I listed above with the screen shot, resolved MY problem.  I consider it a solution, but will lean on the channel users to tell me if it resolved their issues.

frithjof_v
Community Champion
Community Champion

If you click refresh preview inside the dataflow gen2, does the data preview seem to match the columns and data types of the lakehouse table?

 

Do you find some invalid values/errors in some columns in the data preview?

 

For example, do some cells have the value #N/A but the column type is Number or Date?

Hi,

Thank you for your patience. 

On implementing your second and third suggestion I was able to resolve the #N/A issue. Further on resolving that issue I am getting the following error where I installed the 64-bit version of the Access Database Engine still I am getting this issue.

Screen Shot 2024-01-25 at 1.58.46 PM.png

Your input holds significant value.

 

Regards,

Sanchana

I don't understand one thing.  Why suddenly is there an Access DB involved in all this?  I thought you said: 

 

source = Excel files in OneDrive

destination = Fabric lakehouse

 

Why now are you mentioning the Access DB?

To connect to a legacy workbook (such as .xls or .xlsb), the Access Database Engine OLEDB (or ACE) provider is required:

 

https://learn.microsoft.com/en-us/power-query/connectors/excel#prerequisites

 

ACE can't be installed in cloud service environments. So if you're seeing this error in a cloud host (such as Power Query Online), you'll need to use a gateway that has ACE installed to connect to the legacy Excel files.

 

 

(I have no experience with that, I just copy pasted the information from the webpage)

Aahh I see... that reminds me of something... if you have a data gateway installed on your local machine, then the Access DB that sits in the back and that needs to be able to serve data to outside the LAN (local area network), needs to have its TCP/IP named pipes configured (or at least that's what I vaguely remember from what we had to do with SQL Server; I am not a DBA and so this is not really my cup of tea), and also the firewall needs to be configured like so:

 

 

Solution: Set new firewall rules on server running the gateway
The firewall rules on the gateway server and/or customer's proxy servers need to be updated to allow outbound traffic from the gateway server to the following:
Protocol: TCP
Endpoint: *.datawarehouse.pbidedicated.windows.net
Port: 1433

 

Perhaps this will fix the issue.

 

source:  On-premises data gateway considerations for data destinations in Dataflow Gen2 - Microsoft Fabric | ...

Re the firewall rule, I just came across a post that said that the endpoint has changed to 

 

Endpoint: *.datawarehouse.fabric.microsoft.com

 

Hi, I'm glad that the #N/A issue got resolved!

 

I don't know about the error message you are getting now. I haven't seen that error message. Hopefully someone else can assist.

 

Maybe a restart of the computer would help, if the error is related to some software install.

 

By the way, I found this article which tells about the issue you are getting https://learn.microsoft.com/en-us/power-query/connectors/excel

 

It seems that the error is because your file is an older file type (such as .xls or .xlsb).

 

If you try connecting Dataflow Gen2 to another Excel file of type .xlsx, or a .csv file by using the csv connector, then I think you will not get that error.

Remove the staging - that fixed it for me:

 

StephenBatich_1-1706804776783.png

 

@StephenBatich How do you publish though if staging is disabled?  The Publish button disappears when staging is disabled.

I am not sure.  My data source was SQL instead of XLSX, that might have been the difference, but the error message was identicle. Staging may be a requirement of file import vs SQL.  My destination was to LakeHouse, premium capacity BI environment in Fabric.

Something doesn't make sense.  I also ingest from a SQL Server, on-prem.  But when I disable staging on a query, the Publish button is not available.  This is why I had to create a dummy query, ie an empty query, on which I enable staging, then I can publish without having any of the other queries add data to the default system warehouse/lakehouse that is automatically created by Fabric.  

 

The data source in this case should be irrelevant in terms of whether or not the Publish button becomes available.  Are you sure the Publish button is available in your GUI once you have disabled staging for all queries?

It is there, and works - but I am still getting errors on the Scheduled Refresh.  Very strange. Evidence below:

StephenBatich_0-1706894352839.png

 

There was a problem refreshing your dataflow

Your Dataflow Gen2  dataflow couldn’t be refreshed because there was a problem with one or more entities, or because dataflow capabilities were unavailable.

The next refresh for “Dataflow Gen2” is scheduled for February 3, 2024 9:00 UTC.

 

I need to rephrase all this for the potential readers landing here in future. In order for the first DFg2 to feed into the second DFg2 in the chain, the first DFg2 query need to have Enable staging = ON. That is, is staging is not enabled, the Publish button does not appear IF a data destination has not been set. On the other hand, if you set a data destination, then you can turn off staging, and the Publish buttton will still be visible. Fabric gets the data from the on-prem DB and stores it into the automatically provisioned lakehouse, DataflowStagingLakehouse.

 

I take it back.  If there is not data destination set, the Publish button will not appear.  In order to get ingest data from an on-prem DB, the only workaround I know of atm is to use 2 DFg2 chained together, ie the first one connects to the DB but does not write to a data destination. You still need to publish this first DFg2 and therefore whatever final query it contains needs to be staging enabled.  

Then, you use this first DFg2 as the source for the second DFg2, which will write to a data destination, a lakehouse in my case. This way the refresh error goes away.  All queries have staging disabled so as not to duplicate the data in the default system lakehouse that's created in the background.  However, by doing so, I just discovered that all the tables of the model are not available when you use Power BI Desktop to connect to this DFg2 when starting the report creation. So this means either enable staging for all queries whose output needs to be in the semantic model, or split the DFg2 and put each query into its own DFg2 to persist to a respective lakehouse table or keep all queries inside the same DFg2 but assign a lakehouse as data destination to each and this will persist them to your user created lakehouse (and not the default one if you have disabled staging).

Connect Power BI Desktop to these tables by choosinge 'Get data --> Power BI semantic models' from a local instance of the Power Query Editor, which will create a local DF and semantic model that you then publish with the report.

 

Let me ask you re the refesh failure:  Are you ingesting from an on-prem DB using only one DFg2?

I am ingesting from local on-prem SQL, using the gateway, to Lakehouse.  One table is rewrite, and the other is append.  I am just standing this up as a test, as we are migrating to D365 FnO and I'm trying to figure out a good data warehousing location for reporting across our local SQL with a cloud environment combined into a single dataset.

I will try the DFg2 to DFg2 idea to see if that will fix the scheduled refresh issue. Thanks for the help. 

Here is the MS documention explaining how to do it--look for the section titled 'Workaround: Split dataflow in a separate ingest and load dataflow' in this page:

 

On-premises data gateway considerations for data destinations in Dataflow Gen2 - Microsoft Fabric | ...

 

Element115
Power Participant
Power Participant

What settings did you choose in the Data destination settings of this DF?

Can you show the M code?

Is your source an on-prem DB?

 

 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.

Top Solution Authors