Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
HamidBee
Power Participant
Power Participant

Issue with Delta Table Overwrite in Lakehouse Once Connected to Power BI Report

Hello everyone,

 

I'm currently facing a challenge with a lakehouse setup in Microsoft Fabric and I'm hoping to get some insights or solutions from this knowledgeable community.

 

Context: I've created a lakehouse and within it, I set up a folder populated with Excel files. Using a Spark notebook, I perform PySpark transformations on these files. The primary operation is to append these files and then overwrite an existing Delta table. This process has been working flawlessly until a certain point.

 

Issue: The problem arises after I create a Power BI report. The process is as follows:

 

  1. I create a Power BI report by selecting 'Get Data' -> 'SQL Server', then inputting the SQL endpoint, database name, and a SQL query for DirectQuery to fetch data from the Delta table.
  2. I successfully create a visualization in Power BI and publish this report to my workspace.

However, after this step, whenever I add new Excel files to the designated folder and run my notebook script, the script executes successfully but fails to overwrite the Delta table. This issue only occurs once the Delta table is connected to a Power BI report or has a dependency on a Power BI report.

 

Attempts to Resolve:

  • I've ensured that the script runs successfully without any errors.
  • The issue only manifests after linking the Delta table with a Power BI report.

Seeking Help:

  • Has anyone encountered a similar issue with Delta tables in Lakehouse, especially in relation to Power BI dependencies?
  • Are there any known limitations or considerations when linking Delta tables to Power BI that might be causing this issue?
  • Could this be a permissions or locking issue caused by Power BI's connection to the Delta table?

Any insights, suggestions, or guidance would be immensely appreciated. I'm looking to understand the root cause of this issue and find a viable solution to ensure continuous data flow and updating in my lakehouse environment.

Thank you in advance for your time and help!

1 ACCEPTED SOLUTION
HamidBee
Power Participant
Power Participant

I turned on DirectLake:

HamidBee_0-1704153373803.png

and this fixed my issue. Strangely, this was greyed out and was only able for me to switch it on once I created a Power BI report with data build on the lakehouse and published it to the workspace. 

 

View solution in original post

8 REPLIES 8
HamidBee
Power Participant
Power Participant

I turned on DirectLake:

HamidBee_0-1704153373803.png

and this fixed my issue. Strangely, this was greyed out and was only able for me to switch it on once I created a Power BI report with data build on the lakehouse and published it to the workspace. 

 

hi @HamidBee , glad that worked out for you. If my post above helped, would you please mark it as a solution. Good luck with your ongoing progress!

Scott

Hi @HamidBee 

 

Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.

HamidBee
Power Participant
Power Participant

From my investigation the data only actually populates in the table when I click:

HamidBee_0-1703031798732.png

and then hit the refresh button. This is so strange. Someone please help!.

 

Hi @HamidBee 

 

Thanks for using Fabric Community.

 

Since you are converting the files and loading them into a Lakehouse. Have you created a custom semantic model that they can reference to create the report and use Direct Lake mode? 

 

Thanks

Hi. I'm not quite sure what you mean by "creating a custom semantic model that they can reference to create the report and use Direct Lake mode". What I did was I obtianed the SQL endpoint for the Lakehouse and I connected to it directly from Power BI Desktop. by typing a T-SQL query. Let me explain:

1. I went to the Lakehouse settings and copied the endpoit.

2. I opened Power BI Desktop, went to get data, SQL database

3. I pasted the endpoint and then entered the database name along with the T-SQL query. I chose directquery as the method.

 

Is this not the correct way?. Thanks.

Doh, sorry, just saw that you were creating the semantic model in PBI desktop vs. in the service. Right now you can't author PBI desktop files that use direct lake mode. Instead, build the semantic model in the service, using the instructions found here: Prepare data for analysis and reporting - Training | Microsoft Learn

 

Hope this helps!

Scott

Hi @HamidBee , you'll definitely want to use direct lake mode instaed of querying the tables using SQL. If you google DirectLake and semantic model you should find plenty of help on how to do this (if not, ask and we'll try to assist).

 

Having said that - the way you're doing things "should" work. One question - when you say the data isn't changing - do you mean in the Power BI report? It seems like the data looks fine when you query the SQL endpoint. My guess is maybe (???) the semantic model isn't seeing the latest and greatest data due to the following setting?

 

Scott_Powell_0-1703104019852.png

 

 

Hope this helps,

Scott

Helpful resources

Announcements
September Fabric Update Carousel

Fabric Monthly Update - September 2025

Check out the September 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.