Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hello everyone,
I'm currently facing a challenge with a lakehouse setup in Microsoft Fabric and I'm hoping to get some insights or solutions from this knowledgeable community.
Context: I've created a lakehouse and within it, I set up a folder populated with Excel files. Using a Spark notebook, I perform PySpark transformations on these files. The primary operation is to append these files and then overwrite an existing Delta table. This process has been working flawlessly until a certain point.
Issue: The problem arises after I create a Power BI report. The process is as follows:
However, after this step, whenever I add new Excel files to the designated folder and run my notebook script, the script executes successfully but fails to overwrite the Delta table. This issue only occurs once the Delta table is connected to a Power BI report or has a dependency on a Power BI report.
Attempts to Resolve:
Seeking Help:
Any insights, suggestions, or guidance would be immensely appreciated. I'm looking to understand the root cause of this issue and find a viable solution to ensure continuous data flow and updating in my lakehouse environment.
Thank you in advance for your time and help!
Solved! Go to Solution.
I turned on DirectLake:
and this fixed my issue. Strangely, this was greyed out and was only able for me to switch it on once I created a Power BI report with data build on the lakehouse and published it to the workspace.
I turned on DirectLake:
and this fixed my issue. Strangely, this was greyed out and was only able for me to switch it on once I created a Power BI report with data build on the lakehouse and published it to the workspace.
hi @HamidBee , glad that worked out for you. If my post above helped, would you please mark it as a solution. Good luck with your ongoing progress!
Scott
Hi @HamidBee
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.
From my investigation the data only actually populates in the table when I click:
and then hit the refresh button. This is so strange. Someone please help!.
Hi @HamidBee
Thanks for using Fabric Community.
Since you are converting the files and loading them into a Lakehouse. Have you created a custom semantic model that they can reference to create the report and use Direct Lake mode?
Thanks
Hi. I'm not quite sure what you mean by "creating a custom semantic model that they can reference to create the report and use Direct Lake mode". What I did was I obtianed the SQL endpoint for the Lakehouse and I connected to it directly from Power BI Desktop. by typing a T-SQL query. Let me explain:
1. I went to the Lakehouse settings and copied the endpoit.
2. I opened Power BI Desktop, went to get data, SQL database
3. I pasted the endpoint and then entered the database name along with the T-SQL query. I chose directquery as the method.
Is this not the correct way?. Thanks.
Doh, sorry, just saw that you were creating the semantic model in PBI desktop vs. in the service. Right now you can't author PBI desktop files that use direct lake mode. Instead, build the semantic model in the service, using the instructions found here: Prepare data for analysis and reporting - Training | Microsoft Learn
Hope this helps!
Scott
Hi @HamidBee , you'll definitely want to use direct lake mode instaed of querying the tables using SQL. If you google DirectLake and semantic model you should find plenty of help on how to do this (if not, ask and we'll try to assist).
Having said that - the way you're doing things "should" work. One question - when you say the data isn't changing - do you mean in the Power BI report? It seems like the data looks fine when you query the SQL endpoint. My guess is maybe (???) the semantic model isn't seeing the latest and greatest data due to the following setting?
Hope this helps,
Scott
User | Count |
---|---|
30 | |
15 | |
12 | |
8 | |
8 |
User | Count |
---|---|
46 | |
30 | |
23 | |
15 | |
13 |