Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello everyone,
I’m experiencing an issue with a table in my Microsoft Fabric Lakehouse and I’m hoping someone can assist me.
Problem Description:
I have a table in my Lakehouse that shows data correctly when queried via the SQL endpoint or using dataflows. However, when I try to access the same table from a notebook, the DataFrame appears empty.
I was informed that this table was deleted and then reloaded with different data, but it’s strange that I cannot see the data in the notebook while it appears in other methods.
Steps I’ve Tried:
1.Refreshed the connections in the notebook and cleared the local cache.
2.Verified that the schema of the table is consistent across all access points.
3.Waited a few minutes and tried accessing the data again after some time.
4.Checked the configuration of the Delta Tables to ensure they are set up correctly.
Any help or suggestions would be greatly appreciated. Thanks in advance!
Hi there,
It sounds like you're encountering a synchronization issue between your Microsoft Fabric Lakehouse table and the notebook interface. This kind of issue can occur due to inconsistencies in metadata or caching between the Delta table and the notebook's runtime.
Here are a few steps you could try to resolve the issue:
Use the SQL endpoint to check the table's metadata and schema:
DESCRIBE DETAIL your_table_name;
Compare the schema against what your notebook expects. If the table was deleted and reloaded, the metadata may not have updated properly.
You can refresh the Delta table metadata in your notebook using the following:
from delta.tables import DeltaTable # Refresh the table metadata DeltaTable.forPath(spark, "path_to_your_table").refresh()
If your notebook is caching old metadata, try clearing the Spark cache:
spark.catalog.clearCache()
Confirm that the notebook is pointing to the correct Delta table path. If the table was reloaded, the path might have been updated, but your notebook could still be referencing the old location:
df = spark.read.format("delta").load("path_to_your_table") df.show()
Look into the notebook's logs for errors related to access permissions or inconsistencies when reading the table.
If none of the above work, you might need to rebuild the table. This can involve dropping and re-creating it from the Delta table storage:
DROP TABLE IF EXISTS your_table_name; CREATE TABLE your_table_name USING DELTA LOCATION 'path_to_your_table';
Hopefully, one of these steps helps resolve the issue. Let us know how it goes or if you encounter any specific errors!
Best regards,
Jaseph
Hi there,
It sounds like you're encountering a synchronization issue between your Microsoft Fabric Lakehouse table and the notebook interface. This kind of issue can occur due to inconsistencies in metadata or caching between the Delta table and the notebook's runtime. Here are a few steps you could try to resolve the issue:
DESCRIBE DETAIL your_table_name;
from delta.tables import DeltaTable
# Refresh the table metadata
DeltaTable.forPath(spark, "path_to_your_table").refresh()
spark.catalog.clearCache()
df = spark.read.format("delta").load("path_to_your_table")
df.show()
DROP TABLE IF EXISTS your_table_name;
CREATE TABLE your_table_name USING DELTA LOCATION 'path_to_your_table';
Hopefully, one of these steps helps resolve the issue. Let us know how it goes or if you encounter any specific errors!
Best regards,
Jaseph
Hi
I have a similar error where I have the correct amount of rows in my sql endpoint but when I access the data from a notebook, there is only 698 row in the table. When I go on file view in the lakehouse, there is only two files + _delta_log, even thoough the table have been uptdated more times. I am only experiencing this error with one table. Using the same code (with different table name), I can access all the rows. The same error persist across different workspaces for the same table.
It seems the tables have different names:
REPORTS_DailyReport
REPORT_DailyReport
Is that the reason for the different row count?
Possibly related topic: https://community.fabric.microsoft.com/t5/Data-Engineering/Duplicated-rows-between-notebook-and-SQL-...
In that case (and another thread also https://community.fabric.microsoft.com/t5/Data-Engineering/Duplicated-Rows-In-Tables-Built-By-Notebo...) the issue was not that the Notebook returned blank. However there was a discrepancy between data displayed in Notebook (which gave unexpected results) and SQL Analytics Endpoint (which gave expected results).
Do you get any error message when creating the dataframe? (I.e. table missing, table name not found, or something like that).
Or it just returns 0 rows?
Did you try querying the table using %%sql in the Notebook?
%%sql
SELECT * FROM tableName;
or
%%sql
DESCRIBE HISTORY tableName;
Hi,
Thank you for the links and your suggestion.
I do not receive any error messages when creating the DataFrame. There is no message indicating that the table is missing or that the table name was not found; it simply returns 0 rows.
I tried querying the table using %%sql in the Notebook with the following commands:
%%sql
SELECT * FROM tableName;
%%sql
DESCRIBE HISTORY tableName;
Both commands executed without errors, but the SELECT also returned 0 rows. The DESCRIBE HISTORY command showed a history consistent with recent modifications.
Hi @Alexis2024 ,
Can you provide some screenshot information describing exactly what you did to access the data in different ways? Everything works fine in my tests.
Best Regards,
Adamk Kong
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Sure, here are some screenshots. You can see that the Notebook returns zero rows, but when I view the data, there are more than 1000 records.
From the pictures, it looks like the table is a shortcut table.
I wouldn't know if that is relevant to the issue you are experiencing.
Do you know what is the source of the shortcut table? (aka Target of the shortcut).
Anyway, you say that you see the data when you connect from Dataflow Gen2. So it seems quite clear that the data is actually there. Just the Notebook don't find the data - that is the issue here.
Have you tried refreshing preview inside the Dataflow Gen2, or refresh the entire Dataflow Gen2, just to double check that the data is actually in the Lakehouse table?
Or clicked refresh in the user interface everywhere in the Lakehouse (both on Lakehouse level, table folder level, table level) and SQL Analytics Endpoint. There are quite many possible places to click refresh, almost a bit confusing.
Yes, the table is indeed a shortcut table. I have verified that the source table contains the expected data.
I have also created a notebook from the source and the same result
Just to confirm: notebook doesn't find the data also in the source table?
However other user interfaces finds the data also in the source table.
So the issue seems to be at the source table and notebook is not able to find the data, but other user interfaces are able to see the data.
That’s correct, the Notebook doesn’t show the data either. However, when using the SQL endpoint or Dataflow, the data is displayed correctly.
Could you try to use a command like this in the Notebook:
%%sql
SELECT * FROM tableName VERSION AS OF 1
(You can also replace 1 with another number. You can use DESCRIBE HISTORY to find out which versions have been made of the table, and then try to query some of the most recent versions).
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.