The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
My Lakehouse default dataset is stale and not refreshing. I have removed and re-added tables. I tried the refresh button from the SQL endpoint view. I have "updated" the default dataset. Nothing has worked and I don't see any of the traditional refresh options on the dataset settings.
The current data actually is available via the SQL endpoint, it's just not making it to the Dataset or the reports. I can add a measure and it will show up in the dataset. I am able to create a new dataset off of the same SQL endpoint and it has the current data.
Since it is the default dataset, I am unable to delete it as well.
Any other suggestions or is the dataset just corrupted?
Thanks
I am able to create a new dataset (not default) from the lakehouse and it seems to work correctly.
Hi @Backroads4Me ,
Glad that you are able to refresh the new dataset which has been created in Lakehouse.
As per the previous issue which you have mentioned the details, we do not have access to monitor.
So I request you to raise a Suppot ticket as our engineering team will help you better.
Please go ahead and raise a support ticket to reach our support team:
Hi @Backroads4Me ,
Did you get a chance to create a support ticket for the above query?
If yes , can you please share the details of the support ticket as it would help us in tracking for further details.
Thank you.
Hi @Backroads4Me ,
We haven’t heard from you on the last response and was just checking back to see if you could provide the details asked above.
No. The SQL endpoint is updated, but the dataset is unusable. Existing reports no longer return eny data at all. The dataset refreshed data shows that it's several weeks old. I am able to create a new report off of the dataset, but when I try to use the data in a visual I get this error:
Couldn't load the data for this visual
Unexpected parquet exception occurred. Class: 'ParquetStatusException' Status: 'IOError' Message: 'Encountered Azure error while accessing lake file, StatusCode = 404, ErrorCode = , Reason = Not Found'
Please try again later or contact support. If you contact support, please provide these details.
Activity ID: 6917e4bc-b1f3-47b8-93ff-8e8f4ecbea11
Request ID: 119ee5e1-99fb-0770-5b1d-134aea752a31
Correlation ID: 45064130-8f50-a055-ef04-56237af2629e
Time: Tue Oct 17 2023 11:41:14 GMT-0400 (Eastern Daylight Time)
Service version: 13.0.21778.91
Client version: 2310.2.16218-train
Cluster URI: https://wabi-us-north-central-g-primary-redirect.analysis.windows.net/
My understanding is that you still need to refresh (reframe) the scope of the Direct Lake dataset, in order for it to be aware of what delta tables to scope in its model. So my guess would be that since you (for some reason) is not able to refresh the defautl dataset, it is actually referring to delta files (parquet files) no longer existing on the data lake (perhaps you have run a vacuum, or otherwize removed files since last dataset refres)..
Have you tried executing a refresh using REST API? It might give you a better insight in why it isnt working (error response):
https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/refresh-dataset
This blog sheds some light on direct lake datasets, and how refresh is supposed to work:
Hi @Backroads4Me ,
Thanks for using Fabric Community and reporting this.
I have reached the internal team for help regarding this. I will update once I hear back from them.
Appreciate your patience.
I have been encountering similar behavior.
Just last night I was trying to enter single data points into a delta-parquet table in a lakehouse. I enter the data point, no error occurs, but the table reports no entries in the UI. I try again, refresh the table, but no entries. I try a few more times scratching my head, still no entries, so I walk away. I come back 20 minutes later, refresh the table and five duplicate entries are reported.
I then try deleting the table, verify it was deleted, recreate it, add no entries and check its contents. The same five entries are still there.
I've repeated those steps and seen similar behavior multple times. It seems like there's some kind of caching or state management issue happening, but this kind of unpredictable data state behavior makes Fabric untrustworthy.
Hi @Backroads4Me ,
Apologies for the delay in response. Are you able to refresh the dataset now?
Did your issue get resolved?
Please let me know if you have any further queries.
User | Count |
---|---|
3 | |
2 | |
1 | |
1 | |
1 |
User | Count |
---|---|
3 | |
2 | |
2 | |
1 | |
1 |