Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hello,
I am using tutorial like that:
https://www.sqlbi.com/blog/marco/2025/05/13/direct-lake-vs-import-vs-direct-lakeimport-fabric-semant...
and those steps:
Below is the high‑level flow I followed in the video. Adapt the data sources and naming to your environment.
For dimension tables amd DQ Over AS for Fct which is correct.
Whilre refreshing model in power bi desktop I am getting:
But how to get into power query for those Import tables in power bi Desktop?
Solved! Go to Solution.
Hi @jaryszek ,
The error you are getting on the screenshot you shared previously, could be because of privacy setting configured in Power BI desktop/ service.
You might want to check similar post related to the same issue.
Solved: Power BI Scheduler Refresh Fail : Collection was m... - Microsoft Fabric Community
Hope this helps!
If this still doesnt resolve your query, kindly post the error details you are getting when you performed the refresh so that we can assist you better.
Thank you for using Microsoft Community Forum
Hi @jaryszek ,
The error you are getting on the screenshot you shared previously, could be because of privacy setting configured in Power BI desktop/ service.
You might want to check similar post related to the same issue.
Solved: Power BI Scheduler Refresh Fail : Collection was m... - Microsoft Fabric Community
Hope this helps!
If this still doesnt resolve your query, kindly post the error details you are getting when you performed the refresh so that we can assist you better.
Thank you for using Microsoft Community Forum
Hello !
I don't think that actually end up with a Direct Lake + Import model. From the screenshot I can see that the fact table shows DQ over AS but not Direct Lake. When the model is DQ over AS or when you probably edited a published model via XMLA), PQ isn’t available in Desktop for those tables that's you can’t transform data,
You can do your transformations in PQ
After you publish, in the Service you should find in the data source credentials in your model :
One for OneLake / Direct Lake (OAuth)
One for the SQL endpoint (Import dims)
thanks,
i did this like you segestted and it is not working.
When you are going to One Lake Catalog:
1) Choosing lakehouse
2. Connect to OneLake:
3) you will get yur fct table inside.
But you can not now go once again to OneLake and chose Connect to SQL Endpoint. There is only the option to connect to OneLake tables once again.
This is why SQLBI created the workaround for it using import tables.
So your answer is not working. You can not do this in one semantic model.
Best,
Jacek