Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
jaryszek
Impactful Individual
Impactful Individual

How to create a composite model with OneLake Flavour?

Hello,

I am using tutorial like that:

https://www.sqlbi.com/blog/marco/2025/05/13/direct-lake-vs-import-vs-direct-lakeimport-fabric-semant... 

and those steps:

7 · Step‑by‑step demo recap

Below is the high‑level flow I followed in the video. Adapt the data sources and naming to your environment.

  1. Direct Lake stub
    • Connect to OneLake → pick Sales only → publish.
    • Add core measures (they reference only Sales).
  2. Bring in Import dimensions
    • Open Import‑only model built via SQL endpoint.
    • Copy Product, Customer, Date, Store tables into the Direct Lake model with Tabular Editor.
    • Save changes.
  3. Credential mapping
    • In the service, create a connection (“Contoso–DL”) with OAuth credentials.
    • Map the SQL endpoint to that connection.
    • Wait until credentials propagate.
  4. First full refresh. Data is imported for dimensions, no waiting for the Sales table.
  5. Create regular relationships between Sales and dimension keys. Save + quick metadata refresh.
  6. Test query – a DAX matrix using Product[Price Range] (calc column) plus Sales Amount aggregated from the fact.
    • Server timings show one Storage Engine query—no boundary crossing.

      I refreshed model using tabular editor and have everything in one composite model but still I can not use power query on dimensions tables...

      Why? What I am missing?

      I see in tabular editor that Import mode is correct :
      jaryszek_0-1756811430650.png

       

      For dimension tables amd DQ Over AS for Fct which is correct. 
      Whilre refreshing model in power bi desktop I am getting:

      jaryszek_0-1756811704132.png


      But how to get into power query for those Import tables in power bi Desktop?


      Anybody tried it?
      Best,
      Jacek

1 ACCEPTED SOLUTION
v-sdhruv
Community Support
Community Support

Hi @jaryszek ,

The error you are getting on the screenshot you shared previously, could be because of privacy setting configured in Power BI desktop/ service.
You might want to check similar post related to the same issue.
Solved: Power BI Scheduler Refresh Fail : Collection was m... - Microsoft Fabric Community
Hope this helps!
If this still doesnt resolve your query, kindly post the error details you are getting when you performed the refresh so that we can assist you better.

Thank you for using Microsoft Community Forum

View solution in original post

3 REPLIES 3
v-sdhruv
Community Support
Community Support

Hi @jaryszek ,

The error you are getting on the screenshot you shared previously, could be because of privacy setting configured in Power BI desktop/ service.
You might want to check similar post related to the same issue.
Solved: Power BI Scheduler Refresh Fail : Collection was m... - Microsoft Fabric Community
Hope this helps!
If this still doesnt resolve your query, kindly post the error details you are getting when you performed the refresh so that we can assist you better.

Thank you for using Microsoft Community Forum

AmiraBedh
Super User
Super User

Hello !

I don't think that actually end up with a Direct Lake + Import model. From the screenshot I can see that the fact table shows DQ over AS but not Direct Lake. When the model is DQ over AS or when you probably edited a published model via XMLA), PQ isn’t available in Desktop for those tables that's you can’t transform data,

You need to build the Direct Lake stub correctly, so under get data go to OneLake data hub then Lakehouse and select only your fact table.

The fact table storage mode should be DirectLake in Model view (if it says DQ over AS, you picked the autogenerated dataset by mistake).

Then for the dimensions, under get data choose SQL Server and point to the Lakehouse SQL endpoint (or your dim source) and don't forget that the data connectivity mode should be import.

You can do your transformations in PQ

After you publish, in the Service you should find in the data source credentials in your model :

  • One for OneLake / Direct Lake (OAuth)

  • One for the SQL endpoint (Import dims)


Proud to be a Power BI Super User !

Microsoft Community : https://docs.microsoft.com/en-us/users/AmiraBedhiafi
Linkedin : https://www.linkedin.com/in/amira-bedhiafi/
StackOverflow : https://stackoverflow.com/users/9517769/amira-bedhiafi
C-Sharp Corner : https://www.c-sharpcorner.com/members/amira-bedhiafi
Power BI Community :https://community.powerbi.com/t5/user/viewprofilepage/user-id/332696
jaryszek
Impactful Individual
Impactful Individual

thanks, 

i did this like you segestted and it is not working.

When you are going to One Lake Catalog:
1) Choosing lakehouse
2. Connect to OneLake:

 

jaryszek_1-1756885522292.png

 

3) you will get yur fct table inside.

But you can not now go once again to OneLake and chose Connect to SQL Endpoint. There is only the option to connect to OneLake tables once again. 
This is why SQLBI created the workaround for it using import tables.

So your answer is not working. You can not do this in one semantic model. 

Best,
Jacek

 

Helpful resources

Announcements
September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors