Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
imsarah
Regular Visitor

Bringing data (semantic models) from multiple reports in various workspaces to Fabric Lakehouse

hello

We have multiple reports, published in different workspaces that are not Fabric enabled. We need to get data from all of these reports and other datasources (e.g. SharePoint, Excel) and load them in Fabric (e.g. in a Lakehouse). The underlying data in the reports get updated frequently but their schema stay the same. 

 

How can this be achieved? I tried to connect to the semantic models in dataflow gen2, datapipelien, lakehouse with no luck.

 

I even enabled fabric on one of this workspaces and enabled "ObeLake integration" for the semantic model but still I can't connect to it through lakehouse/DFGen2.

 

Any help would be appreciated!

1 ACCEPTED SOLUTION
andrewsommer
Super User
Super User

Power BI semantic models are analytical artifacts, not raw data stores. They don’t expose their data in a tabular/raw form via standard connectors like Lakehouse or Dataflows Gen2.

 

Enabling Fabric and OneLake Integration does let the model write to OneLake if explicitly configured, but semantic models cannot be queried like data tables from within Fabric-native components such as Lakehouses or Pipelines.

 

Dataflow Gen2 does not natively support connecting to semantic models (even in the same workspace).

 

There is no built-in connector from Fabric Lakehouse to Power BI semantic models, despite OneLake integration.

 

Your transition to Fabric is the right time to rebuild your semantic models. 

 

Please mark this post as solution if it helps you. Appreciate Kudos.

View solution in original post

5 REPLIES 5
burakkaragoz
Community Champion
Community Champion

Hi

Based on your forum post image, I'll provide a solution for bringing data from multiple non-Fabric-enabled reports into a Fabric Lakehouse.

Connecting Multiple Semantic Models to Fabric Lakehouse

When bringing semantic models from different workspaces into Fabric Lakehouse, you need a systematic approach since direct connections to semantic models aren't well-supported across workspace boundaries. Here's a comprehensive solution:

Option 1: Use Direct Lake Mode with Power BI Datasets

  1. Enable Direct Lake connectivity for your Power BI datasets:
    • For each source workspace, upgrade to Fabric capacity
    • Enable OneLake integration for each semantic model
    • This creates a Direct Lake endpoint you can access
  2. Access via Spark:
    • In your Lakehouse, use Spark to read directly from the OneLake endpoints
    • Example code:
       
      python
      df = spark.read.format("delta").load("abfss://[workspace-id]@onelake.dfs.fabric.microsoft.com/[dataset-id]")

Option 2: Use Dataflows Gen2 as Intermediary

Since you mentioned having trouble with DataFlow Gen2, here's how to make it work:

  1. Create a Dataflow Gen2 in your destination workspace
  2. Use Power Query to connect to each semantic model:
    • Add new source → Power BI dataset
    • Select the workspace and dataset
    • Select required tables
  3. Schedule refreshes to keep data current
  4. Reference the dataflow in your Lakehouse

Option 3: Export Data Using APIs

  1. Use Power BI REST APIs to export data programmatically:
    • Create a pipeline in Fabric Data Factory
    • Use the Power BI connector to extract data
    • Load directly to your Lakehouse
  2. Sample pipeline configuration:
    • Source: Power BI Dataset connector
    • Query: DAX or direct table reference
    • Sink: Lakehouse

Option 4: Use XMLA Endpoints (Most Reliable)

  1. Enable XMLA endpoints in Power BI Admin settings
  2. Connect via Spark:
     
    python
    # Install required packages
    sc.install_pypi_package("pyodbc")
    
    import pyodbcconn = pyodbc.connect('Driver={ODBC Driver 17 for SQL Server};'
                          'Server=powerbi://api.powerbi.com/v1.0/[tenant]/[workspace];'
                          'Database=[dataset];'
                          'Trusted_Connection=yes;')
    
    # Execute query and load to lakehouse
    query = "SELECT * FROM [TableName]"
    df = spark.read.format("jdbc").options(
        url="jdbc:sqlserver://[xmla-endpoint]",
        dbtable=f"({query}) as tmp",
        user="[username]",
        password="[password]"
    ).load()
    
    # Save to lakehouse
    df.write.format("delta").mode("overwrite").save("/lakehouse/default/[tablename]")

The XMLA endpoint approach is likely your best option if you're experiencing connectivity issues with the other methods. It provides a direct SQL-like connection to your semantic models regardless of workspace boundaries.

andrewsommer
Super User
Super User

Power BI semantic models are analytical artifacts, not raw data stores. They don’t expose their data in a tabular/raw form via standard connectors like Lakehouse or Dataflows Gen2.

 

Enabling Fabric and OneLake Integration does let the model write to OneLake if explicitly configured, but semantic models cannot be queried like data tables from within Fabric-native components such as Lakehouses or Pipelines.

 

Dataflow Gen2 does not natively support connecting to semantic models (even in the same workspace).

 

There is no built-in connector from Fabric Lakehouse to Power BI semantic models, despite OneLake integration.

 

Your transition to Fabric is the right time to rebuild your semantic models. 

 

Please mark this post as solution if it helps you. Appreciate Kudos.

Hi @imsarah ,

We haven’t heard back from you regarding our previous response and wanted to check if your issue has been resolved.

If it has, please consider clicking “Accept Answer” and “Yes” if you found the response helpful.
If you still have any questions or need further assistance, feel free to let us know — we're happy to help!

Thank you!

 

Hi @imsarah ,

If our response addressed by the community member for your query, please mark it as Accept Answer and click Yes if you found it helpful.

Should you have any further questions, feel free to reach out.
Thank you for being a part of the Microsoft Fabric Community Forum!

Hi @imsarah ,

We haven’t heard back from you regarding our previous response and wanted to check if your issue has been resolved.

If it has, please consider clicking “Accept Answer” and “Yes” if you found the response helpful.
If you still have any questions or need further assistance, feel free to let us know — we're happy to help!

Thank you!

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.