Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
vsahin
Helper I
Helper I

Importing Microsoft Fabric Notebook Data into Power BI

Hello everyone,

 

I created a lakehouse in Fabric. I can access to the files that I uploaded via get data button. I can run my python notebooks, I can transform the data. But I can not access to these files on Power BI. So I can't automate my transform and reporting process.

I need to upload data into lakehouse, transform on Notebook(python), download transformed data into my desktop, upload it into my workspace.

 

When I click New Power BI Data Set a pop up comes but nothing in. That's why I'm doing all these manual steps.

 

Also currently can't add data which is already in my workspace into lakehouse. I'm uploading them manually from desktop. 

 

My goal is access to my datasets, transform them on Notebook(python) and use the datasets for my PBI reports.

 

Thanks 

 

fabric.png

1 ACCEPTED SOLUTION

I found the solution and want to share it step by step.

 

In Fabric - Lakehouse click on get data, upload files.

Choose your excel file.

 

vsahin_0-1691698617076.png

 

Once you upload your excel file. You will see it on your files.

 

vsahin_1-1691698729357.png

 

 

Click on top open notebook and choose new notebook.

 

vsahin_2-1691698783431.png

 

Paste below code into your notebook.

 

# 1 load excel file into lake house via import data

# 2 import excel file into dataframe, choose path from files ... copy ABFS
import pandas as pd
df=pd.read_excel('abfss://ecf5a729-f8d9-417f-ae28-8ed1e9473ce3@onelake.dfs.fabric.microsoft.com/027295a6-aee0-4e87-a2dc-0f89fd816a27/Files/Test.xlsx')
#display(df)
# 3 save dataframe into parquet, choose path where want to save file using ...
# 4 if parquet not installed, will get error, in this case: !pip install parquet
df.to_parquet('abfss://ecf5a729-f8d9-417f-ae28-8ed1e9473ce3@onelake.dfs.fabric.microsoft.com/027295a6-aee0-4e87-a2dc-0f89fd816a27/Files/Test.parquet')
#5 read parquet file
dfp=spark.read.parquet('abfss://ecf5a729-f8d9-417f-ae28-8ed1e9473ce3@onelake.dfs.fabric.microsoft.com/027295a6-aee0-4e87-a2dc-0f89fd816a27/Files/Test.parquet')
# 6 write parquet into a table, go to lakehouse, from top right click on Lakehouse, change it to SQL End Poin and start using SQL on the table.
dfp.write.mode("overwrite").format("delta").saveAsTable("N_G_Table")

 

vsahin_3-1691699064692.png

 

 

As explained and show above, you need to change path in your workspace.

once you have update tha path, run the code.

 

refresh your table as below.

vsahin_4-1691699205368.png

 

 

You should see below table on your Lakehouse.

 

vsahin_5-1691699263443.png

 

 

Open Power BI Desktop. You can choose lakehouses but it gave me error at the beginning, in this case click directly on OneLake data hub on Home tab.

 

vsahin_6-1691699351760.png

 

Choose your lake house and click connect.

 

vsahin_7-1691699482424.png

 

Created a basic chart to see that data has been loaded into power bi.

vsahin_8-1691699688179.png

 

 

 

View solution in original post

4 REPLIES 4
ibarrau
Super User
Super User

Hi. When you create a new dataset at Fabric it only read Tables. It looks like you only have files. Think about the tables as the real lakehouse (like metastore). The files can help you build a regular lake with layers, but at the end of the day, the place to store the data model tables it's at tables.

You can just read with spark a file as frame and run the following to store it as table:

spark_frame.write.format("delta").mode("overwrite").save("Tables/[TableName]")

 That will store it at tables and you will be able to pick it with a new power bi dataset.

Another thing, remember that "My Workspace" is only for you. Don't use it for things you want to use as collaboration later. We have the possibility to create workspaces for that 🙂

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

 

Thanks,

I'm just running tests before I move to working space. 

 

On the xlsx files there were already tables. I'm not sure how to create tables. I tried to coppied the code you provide me but didn't work. I made some name changes but didn't work, not sure which names should I change on the code you provided.

 

 

vsahin_0-1689691755564.png

 

What are the files you want to get from PowerBi? are they a few xlsx? I thought we were talking about many files at a lake. 

I have sent you an example. Of course it won't unless you change the variables.

Let's see:

 

spark_frame.write.format("delta").mode("overwrite").save("Tables/[TableName]")

 

spark_frame is a variable. It's a frame I have loaded with a file at the lake. You need to read xlsx with pyspark to get your own frame. Store it in a variable and then try the line with the name of your variable.write....

Also check that at the end it says [TableName]. You should replace that with the table name like "Sales".

If you think this spark thing is too much, we can think about other solutions. If you just want to move an excel from local to a Table at lakehouse you can use Dataflow gen2 that will be using power query only to store it at Tables so you can use it later without coding too much. I have said that because your message sounded like you manage/write notebooks.

I hope this make sense 🙂


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

I found the solution and want to share it step by step.

 

In Fabric - Lakehouse click on get data, upload files.

Choose your excel file.

 

vsahin_0-1691698617076.png

 

Once you upload your excel file. You will see it on your files.

 

vsahin_1-1691698729357.png

 

 

Click on top open notebook and choose new notebook.

 

vsahin_2-1691698783431.png

 

Paste below code into your notebook.

 

# 1 load excel file into lake house via import data

# 2 import excel file into dataframe, choose path from files ... copy ABFS
import pandas as pd
df=pd.read_excel('abfss://ecf5a729-f8d9-417f-ae28-8ed1e9473ce3@onelake.dfs.fabric.microsoft.com/027295a6-aee0-4e87-a2dc-0f89fd816a27/Files/Test.xlsx')
#display(df)
# 3 save dataframe into parquet, choose path where want to save file using ...
# 4 if parquet not installed, will get error, in this case: !pip install parquet
df.to_parquet('abfss://ecf5a729-f8d9-417f-ae28-8ed1e9473ce3@onelake.dfs.fabric.microsoft.com/027295a6-aee0-4e87-a2dc-0f89fd816a27/Files/Test.parquet')
#5 read parquet file
dfp=spark.read.parquet('abfss://ecf5a729-f8d9-417f-ae28-8ed1e9473ce3@onelake.dfs.fabric.microsoft.com/027295a6-aee0-4e87-a2dc-0f89fd816a27/Files/Test.parquet')
# 6 write parquet into a table, go to lakehouse, from top right click on Lakehouse, change it to SQL End Poin and start using SQL on the table.
dfp.write.mode("overwrite").format("delta").saveAsTable("N_G_Table")

 

vsahin_3-1691699064692.png

 

 

As explained and show above, you need to change path in your workspace.

once you have update tha path, run the code.

 

refresh your table as below.

vsahin_4-1691699205368.png

 

 

You should see below table on your Lakehouse.

 

vsahin_5-1691699263443.png

 

 

Open Power BI Desktop. You can choose lakehouses but it gave me error at the beginning, in this case click directly on OneLake data hub on Home tab.

 

vsahin_6-1691699351760.png

 

Choose your lake house and click connect.

 

vsahin_7-1691699482424.png

 

Created a basic chart to see that data has been loaded into power bi.

vsahin_8-1691699688179.png

 

 

 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.