Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I want to create views in my fabric warehouse that are sources from lakehouse files (parquet/json)
I want to do soemthing similar to this but not sure exactly how to write query
Solved! Go to Solution.
Hi @joakimfenno
I think you can try using an external table to load Parquet data into Lakehouse.
You can use the Spark API in your notebook to read Parquet files and load them into Lakehouse. For example:
df = spark.read.parquet("location to read from")
# Keep it if you want to save dataframe as Parquet files to Files section of the default lakehouse
df.write.mode("overwrite").format("parquet").save("Files/" + parquet_table_name)
You can view the link below for more details:
Load data into your lakehouse with a notebook - Microsoft Fabric | Microsoft Learn
Hope this helps you.
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @joakimfenno
Can you tell me if your problem is solved? If yes, please accept it as solution.
Regards,
Nono Chen
Hi @joakimfenno
I think you can try using an external table to load Parquet data into Lakehouse.
You can use the Spark API in your notebook to read Parquet files and load them into Lakehouse. For example:
df = spark.read.parquet("location to read from")
# Keep it if you want to save dataframe as Parquet files to Files section of the default lakehouse
df.write.mode("overwrite").format("parquet").save("Files/" + parquet_table_name)
You can view the link below for more details:
Load data into your lakehouse with a notebook - Microsoft Fabric | Microsoft Learn
Hope this helps you.
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @joakimfenno
To create a view from a lakehouse file (Parquet/JSON) in the Fabric repository, you can follow these steps:
Confirm the file location. Make sure you have the URL for the Parquet or JSON file. For example, https://onelake.... URL
Connect to the Fabric warehouse. Connect to the Fabric repository using your database client or SQL tool.
Write SQL queries. Write SQL queries based on the file type to create the view. For example,
CREATE VIEW view_name AS
SELECT column1, column2, ...
FROM table_name
WHERE condition;
Query the SQL analytics endpoint or Warehouse - Microsoft Fabric | Microsoft Learn
Create reports - Microsoft Fabric | Microsoft Learn
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
thank you but those link does not provide any information about how to query lakehouse files from the SQL endpoint
Hi @joakimfenno
I have misunderstood your question. Please allow me to answer your question again.
The document states:
What is the SQL analytics endpoint for a lakehouse? - Microsoft Fabric | Microsoft Learn
Saving data in the Lakehouse using capabilities such as Load to Tables or methods described in Options to get data into the Fabric Lakehouse, all data is saved in Delta format.
Lakehouse and Delta tables - Microsoft Fabric | Microsoft Learn
Write the SQL query as follows:
CREATE VIEW view_name AS
SELECT *
FROM tablename
Regards,
Nono Chen
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
thanks!
It might be that I have to load the files to delta tables before processing it to the next layer
could external tables be an option?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.