The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have a notebook running with Spark SQL queries. I do not have a Lakehouse attached. Seeing the error "Spark SQL queries are only possible in the context of a lakehouse. Please attach a lakehouse to proceed" when trying to run the Spark SQL query.
Are there options to run this without attaching the Lakehouse?
Solved! Go to Solution.
Hi @Jy
In my experience, you can use Spark SQL without attaching a Lakehouse to the Notebook.
I did it by using the abfss path to create a dataframe of my data.
Then I used this dataframe as the source to create a temporary view.
Then I could use the temporary view for doing Spark SQL.
Hi @Jy
In my experience, you can use Spark SQL without attaching a Lakehouse to the Notebook.
I did it by using the abfss path to create a dataframe of my data.
Then I used this dataframe as the source to create a temporary view.
Then I could use the temporary view for doing Spark SQL.
Hi @Jy ,
Running Spark SQL queries in Fabric does require attaching lakehouse.
Unfortunately, there is no other option to run Spark SQL queries in Fabric Lakehouse notebooks without attaching a lakehouse.Lakehouse provides the necessary context and storage for executing these queries.
If you want to run Spark SQL queries, attach a Lakehouse.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
User | Count |
---|---|
6 | |
2 | |
2 | |
2 | |
2 |
User | Count |
---|---|
20 | |
18 | |
6 | |
5 | |
4 |