The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
When running Spark SQL in a Fabric notebook without a Lakehouse attached, the current error message is overly technical and not very helpful:
AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Spark SQL queries are only possible in the context of a lakehouse. Please attach a lakehouse to proceed.)
This could be improved with a more user-friendly message, such as:
"It looks like your notebook isn't connected to a Lakehouse. Please attach one using the Lakehouse dropdown at the top of the notebook to use Spark SQL features."
Bonus Feature Request:
It would be incredibly helpful to have a pre-run validation feature, similar to Power Automate’s Flow Checker, that scans the notebook for common issues like:
This would help users catch issues before execution and improve the overall user experience.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.