Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I suspect I know the answer to this but I'll ask anyway to see if anyone has a good idea.
I want to use Fabric to do my data processing, then I want to store my data and give access to external applications.
I know that I can use the SQL Server ODBC driver to query the data, but sometimes the data comes in formats thats supported in Spark but not in the ODBC driver, is it possible to get access to the Spark context to run queries on the Cluster as you would from a notebook from outside of Fabric for application development?
Hi @Anonymous ,
Is my follow-up just to ask if the problem has been solved?
If so, can you accept the correct answer as a solution or share your solution to help other members find it faster?
Thank you very much for your cooperation!
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @Anonymous ,
In Fabric, you can do this with the Apache Livy service.
The installation, configuration, and usage of Livy is described in detail here, which you can refer to:
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
User | Count |
---|---|
16 | |
5 | |
3 | |
3 | |
2 |