Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hello Farbic Community
I'm new to using Fabric and am exploring its capabilities. I understand that we can load data from Snowflake into a Fabric Notebook via Data Lakehouse.
I'm curious if it's possible to use a Python script in a Fabric notebook to directly load data from Snowflake into the notebook without involving a data lakehouse. In addition to direct data import from Snowflake, I'm also interested in whether data can be pushed back to Snowflake.
If direct data import/export to and from Snowflake isn't feasible with a Python script in a Fabric notebook, is it possible to accomplish this using Dataflow or Data Pipeline?
I would appreciate any suggestions and guidance.
Hi @Binyi_Zhang ,
Thanks for using Fabric Community.
I know we can read and write data in pyspark using Azure Databricks: Read and write data from Snowflake - Azure Databricks | Microsoft Learn
Can you please try the same steps?
I hope this information helps. If you have any further queries please do let us know.
Hi apologise for my late reply.
Please allow me sometime to go through the link and give a try.
Thank you for your support.
Hi @Binyi_Zhang ,
We haven’t heard from you on the last response and was just checking back to see if your query was answered.
Otherwise, will respond back with the more details and we will try to help .
Thanks
Hello @Binyi_Zhang ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .
Thanks
Check out the September 2024 Fabric update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
6 | |
3 | |
2 | |
1 | |
1 |
User | Count |
---|---|
9 | |
7 | |
3 | |
3 | |
2 |