Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Just started playing with Fabric and trying to get an idea of where it might sit within our organisation.
We have a traditional Azure SQL Database currently, and aren't sure whether to make the move just yet. In the meantime i'm trying to get a feel for what we might use Fabric for in the short term, or whether we might make the move more gradually.
One of the things i'm trying to understand is where Notebooks and data science functions fits in. I understand the use of Notebooks for ETL/ingestion etc, but i'm wondering more generally if Notebooks are used for data science uses what the options are for saving the outputs. I think i've figured out for example i can save an output as a table in a Lakehouse:
Solved! Go to Solution.
Hi @GlassShark1
Thanks for using Fabric Community.
Fabric is a powerful platform that can be used for a variety of tasks, including data engineering, data science, and machine learning. Fabric Notebooks are a primary tool for developing Apache Spark jobs and machine learning experiments. They provide a web-based interactive surface used by data scientists and data engineers to write code, benefiting from rich visualizations and Markdown text.
How to use notebooks - Microsoft Fabric | Microsoft Learn
Regarding your questions about saving outputs:
Saving output as a table in a Lakehouse:
You’re correct that you can save an output as a table in a Lakehouse using the delta format. If you want to save the data in a different format, such as csv, you can indeed change "delta" to "csv" in your code. However, please note that when saving as csv, Spark will always save to a folder because it uses partitions to read and write files. So you save the data into a CSV file in the Files folder.
python - Write to a CSV file using Microsoft Fabric - Stack Overflow
Load data into your lakehouse with a notebook - Microsoft Fabric | Microsoft Learn
Writing the output back as a SQL view or table in Azure Database:
It’s possible to write the output back to a SQL view or table in your traditional Azure Database. You can use copy data activity in Fabric to acheive this.
https://learn.microsoft.com/en-us/fabric/data-factory/copy-data-activity
Below are the available destinations for copy data:
Downloading or saving outputs like csvs in directories:
If you want to download the CSV files or any other files present in lakehouse, you can use Onelake explorer in Fabric.
Access Fabric data locally with OneLake file explorer - Microsoft Fabric | Microsoft Learn
Download the latest version of Onelake explorer and sync the Lakehouse.
You can also refer to this thread:
Solved: save data in excel - Microsoft Fabric Community
Hope this helps. Please let me know if you have any further questions.
Hi @GlassShark1
Thanks for using Fabric Community.
Fabric is a powerful platform that can be used for a variety of tasks, including data engineering, data science, and machine learning. Fabric Notebooks are a primary tool for developing Apache Spark jobs and machine learning experiments. They provide a web-based interactive surface used by data scientists and data engineers to write code, benefiting from rich visualizations and Markdown text.
How to use notebooks - Microsoft Fabric | Microsoft Learn
Regarding your questions about saving outputs:
Saving output as a table in a Lakehouse:
You’re correct that you can save an output as a table in a Lakehouse using the delta format. If you want to save the data in a different format, such as csv, you can indeed change "delta" to "csv" in your code. However, please note that when saving as csv, Spark will always save to a folder because it uses partitions to read and write files. So you save the data into a CSV file in the Files folder.
python - Write to a CSV file using Microsoft Fabric - Stack Overflow
Load data into your lakehouse with a notebook - Microsoft Fabric | Microsoft Learn
Writing the output back as a SQL view or table in Azure Database:
It’s possible to write the output back to a SQL view or table in your traditional Azure Database. You can use copy data activity in Fabric to acheive this.
https://learn.microsoft.com/en-us/fabric/data-factory/copy-data-activity
Below are the available destinations for copy data:
Downloading or saving outputs like csvs in directories:
If you want to download the CSV files or any other files present in lakehouse, you can use Onelake explorer in Fabric.
Access Fabric data locally with OneLake file explorer - Microsoft Fabric | Microsoft Learn
Download the latest version of Onelake explorer and sync the Lakehouse.
You can also refer to this thread:
Solved: save data in excel - Microsoft Fabric Community
Hope this helps. Please let me know if you have any further questions.
Hi @GlassShark1
We haven’t heard from you on the last response and was just checking back to see if your query has been resolved. Otherwise, will respond back with the more details and we will try to help.
Thanks
Thanks for the quick and comprehensive reply.
I've accepted this as the solution as i'm sure this would be the case - however i'm waiting on our org to decide whether they want to implement copy data back to azure, and i'd need admin to approve OneLake Explorer download for getting the data out of Fabric.
In the meantime, is there anything like SSMS where you could do a Select * query on the table and copy the entire table and paste?
Thanks again
Hi @GlassShark1
Glad that I could help you in resolving the query. Can you please explain the ask further?
I did not understand this :
In the meantime, is there anything like SSMS where you could do a Select * query on the table and copy the entire table and paste?
Thanks
Check out the September 2024 Fabric update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
5 | |
3 | |
2 | |
1 | |
1 |
User | Count |
---|---|
9 | |
7 | |
3 | |
3 | |
2 |