Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi,
I'm able to read the files and folders withing sharepoint using graphapi, how I can write those file into my lakehouse, can someone please help?
Solved! Go to Solution.
Thanks I was able to directly download the file to lakehouse using fabric notebook without converting it into dataframe. #solved
Thanks I was able to directly download the file to lakehouse using fabric notebook without converting it into dataframe. #solved
Hi. You probably could do it from a Fabric Notebook. Get the data, put it on a spark dataframe and then store it at lakehouse as csv, parquet or delta. When working at a fabric notebook you can pick a lakehouse to make it easy to read and write. You can check it here for examples: https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-notebook-load-data
On the other hand, that would be way easier if you use a Dataflow gen2 item at Fabric to connect the sharepoint, pick the file or files and configure the lakehouse destination. The UI is very friendly.
I hope that helps,
Happy to help!
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 72 | |
| 59 | |
| 27 | |
| 23 | |
| 20 |