Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi,
I'm able to read the files and folders withing sharepoint using graphapi, how I can write those file into my lakehouse, can someone please help?
Solved! Go to Solution.
Thanks I was able to directly download the file to lakehouse using fabric notebook without converting it into dataframe. #solved
Thanks I was able to directly download the file to lakehouse using fabric notebook without converting it into dataframe. #solved
Hi. You probably could do it from a Fabric Notebook. Get the data, put it on a spark dataframe and then store it at lakehouse as csv, parquet or delta. When working at a fabric notebook you can pick a lakehouse to make it easy to read and write. You can check it here for examples: https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-notebook-load-data
On the other hand, that would be way easier if you use a Dataflow gen2 item at Fabric to connect the sharepoint, pick the file or files and configure the lakehouse destination. The UI is very friendly.
I hope that helps,
Happy to help!
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
| User | Count |
|---|---|
| 54 | |
| 24 | |
| 13 | |
| 12 | |
| 11 |