Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
DiKi-I
Post Partisan
Post Partisan

Download sharepoint files to lakehouse

Hi,
I'm able to read the files and folders withing sharepoint using graphapi, how I can write those file into my lakehouse, can someone please help?

import requests

# Replace with your actual values
tenant_id = ""
client_id = ""
client_secret = ""
sharepoint_url = "xxxx.sharepoint.com/"
site_name = "ADFPOC"

# Step 1: Get OAuth2 token
token_url = f"https://login.microsoftonline.com/{tenant_id}/oauth2/token"
token_data = {
    "grant_type": "client_credentials",
    "client_id": client_id,
    "client_secret": client_secret,
    "resource": "https://graph.microsoft.com"
}
response = requests.post(token_url, data=token_data)
response.raise_for_status()
access_token = response.json()["access_token"]

# Step 2: Test SharePoint Site ID lookup
headers = {
    "Authorization": f"Bearer {access_token}"
}
site_info_url = f"https://graph.microsoft.com/v1.0/sites/{sharepoint_url}:/sites/{site_name}?$select=id"

response = requests.get(site_info_url, headers=headers)
response.raise_for_status()
site_id = response.json()["id"]

print(" Successfully connected to SharePoint.")
print(f"Site ID: {site_id}")


------------------------------------------------------------
1 ACCEPTED SOLUTION
DiKi-I
Post Partisan
Post Partisan

Thanks I was able to directly download the file to lakehouse using fabric notebook without converting it into dataframe. #solved

View solution in original post

2 REPLIES 2
DiKi-I
Post Partisan
Post Partisan

Thanks I was able to directly download the file to lakehouse using fabric notebook without converting it into dataframe. #solved

ibarrau
Super User
Super User

Hi. You probably could do it from a Fabric Notebook. Get the data, put it on a spark dataframe and then store it at lakehouse as csv, parquet or delta. When working at a fabric notebook you can pick a lakehouse to make it easy to read and write. You can check it here for examples: https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-notebook-load-data

On the other hand, that would be way easier if you use a Dataflow gen2 item at Fabric to connect the sharepoint, pick the file or files and configure the lakehouse destination. The UI is very friendly.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors