Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Score big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount

Reply
DiKi-I
Post Patron
Post Patron

Download sharepoint files to lakehouse

Hi,
I'm able to read the files and folders withing sharepoint using graphapi, how I can write those file into my lakehouse, can someone please help?

import requests

# Replace with your actual values
tenant_id = ""
client_id = ""
client_secret = ""
sharepoint_url = "xxxx.sharepoint.com/"
site_name = "ADFPOC"

# Step 1: Get OAuth2 token
token_url = f"https://login.microsoftonline.com/{tenant_id}/oauth2/token"
token_data = {
    "grant_type": "client_credentials",
    "client_id": client_id,
    "client_secret": client_secret,
    "resource": "https://graph.microsoft.com"
}
response = requests.post(token_url, data=token_data)
response.raise_for_status()
access_token = response.json()["access_token"]

# Step 2: Test SharePoint Site ID lookup
headers = {
    "Authorization": f"Bearer {access_token}"
}
site_info_url = f"https://graph.microsoft.com/v1.0/sites/{sharepoint_url}:/sites/{site_name}?$select=id"

response = requests.get(site_info_url, headers=headers)
response.raise_for_status()
site_id = response.json()["id"]

print(" Successfully connected to SharePoint.")
print(f"Site ID: {site_id}")


------------------------------------------------------------
1 ACCEPTED SOLUTION
DiKi-I
Post Patron
Post Patron

Thanks I was able to directly download the file to lakehouse using fabric notebook without converting it into dataframe. #solved

View solution in original post

2 REPLIES 2
DiKi-I
Post Patron
Post Patron

Thanks I was able to directly download the file to lakehouse using fabric notebook without converting it into dataframe. #solved

ibarrau
Super User
Super User

Hi. You probably could do it from a Fabric Notebook. Get the data, put it on a spark dataframe and then store it at lakehouse as csv, parquet or delta. When working at a fabric notebook you can pick a lakehouse to make it easy to read and write. You can check it here for examples: https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-notebook-load-data

On the other hand, that would be way easier if you use a Dataflow gen2 item at Fabric to connect the sharepoint, pick the file or files and configure the lakehouse destination. The UI is very friendly.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.