Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
JatinSaini
Regular Visitor

[URGENT] Get Data from SharePoint Folder Dynamically

Hi Developers,

I am kind of new to the Fabric Platform and was recently working on a new requirement, hope you could help me to solve this:

My users are going to add the files on monthly basis to a SharePoint Folder following the below naming convetion:
FileName_Jan,

FileName_Feb,

FileName_Mar,

and so on....

For the first time there is only single file 'FileName_Jan'. I want to create an end-to-end process where the Fabric will get the data (either via DataFlowGen2 or DataPipeline) and then clean/transform it and then add it to the Warehouse.

The catch is that the data ingestion should be dynamic everytime as the file name is changed.Hope you would get an idea what am looking for.

Thanks in advance.

1 ACCEPTED SOLUTION
NandanHegde
Super User
Super User

There are 2 ways:

Assuming you use data pipeline, you can use sharepoint REST API to copy the file from sharepoint into a lakehouse staging zone :

https://learn.microsoft.com/en-us/azure/data-factory/connector-sharepoint-online-list?tabs=data-fact...

Note : Though the link is for ADF/synapse, data pipeline concpet remains the same.

You can parameterize the file name as per your convenience

 

In case if you want dataflow,plz follow the below link as reference :

https://community.fabric.microsoft.com/t5/Desktop/Load-Query-Based-on-Dynamic-File-Name-SharePoint/t...




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

View solution in original post

2 REPLIES 2
v-pnaroju-msft
Community Support
Community Support

Hi @JatinSaini,

We appreciate your inquiry through the Microsoft Fabric Community Forum.

 

Please follow the steps mentioned below to dynamically ingest, cleanse, transform, and load data from SharePoint into a Data Warehouse using Microsoft Fabric:

  1. Store all the files in a specific SharePoint folder following the naming convention FileName_Month.

  2. Open Dataflow Gen2 in Microsoft Fabric and connect to the SharePoint folder by providing all the necessary details, and select the folder where the files are stored.

  3. To dynamically handle the changing file names, use Power Query M code to parameterise the file path based on the current month:
    a) Add a custom column in Power Query to generate the file path dynamically.
    b) Use expressions like DateTime.ToText(DateTime.LocalNow(), "MMM") to fetch the current month and construct the file name dynamically. Please refer to the sample code below:

    let
    CurrentMonth = DateTime.ToText(DateTime.LocalNow(), "MMM"),
    FileName = "FileName_" & CurrentMonth
    in
    FileName

  4. Use the Power Query editor to clean and transform the data as required. Save and close the dataflow.

  5. Create a new Data Pipeline in Microsoft Fabric to orchestrate the data ingestion process. Add an activity to ingest data from the Dataflow Gen2 created in the previous steps.

  6. Add transformation activities to perform any additional processing required on the data before loading it into the Data Warehouse.

  7. Add an activity in the Data Pipeline to load the cleaned and transformed data into the Data Warehouse. Use the COPY statement for efficient, high-throughput data ingestion.

  8. Schedule the Data Pipeline to run on a monthly basis or as per the file addition schedule. Set up triggers within the pipeline to ensure it executes automatically whenever new files are added to the SharePoint folder.

For additional guidance, please refer to the following resources:

If you find this response helpful, kindly mark it as the accepted solution and provide kudos to assist other members with similar queries.

 

Best regards,
Pavan

NandanHegde
Super User
Super User

There are 2 ways:

Assuming you use data pipeline, you can use sharepoint REST API to copy the file from sharepoint into a lakehouse staging zone :

https://learn.microsoft.com/en-us/azure/data-factory/connector-sharepoint-online-list?tabs=data-fact...

Note : Though the link is for ADF/synapse, data pipeline concpet remains the same.

You can parameterize the file name as per your convenience

 

In case if you want dataflow,plz follow the below link as reference :

https://community.fabric.microsoft.com/t5/Desktop/Load-Query-Based-on-Dynamic-File-Name-SharePoint/t...




----------------------------------------------------------------------------------------------
Nandan Hegde (MSFT Data MVP)
LinkedIn Profile : www.linkedin.com/in/nandan-hegde-4a195a66
GitHUB Profile : https://github.com/NandanHegde15
Twitter Profile : @nandan_hegde15
MSFT MVP Profile : https://mvp.microsoft.com/en-US/MVP/profile/8977819f-95fb-ed11-8f6d-000d3a560942
Topmate : https://topmate.io/nandan_hegde
Blog :https://datasharkx.wordpress.com

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.