Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
aiwithchirag
Frequent Visitor

How to access SharePoint Lists Attachment in Fabric

Hi,
 

I was thinking if I have a list in which I have multiple attachments within different or same row items, how I can access this using Fabric, I mean creating any DataFlow or something such that I can store it in the Lakehouse or so.

1 ACCEPTED SOLUTION

I see. If I understand correctly, you basically just want to store a copy of the files somewhere.

 

I guess you can use Notebook or Data Pipeline to copy the files into a files folder in a Lakehouse.

 

Or you could create a folder for each list item, and insert the files which are attached to each list item. (automatically).

 

I don't know the code for doing that, and unfortunately I don't have the time to test that now, but it sounds like something which is doable.

 

---

 

It seems maybe you will need to create an app registration (service principal) in Azure and give this service principal access to the Sharepoint list, and then use the service principal to authenticate to the Sharepoint list attachments from either a Notebook or Data pipeline.

 

Perhaps this documentation will help:

Working with folders and files with REST | Microsoft Learn

 

I am guessing use GET method to get each attachment file from Sharepoint via API, and then have the Notebook or Data pipeline save the file in a folder in Fabric Lakehouse.

 

---

 

If a low code approach is preferred, I am thinking maybe Power Automate could be an alternative way of extracting the attachments from the list:

How to store all attachments from a SharePoint list to one folder? - Microsoft Q&A

Save Attachments from Existing SharePoint List to Document Library and Create Individual folders - M...

 

But per my knowledge there are no Fabric connectors in Power Automate. So if the aim is to store the files in Fabric, I think it would need to use some API: Solved: Automate file movement lakehouse > onderive - Microsoft Fabric Community

If going ahead with Power Automate, maybe the easiest is to just store the attachment files in another folder in Sharepoint which is "visible" in the Sharepoint document hierarchy (as opposed to the list attachments folders, which seem to be "hidden" in Sharepoint).

View solution in original post

9 REPLIES 9
frithjof_v
Community Champion
Community Champion

What is the file format of the attachments? (I.e. is it csv, json, excel, image files, word, etc.?)

 

I guess Data pipeline, notebook or dataflow gen2 could be relevant, depending on the file format of the SharePoint attachment, and whether you want to store it as table in Lakehouse or in files in Lakehouse?

 

Is this a one time job, or something you want to run automatically e.g. every day?

(Will the number of rows and number of attachments increase?)

 

I would look for SharePoint list connector or SharePoint folder connector.

Or I would try to do REST API or Web activity.

 

Depending on the tool you wish/need to use, I guess there are different ways of accessing the attachments (browsing or use folder path or file path).

@frithjof_v These files will be financial files and will be in excel or csv format, there can be multiple files within each row item of the list, and this needs to be refreshed daily, the data within the attachment can change, even the attachment can be refreshed as per the requirement, Is it really possible to read list items in fabric, if yes please guide me how?

aiwithchirag_0-1720510571882.png

 

Unfortunately, I don't have the time to test it now. Hopefully someone else can help.

 

However, my main questions to guide hopefully a solution would be:

 

What will be the final goal (reason why you want to get the files into Fabric Lakehouse)? Are you planning to make one (or more) Power BI reports based on the file content?

 

Do you want to combine multiple SharePoint attachment files into one Table in Fabric Lakehouse?

Or you want to save each file as a separate Table in the Lakehouse?

 

Or you want to save each file as a separate file in Lakehouse? (Purely copy the files from SharePoint to Fabric).

 

Do all the files have similar content inside the file? (Identical column names, etc.)

Or does each file have different layout inside the file?

 

 

Depending on the tool (dataflow gen2, data pipeline, notebook) I guess there can be different ways to access attachment files. 

However the choice of method and tool depends on what you want to achieve as the final product.

 

E.g. with the dataflows gen2 tool, I think you could use one of the SharePoint connectors (list or folder):

https://learn.microsoft.com/en-us/fabric/data-factory/dataflow-support#s

 

By using the SharePoint online list connector, I think it can be possible to navigate to find the attachment. Maybe also create a custom function to find the attachment for each list item, if necessary.

You could probably also use the SharePoint Folder connector to open the folder which contains all the attachments. I'm not sure what the path to the attachments folder is, but you would need to find the path to the attachments folder, maybe something like this combined with this: .../Lists/<ListName>/Attachments/

 

https://learn.microsoft.com/en-us/fabric/data-factory/connector-sharepoint-folder

https://learn.microsoft.com/en-us/power-query/connectors/sharepoint-folder#determine-the-site-url

 

https://learn.microsoft.com/en-us/power-query/connectors/folder

https://learn.microsoft.com/en-us/power-query/combine-files-overview

 

For now, just wanted to read this attachment (.csv or excel) and store in the lakehouse:

 

aiwithchirag_0-1720515465962.png

aiwithchirag_1-1720515469408.png

Not able to do it through dataflow gen2, any ideas on this?

 

I did a test on my side, and it seems to be more complicated than I thought to get the attachments.

 

However I was able to get the attachment file by using the Web API connector. The URL was like this:

https://<domain>.sharepoint.com/sites/<sitename>/Lists/<listname>/Attachments/<itemnumber>/<filename>

 

I found this part: /sites/<sitename>/Lists/<listname>/Attachments/<itemnumber>/<filename>

by using the SharepointOnline List connector in the Dataflow Gen2.

First, I opened the SharepointOnline List connector and inserted this part: https://<domain>.sharepoint.com/sites/<sitename>/     

as the Site URL.
In order to find this part: https://<domain>.sharepoint.com/sites/<sitename>/ , see the instructions on this page Power Query SharePoint Online list connector - Power Query | Microsoft Learn

frithjof_v_0-1720560956714.png

 

 

Then, in the Dataflow Gen2, I navigated to the relevant SharepointOnline List.

 

Then, I chose to expand the AttachmentFiles column.

frithjof_v_1-1720561047419.png

frithjof_v_3-1720561395335.png

 

After expanding the AttachmentFiles column, I could access the ServerRelativeUrl column, which contains this information for each attachment: 

/sites/<sitename>/Lists/<listname>/Attachments/<itemnumber>/<filename>

 

Then I used the Web API connector to get the attachment file content. 

I used this information as the URL in the Web API connector:

https://<domain>.sharepoint.com/sites/<sitename>/Lists/<listname>/Attachments/<itemnumber>/<filename>  

frithjof_v_2-1720561215947.png

 

Thanks for the response!

I was able to read the file by manually providing the serverrelativeurl using web api.

But now the real challenge is, how I can automate this? I mean there will a lot of files within each list items, I can't copy url for each attachment repeatedly, Is there any way around to solve this?

(I want to keep all the files seperately)

If you want to keep the files separately, and the number of attachments may increase dynamically, I think you will need to use Data pipeline or Notebook, in order to loop through the dynamic list of attachments. (if the number of attachments may increase over time, and you want the attachments to be kept separately, I don't think Dataflow Gen2 can handle that).

 

We can find the dynamic list of attachments by using the SharePoint Online List connector which we have already used before.

 

The next step is to loop through the list of attachments and get the contents of each file and store as a file copy or as a table in the Lakehouse.

Unfortunately I don't have the time to test it now.

 

Will the contents of the attachments have the same layout in each file (same column names, same worksheet name inside the workbook, etc.)?

Or will each file have different layout inside the file?

Thanks Again for clarifying and responding back!

 

The File content may change, the columns name may change, it's basically not fixed.

 

In a list like 'Financials', I can have some financial reports let's say which needs to be approved or something like that, but can have other files too so this format is not fixed, the basic idea is to fetch all the files attached within each list items and store it seperately somewhere.

To Answer your question, yes it will have different layout sometimes and sometimes it can be same.

I see. If I understand correctly, you basically just want to store a copy of the files somewhere.

 

I guess you can use Notebook or Data Pipeline to copy the files into a files folder in a Lakehouse.

 

Or you could create a folder for each list item, and insert the files which are attached to each list item. (automatically).

 

I don't know the code for doing that, and unfortunately I don't have the time to test that now, but it sounds like something which is doable.

 

---

 

It seems maybe you will need to create an app registration (service principal) in Azure and give this service principal access to the Sharepoint list, and then use the service principal to authenticate to the Sharepoint list attachments from either a Notebook or Data pipeline.

 

Perhaps this documentation will help:

Working with folders and files with REST | Microsoft Learn

 

I am guessing use GET method to get each attachment file from Sharepoint via API, and then have the Notebook or Data pipeline save the file in a folder in Fabric Lakehouse.

 

---

 

If a low code approach is preferred, I am thinking maybe Power Automate could be an alternative way of extracting the attachments from the list:

How to store all attachments from a SharePoint list to one folder? - Microsoft Q&A

Save Attachments from Existing SharePoint List to Document Library and Create Individual folders - M...

 

But per my knowledge there are no Fabric connectors in Power Automate. So if the aim is to store the files in Fabric, I think it would need to use some API: Solved: Automate file movement lakehouse > onderive - Microsoft Fabric Community

If going ahead with Power Automate, maybe the easiest is to just store the attachment files in another folder in Sharepoint which is "visible" in the Sharepoint document hierarchy (as opposed to the list attachments folders, which seem to be "hidden" in Sharepoint).

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.

Top Solution Authors