Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
nrowey
Helper I
Helper I

Guidence Needed, best way to Get Data from Large CSV files in a Folder updated into Premium per user

I have Large 15 -20 GB CSV files in Folders on Sharepoint , When I try to refresh and combine and these large .csv's I get Run Out of Mem error in a DataFlow.
Is a Data Flow the best way to approach this and will I be able to do an Incremental Refresh on Files in a Sharepoint folder , I can get a date field with a bit of manipulation. Do I need to Purchase more Storage for this from MS ? if so what kind and how..

Only the first few files in each folder are this large future files will be 5 to 20 MB.

Thanks for help...

 

1 ACCEPTED SOLUTION
christinepayton
Super User
Super User

Your best bet is to load the CSVs into a real database like SQL and connect to that. I don't think you will get good performance with 30GB of CSVs in SharePoint no matter what you do (if it'd work at all). 

View solution in original post

1 REPLY 1
christinepayton
Super User
Super User

Your best bet is to load the CSVs into a real database like SQL and connect to that. I don't think you will get good performance with 30GB of CSVs in SharePoint no matter what you do (if it'd work at all). 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

Jan25PBI_Carousel

Power BI Monthly Update - January 2025

Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.

Jan NL Carousel

Fabric Community Update - January 2025

Find out what's new and trending in the Fabric community.