Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
nadyukhat
Regular Visitor

Working a large data set is an issue for Power BI

Hi,
I need a help of those who faced the same issue or MS experts. I wonder how to import a large data entity through OData feed.
I filtered columns, left only 3 of them, but number of lines is close to a million (around 1Gb). I set a timeout for 8 hours [0,8,0,0].
I tried different ways: through merge, disabled load, FilterToggle etc. No luck.
How to optimize the query to minimize the import and make it quicker?

DirectQuery does not work for me. Company access is locked to SQL server.
I do appreciate your contribution.

P.S. Just some clarifications. When database was smaller, the report worked properly and was refreshed automatically.

1 ACCEPTED SOLUTION
GilbertQ
Super User
Super User

Hi @nadyukhat

 

Due to you mentioning that the data set refreshed fine when it was smaller, it would appear that due to the volume being so large, that the timeout is indeed the issue.

 

The only way I can currently think of working around this, is to store the persisted data in files, so that you do not have to re-query it all via OData (Which as I understand can be quite slow).

 

You could possibly persist the data using Excel files with the baked in Query Editor?

Or use something like SSIS and potentially store that in a database of some sort?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

View solution in original post

2 REPLIES 2
GilbertQ
Super User
Super User

Hi @nadyukhat

 

Due to you mentioning that the data set refreshed fine when it was smaller, it would appear that due to the volume being so large, that the timeout is indeed the issue.

 

The only way I can currently think of working around this, is to store the persisted data in files, so that you do not have to re-query it all via OData (Which as I understand can be quite slow).

 

You could possibly persist the data using Excel files with the baked in Query Editor?

Or use something like SSIS and potentially store that in a database of some sort?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Thank you for your valuable comment. It works for me (I mean storing old data in txt files)!!!!So grateful!

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.