March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi community,
I'm facing an issue. I have a dataset of about 200+ M rows, that grows by million of rows daily, and I cant import the data either in Desktop to create a dataset there, or in Service to create a Dataflow or Datamart.
I'm using an ODBC connection to an Amazon Redshift database.
On desktop the data never load or, when trying to tranform the data, the evaluation never ends in Power Query. Same on Service, when trying to create a Dataflow or Datamart, the evaluation gets cancelled in Power Query when reaching the 10 minute limit. Then when I click save I get a "PQO Evaluation Failed" error.
Are there any suggestions on how to deal with this issue?
Solved! Go to Solution.
I have created a filtered pbix file that has only a snapshot of the data. Set up incremental refresh on it. Published and refreshed. That created the partitions.
Then you need to use external tools like SQL Management Studio and XML Toolkit to remove the filtering and refresh the partitions.
Check this video:
https://youtu.be/5AWt6ijJG94
One other way is to create a parameter to filter the data and change it or remove it from the dataset settings on Power BI service.
Any ideas on how to import a large dataset with an ODBC connection?
There are several strategies you can try to deal with large datasets in Power BI:
Use a direct query connection instead of importing the data. Direct query connections allow you to query the data directly from the database, rather than importing it into Power BI. This can be more efficient when working with large datasets, as it reduces the amount of data that needs to be transferred and stored in Power BI. However, it can also be slower, as the queries are executed in real-time and may require more resources from the database.
Use data sampling to reduce the amount of data that is imported or queried. Data sampling allows you to select a random subset of the data to import or query, rather than the entire dataset. This can be useful for testing and prototyping, as it allows you to work with a smaller and more manageable dataset.
Use incremental refresh to update only the new or modified rows in the dataset. Incremental refresh allows you to set up a schedule to refresh only the rows that have been added or modified since the last refresh, rather than the entire dataset. This can be more efficient when working with large datasets that grow over time, as it reduces the amount of data that needs to be processed and stored in Power BI.
Use data modeling techniques to optimize the data for querying and analysis. Data modeling techniques such as indexing, partitioning, and columnstore indexes can help to improve the performance of queries on large datasets by optimizing the way the data is stored and accessed.
Thanks for your time @MAwwad , I have tried all the steps that you mention and this is not what I need.
1. Direct Query is not an option for ODBC.
2, 3,4. Not my issue, I want to import and load all my data. Incremental refresh comes after that.
Hi, did you find the solution to this problem, I am also facing the same issue while importing huge datasets in the Power BI at the initial level.
Please let me know your findings.
I have created a filtered pbix file that has only a snapshot of the data. Set up incremental refresh on it. Published and refreshed. That created the partitions.
Then you need to use external tools like SQL Management Studio and XML Toolkit to remove the filtering and refresh the partitions.
Check this video:
https://youtu.be/5AWt6ijJG94
One other way is to create a parameter to filter the data and change it or remove it from the dataset settings on Power BI service.
Can you explain this in details please?
One other way is to create a parameter to filter the data and change it or remove it from the dataset settings on Power BI service.
From where to remove the filter in Power BI Service?
If you create a date parameter in Power Query and filter your data according to it, when the report is published you will be able to see the parameter here and change its value:
Thanks for your reply. We use Professional License hence this will not work in our case.
Anyways thanks for your prompt response.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
129 | |
90 | |
75 | |
58 | |
53 |
User | Count |
---|---|
200 | |
104 | |
101 | |
67 | |
55 |