Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi everyone,
I have a use case in which my client usually generates 2.5 Million rows of data per day. I am trying to fulfil this use case by using Power BI rest API and push data set. However, while doing a POC, I stumbled upon these limitations for Power BI API:
https://docs.microsoft.com/en-us/power-bi/developer/automation/api-rest-api-limitations
According to this, I can only add a maximum of 5 Million rows to a table (None data retention policy), and in addition, the maximum number of tables for a particular data set is limited to 75 tables.
Now, in such a scenario, the limit will be reached quickly for my use case. So, apart from this, is there any way I can fulfil this use case? Also, does the same limitations apply to Power BI premium also? Or, these limitations only apply to pro users?
I am deep into this and any help would be highly appreciated.
Solved! Go to Solution.
Hey @cte_crest ,
as @GilbertQ already mentioned, use Azure Event Hub to "capture" the events. Then you can use Azure Stream Analytics to process these events.
Azure Stream Anayltics (https://azure.microsoft.com/en-us/services/stream-analytics/?&ef_id=CjwKCAjwqJ_1BRBZEiwAv73uwKwzEB0_...) is able to feed more than 1 data sink from one event
Hopefully, this provides some additional ideas.
Personally, I'm very satisfied using these components, to monitor "fast" data, and also provide deep analytical capabilities on the complete data over time,
Hey @cte_crest ,
as @GilbertQ already mentioned, use Azure Event Hub to "capture" the events. Then you can use Azure Stream Analytics to process these events.
Azure Stream Anayltics (https://azure.microsoft.com/en-us/services/stream-analytics/?&ef_id=CjwKCAjwqJ_1BRBZEiwAv73uwKwzEB0_...) is able to feed more than 1 data sink from one event
Hopefully, this provides some additional ideas.
Personally, I'm very satisfied using these components, to monitor "fast" data, and also provide deep analytical capabilities on the complete data over time,
Hi @cte_crest ,
You can register an instance of Azure Analysis Services and build a model in it to reduce costs. You can then make a real-time connection from Power BI to the model without incurring large overhead.
I am really thankful for your suggestion.
So, does that mean that given the volume of data, instead of storing the data in Power BI itself (through Push dataset, or connector), it would be more favourable to store the data in Azure Analysis service?
Thank you very much for your reply.
So, after the data is stored to the database, the best way to use the data would be through live connection?
Or through importing the required data and implementing the incremental refresh policy?
And, I am also not accustomed to the different products offered by Azure. So, given this use case of a large dataset, which product allows to use the maximum capabilities of Power BI?
Hi @GilbertQ!
Thanks for suggesting Azure events hub, I will surely have a look at that.
As for the other question, we are monitoring the environment which generates data in high volume. Then the end goal is to create reports based on this data. And we are thinking to leverage PowerBI for that.
Also, just to confirm from the answer, I assume that even in PowerBI premium, we wouldn't be able to push more than 5 Million rows in a single table.
Also, would there be any possibility to store the data in Power BI itself, using any of the available methods? Or, it is preferable to store data somewhere else given this use case?
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.