Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
cte_crest
Helper I
Helper I

How to store a large amount of data through push dataset PowerBI?

Hi everyone, 

 

I have a use case in which my client usually generates 2.5 Million rows of data per day. I am trying to fulfil this use case by using Power BI rest API and push data set. However, while doing a POC, I stumbled upon these limitations for Power BI API:

https://docs.microsoft.com/en-us/power-bi/developer/automation/api-rest-api-limitations

 

According to this, I can only add a maximum of 5 Million rows to a table (None data retention policy), and in addition, the maximum number of tables for a particular data set is limited to 75 tables.

 

Now, in such a scenario, the limit will be reached quickly for my use case. So, apart from this, is there any way I can fulfil this use case? Also, does the same limitations apply to Power BI premium also? Or, these limitations only apply to pro users?

 

I am deep into this and any help would be highly appreciated.

2 ACCEPTED SOLUTIONS

Hi there

Your data will always have to be stored somewhere after it is generated.

You could store it in Azure Blob Storage (Which is a data lake) and then take it from there and then use Azure Analysis Services to report on the data?

I would suggest storing a large amount of data like that in some database. It will make it a lot easier to query and aggregate too.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

View solution in original post

TomMartens
Super User
Super User

Hey @cte_crest ,

 

as @GilbertQ already mentioned, use Azure Event Hub to "capture" the events. Then you can use Azure Stream Analytics to process these events.

Azure Stream Anayltics (https://azure.microsoft.com/en-us/services/stream-analytics/?&ef_id=CjwKCAjwqJ_1BRBZEiwAv73uwKwzEB0_...) is able to feed more than 1 data sink from one event

  • a live/streaming data set in Power BI, that represents the "hot" data, meaning what's going on "right now"
  • a 2nd Streaming Analytics query can push the event data, e.g. into a SQL Server database on Azure
    this will allow batch processing, and can be used to create a more sophisticated data model

Hopefully, this provides some additional ideas.

Personally, I'm very satisfied using these components, to monitor "fast" data, and also provide deep analytical capabilities on the complete data over time, 



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany

View solution in original post

7 REPLIES 7
TomMartens
Super User
Super User

Hey @cte_crest ,

 

as @GilbertQ already mentioned, use Azure Event Hub to "capture" the events. Then you can use Azure Stream Analytics to process these events.

Azure Stream Anayltics (https://azure.microsoft.com/en-us/services/stream-analytics/?&ef_id=CjwKCAjwqJ_1BRBZEiwAv73uwKwzEB0_...) is able to feed more than 1 data sink from one event

  • a live/streaming data set in Power BI, that represents the "hot" data, meaning what's going on "right now"
  • a 2nd Streaming Analytics query can push the event data, e.g. into a SQL Server database on Azure
    this will allow batch processing, and can be used to create a more sophisticated data model

Hopefully, this provides some additional ideas.

Personally, I'm very satisfied using these components, to monitor "fast" data, and also provide deep analytical capabilities on the complete data over time, 



Did I answer your question? Mark my post as a solution, this will help others!

Proud to be a Super User!
I accept Kudos 😉
Hamburg, Germany
v-eachen-msft
Community Support
Community Support

Hi @cte_crest ,

 

You can register an instance of Azure Analysis Services and build a model in it to reduce costs. You can then make a real-time connection from Power BI to the model without incurring large overhead. 

 

Community Support Team _ Eads
If this post helps, then please consider Accept it as the solution to help the other members find it.

@v-eachen-msft 

 

I am really thankful for your suggestion.

So, does that mean that given the volume of data, instead of storing the data in Power BI itself (through Push dataset, or connector), it would be more favourable to store the data in Azure Analysis service?

Hi there

Your data will always have to be stored somewhere after it is generated.

You could store it in Azure Blob Storage (Which is a data lake) and then take it from there and then use Azure Analysis Services to report on the data?

I would suggest storing a large amount of data like that in some database. It will make it a lot easier to query and aggregate too.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

@GilbertQ @v-eachen-msft 

 

Thank you very much for your reply. 

 

So, after the data is stored to the database, the best way to use the data would be through live connection?

 

Or through importing the required data and implementing the incremental refresh policy?

 

And, I am also not accustomed to the different products offered by Azure. So, given this use case of a large dataset, which product allows to use the maximum capabilities of Power BI?

GilbertQ
Super User
Super User

Hi there

For that volume of data I would be looking at Azure Event Hubs, (I know about it and not an expert on it)

Also why would then need to see all 2.5 million rows that will go by so fast you would not notice all the rows?





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Hi @GilbertQ!

Thanks for suggesting Azure events hub, I will surely have a look at that.

 

As for the other question, we are monitoring the environment which generates data in high volume. Then the end goal is to create reports based on this data. And we are thinking to leverage PowerBI for that. 

 

Also, just to confirm from the answer, I assume that even in PowerBI premium, we wouldn't be able to push more than 5 Million rows in a single table. 

 

Also, would there be any possibility to store the data in Power BI itself, using any of the available methods? Or, it is preferable to store data somewhere else given this use case?

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.