Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by watching the DP-600 session on-demand now through April 28th.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
We have been using Stream Analytics w/the Power BI Connector to stream real-time data to Power BI for a little over a year. We are streaming Contact Center data and then running some DAX formulas against it to calculate different totals for the day; which we cannot calculate in Stream Analytics.
Since the last week of October when our dataset hits the row limit for the Streaming Dataset instead of purging old rows it is now clearing the dataset entirely. It is doing this on both of our datasets. We have since recreated 1 of the dataset connections and the issue is still occurring. I have the same solution deployed for another customer and the datasets are not behaving like this.
Did anything change with PushDatasets and how the data retention policy is applied?
@Anonymous
I have the same problem with you, three weeks have passed and I have not found the solution.
When my streaming data set has 225-230k rows it goes back to 215-220k rows, deleting the first 10k rows I sent. I have two other streaming dataset with more than 2 million rows, and they work perfectly.
I tried create a new test account to test with another streaming dataset on another account to eliminate the chances of reaching the pro account data limit, and I had the same problem.
Could it be a problem generated after the October update? Or does my dataset need some configuration that I don't know about?
The behavior of the 250k back down to 215k is the default behavior for PushStreaming datasets when FIFO is the Default Retention Mode.
When SA creates a dataset it creates the Default Retention as FIFO which enforces the 200k limit. If you create a PushStreaming dataset via the API it does not default to FIFO which allows you to push well over 250k but then you hit API data streaming restrictions and your real-time Dashboards will start to slow down.
@Anonymous
I created a new dataset via API and now it works, thanks man!!!
But I don't understand why the other streaming data sets work and this one doesn't. All data sets were created using the same method.
Hi there
Are you storing the historical data?
If so I do recall a limit of 5M rows
Here are more details: https://docs.microsoft.com/en-us/power-bi/connect-data/service-real-time-streaming#using-azure-strea...
Hi Gilbert,
We are storing historical data but the limit before data is purged is between 200k - 225k. We did delete the dataset again and let SA recreate it. Since doing that when the count approaches 220k it is back to running as it did before. It starts to purge the older records.
Not sure why the issue randomly started but we have another dataset that we did not fix yet; so we are going to use that to troubleshoot.
Jeremy
Check out the April 2026 Power BI update to learn about new features.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
| User | Count |
|---|---|
| 10 | |
| 9 | |
| 9 | |
| 7 | |
| 7 |
| User | Count |
|---|---|
| 40 | |
| 27 | |
| 26 | |
| 19 | |
| 19 |