Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Sign up nowGet Fabric certified for FREE! Don't miss your chance! Learn more
Hi,
I'm using PowerBI online with a profesionnal account. I read excellent articles about handling powerBI datasets through an API, that's awesome! 🙂
I need to insert data in a daily temp dataset to have dashboard in (real) real time (eg : Number of visits of the last 24h).However, I'm wondering about differences between using API vs using EventHub + Stream Analytics to insert the data in dataset in real-time.
I have several questions :
1/ What is the storage of the data uploaded in a dataset through the API ? Are they stored in a Azure SQL Database ?
2/ And what about the persitancy of these data ? Is there a scheduled drop of these data or these data are persistant ?
3/ Is there any limit on the API ? I hope the limits raised in the following post have been resolved (cf http://community.powerbi.com/t5/Service/Limits-when-using-Power-BI-REST-API-and-or-Azure-Stream/m-p/...
Thanks ! 🙂
Romain.
As far as I understand, Stream Analytics just uses the same API you would use yourself if updating the data directly yourself via the API.
There's some information here regarding the retention policies, I think the standard retention is 200,000 rows,
https://powerbi.microsoft.com/en-us/blog/automatic-retention-policy-for-real-time-data/
Hope that helps
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 3 | |
| 2 | |
| 2 | |
| 1 | |
| 1 |
| User | Count |
|---|---|
| 5 | |
| 4 | |
| 3 | |
| 3 | |
| 2 |