The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi,
I am trying to understand how the dataset is caching the data pushed through the REST calls. In one of the video(https://www.youtube.com/watch?v=PpiUsSCXFhM) it says, the data won't be imported, but will be cached.
Isnt it possible to generate a report that uses the data across a date range say for an year or more and having more than 200,000 rows? One alternative is to specify no retention policy, up to 5,000,000 max rows stored per table in ‘none retention policy’ dataset. Then isn't it possible to backup the data beyond the limit?
As per the basicFIFO policy(https://msdn.microsoft.com/en-us/library/mt186545.aspx) next 10,000 rows after 200, 000 will be inserted after removing the very first 10,000 row.Is it possible to backup the rows deleted?
What if I want to generate a report that includes the data from these 210,000 or more rows?
User | Count |
---|---|
5 | |
3 | |
2 | |
2 | |
2 |
User | Count |
---|---|
11 | |
7 | |
5 | |
4 | |
4 |