Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for a $400 discount.
Register nowGet inspired! Check out the entries from the Power BI DataViz World Championships preliminary rounds and give kudos to your favorites. View the vizzies.
Hi,
I am trying to fetch data from Log Analytics from powerBI using Log analytics RestAPI.
The API has limitation of 500000 rows per query.
And hence I am thinking of handling large dataset at powerBI by calling multiple RestAPI (with different timerange).
I want to make this dynamic and hence I created function (which functionality code and call Rest API on Log analytics workspace).
Then I have a loop which basically calls this function iteratively for different time range and finally append the data.
let list1 = List.Generate(()=>[Counter=90, table2 = null],
each [Counter] >= 0,
each [table2 = PIFunc([Counter], [Counter]-30),Counter=[Counter]-30],
each _[table2]), // skip first null value finalList = List.Skip(list1), Result = Table.FromList(finalList, Splitter.SplitByNothing(), null, null), #"Expanded Column1" = Table.ExpandTableColumn(Result, "Column1", {"BackupItemUniqueId", "ProtectedContainerUniqueIdData", "AsOnDateTime"}, {"BackupItemUniqueId", "ProtectedContainerUniqueIdData", "AsOnDateTime"}) in #"Expanded Column1"
In the above code, it calls the function PIFunc for different time range (eg - 90 to 60 days, 60 to 30 days.. ).
Like PIFunc(90,60), PIFunc(60,30), PIFunc(30, 0).
Finally the table data will be in the list and then table is generated.
My objective is I would like to avoid any heavy operation (in terms of memory due to 1GB dataset limitation) performed at powerBI. That's why I am using LA to do all summarization and aggregation at LA.
I would like to know if it leads to high memory consumption or not ? Does it depend on intermediate table data or not?
If so, what is the right way to write some sort of loop to fetch small datasets iteratively and then combine data at powerBI in a single table?
Solved! Go to Solution.
@V55 ,
>>My objective is I would like to avoid any heavy operation (in terms of memory due to 1GB dataset limitation) performed at powerBI
There is no data volume limitation for a load for either DirectQuery or Import. We will meet the limitation for dataset when we publish a pbix file over 1GB to Power BI Service.
When Power Query run the above query, it will actually takes more memory, but after the data imported into dataset, Power Query will release this part of memory. The memory used by Power Query will also not affect the size of dataset.
Generally, to make a small pbix file and improve performance in Power BI report, please follow the guide in this article to load data and optimize report.
Regards,
Lydia
@V55 ,
>>My objective is I would like to avoid any heavy operation (in terms of memory due to 1GB dataset limitation) performed at powerBI
There is no data volume limitation for a load for either DirectQuery or Import. We will meet the limitation for dataset when we publish a pbix file over 1GB to Power BI Service.
When Power Query run the above query, it will actually takes more memory, but after the data imported into dataset, Power Query will release this part of memory. The memory used by Power Query will also not affect the size of dataset.
Generally, to make a small pbix file and improve performance in Power BI report, please follow the guide in this article to load data and optimize report.
Regards,
Lydia
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code FABINSIDER for a $400 discount!
Check out the February 2025 Power BI update to learn about new features.
User | Count |
---|---|
31 | |
31 | |
20 | |
15 | |
12 |
User | Count |
---|---|
21 | |
20 | |
16 | |
10 | |
9 |