The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
We have an embedded powerbi solution in a .NET web app that allows users to visualize their data on-demand. We process data and have an in-memory data table within the app. We used to push this data to a powerbi push dataset but wanted extended functionality like localization of metadata and measures in our dataset so we switched to using XMLA endpoint-based Tabular Object Model datasets. To push data to the dataset we have been hardcoding the string JSON rows in M queries and creating the partitions, this has been bad for performance since the data refresh is slow and increases with the number of partitions that we push.
Was hoping to get some advice on how to get the data into TOM datasets such that it is available quickly / has better data loading performance, similar to how it is with push datasets when pushing rows using the PowerBI Rest API.
M expression being used in partitions:
let Source = Text.ToBinary(\"{json}\"), ParsedJson = Json.Document(Source), JsonToTable = Table.FromRecords(ParsedJson)
in JsonToTable
It's great that you've migrated to using XMLA endpoint-based Tabular Object Model (TOM) datasets for extended functionality in your Power BI solution. However, the performance issues you're experiencing with slow data refresh times can be addressed. Here are some tips to improve data loading performance:
Batch Data Loading:
Optimize Data Transformation:
Data Compression:
Data Modeling:
Aggregation Tables:
Use Partitions Efficiently:
Parallel Data Loading:
Incremental Loading:
Hardware and Resources:
Monitoring and Profiling:
Consider Automation:
By implementing these best practices, you should be able to significantly improve the data loading performance of your Tabular Object Model datasets in Power BI. It's important to continuously monitor and optimize your data loading process as your data and usage patterns evolve.