Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
We have an embedded powerbi solution in a .NET web app that allows users to visualize their data on-demand. We process data and have an in-memory data table within the app. We used to push this data to a powerbi push dataset but wanted extended functionality like localization of metadata and measures in our dataset so we switched to using XMLA endpoint-based Tabular Object Model datasets. To push data to the dataset we have been hardcoding the string JSON rows in M queries and creating the partitions, this has been bad for performance since the data refresh is slow and increases with the number of partitions that we push.
Was hoping to get some advice on how to get the data into TOM datasets such that it is available quickly / has better data loading performance, similar to how it is with push datasets when pushing rows using the PowerBI Rest API.
M expression being used in partitions:
let Source = Text.ToBinary(\"{json}\"), ParsedJson = Json.Document(Source), JsonToTable = Table.FromRecords(ParsedJson)
in JsonToTable
Why would you not just use the incremental refreshing in Power BI and refresh the same partition multiple times, each time it would then import the additional data?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
34 | |
21 | |
20 | |
14 | |
12 |
User | Count |
---|---|
26 | |
18 | |
18 | |
16 | |
14 |