The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Hi.
This is my first post here and i couldn't find this issue anywhere.
I have implemented a PBI Embedded solution for our external customers. It serves about 100k users.
It is a solution in a composit model with RLS. Main PBI report uses only aggregated data in a imported model. Navigation here is good and responsive even though the datasize is pretty huge.
My problem starts when the users is supposed to drillthrough to a report page that is on a Direct Query agains a detaild Azure SQL DB containing about 1 billion rows.
Running this report through Desktop and PBI reporting service it performs super fast, and returns the detailed report super fast. The underlying DB is tuned and indexed with the correct indexes. Then i switch over and runs this as a "customer" through our customer portal.
First of all, Embedded does change the SQLs sent to the DB compared to the 2 other ways of running this report. Guess this has to do with quey folding and not doing it through Embedded.
So instead of running the queries i have tuned for it splits up and runs a few extra queries. OK, i'll tune for those too, and by themselves i manage to get them to perform well.
Now i hit the actual problem. For som wierd reason the embedded solution generates an extra script that produces a matrix of all possible combinations of 3 of the columns(attributes) for the transaction. And it doesn't bother looking at any other limitations as a where clause other than what is spesified as the RLS. So it returns behind the scene several million combinations even though i only want 1 of those combinations. The drillthrough already has this limitations, and it works fine for Desktop and Service. The where clause is complete for those 2. But Embedded makes this extra really massive list, then it runs a version of the 2 original queries. ! query for the sum field wich is fast, an d one for the actual transaction list wich can return all from 1 to about 150k rows. This is also fast. Then he limits the 1 really heavy matrix result to match to the transaction list, and return the table i want. This table can then then be exported or just checked for specific event dependant on user demand.
How can i get embedded to skip the matrix generating step which ruins the whole solution. Without it this will all perform slmost instant, even for huge lists.
I have read something about, if i can set the drillthrough to set filters on the visual it limits the query, but i am only able to set the drillthrough filters to be applied to the page.
Anyone that has experienced this and have a solution. I need to get the response time within acceptable duration, and i feel it is doable, but cant find a way to affect the direct query script.
Solved! Go to Solution.
Hi.
This was a bit of a headace to get it to work as it should. I have now managed to fins a solution that is working with a acceptable responsetime.
Case:
Get decent performance on a Power BI Embedded solution embedded through a frame on our company portal for about 100k external users.
PBI report uses a Hybrid model of aggregated data and some dimensions that is a imported dataset. Contains 2 table of about 15 million rows, and a few dimesions like date, merchant, transactiontype etc.
The problem was that from the aggredated data the users should be able to drill down to detailed transaction level. This drillthrough is to a Direct Query against a Azure SQL DB containing about 1 billion rows with fairly detailed information. And from there the users should be able to download the details to a file for import into their reconsiliation systems.
Problem:
I was not able to get the Embedded report to perform to acceptable performance levels. Everything worked nice and very responsive through Power BI Desktop and PBI Reporting Services. It was easy to get it to work through those solutions since the scripts sent to the DB was simple and PBI did what was exptected with query folding and sent limited amount of sqls down to the DB. Problem was when it was ran throught the frame in our company portal. This report also uses Row Level Sequrity to add another level of complexity.
There where several issues here. First it did not send the same queries as it did through the service and desktop. It sent several queries and returned insane amount of data that wasn't needed in the report.
I found that the main reason for this generation of unnessasary queries was due to the fact that in the list on the Direct Query report i had used a few dimensional fields. This causes the engine to create several queries to create the list. I created a massive matrix of all possible combinations of the dimension data which it pulled from the DB, even though it only needed 1 combination which was already defined in the drillthrough filters. It used this matrix to filter against the filters set in the report, but this killed the performance, and it was really hard to tune the indexes correctly due to the several combinations of scripts sent, bet the inmodel work was what killed performance.
Solution:
I ended up trimming the data in the Direct Query source. This is always a good solution, but i also had to add a few fields here.
The solution was to force PBI to send only 1 query and actually listen to the filters you have set in the drillthrough to be used. I managed to change the driect query report to ONLY use fields from the DQ source. By doing this i was able to get PBI to only send 1 query even when ran through embedded. Now i was able to tune the indexes perfectly. Now the users are able to serach through detailed transactions, produce lists connected to certain settlement sessions, or produce a list af all transactions for a given month. Merchants with only a few thousand rows gets instant response, and merchants with around 100k transactions gets responce on 2-3 sec.
So to be able to solve this issue, be sure to only use data from one datasource, and especially not mix with the imported model.
@v-bmanikante suggested splitting into 2 reports. This would in theroy have fixed this problem since the DQ report would probably only had data from that datasource. But i was reluctant to do this since the report already had pased PEN test, and didn't want to hire in a new PEN test team due to implementation of a second report. And not sure the URL query string filter would have passed PEN test.
Hope this gives a few hints for others with similar issues to know what to look for in their solution.
Regards,
Kjetil Blomsøy
Hi @KjetilAB ,
Thank you for reaching out to Microsoft Fabric Community Forum.
Since the report works fine in Power BI Desktop and Service, the embedded version generates extra queries, including a heavy matrix of all combinations, which slows down performance,
You can try the below workarounds which help to improve performance.
Optimization guide for Power BI - Power BI | Microsoft Learn
Monitor report performance in Power BI - Power BI | Microsoft Learn
Best practices for faster performance in Power BI embedded analytics - Power BI | Microsoft Learn
Troubleshoot DirectQuery models in Power BI Desktop - Power BI | Microsoft Learn
Use DirectQuery in Power BI Desktop - Power BI | Microsoft Learn
Similar issues:
Matrix Performance issue in Direct query - Microsoft Fabric Community
Solved: How drill through works with direct query - Microsoft Fabric Community
If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it! |
Regards,
B Manikanteswara Reddy
Hi. Thanks for providing some tips, but i have tested most of these.
-
So unless i am able to remove that generate a "matrix" result for data which is not relevant, i-m not sure how to solve this.
Hi @KjetilAB ,
Yes, you're right,
While drill through filters are automatically applied at the page level, you can try this workaround:
Like below:
Filtered Transactions =
FILTER (
'TransactionTable',
'TransactionTable'[CustomerID] = SELECTEDVALUE('DrillthroughParameter'[CustomerID])
)
As a more architectural approach, consider breaking the drill through page into a separate report, and link to it through a URL with query string filters. This avoids embedding the filters in the page model and allows the separate report to start clean, filtering visuals directly. You can preload only what’s needed, improving performance drastically for Direct Query scenarios with large datasets.
Since this behaviour is specific to Embedded, and you’re observing material differences in SQL generation compared to Service/Desktop, this may be a case where Microsoft support should be looped in , especially if the query structure generated by Embedded disregards visual/page filter scope. They can assist with telemetry traces and possibly flag it for engineering.
https://learn.microsoft.com/en-us/power-bi/support/create-support-ticket
If this post helps, then please consider Accepting as solution to help the other members find it more quickly, don't forget to give a "Kudos" – I’d truly appreciate it! |
Regards,
B Manikanteswara Reddy
Hi @KjetilAB ,
We wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Please don't forget to give a "Kudos |
Regards,
B Manikanteswara Reddy
Hi @KjetilAB ,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Please don't forget to give a "Kudos |
Regards,
B Manikanteswara Reddy
Hi @v-bmanikante
Sorry for not answering sooner, but i have been trying the different tips you provided.
Sorry to say, but none of them have helped me.
The tips about a disconnected table(calcultes table). I was not able to do this since the calculated table is calculated on process time, and the parameters is set on the fly. Was not able to use this, unless i did it massivly wrong.
I was able to create e version of the Filtered transaction measure and put it into the visual filter. Problem is that when ran through the embedded frame the Direct Query SQLs provided stil doesn't limit the result. It still generates the massive matrix result which is really the one that sloves this down.
I am now trying to slim down the model to a bare minimum to se if that helps with compression and splitting the date and time fields. I will provide a feedback here when i find what works, cus i haven't given up yet.
The biggest problem is that you tune the database for the report when you create it and run it through desktop or Power bi service, but when ran through the frame as embedded it provides totaly different sql, and it produses more SQLs.
Hi @KjetilAB ,
Thank you so much for the detailed follow up, we really appreciate you for taking the time to try out the suggestions despite your busy schedule.
You're absolutely on the right track by slimming down the model and experimenting with separating the date and time fields. These optimizations often help with performance under embedding scenarios.
Please do keep us posted on your progress , your findings could be incredibly valuable for others facing similar issues. And if you manage to find a working solution or workaround, we'd be happy to help you refine it further.
Looking forward to hearing back from you!
Please don't forget to give a "Kudos |
Regards,
B Manikanteswara Reddy
Hi.
This was a bit of a headace to get it to work as it should. I have now managed to fins a solution that is working with a acceptable responsetime.
Case:
Get decent performance on a Power BI Embedded solution embedded through a frame on our company portal for about 100k external users.
PBI report uses a Hybrid model of aggregated data and some dimensions that is a imported dataset. Contains 2 table of about 15 million rows, and a few dimesions like date, merchant, transactiontype etc.
The problem was that from the aggredated data the users should be able to drill down to detailed transaction level. This drillthrough is to a Direct Query against a Azure SQL DB containing about 1 billion rows with fairly detailed information. And from there the users should be able to download the details to a file for import into their reconsiliation systems.
Problem:
I was not able to get the Embedded report to perform to acceptable performance levels. Everything worked nice and very responsive through Power BI Desktop and PBI Reporting Services. It was easy to get it to work through those solutions since the scripts sent to the DB was simple and PBI did what was exptected with query folding and sent limited amount of sqls down to the DB. Problem was when it was ran throught the frame in our company portal. This report also uses Row Level Sequrity to add another level of complexity.
There where several issues here. First it did not send the same queries as it did through the service and desktop. It sent several queries and returned insane amount of data that wasn't needed in the report.
I found that the main reason for this generation of unnessasary queries was due to the fact that in the list on the Direct Query report i had used a few dimensional fields. This causes the engine to create several queries to create the list. I created a massive matrix of all possible combinations of the dimension data which it pulled from the DB, even though it only needed 1 combination which was already defined in the drillthrough filters. It used this matrix to filter against the filters set in the report, but this killed the performance, and it was really hard to tune the indexes correctly due to the several combinations of scripts sent, bet the inmodel work was what killed performance.
Solution:
I ended up trimming the data in the Direct Query source. This is always a good solution, but i also had to add a few fields here.
The solution was to force PBI to send only 1 query and actually listen to the filters you have set in the drillthrough to be used. I managed to change the driect query report to ONLY use fields from the DQ source. By doing this i was able to get PBI to only send 1 query even when ran through embedded. Now i was able to tune the indexes perfectly. Now the users are able to serach through detailed transactions, produce lists connected to certain settlement sessions, or produce a list af all transactions for a given month. Merchants with only a few thousand rows gets instant response, and merchants with around 100k transactions gets responce on 2-3 sec.
So to be able to solve this issue, be sure to only use data from one datasource, and especially not mix with the imported model.
@v-bmanikante suggested splitting into 2 reports. This would in theroy have fixed this problem since the DQ report would probably only had data from that datasource. But i was reluctant to do this since the report already had pased PEN test, and didn't want to hire in a new PEN test team due to implementation of a second report. And not sure the URL query string filter would have passed PEN test.
Hope this gives a few hints for others with similar issues to know what to look for in their solution.
Regards,
Kjetil Blomsøy