The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
I found this post that is similar to my issue: https://community.fabric.microsoft.com/t5/Issues/1000000-row-limit-error-when-viewing-report-using-R...1
I have a rather large semantic model with 20 dimensions and an extremely large detail fact table. Whether or not I use aggregations when the fact table hits direct query it returns records just fine.
However, if I am using RLS a direct query visual that is pulling an ID field from the fact table and then a field from a connected dimension (or a measure) will return a million record error.
However, if I remove RLS and just view the data with 0 filters it returns just fine.
My RLS is on most dimensions and is simply: ID = int(userprincipalname)
I am having a similar issue. Were you able to resolve it?
Sadly not the way you are probably hoping.
I found out that Power BI RLS pulls ALL of the data into your model and then applies RLS. Instead of sending the RLS query to the source and only pulling what is needed.
I tried several methods to get it to send the request to the server first but nothing worked.
So what we ended up doing was probably a little more complex but works great for us.
We turned the model into an import model. Then we scripted the model out to XMLA so we had the XMLA script (login to SQL Server, connect as Analysis DB, right click and script as...).
Took that script and put in variables in places like name/id/parameters.
Used XMLA endpoint and C# to code a method that basically allows us to split our model out between clients however we see fit. So basically we can put 1 large client in their own model, 5 small ones in a shared model that has RLS. Used the REST API scripts extensively for this.
Ended up being a lot more work than what I originally tried to do but it is working like a charm.
Good luck ,if you do find an answer please respond back... would be nice to know for the future!
Hi @Don-Bot - When using Row-Level Security (RLS) in Power BI with a large semantic model and a detailed fact table, encountering performance issues or errors like "a million record error"
This is due to the additional filters and complexity that RLS introduces.
you can check on optimize the filters ID = int(userprincipalname)
Since the issue does not occur without RLS, compare the performance of queries with and without RLS. This can help identify specific elements of the RLS logic causing performance issues.
[ID] IN (
SELECT [ID]
FROM [Users]
WHERE [Email] = USERPRINCIPALNAME()
)
check the all other criterias like model optimization, cardinality check.
Proud to be a Super User! | |
Hi @rajendraongole1 , stupid question for you,
How do I check the performance of my RLS queries? Especially since they error out and I can't get the dax from them to use in dax studio?
Also, what is the query/measure you posted meant for?
User | Count |
---|---|
56 | |
54 | |
53 | |
47 | |
30 |
User | Count |
---|---|
175 | |
88 | |
69 | |
48 | |
47 |