The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello,
We are trying to read from a Redshift View into Power bi. The redshift view has 40 columns in which 20 of them are PII, they were encrypted in the base table, and the DB team had to apply masking policies to each of the 20 columns to de-crypt the data.
When we try to read it from power bi desktop using the Redshift connector, The preview loading time is around 15 minutes. And once we close and apply, if we are using direct query, it takes forever to load the columns into a simple table visual..the data load is very slow and as more the columns are added, the visual failes with "unexpected error".
I also tried to first select a slicer and limit to single select, and add the table next, and once i start adding the columns into the table visual, which is already filtered on the slicer, it still takes lot of time and fails on me.
Data Volume: 10.5 million rows. If decrypting logic comes into account, then 10.5 million * 20 = 200 million(power bi read).
What i want to know, is there a better way to do this? Did anyone read decrypted data from redshift into power bi in such data volumes? Appreciate any help. TIA
Solved! Go to Solution.
HI @kumar9999,
Currently I haven't found any better suggestions to handle with this scenario.
Perhaps you can try to export and migrate some of cold data to a temporary data source, then you can getting data from these two type of data source to reduced decrypted data amount and improve performance.
Regards,
Xiaoxin Sheng
Hi @kumar9999,
First of all, it's important to limit your data at the backend—meaning in your Redshift queries. Make sure you apply the necessary filters in the queries themselves to reduce the amount of data being retrieved.
Additionally, check the Redshift plan you are using, as the data retrieval rate also depends on the plan. Upgrading your plan can improve performance.
In the Query Editor, keep things simple and avoid complex logic. Apply basic filters only, including setting the date filter to limit data up to the month level. You can also utilize the Query Reduction option to optimize query execution.
Make sure you apply all filters using the Performance Analyzer to ensure better performance.
From the Report Configuration, try the following steps:
I hope these adjustments help!
Thanks.
HI @kumar9999,
Currently I haven't found any better suggestions to handle with this scenario.
Perhaps you can try to export and migrate some of cold data to a temporary data source, then you can getting data from these two type of data source to reduced decrypted data amount and improve performance.
Regards,
Xiaoxin Sheng
User | Count |
---|---|
79 | |
78 | |
37 | |
33 | |
31 |
User | Count |
---|---|
93 | |
81 | |
59 | |
49 | |
48 |