Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
kumar9999
Frequent Visitor

Data Read from AWS Redshift(Encrypted-->Decrypted) into Power bi takes too much time & fails to load

Hello,

 

We are trying to read from a Redshift View into Power bi. The redshift view has 40 columns in which 20 of them are PII, they were encrypted in the base table, and the DB team had to apply masking policies to each of the 20 columns to de-crypt the data.

 

When we try to read it from power bi desktop using the Redshift connector, The preview loading time is around 15 minutes. And once we close and apply, if we are using direct query, it takes forever to load the columns into a simple table visual..the data load is very slow and as more the columns are added, the visual failes with "unexpected error". 

 

I also tried to first select a slicer and limit to single select, and add the table next, and once i start adding the columns into the table visual, which is already filtered on the slicer, it still takes lot of time and fails on me.

 

Data Volume: 10.5 million rows. If decrypting logic comes into account, then 10.5 million * 20 = 200 million(power bi read).

 

What i want to know, is there a better way to do this? Did anyone read decrypted data from redshift into power bi in such data volumes?  Appreciate any help. TIA

1 ACCEPTED SOLUTION
Anonymous
Not applicable

HI @kumar9999,

Currently I haven't found any better suggestions to handle with this scenario.
Perhaps you can try to export and migrate some of cold data to a temporary data source, then you can getting data from these two type of data source to reduced decrypted data amount and improve performance.

Regards,

Xiaoxin Sheng

View solution in original post

2 REPLIES 2

Hi @kumar9999,

First of all, it's important to limit your data at the backend—meaning in your Redshift queries. Make sure you apply the necessary filters in the queries themselves to reduce the amount of data being retrieved.

Additionally, check the Redshift plan you are using, as the data retrieval rate also depends on the plan. Upgrading your plan can improve performance.

In the Query Editor, keep things simple and avoid complex logic. Apply basic filters only, including setting the date filter to limit data up to the month level. You can also utilize the Query Reduction option to optimize query execution.

Make sure you apply all filters using the Performance Analyzer to ensure better performance.

From the Report Configuration, try the following steps:

  • Increase the cache and memory limits.
  • Increase parallel processing to speed up data processing.
  • Uncheck Auto Date and Time.
  • Disable Preview Load to reduce unnecessary loading time.

I hope these adjustments help!

Thanks.

Anonymous
Not applicable

HI @kumar9999,

Currently I haven't found any better suggestions to handle with this scenario.
Perhaps you can try to export and migrate some of cold data to a temporary data source, then you can getting data from these two type of data source to reduced decrypted data amount and improve performance.

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.