Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
Anonymous
Not applicable

SSAS Import Removing Duplicates

Hello everyone,

 

I am currently importing our SSAS Model to Power BI as we need to do some development that is outside the scope of live connection. Each table has been importand individually and I've connected the data model.

 

We are only using a couple of columns from the fact table as it is huge. The problem is that many of the rows are the now the same and PowerQuery is automatically removing duplicates. I can prove this because when I import the PK, the SUM of the rows increase to the correct number. Unfortuntely, as expected, importing the PK increases the file size by 10x. 

 

Has anyone experienced this before? Is there any workaround? I do not believe PowerQuery is designed to automatically remove duplicates so I'm a bit stuck.

 

Cheers!

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hey @v-kkf-msft,

 

Thanks for the response. I tried this but it also did not work.

 

I ended up bringing in the data using SQL that brought in all the data with duplicates. It also loaded significantly faster so I'll stick with that approach.

 

Thanks!

View solution in original post

2 REPLIES 2
v-kkf-msft
Community Support
Community Support

Hi @Anonymous ,

 

Have you tried importing the pk column into power bi at the same time, and then deleting that PK column?

 

vkkfmsft_0-1638761307556.png    vkkfmsft_1-1638761353418.png

 

If the problem is still not resolved, please provide detailed error information or the expected result you expect. Let me know immediately, looking forward to your reply.

Best Regards,
Winniz

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

Anonymous
Not applicable

Hey @v-kkf-msft,

 

Thanks for the response. I tried this but it also did not work.

 

I ended up bringing in the data using SQL that brought in all the data with duplicates. It also loaded significantly faster so I'll stick with that approach.

 

Thanks!

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.