Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
MP_123
Microsoft Employee
Microsoft Employee

filter on import data from ssas

hi,

 

i have a problem that my cube is very very big so i can't import the whole data to PBI (i don't want to connect live because i want to be able to add measures, formatting and stuff)

how can i filter the data before i'm importing it? i know i can edit the 'M' but it's after the importing, right? so i can edit the query but it still not on the whole data...

 

is someone have an idea to solve my problem? thank you very much!

9 REPLIES 9
Anonymous
Not applicable

As is, this procedure is not really correct. It's flawed yes.

 

You have to import and then filter out data OR as I noticed.

 

You can import small amount of data, filter it and then add the remaining columns or dimensions continuing to filter as you go.

 

It's messy, but effective.

 

On another note.

PBI team: Please provide pre-filters for this? 🙂

MP_123
Microsoft Employee
Microsoft Employee

@Anonymous

thanks!

 

do you mean by "table.selectRows"?

because it doesnt work for me...

 

Anonymous
Not applicable

1) Get data - Analysis Services

2) enter your servers name etc

3) Assuming you're importing data (not live connection): Select your timedimension and simply load it all.

4) Edit Queries and filter the year to this year for example. 

5) On your right in Query settings under Applied Steps - click on the gear sign right to "Added Item" and simply add the dimensions you need and Load them each time. Filter as much as you can each time as well.

 

For me this works better than loading all in one which either takes way too long or doesnt work at all.

 

This is something that we need to be able to pre-filter before loading IMO.

 

Hope this helps

 

Hi @Anonymous

 

I've just seen your comment on the OP's issue and thought this looked like a great solution, however when I try this the filters are removed when adding fields so it doesn't make any impact. The reason being is that when clicking the 'cog' you're effectively going back a step to the pre-filter view.

 

Am I missing something?

 

Thank you

Chris

Hi

 

I wondered if you found a solution to your problem?

 

I have the same issue...i need to filter data from the SSAS cube at source as there are far too many rows to pull into PBI

MP_123
Microsoft Employee
Microsoft Employee

@Anonymous

thank you for your fast answer

i wanted to filter the data due to memory settings, and after i filtered its still having memory issue:

 

DataSource.Error: AnalysisServices: The Auto Exist operation is exceeding the permitted memory limit and is being canceled.
Details:
DataSourceKind=AnalysisServices

 

do you have an idea to how can i bring the data to pbi?

Vicky_Song
Impactful Individual
Impactful Individual

Another thing you need to take into consideration is that "There is a 250 MB limit per dataset that is imported into Power BI". So as other community memebers figured out, you need to define MDX or DAX query to set the imported dataset size is not bigger than 250 MB.

This is a bit counterintuitive, but in order to reduce the number of rows/columns to be returned from your Server/cube you have to write the filter-steps in M. In many cases the query will fold back to the server (this is the technical term for letting the server to the pre-selections).

 

In addition to that, M is partially lazy evaluating, so even in "normal" queries, the commands wouldn't be executed as you read the code but once they are needed for the result (and the engine thinks it's best 🙂 ).

 

However, there are limitations that you can read here.

If you want to pass filter-tables to your SSAS model, you should have a look here (also including the syntax on how to pass simple filter arguments into your MDX-statements).

More useful info about querying SSAS-cubes from Chris Webb here.

Imke Feldmann (The BIccountant)

If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!

How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries

Anonymous
Not applicable

Do you have high number of columns ? Try to limit them as you can.

 

Now I'm thinking you need to write your own MDX formula for your desired query to do this.

 

Never received memory limitations working with cubes before myself.

 

I'm all out of answers really... Good luck @MP_123

Helpful resources

Announcements
July 2024 Power BI Update

Power BI Monthly Update - July 2024

Check out the July 2024 Power BI update to learn about new features.

July Newsletter

Fabric Community Update - July 2024

Find out what's new and trending in the Fabric Community.