Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
TotalAnonymous
Helper II
Helper II

Error Querying from Analysis Services Source

Dear Community, 

I hope you're doing well. 
Currently, I tried to pull data from Analysis Services using import mode and using DAX as a native query. However, I'm facing an issue as follows. May I know what the error means? Have you experienced this one? 

TotalAnonymous_0-1744163723029.png



I'm glad to hear your input/suggestions and really appreciate it. Thank you

1 ACCEPTED SOLUTION

Hi @TotalAnonymous , Thank you for reaching out to the Microsoft Community Forum.

 

Yes, it is optimized enough for your scenario. With a small dataset, performance should be efficient if you use a unique key in “Manage Relationships” and pre-filter the SharePoint data in Power Query to remove unused rows. Test this with SELECTCOLUMNS(TOPN(1000, 'Product Master'), "Product ID", 'Product Master'[Product ID], "Name", 'Product Master'[Name]) and monitor refresh times. If latency occurs, importing the SharePoint table or enabling incremental refresh on Analysis Services with RangeStart/RangeEnd (admin action) can further optimize.

 

“Using model measures” means utilizing pre-defined calculations in your Analysis Services model, like [Net Secondary Quantity - Invoiced (EA)], which you can find in the Power BI Fields pane. Instead of writing new DAX, drag these measures into visuals to leverage server-side processing. This avoids cross-source DAX issues with SharePoint, ensuring efficiency. If needed ask your admin to add measures like TotalQuantityAggregated := CALCULATE(SUM('Secondary Sales Invoices'[Net Secondary Quantity - Invoiced (EA)]), ALL('Date')).

 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

View solution in original post

13 REPLIES 13
v-hashadapu
Community Support
Community Support

Hi @TotalAnonymous , Please let us know if your issue is solved. If it is, consider marking the answer that helped 'Accept as Solution', so others with similar queries can find it easily. If not, please share the details.
Thank you.

v-hashadapu
Community Support
Community Support

Hi @TotalAnonymous , Please let us know if your issue is solved. If it is, consider marking the answer that helped 'Accept as Solution', so others with similar queries can find it easily. If not, please share the details.
Thank you.

v-hashadapu
Community Support
Community Support

Hi @TotalAnonymous , Thank you for reaching out to the Microsoft Community Forum.

 

“This query uses more memory than the configured limit" (40,960 MB) indicates your DAX query exceeds the per-query memory cap during Power Query import, likely due to high data volume and complex calculations.

 

Please try below:

Replace the date filter in SUMMARIZECOLUMNS with below then test and expand gradually if successful.
FILTER('Date', 'Date'[Date] >= DATE(YEAR(TODAY()), MONTH(TODAY()) - 3, 1))

 

Ask your admin to create an aggregated table (Adjust filters as needed)
TotalQuantityAggregated := CALCULATE(SUM('Secondary Sales Invoices'[Net Secondary Quantity - Invoiced (EA)]) + SUM('Secondary Sales Invoices'[Quantity Sellout Return Eaches]), ALL('Date'))

 

Process one MRA at a time, run separate queries and combine in Power BI. Example:
FILTER('Management Responsibility Area', 'Management Responsibility Area'[MRA Code] = "IID0201")

 

Run this to isolate the issue: EVALUATE TOPN(1000, SELECTCOLUMNS('Secondary Sales Invoices', "Product SKU Code", 'Secondary Sales Invoices'[Product SKU Code], "Invoice Date", 'Secondary Sales Invoices'[Invoice Date]))

 

Start with the 3-month date range test. If it works, scale up while monitoring memory. If issues persist, collaborate with your admin to pre-aggregate data. If needed, request your admin to raise the memory limit

 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

Hi, @v-hashadapu , 

Thanks for your detailed explanation. Please allow me to check and get back to you ASAP 🙂

Hi @TotalAnonymous , Happy to help. Please take your time and check if it works. If it doesn't work, please share the details. Thank you.

Thanks @v-hashadapu. Please wait...

I'm just curious because this DAX query is like Native Query right for Analysis Service, am I correct? however, this kind of issue really challenging. I'm thinking to switch into Live Connection mode but since I have another source from SharePoint and want to add it, I'm afraid it will affect the performance due to connection mode switching (Live Connection -> Direct Query) 🙂

Hi @TotalAnonymous , Thank you for reaching out to the Microsoft Community Forum.

 

When you use a native DAX query in Power Query against Analysis Services, you're submitting a direct command to the SSAS engine, like a native SQL query, bypassing the usual metadata layer and returning a fixed result set defined by your DAX. So yes, you’re correct, it’s effectively a "native query" for SSAS. In Import mode, Power BI must fully load the entire result into memory during refresh with no lazy loading, which is why your query hits the 40,960 MB memory limit, likely due to high cardinality from SUMMARIZECOLUMNS and complex measures (e.g., [Net Secondary Quantity - Invoiced (EA)] + SUM(...)) across large tables like 'Secondary Sales Invoices'.

 

Switching to Live Connection mode keeps query execution on the Analysis Services server, potentially bypassing the memory limit since Power BI doesn’t load data upfront—test this first by switching to Live Connection and monitoring performance with SELECTCOLUMNS(TOPN(1000, 'Secondary Sales Invoices'), "Product SKU Code", 'Secondary Sales Invoices'[Product SKU Code], "Invoice Date", 'Secondary Sales Invoices'[Invoice Date]).

 

Adding SharePoint data complicates things, as Live Connection works only with Analysis Services, requiring DirectQuery or Import mode for SharePoint; mixing modes is supported via composite models, but DirectQuery may slow queries, especially with large or unstructured SharePoint data, causing latency when merging results. Start with a small, filtered SharePoint dataset in DirectQuery, link it to Analysis Services via “Manage Relationships,” and optimize by pre-filtering SharePoint data and using model measures directly to avoid complex cross-source DAX.

 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

Hi @v-hashadapu , 

Thanks for your detailed explanation :). The SharePoint dataset that I want to incorporate with Analysis Service is a Product Master/Mapping that has different structure/column with the Product Master in Analysis Service. The Sharepoint Product Master have 4 columns.

If I change the mode into Direct Query and link this product master to Analysis Service table (Product Master Sharepoint to Product Master Analysis Service (1:1 Relationship)), is it optimized enough? Also, could you please explain more about "Using Model Measures" that you've mentioned?

Thank you

Thanks for your answer! I will mark it as solution first. There is another option, which is using Virtual Machine when open up Power Query. Our admin said that instead of open Power Query in desktop, I need to try open it in Virtual Machine. I'm trying right now. Hopefully, it can solve it. 

Hi @TotalAnonymous , Thanks for the update. Hope it works for you.

Hi @TotalAnonymous , Thank you for reaching out to the Microsoft Community Forum.

 

Yes, it is optimized enough for your scenario. With a small dataset, performance should be efficient if you use a unique key in “Manage Relationships” and pre-filter the SharePoint data in Power Query to remove unused rows. Test this with SELECTCOLUMNS(TOPN(1000, 'Product Master'), "Product ID", 'Product Master'[Product ID], "Name", 'Product Master'[Name]) and monitor refresh times. If latency occurs, importing the SharePoint table or enabling incremental refresh on Analysis Services with RangeStart/RangeEnd (admin action) can further optimize.

 

“Using model measures” means utilizing pre-defined calculations in your Analysis Services model, like [Net Secondary Quantity - Invoiced (EA)], which you can find in the Power BI Fields pane. Instead of writing new DAX, drag these measures into visuals to leverage server-side processing. This avoids cross-source DAX issues with SharePoint, ensuring efficiency. If needed ask your admin to add measures like TotalQuantityAggregated := CALCULATE(SUM('Secondary Sales Invoices'[Net Secondary Quantity - Invoiced (EA)]), ALL('Date')).

 

If this helped solve the issue, please consider marking it 'Accept as Solution' so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.

TotalAnonymous
Helper II
Helper II

Hi, @Cookistador 

Thanks for your feedback. Really appreciate it. 
In this case, I use the DAX query as follows:

EVALUATE
SELECTCOLUMNS(
    SUMMARIZECOLUMNS(
        'Date'[Date],
        'Customer'[Distributor Code],
        'Store Invoice'[Channel Local],
        'Product'[Product SKU Code],
        FILTER(
            'Date',
            'Date'[Year] IN {YEAR(TODAY()), YEAR(TODAY()) - 1}
        ),
        FILTER(
            'Management Responsibility Area',
            'Management Responsibility Area'[MRA Code] IN {"IID0201", "IID0301"}
        ),
        FILTER(
            'Secondary Sales Invoices',
            'Secondary Sales Invoices'[Invoice Status] IN {"I - Confirmed", "S - Invoiced", "Unknown"}
        ),
        "Total Quantity (EA)", [Net Secondary Quantity - Invoiced (EA)] + SUM('Secondary Sales Invoices'[Quantity Sellout Return Eaches]),
        "Discount",SUM('Secondary Sales Invoices'[Amount Sellout Discount]) + SUM('Secondary Sales Invoices'[Amount Return Discount]),
        "IMS", [Net Secondary Amount - Invoiced] + [Total Returns Amount]
    ),
    "Date", [Date],
    "Distributor Code",[Distributor Code],
    "Channel", [Channel Local],
    "Product SKU Code", [Product SKU Code],
    "Total Quantity (EA)", [Total Quantity (EA)],
    "Discount", [Discount],
    "IMS", [IMS]
)

In this case, I use Selected Columns function and only choose the 7 columns. All the columns are relevant for this one fact table. This issue happens in Power Query not in Visuals....

Cookistador
Super User
Super User

As mentionned, what you are trying to import is too heavy

Just to be sure that the issue is comming from this side, can you try to only import one table ?

If it works, 

Can you try to reduce the size ?

1) Are you sure that all tables are required?

2)Only include the necessary columns in your visuals. Avoid dragging in entire tables if you only need a few columns.

3) Do you need all data or can you applied some slicers in the query ?

4) Did you have some aggregations table in your SSAS? if not, creating some of them will improve the performance

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors