Hit Reply and let us know what you think of the DirectQuery for Power BI datasets and Azure Analysis Services. To learn more about this feature, please visit this blog post or our documentation.
Here are some areas that we'd like to hear about in particular:
Thanks and we look forward to hearing your feedback!
- The Power BI Modeling Team
thanks for the feedback, this is something we're planning to change: just 'read' permissions will be required when we make this feature generally available.
we are still waiting on infrastructure changes to happen in the datacenters across the globe before we can make this available to PRO.
Yes! Weird to give changing priviledge to someone would be only reading the dataset outside the original report - a Power BI thin report or in Excel.
Getting this error in the service after days of developing in desktop with no issue/warning in local refreshes. Very discouraging and wouldn't recommend to anyone to develop in desktop using this feature. I have to now import the datasets I am using instead. I have learned not to use any features in preview going forward if I want to manage expectations well. Preview???? NO THANKS.
I understand that this is annoying - we don't actively stop you from doing this in Desktop as we don't know if you are going to publish to the Service. So instead of "breaking" Desktop for those that never publish to the Service, we allow you to do it and then it breaks when you publish to the Service. If we warned for everything that is not going to work in the Service that would be a very jarring experience and get in the way a lot while it might not even be relevant for you. Please see https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-directquery-datasets-azure-analysis-s... for a list of limitations in this preview, your exact scenario is listed as well.
It would be great if there were an officially sanctioned and clear approach (or more than one) to managing large datasets (+30GB) using directquery (or not). This is especially the case because it seems like directquery is currently the best way to deal with huge datasets in PBI. Haven't been able to load huge tables to dataflows, datamart, or blob without timeout errors, etc. I am fully in the MS ecosystem, so I connect to Azure SQL Database using directquery. The tables in the DB are indexed (columnstore) and I'm using auto aggregation. I'm pretty sure that's all I can do (or mostly all), but I'm not entirely sure. Because it's a little bit of witchcraft. Would love a video or post like this: dealing with huge datasets from SQL Server et al to PBI.
This thread isn't related to large models (import or direct query). This is related to connecting to Analysis Services in a Direct Query mode rather than a Live Connection mode.
It looks like there are a number of posts under the desktop section of the sight that maybe helpful:
As a Consultant myself working with large models (primarily against Snowflake), the advice is to import the data into the model in the first place. with large data this requires implementing incremental refresh. (Note that a P1 capacity has a model size limit of 25GB Capacity and SKUs in Power BI embedded analytics - Power BI | Microsoft Docs.)
If you can't scale up the capacity, or reduce data size through optimisation and rationalisation, then utilise direct query with aggregations; auto-aggregations (which may be helpful dependant on the repetitive nature of the queries) or manual aggregations.
There is plenty of material on these topics if you Bing search. GuyInACube has some good material as a starting point:
Thanks for adding this feature. Really useful; however we are having some issues at my company.
As a tenant admin, connecting to a PBI dataset (switched to DirectQuery) + an Excel file is working fine.
But for my colleague (not an admin, but workspace admin), he is getting an error
"Your credentials could not be authenticated: "Credentials are missing." altough everything is set up correctly in the source (we tried with several kinds). He also enabled the feature in Power BI desktop.
I checked all the tenant admin settings (mentionned here) too.
Do you have any idea on what could cause the problem ?
Thanks for your help
sounds like a permission problem or credentials not being configured for the source. Have you checked the source settings in the Service?
I'm having trouble developing the composite models.
I connect to a powerbi service dataset and create a model in direct query.
I then enabled DirectQuery for PowerBi datasets and the AS option.
Despite this, it doesn't give me the option to connect to any other file.
Also, I went to External Tools and tried to open the Tabular Editor and it gives me an error: The remote server returned an error: (401) Unauthorized
Am I doing something wrong?
Do I need to enable anything else for composite models to work?
The report is published with a free account, I do not have a pro or premium.
Also, it is published on my workspace.
from the documentation: "
Using DirectQuery on datasets from “My workspace” isn't currently supported.": https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-directquery-datasets-azure-analysis-s...
This feature could support SQL Server Analysis Services 2022 now. It's great !
Is it possible to support SQL Server Analysis Services 2019 in the future ?
Thank you for the reply.
But it's a pity because SSAS 2019 is the latest official version for now.
The on-premises users have to wait next year at least to use this fonctionality. There is also a migration to do.
Anyway it's a very useful fonctionality. Thank you for the work.
Check out the November 2023 Power BI update to learn about new features.
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.
130+ sessions, 130+ speakers, Product managers, MVPs, and experts. All about Power BI and Fabric. Attend online or watch the recordings.