Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
Hi,
Currently evaluating a dashboard solution. Data source is an on-premises SQL server. Trying to explorer whether I use direct query or import for the solution. Pros and Cons etc.
for me, import will be best in performance if the data set loaded not exceeding the 1g limit (this one is a small dataset anyway). what is your experience of direct connect performance?
In general, the options are. import mode on all sources, direct connect on SQL server sources, direct connect on SSAS sources. Factors will be size of the data set etc. Any best practice and guidelines around these options will be great?
Solved! Go to Solution.
Hi @Anonymous
there are multiple factors to consider.
Frequency: with import you are limited to 8 times per day and my personal experience is, that you can't depend on times exactly. It runs in approximate time frame as close to scheduled time depending on how utilized the service is. If you need to refresh more often. You'll need to go for live connection.
Data size: 1 GB file limit for import, and even if not, how often would you like to bring data in.
Live connection limitations: If you would go for live connection against source system. It could have negative impact on source system as well as performance on aggregated data could be better if preaggregated.
Live connection against SSAS model works quite good and that's why we use it at work. However you need enterprise edition of SQL Server in background. Although SSAS Multidimensional is supported in Standard SQL, Power BI issues DAX queries against it and this is not supported in Standard (untill SQL Server 2016).
Import from SSAS would work a little strange against SSAS, because you'll need to recreate potentially existing logic in your SSAS model. For example if you insert percentage measure. This is computed on detailed level when using import and if you want correct percentage, you'll need to recalculate it in your model.
Security is another concern. If you would import data to multiple files from same datasource, you'll need to recreate logic for roles every single time. In comparisson when using live connection you set it up once on data source side
My personal preference is
Use import against SQL Server for precalculating data for better query performance
Live against SSAS not to duplicate logic on report level, that already exists in your data model on premises.
Jiri
Hi @Anonymous
there are multiple factors to consider.
Frequency: with import you are limited to 8 times per day and my personal experience is, that you can't depend on times exactly. It runs in approximate time frame as close to scheduled time depending on how utilized the service is. If you need to refresh more often. You'll need to go for live connection.
Data size: 1 GB file limit for import, and even if not, how often would you like to bring data in.
Live connection limitations: If you would go for live connection against source system. It could have negative impact on source system as well as performance on aggregated data could be better if preaggregated.
Live connection against SSAS model works quite good and that's why we use it at work. However you need enterprise edition of SQL Server in background. Although SSAS Multidimensional is supported in Standard SQL, Power BI issues DAX queries against it and this is not supported in Standard (untill SQL Server 2016).
Import from SSAS would work a little strange against SSAS, because you'll need to recreate potentially existing logic in your SSAS model. For example if you insert percentage measure. This is computed on detailed level when using import and if you want correct percentage, you'll need to recalculate it in your model.
Security is another concern. If you would import data to multiple files from same datasource, you'll need to recreate logic for roles every single time. In comparisson when using live connection you set it up once on data source side
My personal preference is
Use import against SQL Server for precalculating data for better query performance
Live against SSAS not to duplicate logic on report level, that already exists in your data model on premises.
Jiri
Makes sense to me that import from SQL, direct connect for SSAS.
refresh frequency is not an issue. Most DW systems do once a day anyway.
By the way, we use 2014 Enterprise SQL server so SSAS can be an option.
So would you prefer import from SQL source or build a SSAS cube then do direct connect for the cube? Unless size is an issue, I do not see a case why we need a build SSAS in between Power BI and SQL server.
@Anonymous Another thing to throw in to this discussion is the need for backup/recovery/change control for a corporate solution. If you have the option to build and use Tabular I would suggest you go that route for several reasons.
1) You BISM layer isn't stuck in Desktop files
2) You only need to maintain 1 model. (If you build int Desktop, you would need to maintain "x" number of copies depending on how many reports you created. For example, 5 PBIX files into your project you realize you need to add a column or calc, you have to update in 5 places rather than 1).
3) Code/model is secured and backed up on SSAS instance and Visual Studio
4) Check in / Check out policies with TFS
5) Processing doesn't seem to be a major concern for you, but having a dedicate SSAS instance to handle the load is nice.
6) Always up to date without having to manage schedules in Power BI
Live connection to Tabular is my preferred method for corporate solutions for all of the above reasons.
Thanks for the advice. Another factor that can be thrown in the equation is that whether it is a formal governed reporting or an add-hoc type analysis. Seems for the formal reporting, preference is to have the SSAS layer. But if the business is after quick cheap ad hoc type analysis (which happen more and more like this in the BI space), import mode with PBI offers the best cost effective solution,
@Anonymous
Could be wrapped up this way. Just to add, for analytical request you can do it in Power Pivot and publish it to Power BI service as well
Jiri
@Anonymous In all actuality, one of the things I love about Power BI, is that these lines are blurred from the standpoint that you don't have to silo yourself permanently into any one solution. You can use an ad-hoc solution while a longer term solution is built for certain types of "formal" reports (If it's worth it). And the code you develop in the BISM layer can be re-used at a later date.
Every area of business or departments will have their own best use case, in all of them I've been extremely satisfied with performance.
@Anonymous
for instance we have requirement for 15 minutes delay, and this can't be achieved by import. If it is not your case, good for you 🙂 and you can use import as well.
I personally prefer intermediate SSAS layer, where you can do all the calculations, logic once and reuse it across multiple reports using live connection. If you did it using import and then the calculation would change, you would need to replace all the files. This won't happen using live connection.
But both approaches have advantages and limitations. So pick the one which suits you the most for your current case.
Jiri
@Anonymous Below are few simple rules I've followed.
1. Import - choose this mechanism if dataset is relatively small in size and no issues with sensitivity of data.
2. DirectQuery - Use this if dataset is large in size and data is private and sensitive. I work in healthcare and so sometimes have to deal with data that is very sensitive and has legal requirement of staying on premise. In that case I would use DirectQuery.
Performance wise Import would be quicker if dataset is small since there would be caching but if dataset is large and you've used import you may have performance issue plus longer times for data refresh. DirectQuery would take away the need of data refresh but that means there maybe bit of delay in getting data for visuals as queries would be sent to on-premise data source. For example when you apply a filter queries would be sent to on-premise data source.
In either case it depends on your specific requirements.
@AnkitPatira how do you find your experience with dealing with the limitation of the direct connect such as limited measure? EG I found today direct connection does not support sameperiodlastyear function. I guess it also applies to other time intelligence expressions. Without them, Power BI will be much less powerful. I always thought, to be able to fully utilise power bi's power, import mode should be the prefered option.
Situation could be a bit different if you have SSAS as a direct connect source where you can code the time expression within SSAS.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
142 | |
80 | |
64 | |
52 | |
48 |
User | Count |
---|---|
212 | |
89 | |
79 | |
68 | |
60 |