March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
We've had a data warehouse for a number of years which includes a sales data cube for reporting. This solutions served a few SSRS reports but never saw any self-service action in it's entire lifetime. We've recently had our eyes open to the wealth of BI capable tools centered around Excel and now PowerBI. Our SSAS solution rarely changed as we don't have in-house talent for SSIS and MDX tools.
Using Excel and Power BI has made us amazingly productive developing BI reports that are blazingly fast. We find ourselves reaching around the data cube solution and grabbing data from raw SQL tables and feel pretty good about the results.
At what point or for what reason would we spend the money to continually enhance our SSAS solution as we find additional data needs or is Power BI a good final solution or modeling tool on the way there?
Any and all thoughts appreciated.
Solved! Go to Solution.
I'll give you my perspective for what it's worth. Remember that SSAS comes from a time before in-memory, column store data modeling so it was originally considered "fast" since it did pre-aggregation, etc. Today, people still cling to it for other reasons, mainly being centered around the broad category of "governance". Governance being things like master data management, security, single source of truth, simplified data marts, etc. For example, if there are calculations that must occur on the data, you can ensure that you do them in a central location and thus you know that they are always correct versus you have 20 different people pulling the raw data and all fo them do the calculation differently.
I'll give you my perspective for what it's worth. Remember that SSAS comes from a time before in-memory, column store data modeling so it was originally considered "fast" since it did pre-aggregation, etc. Today, people still cling to it for other reasons, mainly being centered around the broad category of "governance". Governance being things like master data management, security, single source of truth, simplified data marts, etc. For example, if there are calculations that must occur on the data, you can ensure that you do them in a central location and thus you know that they are always correct versus you have 20 different people pulling the raw data and all fo them do the calculation differently.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
21 | |
16 | |
13 | |
12 | |
9 |
User | Count |
---|---|
35 | |
31 | |
20 | |
19 | |
17 |