The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
We have an on-prem multidemensional cube that updates daily. We are investigating whether we should import a subset of this data into a Fabric Data Warehouse so that it can be reported on using Power BI. The data in the warehouse would be historical by financial year/period, and it would be loaded every period.
What are our options for loading the data? I found that I cannot select SSAS Multidimensional as a data source when I'm setting up my Gen2 Dataflow. I hear that the ability to load tabular data from an on-prem SSAS instance is coming, but I don't see anything about loading multidim data. Is this on the road map in the near future? If not, what are my options? I found that I can pull the data using OPENQUERY in a stored procedure, but I'm interested in whether there are better options out there. Thanks for your help.
Solved! Go to Solution.
SSAS Multidimensional is not well suited for bulk data extraction. The MDX queries required for bulk data extract tend to be extremely expensive and slow.
Instead reuse the queries you defined in the multidimensional model's data source views to pull the data into Fabric with a Copy task or a Gen2 Dataflow. Both the Copy task and Gen2 Dataflows support all the data sources that Multidimensional does.
SSAS Multidimensional is not well suited for bulk data extraction. The MDX queries required for bulk data extract tend to be extremely expensive and slow.
Instead reuse the queries you defined in the multidimensional model's data source views to pull the data into Fabric with a Copy task or a Gen2 Dataflow. Both the Copy task and Gen2 Dataflows support all the data sources that Multidimensional does.
Hi @ProtoPrimate
Thanks for using Fabric Community.
At this time, we are reaching out to the internal team to get some help on this. We will update you once we hear back from them.
Thanks
User | Count |
---|---|
3 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
3 | |
3 | |
2 | |
2 | |
1 |