The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have a Analysis Service Cube that has a DataTable, CalenderTable and Measure(s) built on DataTable.
Currently above tables and measures are used in Power BI "Line and staked column chat" and all good. The the chat shows month wise (monthname comes from CalenderTable - Used in X-axis) incident_counts (comes from DataTable - used in Y-axis) for last 12 months. And the Line Y-axis shows the value from the measure (12MonthRIR) for those Months. This Measure internally using few other measures and columns.
Now I have another need to pull the data from above Cube into Fabric Lake house using dataflow Gen2 and I did that to pull only required columns (like DateId, ReportedData, IncidentId columns and also 12MonthRIR Measure). I have not included the other columns and measures used inside the 12MonthRIR measure.
After the data has been pulled and if I do same agreegation based on the Month the incident_counts matches to the Power BI report but not the 12MonthRIR from the Measure column.
What could be the wrong here?
Is that feasible scenario to pull measures into table along with other columns to avoid doing same calculation again?
Is that required to pull all other depended measures and columns too?
At this moment I dont have a Analysis service environment to do a sample and try this, except to read for the aboev pull purpose..
Thanks,
Prabhat
@prabhatnath
If I assume that you connected to your cube with live connectivity mode/direct query connectivity mode. Then your results would be correct in Power BI and you dont need to pull the dependent columns into power bi. Because Power BI sends query when ever you open the visual.
However, if you want to pull this into lakehouse using dataflow gen 2 then adding the required fileds and the measure might lead to errors incase if you are not aggregating the measure correctly.
I think two possible solutions are
1. Pull all the dependent columns and tables into lakehouse and rewrite the measure.
2. If first option does work for you then, your table aggregation should be at the same level as your required visual aggregation should be.
If the post helps please give a thumbs up
If it solves your issue, please accept it as the solution to help the other members find it more quickly.
Tharun
Thanks Tharun for the reply.
The PBi has no issues and it is live connected and all good.
The issue is when I load the same data into Lake house.
Your suggested option-1 is not feasible as team wanted to avoid doing calculation in multiple stage as that may change.
I will try to pull the depended measures and columns and see if that help.
Thanks,
Prabhat