The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi all,
I have a report where I have a bar graph visual. The X-axis is Year and Y-axis is value. It is driven from TableA in my model:
TableA
Reference | Value | Date |
aaa | 80 | 15/04/2022 |
aaa | 20 | 24/07/2022 |
aaa | 46 | 20/02/2020 |
aaa | 84 | 02/09/2020 |
aaa | 235 | 01/04/2018 |
aaa | 69 | 01/02/2018 |
aaa | 25 | 01/02/2017 |
aaa | 24 | 21/10/2014 |
bbb | 75 | 30/12/2021 |
bbb | 95 | 28/11/2020 |
bbb | 11 | 20/07/2018 |
bbb | 66 | 16/07/2017 |
bbb | 84 | 12/05/2016 |
bbb | 64 | 01/02/2015 |
bbb | 90 | 08/06/2014 |
bbb | 36 | 02/04/2013 |
As things stand today, the date is December 2022. I want to display the SUM of values split by year. The problem in my current bar chart is that each time a year starts, the sum for that year starts at zero. Also, previous years contain gaps where the value against a reference may not be declared in a year - so it is not carried over a year or two where there is no value update/change. So if I'm looking at 2023's total values on the 1st January 2023, 2023 will have a value of zero as it's a new year and no values with a date in that year have been made.
What I need is for values to carry through the years. In the case of Reference aaa in TableA above, the last value of the reference in 2018 was on 01/04/2018 for 235. I would want that 235 to be part of the amount summed for the year 2018, to also be summed for 2019 but not 2020 as in that year there are two reviews of the value. The latest value of 84 would then take priority for 2020
There's also a cross check needed with TableB to see whether the reference was actually purchased or sold. If it has been sold and the date of being sold was, say, in 2020, then you'd not want to carry the value of that reference into 2021:
TableB
Reference | Purchase | Transaction Date |
aaa | Purchase | 01/01/2013 |
bbb | Purchase | 01/04/2013 |
ccc | Sale | 28/07/2022 |
ddd | Sale | 22/06/2022 |
How do I achieve this? Grateful for any help.
Hi Julian,
This is just an addition concerning the question whether the can by a performant completely measure-based approach. So, no action required.
As we have been going back and forth to find on a hands on solution it was still unclear if we can have a sufficiently performant approach based on measures alone. I came across “semi-additive” measures in the “DAX Patterns” book and this gives part of the answer how this is feasible.
To complete the topic and to provide a comprehensive description of the approach I will give a full description below. The approach with pre-calculations of exchange rates and amounts in calculated columns is still completely OK. This consumes space for calculated columns, but at an affordable cost for our discussed scenario.
Still, by describing the approach we can see better what is going on. And it is simply a community contribution. So, let’s start:
Fully Measure Based Semi-Additive Approach for Currency Conversion:
Data Model:
We generalize and simplify the model to the essence of the problem. We have tables as follows:
The rest of the original model, like the sales recognition with value zero is not relevant for the performance. In the simulation we have
A word of warning as all figures are random numbers, an update of the model values will change all random numbers and if we had e.g. 5 transactions before there might be non as the number of transactions per entity is random, too.
The model relationships are as per screenshot below. The entity table and the date table filter the fact table “Transactions”. The only difference to our work is that the transaction table has already the original currency as column. The element we have been missing is that there is an exchange rate ID which is composite key of date and currency as we should have exactly one exchange rate per currency and day. With that the transaction table will have an exchange rate ID for the original currency for the valuation date. This is how we define the relationship between the exchange rate table and the transaction table.
So, this is a standard star schema for the problem:
Now measures are defined as follows:
OriginalExchangeRate =
CALCULATE (
SUM ( ExchangeRates[OBS_VALUE] ),
CROSSFILTER ( 'Transaction'[OriginalCurrencyExchangeID], ExchangeRates[ExchangeRateID], BOTH )
)
Here, we use the fact that the exchange rate ID is unique and with the existing relationship between transaction table and exchange rate table we get exactly one record from the exchange rate table by using CROSSFILTER. This gives the exchange rate. This uses the existing relationship and is supposed to be quicker than using LASTNONBLANK as discussed before.
EuroAmount =
SUMX ( 'Transaction', 'Transaction'[Amount] / [OriginalExchangeRate] )
with the above exchange rate the Euro amount as base value can be calculated.
TargetExchangRate =
VAR thisTransactionDate =
FIRSTNONBLANK ( 'Transaction'[TransactionDate], 0 )
VAR exchangeRateID =
COMBINEVALUES (
"-",
FORMAT ( thisTransactionDate, "yyyy-MM-dd" ),
[Selected Currency]
)
VAR exchangeRate =
LOOKUPVALUE (
ExchangeRates[OBS_VALUE],
ExchangeRates[ExchangeRateID], exchangeRateID
)
RETURN
exchange Rate
The target exchange rate is depending on the selected currency.
ValuationLastDateDAXPattern =
VAR maxDate =
MAX ( DateTable[Date] )
VAR maxValuationDates =
CALCULATETABLE (
ADDCOLUMNS (
SUMMARIZE ( 'Transaction', Entity[Entity] ),
"MaxValuationDate", CALCULATE ( MAX ( 'Transaction'[TransactionDate] ) )
),
DateTable[Date] <= maxDate
)
VAR maxValuationDatesWithLineage =
TREATAS ( maxValuationDates, Entity[Entity], DateTable[Date] )
VAR result =
CALCULATE ( [TargetAmount], maxValuationDatesWithLineage )
RETURN
Result
Above is now the part where the “DAX Patterns” knowledge is used: as before the max date of the date filter is used (max date in quarter etc., …). SUMMIRIZE gets all entities and relevant transaction dates. The TREATAS substitutes the SUMMARIZE information in the filter tables Entity and ‘DateTable’. Now the target amount (the amount converted into the selected currency) is calculated with the substituted filter.
As you can see in the relevant visual this is working and quicker than our hands-on approach with measures only.
We now have a tradeoff for the calculated column approach. The exchange rate conversion is precalculated, but the transaction table size is multiplied by the number of currencies (25 in the specific case).
The measure for the calculated column approach is now adjusted to match the DAX Pattern approach above:
ValuationLastDateDAXPatternCC =
VAR maxDate =
MAX ( DateTable[Date] )
VAR maxValuationDates =
CALCULATETABLE (
ADDCOLUMNS (
SUMMARIZE ( 'TransactionsExtended', Entity[Entity] ),
"MaxValuationDate", CALCULATE ( MAX ( 'TransactionsExtended'[TransactionDate] ) )
),
DateTable[Date] <= maxDate
)
VAR maxValuationDatesWithLineage =
TREATAS ( maxValuationDates, Entity[Entity], DateTable[Date] )
VAR result =
CALCULATE ( [TargetAmountCC], maxValuationDatesWithLineage )
RETURN
result
Now, we the performance analyzer we see that the measure only approach takes about 600 to 700 ms for a currency change, the calculated column approach 1100 – 1200 ms.
Without proof I would say that there is a lower threshold for the number of currencies where the calculated column approach is quicker. So, it is a matter of projection:
As I said, no rigorous proof, but is sheds some light on the situation. What have we won:
The pbi file: CurencyConversionPerformance.pbix
OK, this is where I stand in terms of understanding. As I said no action to be taken. It is just the clarification for the record.
Best regards
Christian
Hi Christian,
Apologies, for a little time getting back to you on this.
Fabulous. Thank you for this input. It is a really great approach and I had overlooked the date currency combination as a potential unique key of reference for the join, which of course can prove to be much more efficient. Would this key continue to offer matches for scenarios where, in the unlikely event valuations occur over the weekend, the last FX rates of Friday would need to be taken and extended to Saturday and Sunday?
I like both approaches. I think for the time being I will adopt the first strategy of the calculated column approach. Although this could take twice as long to compute, as you say it would only be a problem if the volume of data and currency options were likely to increase significantly. It is not a huge issue right now. In all likeliness, we would be talking of about 1000 extra valuations each year. Some assets would have several valuation updates a year, some only one. The amount of assets wouldn't increase drastically either - most likely in the 10s each year. And finally new currenies are very unlikely to be added, so the ones we have would be staying.
To recap, the issue with the infinity numbers was skewing everything in my charts - this was due to the Euro not being around before a certain date - so your solution should be perfectly fine, there is simply no data before a certain point in time to make the currency exchange to and from EUR. I said that I would need to manually obtain a Finance reference for FX rate conversion for this period and merge/apend it to the ECB_FX_RATES table using PQ, if this works. That might take some time to obtain so I will cut out any date preceding the establishment of the EUR in the meantime.
Will come back to you very shortly once i filter that out and re-run. I will use your last solution provided for the calculated column approach.
Hi Julian,
I would take the orginal exchange rate and transform it into exactly what I need for the calculation.
- create an exchange rate for every date and currrency in the relevant period defined by min and max transaction date.
- try to get an exchange rate by key for these combinations and above, if not available take the next best one before the transcation date
ExchangeRates =
CROSSJOIN (
SELECTCOLUMNS (
FILTER (
DateTable,
DateTable[Date] <= MAX ( 'Asset Valuation'[Valuation Date] )
&& DateTable[Date] >= MIN ( 'Asset Valuation'[Valuation Date] )
),
"Date", DateTable[Date]
),
DISTINCT (
SELECTCOLUMNS (
ECB_FX_RATES_ORIGINAL,
"Currency", ECB_FX_RATES_ORIGINAL[CURRENCY]
)
)
)
ExchangeRate =
VAR thisDate = ExchangeRates[Date]
VAR thisCurrency = ExchangeRates[Currency]
VAR thisKey = ExchangeRates[ExchangeRateKey]
VAR maxDate =
CALCULATE (
MAX ( ECB_FX_RATES_ORIGINAL[TIME_PERIOD] ),
FILTER (
ECB_FX_RATES_ORIGINAL,
ECB_FX_RATES_ORIGINAL[TIME_PERIOD] <= thisDate
&& ECB_FX_RATES_ORIGINAL[CURRENCY] = thisCurrency
&& ISBLANK ( ECB_FX_RATES_ORIGINAL[OBS_VALUE] ) = FALSE ()
)
)
VAR tryGetValueFromKey =
CALCULATE (
FIRSTNONBLANK ( ECB_FX_RATES_ORIGINAL[OBS_VALUE], 0 ),
FILTER (
ECB_FX_RATES_ORIGINAL,
ECB_FX_RATES_ORIGINAL[ExchangeRateKey] = thisKey
)
)
VAR getlastBestValue =
CALCULATE (
FIRSTNONBLANK ( ECB_FX_RATES_ORIGINAL[OBS_VALUE], 0 ),
FILTER (
ECB_FX_RATES_ORIGINAL,
ECB_FX_RATES_ORIGINAL[CURRENCY] = thisCurrency
&& ECB_FX_RATES_ORIGINAL[TIME_PERIOD] = maxDate
)
)
RETURN
IF (
ISBLANK ( tryGetValueFromKey ) = FALSE (),
tryGetValueFromKey,
getlastBestValue
)
You can use LASTNONBLANKVALUE instead of the explicit maxDate approach. I read somewhere that this is not necessarily quicker, so just a matter of taste. With variables like above I find the code more transparent.
Above, the lookup for the rate to be taken is hopefully lazily evaluated. Again, I read somewhere that this is the case without testing it.
So, if we can get an exchange rate by key, we take it. As this is the quickest way to get it. If not, we take the next best one.
As discussed before, your original exchange rates have to satisfy the condition that for every transaction date a candidate for the rate can be found.
With this we can now establish a relationship between the transaction table and the new derived exchange rate table and we can be sure that for every key in the transaction table there is an exchange rate.
Best regards
Christian
Hi,
I now have modified the approach. I assume the following business rules/definitions, just to be clear:
- one reference number can occur an arbitrary number of times in the base table because the transaction date signifies the update date
- updates can be the change from "Purchase" to "Sales" or vice versa or an amount change
- the rule is "Purchase" appears in the same year, but is as well carried over to the next year
- there is data picker where a date is selected and then a snapshot in the past is calculated, disregarding everything after this date
- the aggregation is the year: target is the total sum of amount and carry over for the relevant years
- there is already a consolidated table that has as a transaction:
transaction date, amount, [purchase or sale], reference number
I did the following steps:
a) create a date table: with 2 helper measures for the start date and the max date
DateRange =
VAR a = [StartDate]
VAR b = [MaxDate]
VAR tmp =
CALENDAR ( a, b )
VAR result =
ADDCOLUMNS ( tmp, "Year", YEAR ( [Date] ) )
RETURN
result
MaxDate = MAX(History[TransactionDate]) + 365
StartDate = date(2018,1,1)
So, we have I dynamic range of dates from the base table and the corresponding year. The dates can be then used in a slicer visual to select the picked date.
Years =
DISTINCT ( SELECTCOLUMNS ( ALL ( DateRange ), "Year", DateRange[Year] ) )
This gives the range of years that are relevant from the start date to the next year of as-is as carry over year.
b) Measures for in-year amount and next year amount from carry over
YearAmount =
VAR thisYear =
SELECTEDVALUE ( Years[Year] )
VAR transactionsBeforePickerDate =
FILTER ( History, History[TransactionDate] <= [PickedDate] )
VAR extendedHistory =
ADDCOLUMNS (
transactionsBeforePickerDate,
"MaxDateForTransaction",
VAR thisReference = [Reference]
RETURN
CALCULATE (
MAX ( History[TransactionDate] ),
FILTER ( transactionsBeforePickerDate, [Reference] = thisReference )
),
"Year", YEAR ( [TransactionDate] )
)
VAR currentTransactionsInTheYear =
FILTER (
extendedHistory,
[TransactionDate] = [MaxDateForTransaction]
&& [Year] = thisYear
)
RETURN
CALCULATE ( SUM ( History[Amount] ), currentTransactionsInTheYear )
PickedDate =
SELECTEDVALUE ( DateRange[Date] )
The measure will pick up the relevant year from the year table in the table visual. PickedDate is the selected date from the slicer put in a measure. Then only the transactions before the PickedDate are taken and the MaxDateForTransaction is created to later filter the latest transaction for the reference number in the relevant year
CarryOverAmount =
VAR thisYear =
SELECTEDVALUE ( Years[Year] )
VAR transactionsBeforePickerDate =
FILTER ( History, History[TransactionDate] <= [PickedDate] )
VAR extendedHistory =
ADDCOLUMNS (
transactionsBeforePickerDate,
"MaxDateForTransaction",
VAR thisReference = [Reference]
RETURN
CALCULATE (
MAX ( History[TransactionDate] ),
FILTER ( transactionsBeforePickerDate, [Reference] = thisReference )
),
"Year", YEAR ( [TransactionDate] )
)
VAR currentTransactionsInTheYear =
FILTER (
extendedHistory,
[TransactionDate] = [MaxDateForTransaction]
&& [Year] = thisYear - 1
&& [PurchaseOrSale] = "Purchase"
)
RETURN
CALCULATE ( SUM ( History[Amount] ), currentTransactionsInTheYear )
Same thing, but now the transactions of type "Purchase" are carried over.
You can pull now the year from Years and the two measures in a table visual.
It goes without saying that this not extensively tested. Should give you in any case the flavor of one way how to approach this.
If someone had a smarter way to to this I'd be curious.
Hope this helps.
Best regards
Christian
@scee07 Happy New Year, and thank you greatly for the above response at the end of last year. I was just about to take vacation so have not until now been able to try apply it!
I think it might be the right solution - or certainly nearly there. But I am struggling to apply the first section creating the calendar. I have a slicer linked to my built in date table, so am wondering do I need to bother at all with assigning the min and max date selections
I thought it might be a good idea to send a link to my pbix here. - unable to upload on the forum.
It might help if I recap a little to make sure the solution you provided would work or if there is something missing, but also to illustrate a bit better what I was describing for others in a similar situation,
What I have on my report canvas currently is the following:
You will see the date picker already at the top (i hide the start date so user can't change). It's essentially linked to the [Date] of DateTable that I create in PQ. This provides the window of time to be viewed in the bar chart below it but also filters out any dates from other tables so data is not involved in the calculations and can be filtered out.
In the attachment pbix, there are four tables in the model at work:
1. DateTable - as described above
2. Asset Transaction - this is the acquisition value of the asset so when a 'purchase' it contains the very first record of its value. If there is a stamp saying 'sale', it marks the final year you can count the last valuation of the asset. Values for the asset should not run forward of this date.
Asset Reference | Transaction Type | Transaction Date | Net Transaction Price |
AAA100 | Purchase | 12 July 2018 | 5175000 |
AAA103 | Purchase | 22 November 2013 | 37400000 |
AAA107 | Purchase | 20 May 2011 | 8750000 |
AAA130 | Purchase | 17 December 2002 | 3800000 |
AAA130 | Sale | 19 October 2021 | 6000000 |
AAA107 | Sale | 10 March 2021 | 10000000 |
3. Asset Valuation
This lists consequent reviews of the asset value through time. These are irregular times. Some years are empty.
Even if there is a year missed in the Valuation Date column, the DAX calculation should carry forward the last known valuation into any gap years and count it as the value for it.
Asset Reference | Valuation Date | Asset Value |
AAA100 | 30-Sep-18 | 5400000 |
AAA100 | 31-Dec-18 | 5400000 |
AAA100 | 31-Dec-19 | 5400000 |
AAA100 | 31-Mar-20 | 5250000 |
AAA100 | 30-Jun-19 | 5400000 |
AAA100 | 31-Mar-19 | 5400000 |
AAA100 | 30-Sep-19 | 5400000 |
AAA103 | 30-Sep-18 | 43900000 |
AAA103 | 31-Dec-18 | 43900000 |
AAA103 | 31-Dec-19 | 43900000 |
AAA103 | 31-Mar-20 | 43900000 |
AAA103 | 30-Jun-19 | 43900000 |
4. Building
This lists the asset references and provides attributes for them that can be used in the bar chart to split the total values by categories.
Asset Reference | Risk Profile |
AAA100 | A |
AAA103 | D |
AAA107 | A |
AAA130 | D |
It should be worth noting then that is a 'purchase' of an asset has been made in Asset Transaction table but there is no record of a review of the valuation in the Asset Valuation table, then the Asset Transaction value should be used. That's an extra point that I came to realise was absent before but needs to be part of the calculation.
As you can see in the bar chart visualisation, there are some years with complete gaps and that's because it is not currently carrying over the last known value from the Asset Valuation table for the asset into further years. I think my DAX code is inaccurate:
ValueLastDate = Calculate(SUM('Asset Valuation'[Asset Value]), LASTDATE('Asset Valuation'[Valuation Date]))
Does your solution still work in this scenario? I was lost with the DAX created table you provided. I'm not sure if I need this, but was also confused about the reference to Years[Year] and whether I need both snippets of your code b and c?
Sorry for the confusion. This level of complexity is a hard one for me to grasp on this one! I am still open to the idea of adding columns to existing tables if that is something that would break down the complexity and work with the dynamic date picker the user can select in the report as shown above.
Hi Julian,
Happy New Year to you to. Good to have the file. I will have time during the day (German Time) to look at this and come back to you. I remember that the table in the DAX expression comes from the requirement to have a snapshot in the past, this means that e.g. you could say what was the situation beginning of last year. Never mind, I will look at the pbi and see what is going. I'll come back to you. Best regards
Christian
Hi Christian @scee07
Many thanks for taking a look again.
I think I said before, but I believe a value labelled sale should not be included in any figures. So we would take the first recorded value in the Asset Transaction table when it is marked 'Purchased', and that would be the value for that year (unless there is an update to the value in the Asset Valuation table in which we take the last provided in that year). If available, we would not take the last value to be that marked 'Sale' for an asset in the Asset Transaction table.
This might be needed in future though.
Hi Julian,
I will now try to modify the approach and provide a link to the workbook for you to look at below.
Assets are buildings, that gives me better context.
Firstly, the transaction and the valuation table should be married:
AllTransactions =
UNION (
SELECTCOLUMNS (
'Asset Transaction',
"Asset Reference", 'Asset Transaction'[Asset Reference],
"TransactionDate", 'Asset Transaction'[Transaction Date],
"Type", 'Asset Transaction'[Transaction Type],
"Amount", 'Asset Transaction'[Net Transaction Price]
),
SELECTCOLUMNS (
'Asset Valuation',
"Asset Reference", 'Asset Valuation'[Asset Reference],
"TransactionDate", 'Asset Valuation'[Valuation Date],
"Type", "Validation",
"Amount", 'Asset Valuation'[Asset Value]
)
)
Now the full history is in the table from purchase to evaluation to potential sales is in one table.
I think to have understood that the date filter is only for defining the window that is shown in the visual. If this is the case then we can try a brute force approach with no very complicated measures. I come to the limitations later on.
BruteForceCrossJoin =
CROSSJOIN (
SELECTCOLUMNS ( DateTable, "Date", DateTable[Date] ),
DISTINCT (
SELECTCOLUMNS (
'Asset Transaction',
"Asset Reference", 'Asset Transaction'[Asset Reference]
)
)
)
These are all combinations of Date and Asset. This will adress the problem that the visual does not carry over valuations to a the following years if there are no valuation dates. Now, the limitations: if you have thousands of assets this table is getting presumably to big. One thing, in any case is that you can restrict the date table to date with a new table Date which has a filter smaller than Today() on the date. Let's assume for the moment that this is working as the number of different assets is sufficiently small.
Now we can do a calculated column:
CurrentValue =
VAR thisAsset = BruteForceCrossJoin[Asset Reference]
VAR thisDate = BruteForceCrossJoin[Date]
VAR salesDate =
CALCULATE (
FIRSTNONBLANK ( AllTransactions[TransactionDate], 0 ),
FILTER (
AllTransactions,
AllTransactions[TransactionDate] <= thisDate
&& AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[Type] = "Sale"
)
)
VAR lastValuationDate =
CALCULATE (
MAX ( AllTransactions[TransactionDate] ),
FILTER (
AllTransactions,
AllTransactions[TransactionDate] <= thisDate
&& AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[Type] <> "Sale"
)
)
VAR lastValuationAmount =
CALCULATE (
FIRSTNONBLANK ( AllTransactions[Amount], 0 ),
FILTER (
AllTransactions,
AllTransactions[TransactionDate] = lastValuationDate
&& AllTransactions[Asset Reference] = thisAsset
)
)
RETURN
IF ( ISBLANK ( salesDate ), lastValuationAmount, 0 )
So, now are dates have a value for all assets. If an asset is sold the value is 0. What concerns the remark above I am not sure if I understand the business context: if an asset is sold the futere value is zero. However, if I would like to still see it in the history. Please clarify.
In the model there only 3 tables now Date, BruteForceCrossJoin and Building. Relationships are Date and BruteForce via the dates and BruteForce and Building via the asset reference.
Now to the visual: I think the column year in the date table should be directly selected as the drill down of the date is essentially meaningless for the visual. Then we only want to show only the last valuation of the year summed over assets and risk class. Therefore we need a measure as value for the visual:
LatestValue =
VAR selectedYear =
SELECTEDVALUE ( DateTable[Year] )
VAR maxDateInYear =
CALCULATE (
MAX ( DateTable[Date] ),
FILTER ( DateTable, DateTable[Year] = selectedYear )
)
VAR selectedMaxDate =
SELECTEDVALUE ( DateTable[Date] ) //var maxDate = if (selectedMaxDate < maxDateInYear, selectedMaxDate, maxDateInYear)
RETURN
CALCULATE (
SUM ( BruteForceCrossJoin[CurrentValue] ),
FILTER ( BruteForceCrossJoin, BruteForceCrossJoin[Date] = maxDateInYear )
)
With a little helper filter for the assets, you see that the amount disappears, once sold:
is the one drive link to the workbook
If this is roughly the requirement then the calculated column approch works in principle.
Don't hesitate to reply: we will do this until we have both understood what is required.
Best regards
Christian
Thank you so much again Christian @scee07
I am testing now and it really does seem to work. The values are moving across the years where there is no update for a year and are being treated as part of the total for that 'blank' year.
And really liking the approach of merging the Transaction and Valuation tables. This makes a whole lot more sense and simplifies the calculations by essentially adding a balance sheet by date for all assets.
Repsonse to your comments:
I think to have understood that the date filter is only for defining the window that is shown in the visual.
It acts essentially as an As Of Date - a way of going back in time to see how things looked if I was to run the report at that time, like it was today's date.
What concerns the remark above I am not sure if I understand the business context: if an asset is sold the futere value is zero. However, if I would like to still see it in the history. Please clarify.
I needed to get a bit more clarity on this. The change is - I no longer need to use the 'Purchased' amount in the Asset Transaction table. Nor should we use the value if it is marked 'Sale'.
If an amount is not in the Asset Valuation table we should just ignore the asset, even if there is a 'Purchased' record of it in the Transaction table. This is because it should not be treated as part of the portfolio until it's first and consecutive evaluations. It is not validated with any value until there is a log in the Asset Valuation table. In fact the only thing now we do need from the Transaction table is if it is a 'Sold'. We would need the date from that to put a stop on that asset value being included, depending on the date picker the user selects.
So if an asset had a value of 300,000 on 20th Jan 2013 but was sold on 25th Jan 2013 - if the date user picked is 21st Jan 2013, the asset is still part of the sum total for the year 2013. But if I was to pick date 26th Jan 2013, the asset value would not be part of the total now displaying for me for that year of 2013.
So, basically, we only count values as and when they are listed in the Valuation table, until the date marked sold in the Transaction table.
About your question about number of assets. I am currently working with around a thousand. This will not increase quickly over time, but you never know. It's more likely to increase than decrease.
In regards to the granularity of the data - the solution might be a problem if it doesn't maintain data on a daily level. I am implementing a second drop down selection that will convert all values displayed to a currency of their choice (about 8 available). So values and their related dates in the Asset Valuation table will be looked up and converted from their native currency (this is available against the asset in the Building table) to selected currency. I bring in daily historic conversion rate data from ECB for this, but it means that each individual value has to be converted on the date the valuation occurred, it can't use the converstion rate on the date you are running the report.
Given that we now no longer need the Asset Transaction table so much, perhaps a merging of tables is no longer needed?
Will it help if I provide the currency conversion in the pbix?
Yes, please send what you have.
I will have time tomorrow to continue with this. A few point upfront to check my understanding
- the transaction table is still needed as we need the potential sales dates of the assets?
- if we assume that we can keep the brute force approach for the moment, I think that a date selection will exactly achieve what we want, because if a sales date is after the upper filter date, it will not appear in the sequence of valuations for the asset?
- yes, the currencies will mulitply the table by 8 (if 8 currencies).
- if the table size should become a problem, the approach would be to achieve a working brute force solution and then opimize the solution.
Bes reagards
Christian
Thank you.
- the transaction table is still needed as we need the potential sales dates of the assets?
That is correct. The date will need to be referenced to understand if a previous Asset Valuation amount should be included in the current portfolio total up to the date the user has selected in the date picker. OR, if it has been marked as sold at a date previous to the date selected in the date picker, that this is then not included as part of the portfolio total, because you would be including asset values that are no longer owned.
- if we assume that we can keep the brute force approach for the moment, I think that a date selection will exactly achieve what we want, because if a sales date is after the upper filter date, it will not appear in the sequence of valuations for the asset?
Yes, I think this is correct.
Attached here I have updated your pbix attachment to include everything you have added where left off, but also to include the currency conversion slicer and logic for that to work. I'm open to this being written more efficiently, but for now it does the job.
I have created a Measure table to store some of the added measures instead of assigning them to a table (find it more organised this way)
In the Measure table:
- Selected Currency - returns the user selected currency
- ValueLastDate - returns the last value for an asset based on the LASTDATE DAX function
- ValueLastDate SUM - ensures totals are being displayed properly if displayed as a table
- ValueLastDate_CurrencyChanged - Takes the currency, the selected currency, LASTDATE of asset valuation, then finds the nearest currency converstion rate from the ECB imported table (or closest last day going back in time where there is conversion data). It then performs the converstion. As there are only conversions to EUR, sometimes there are two steps to the conversion if you are converting to anything other than the EUR, for example AUD to DKK has to be converted into then out of the EUR.
- ValueLastDate_CurrencyChanged SUM - is the same as the ValueLastDate SUM but uses the converted currency amount to summarise.
- I added a currency column to the Building table so that the above can reference the native currency of the asset for converting to/from
- I added a live feed for the ECB rates - ECB_FX_RATES table in the model
- In the DateTable I needed to add a couple of columns - ECBRateDate and isinECBTable to assist the dates of conversion alignment
- Added DistinctECBRates DAX table which gives DISTINCT rows for the ECB_FX_RATES dates available so it can cross reference.
You can see in the updated visuals I added on the Curency Conversion tab, this sort of works, and you can drill down to quarters etc. The values also convert to chosen currency if selected. But there is nothing for 2021/22/23 despite not all assets being Sold.
I hope this makes sense.
Hi Julian,
with the currencies added I got rid of the brute force table with all combinations, as this will not scalable with more currencies and buildings. This required a few changes outlinded below:
a) restrict the date table: dates before the first validation at all are useless, dates after today as well, as the figures will not change. The same for days before all exchange rates are available.
The restricted date table makes use of the following measures:
RestrictedDateTable =
FILTER (
DateTable,
DateTable[Date] <= TODAY ()
&& DateTable[Date] >= [FirstValuationDate]
&& DateTable[Date] >= [FirstCompleteCurrencyDate]
)
FirstValuationDate =
CALCULATE (
MINX ( ALL ( 'Asset Valuation' ), 'Asset Valuation'[Valuation Date] )
)
FirstCompleteCurrencyDate =
VAR minDates =
CALCULATETABLE (
SUMMARIZE (
FILTER (
ECB_FX_RATES,
TRIM ( ECB_FX_RATES[OBS_VALUE] ) <> ""
|| ISBLANK ( ECB_FX_RATES[OBS_VALUE] ) = FALSE ()
),
ECB_FX_RATES[CURRENCY],
"MinDate", MIN ( ECB_FX_RATES[TIME_PERIOD] )
)
)
VAR test =
CALCULATE ( MAXX ( minDates, [MinDate] ) )
RETURN
test
The AllTransactions table gets the currency and the latest available exchange rate to Euro before the transaction date:
Currency =
RELATED ( Building[Currency] )
LatestConversionRate =
VAR thisCurrency = AllTransactions[Currency]
VAR thisDate = AllTransactions[TransactionDate]
RETURN
IF (
thisCurrency = "EUR",
1,
CALCULATE (
LASTNONBLANK ( ECB_FX_RATES[OBS_VALUE], 0 ),
FILTER (
ECB_FX_RATES,
ECB_FX_RATES[TIME_PERIOD] <= thisDate
&& ECB_FX_RATES[CURRENCY] = thisCurrency
)
)
)
EuroValue =
IF (
AllTransactions[LatestConversionRate] <> BLANK (),
AllTransactions[Amount] / AllTransactions[LatestConversionRate]
)
Now the measures are more complicated. The first panel transforms everything in Euro values:
LatestValidationValueInEuro =
VAR thisAsset =
SELECTEDVALUE ( Building[Asset Reference] )
VAR maxDate = [Max Selected Date]
VAR lastValidationDate =
CALCULATE (
MAX ( AllTransactions[TransactionDate] ),
FILTER (
ALL ( AllTransactions ),
AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[TransactionDate] <= maxDate
&& AllTransactions[Type] = "Validation"
)
)
VAR validationValue =
CALCULATE (
FIRSTNONBLANK ( AllTransactions[EuroValue], 0 ),
FILTER (
ALL ( AllTransactions ),
AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[TransactionDate] = lastValidationDate
)
)
VAR globalMaxValidationDate =
CALCULATE (
MAX ( AllTransactions[TransactionDate] ),
FILTER (
ALL ( AllTransactions ),
AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[Type] = "Validation"
)
)
VAR globalLastValue =
CALCULATE (
FIRSTNONBLANK ( LatestValuations[EuroAmount], 0 ),
FILTER ( LatestValuations, LatestValuations[Asset Reference] = thisAsset )
)
VAR wasSold =
COUNTROWS (
FILTER (
ALL ( AllTransactions ),
AllTransactions[TransactionDate] <= maxDate
&& AllTransactions[Type] = "Sale"
&& AllTransactions[Asset Reference] = thisAsset
)
) > 0
VAR result =
IF (
wasSold,
BLANK (),
IF ( ISBLANK ( validationValue ), [GlobalLastValidationValue], validationValue )
)
RETURN
result
LatestValidationValueInEuroWrapped =
VAR tab =
SUMMARIZE (
Building,
Building[Asset Reference],
"measure", [LatestValidationValueInEuro]
)
RETURN
SUMX ( tab, [measure] )
We can discuss next week in case this unclear. In principle the [Max Selected Date] measure does the work. If you apply this measure to the date hierarchy, the last date of the period is picked (as in period year, quarter, ...)
Then there is a filter context with the max date and the asset considered and the right value is calculated. The wrapper makes this value summable.
Then the currency conversion is easily achieved with a little extension:
LatestValidationInCurrency =
VAR thisCurrency = [Selected Currency]
VAR thisAsset =
SELECTEDVALUE ( Building[Asset Reference] )
VAR maxDate = [Max Selected Date] // find the most recent exchange rate before maxDate
VAR maxCurrencyDate =
CALCULATE (
MAX ( ECB_FX_RATES[TIME_PERIOD] ),
FILTER ( ECB_FX_RATES, ECB_FX_RATES[TIME_PERIOD] <= maxDate )
)
VAR exchangeRate =
CALCULATE (
FIRSTNONBLANK ( ECB_FX_RATES[OBS_VALUE], 0 ),
FILTER (
ECB_FX_RATES,
ECB_FX_RATES[CURRENCY] = thisCurrency
&& ECB_FX_RATES[TIME_PERIOD] = maxCurrencyDate
)
)
VAR lastValidationDate =
CALCULATE (
MAX ( AllTransactions[TransactionDate] ),
FILTER (
ALL ( AllTransactions ),
AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[TransactionDate] <= maxDate
&& AllTransactions[Type] = "Validation"
)
)
VAR validationValue =
CALCULATE (
FIRSTNONBLANK ( AllTransactions[EuroValue], 0 ),
FILTER (
ALL ( AllTransactions ),
AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[TransactionDate] = lastValidationDate
)
)
VAR globalMaxValidationDate =
CALCULATE (
MAX ( AllTransactions[TransactionDate] ),
FILTER (
ALL ( AllTransactions ),
AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[Type] = "Validation"
)
)
VAR globalLastValue =
CALCULATE (
FIRSTNONBLANK ( LatestValuations[EuroAmount], 0 ),
FILTER ( LatestValuations, LatestValuations[Asset Reference] = thisAsset )
)
VAR wasSold =
COUNTROWS (
FILTER (
ALL ( AllTransactions ),
AllTransactions[TransactionDate] <= maxDate
&& AllTransactions[Type] = "Sale"
&& AllTransactions[Asset Reference] = thisAsset
)
) > 0
VAR result =
IF (
wasSold,
BLANK (),
IF (
ISBLANK ( validationValue ),
[GlobalLastValidationValue] * exchangeRate,
validationValue * exchangeRate
)
)
RETURN
result
LastValidationinCurrencyWrapped =
VAR tab =
SUMMARIZE (
Building,
Building[Asset Reference],
"measure", [LatestValidationInCurrency]
)
RETURN
SUMX ( tab, [measure] )
It goes without saying that this is not sufficiently tested. Here the link:
Assets_130123.pbix
We'll talk next week. If you let me, I would like to understand how the exchange rates are aligned to the accounting guidelines in this specific case.
Best regards
Christian
Hi Christian, hoping you had a good weekend.
As ever, thank you for your time on this - it turned complicated, and for that, apologies.
Anyway, I've been testing your attached solution. Although I have not dissected your code line by line, I have analysed the output and conversion rates, comparing the values I'm getting after conversion to the ones the report gives me.
Essentially the conversion must work in the following way if, for example, I wanted to exchange a DKK amount for a selected currency of AUD:
As of Date = 2022-06-30 as follows:
From ECB_FX_RATES table which provides all currency conversion rates we need to Euro
<Cube time="2022-06-30">
<Cube currency="DKK" rate="7.4392"/>
<Cube currency="AUD" rate="1.5099"/>
</Cube>
[EUR -> EUR] * [EUR -> DKK]
[1/1.0000] * [7.4392] = 7.4392 (this one is simple as the rate matches that from the ECB extraction)
[AUD -> EUR] * [EUR -> DKK]
[1/1.5099] * [7.4392] = 4.9269
In the latest file you sent, I looked at both tabs. For the bar graph visual I changed the Legend to contain Asset Reference rather than Risk Profile, just to make it easier to separate the values at asset level and check their value change.
For the Euro tab, I'm assuming these are conversions all to Euro from the native currency?
I looked at AAA100 - this is in Euros anyway, so the value would be the same as in the table. It definitely is displaying as it should with the latest valuation of the year used for that year and carried over to other years if there is no update of the valuation in them.
I then looked at AAA107 - the native currency of this asset is DKK. In 2018, the last valuation of the year was on 31/12/2018 for 43,900,000 DKK. DKK to EUR on the 31/12/2018 was 7.4674.
Using the calc logic above, this would be:
43,900,000 / 7.4392 (use / instead of * as you are switching round EUR -> DKK to DKK -> EUR.
This gives result of 5,091,172.17 (rounded down to 2dp). However the value I'm getting in the bar chart is 5,875,503.57.
AAA107 last value for 2018 is 11,500,000 in GBP. valued on 31/12/2018. FX rate to EUR on 31/12/2018 is 0.89453. 11500000 / 0.89453 = 12,855,913.20. Displayed is 7,959,984.94.
I think this was the accountancy details you were after to make it clearer on how this is calculated. My original exchange rate DAX logic used LOOKUPVALUE which did the job but I was never really happy about as I believe it can be an expensive function to use.
I hope the above sheds some light on the conversion logic - there are basically two methods used depending on the scenario - first, if the asset native currency is EUR anyway OR the user has selected conversion currency to EUR in the drop-down then you only need to find and use one rate to EUR . If both the asset native currency is not EUR and the selected currency is not EUR then two exchange rate lookups need to be made and used as part of the conversion.
Hoping this makes sense.
Hi Julian,
I indeed found a mistake in my calculated column for the conversion to Euro in the AllTransactions table. This fixes AAA103 as it seems.
For AAA107, I found this, what scares me :-):
I have 2 valuations per day in the original table, and the valuations are different.
Please let me if this is a bug, or if I am missing something.
Basically with this input, the difference can be explained as after sorting the lower value is picked. Or is there a rule: if there are 2 valuations on the same day take the bigger one?
Best regards
Christian
Hi Christian,
Yes well spotted. In actual fact this is a mistake in my data recreation - in the Asset Valuation table, there will never normally be two valuations in a day. But I guess it is not impossible. If it should ever occur, the latest date should be picked, but to be honest I think the chances of this ever happening are virtually nil.
It is a similar story in the ECB_FX_RATES table. You will never get two FX rates for a single day as it is a daily report. Not every day is provided - weekends are not normally given. In those cases, should as asset valuation occur the day of a weekend where there is no FX rate provided in the ECB_FX RATES table, then the last FX date nearest to the date of valuation should be used.
For example
If a valuation occured on Sunday 15th January 2023, then the FX rate for Friday 13th Jan 2023 would be used.
Hi Julian,
I have now included 2 test panels for Euro and DKK to check the values. Now I do not find any differences.
As we have been going back and forth, I think a short summary is in order:
The core measure is LatestValidationInCurrency:
LatestValidationInCurrency =
VAR thisCurrency = [Selected Currency]
VAR thisAsset =
SELECTEDVALUE ( Building[Asset Reference] )
VAR maxDate = [Max Selected Date]
VAR lastValidationDate =
CALCULATE (
MAX ( AllTransactions[TransactionDate] ),
FILTER (
ALL ( AllTransactions ),
AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[TransactionDate] <= maxDate
&& AllTransactions[Type] = "Validation"
)
)
VAR maxDateForExchangeRate =
CALCULATE (
MAX ( ECB_FX_RATES[TIME_PERIOD] ),
FILTER (
ECB_FX_RATES,
ECB_FX_RATES[TIME_PERIOD] <= lastValidationDate
&& ECB_FX_RATES[CURRENCY] = thisCurrency
)
)
VAR exchangeRate =
CALCULATE (
FIRSTNONBLANK ( ECB_FX_RATES[OBS_VALUE], 0 ),
FILTER (
ECB_FX_RATES,
ECB_FX_RATES[CURRENCY] = thisCurrency
&& ECB_FX_RATES[TIME_PERIOD] = maxDateForExchangeRate
)
)
VAR validationValue =
CALCULATE (
FIRSTNONBLANK ( AllTransactions[EuroValue], 0 ),
FILTER (
ALL ( AllTransactions ),
AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[TransactionDate] = lastValidationDate
)
)
VAR globalMaxValidationDate =
CALCULATE (
MAX ( AllTransactions[TransactionDate] ),
FILTER (
ALL ( AllTransactions ),
AllTransactions[Asset Reference] = thisAsset
&& AllTransactions[Type] = "Validation"
)
)
VAR globalLastValue =
CALCULATE (
FIRSTNONBLANK ( LatestValuations[EuroAmount], 0 ),
FILTER ( LatestValuations, LatestValuations[Asset Reference] = thisAsset )
)
VAR wasSold =
COUNTROWS (
FILTER (
ALL ( AllTransactions ),
AllTransactions[TransactionDate] <= maxDate
&& AllTransactions[Type] = "Sale"
&& AllTransactions[Asset Reference] = thisAsset
)
) > 0
VAR result =
IF (
wasSold,
0,
IF (
ISBLANK ( validationValue ),
[GlobalLastValidationValue] * exchangeRate,
validationValue * exchangeRate
)
)
RETURN
result
The core feature is that measure [Max Selected Date] gives the maximum date of the filter context. The measure above then calculates the last valuation of the asset in the filter context.
If there was a sales transaction for the asset, then the asset valuation is zero. The wrapper makes this measure summable.
The logic assumes one currency value per day and one valuation per day. I have changed the test data, to guarantee this.
I think from a business standpoint this makes sense: if you have multiple exchange rates per day, you can take the average of the rates or whatever. I think there is no accounting with real time exchange rates as every valuation per currency would create a new accounting record. The accounting rate would then be something and not necessarily what is taken for this report.
The same goes for the valuations. The maximum would be several valuations per year.
The testing is done by creating a calculated column in the transactions table that calculates the valuation in DKK with the last available exchange rate. In the test panel this value is compared with the above measure value.
This is where we stand. I don’t know if there is a shorter way. The issue is that the end of the period in the filter context is just the input for a calculation to get the last valuation per asset. This is date is dynamic depending on the filter context.
Let me know if there are more issues. We do this until we are both happy.
Here the file:
Assets_180123.pbix
Best regards
Christian
Hi Christian,
I have tested your latest solution fairly extensively - I copied the original data tables in the data model into Excel workbook to perform calculations to check alignment with the output on the Currency Conversion By RIsk Profile tab of the report. All works perfectly. I also tested AAA107 asset reference which is sold on 10-Mar-21. I adjusted the date picker to 09-Mar-21 and then 11-Mar-21 to check that when the asset is sold it is no longer counted in the total. It works flawlessly. I cannot thank you enough.
The only prevailing issue is that if I select Currency drop-down as EUR, no numbers pull through into the bar chart. In this case where EUR is chosen currency, only one step of conversion needs to take place rather than a two stage lookup to acquire two exchange rates. This is already happening in the Euro By Asset tab. It has a bar graph where all the values shown are conversions to EUR. I checked and they are all correct.
Is it possible to merge the logic behind the output in the Currency Conversion By RIsk Profile tab to that in the Euro By Asset tab so that when EUR is selected it does not show the blank?
Again, thank you for all your time on this.
Hi Julian,
yes, I forgot this case when testing
Here the file: Assets_180123_2.0.pbix
It is just saying in the main measure: if thisCurrency is "EUR" then set the variable "exchangeRate" to 1. The reason why this is blank is that EUR has no exchange at rate 1 in the ECB table which is OK.
Best regards
Christian
Hi Christian,
Wonderful. It works and the EUR selection now brings back correct results in the bar graph.
I'll bring your solution into my actual dataset now and join together with it.
I will take out any measures not referenced or needed in the transfer, or any tables, but I think it is quite tidy and everything in your solution is needed and referenced so needs to stay. Correct me if I am wrong.
As I create other new reports as I go that also need this currency conversion logic, I will need to ensure that all these measures and tables are in the new report to apply the conversions. In these cases, any monetary value that needs to be converted will need to 'pass' both the value and the date of the value to the solution you provided. Is it best to pass this to the measure [LastValidationCurrencyWrapped] or [LatestValidationInCurrency]?
To separate the currency conversion logic from standard measures, I might create a folder in the Measure table to separate these measures from the rest - keeping things tidy.
So grateful for all your time on this. Hope the challenge has brought something for you too.
Yes, I think copying and adjusting it is. If you use the same conversion table then perhaps the entity changes from asset to something else with a value and a date. To my knowledge you cannot do any custom functions in DAX, so you have to copy. I think even then you would have to copy the function. It's more like a pattern that can be applied then. I am sure there are some out of the box functions that do some of the work we have done. But unless you do not understand exactly what these functions do, it is for my taste better to do it from scratch. There might be performance considerations. But this is more for tables with millions of records, not a couple of thousands.
Best regards
Christian
User | Count |
---|---|
25 | |
10 | |
8 | |
7 | |
6 |
User | Count |
---|---|
32 | |
12 | |
10 | |
10 | |
9 |