Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi,
I'm getting familiar with VertiPaq engine and I'm checking table sizes, column sizes, encoding types etc. for a Power BI report to see how I can improve it. I connected my report to VertiPaq and I noticed that not all decimal columns are Value encoded, some of them have the encoding type set to Hash. I tried to compare 2 columns, one that is Value encoded and one that is Hash encoded and I could not find any difference between the two. I didn't apply any formatting over my column, it comes as decimal from OData and also VertiPaq sees it as decimal (DBTYPE_R8). I don't now if I'm missing any configuration, any advice will be really helpful.
Thanks in advance!
Solved! Go to Solution.
Hi @teodorajuduc ,
There is a similar thread, the user found that if he has an Int64 column with 225 000 rows, Vertipaq still uses a value encoding, but if it is ~ 240 000+ rows it uses dictionary encoding. Maybe it is the same logic for decimal columns.
And he gave a solution:
you can give a hint to the Vertipaq Analyzer, that it should use Value instead of Hash. It can be set by using Tabular Editor in a PBIT file. You can also write a simple C# code which does the transformation for all columns, which supports Value encoding and so on. I will write a blog post about that, the topic is waiting in a queue
You can use for example following snippet in the Tabular Editor:
// for all Int64 columns set EncodingHint foreach(var column in Model.Tables.SelectMany(t => t.Columns)) { if(column.DataType == DataType.Int64) column.EncodingHint = EncodingHintType.Value; }
https://community.powerbi.com/t5/Desktop/Vertipaq-Engine-VALUE-vs-HASH/td-p/690874
Best Regards,
Xue Ding
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thanks @v-xuding-msft for your answer. I have done all the steps from the provided thread and I also tried to isolate one particular table by loading it separately in a new report, but it didn't work. The column is still Hash encoded even if the total number of rows in the table is only 48,640. Just for test, I changed the column date type from decimal to whole number and after I refreshed the report in VertiPaq the column was set as Value encoded. Of course, this is not a solution because I'm losing values but just wanted to double check if the issue is related with the number of values in the table or with the decimal date type. It looks like the decimal column type is the potential issue and I don't know if this a bug in VertiPaq since other decimal columns have the encoded type correctly set to Value.
Edit message:
I tried multiple scenarios and found one that seems to work if I have more then 448,000 rows in the table. The problem is related with the number of digits that the decimal column has, in my case the values imported via OData have 6 digits.
It looks like that if I combine Tabular Editor (where I set Encoding Hint to be Value) together with Power Query Editor (where I do the following steps: right click on the decimal column -> Transform -> Round -> Round -> and set Decimal Places 1), VertiPaq uses Value as encoding type.
Unfourtunately this configuration does not work if you set Decimal places to be more than 1 even if you have only 200,000 rows in the table.
Hi @teodorajuduc ,
There is a similar thread, the user found that if he has an Int64 column with 225 000 rows, Vertipaq still uses a value encoding, but if it is ~ 240 000+ rows it uses dictionary encoding. Maybe it is the same logic for decimal columns.
And he gave a solution:
you can give a hint to the Vertipaq Analyzer, that it should use Value instead of Hash. It can be set by using Tabular Editor in a PBIT file. You can also write a simple C# code which does the transformation for all columns, which supports Value encoding and so on. I will write a blog post about that, the topic is waiting in a queue
You can use for example following snippet in the Tabular Editor:
// for all Int64 columns set EncodingHint foreach(var column in Model.Tables.SelectMany(t => t.Columns)) { if(column.DataType == DataType.Int64) column.EncodingHint = EncodingHintType.Value; }
https://community.powerbi.com/t5/Desktop/Vertipaq-Engine-VALUE-vs-HASH/td-p/690874
Best Regards,
Xue Ding
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thanks @v-xuding-msft for your answer. I have done all the steps from the provided thread and I also tried to isolate one particular table by loading it separately in a new report, but it didn't work. The column is still Hash encoded even if the total number of rows in the table is only 48,640. Just for test, I changed the column date type from decimal to whole number and after I refreshed the report in VertiPaq the column was set as Value encoded. Of course, this is not a solution because I'm losing values but just wanted to double check if the issue is related with the number of values in the table or with the decimal date type. It looks like the decimal column type is the potential issue and I don't know if this a bug in VertiPaq since other decimal columns have the encoded type correctly set to Value.
Edit message:
I tried multiple scenarios and found one that seems to work if I have more then 448,000 rows in the table. The problem is related with the number of digits that the decimal column has, in my case the values imported via OData have 6 digits.
It looks like that if I combine Tabular Editor (where I set Encoding Hint to be Value) together with Power Query Editor (where I do the following steps: right click on the decimal column -> Transform -> Round -> Round -> and set Decimal Places 1), VertiPaq uses Value as encoding type.
Unfourtunately this configuration does not work if you set Decimal places to be more than 1 even if you have only 200,000 rows in the table.