March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Think I have nearly got it just cant quite work out the last bit, any help appreciated.
I am trying to increment the range so that it shows increment of 0.01 the PAG table is the list and the second table is the output.
What I ideally need is to get 0.01, 0.02 ..... 1.21, 1.22, 1.23 etc....
let
Source = PAG,
#"Changed column type" = Table.TransformColumnTypes(Source, {{"Start Range", type number}, {"Finish Range", type number}}),
List = Table.AddColumn(#"Changed column type", "Key Stage 1 Average PAG", each List.Numbers([Start Range], Number.Round(([Finish Range]-[Start Range])*10+0.1),0.1)),
#"Expanded Key Stage 1 Average PAG" = Table.ExpandListColumn(List, "Key Stage 1 Average PAG")
in
#"Expanded Key Stage 1 Average PAG"
PAG Table
Group | Start Range | Finish Range | Reading(PAG) | Writing(PAG) | Maths(PAG) |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 |
2 | 1.5000001 | 1.7499999 | 67.75 | 67.11 | 67.84 |
3 | 1.7500001 | 1.999999 | 70.6 | 69.33 | 70.8 |
4 | 2.0000001 | 2.999999 | 73.65 | 72.27 | 74.13 |
5 | 3 | 3.249999 | 80.32 | 77.85 | 80.49 |
6 | 3.25 | 3.499999 | 82.08 | 79.41 | 82.23 |
7 | 3.5 | 3.749999 | 85.74 | 82.46 | 85.37 |
8 | 3.75 | 3.999999 | 88.53 | 84.51 | 88.05 |
9 | 4 | 4 | 90.51 | 86.66 | 89.38 |
10 | 4.000001 | 4.999999 | 92.24 | 87.86 | 92.18 |
11 | 5 | 5 | 94.22 | 89.77 | 93.38 |
12 | 5.000001 | 5.999999 | 95.58 | 90.33 | 95.1 |
13 | 6 | 6 | 96.78 | 93.01 | 95.81 |
14 | 6.000001 | 6.999999 | 100.85 | 96.35 | 97.86 |
15 | 7 | 7 | 101.04 | 97.37 | 100.93 |
16 | 7.000001 | 7.999999 | 103.29 | 98.23 | 102.84 |
17 | 8 | 8 | 105.38 | 102.02 | 104.61 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 |
19 | 9.5 | 9.999999 | 111.29 | 105.5 | 110.84 |
20 | 10 | 10 | 113.29 | 108.88 | 112.15 |
The output doesnt increment by 0.01
PAG Group | Start Range | Finish Range | Reading(PAG) | Writing(PAG) | Maths(PAG) | Key Stage 1 Average PAG |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.000001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.100001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.200001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.300001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.400001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.500001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.600001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.700001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.800001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 0.900001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 1.000001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 1.100001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 1.200001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 1.300001 |
1 | 0.000001 | 1.499999 | 62.84 | 62.56 | 63.05 | 1.400001 |
2 | 1.5 | 1.75 | 67.75 | 67.11 | 67.84 | 1.5 |
2 | 1.5 | 1.75 | 67.75 | 67.11 | 67.84 | 1.6 |
2 | 1.5 | 1.75 | 67.75 | 67.11 | 67.84 | 1.7 |
3 | 1.75 | 1.999999 | 70.6 | 69.33 | 70.8 | 1.75 |
3 | 1.75 | 1.999999 | 70.6 | 69.33 | 70.8 | 1.85 |
Through to
PAG Group | Start Range | Finish Range | Reading(PAG) | Writing(PAG) | Maths(PAG) | Key Stage 1 Average PAG |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 8.200001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 8.300001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 8.400001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 8.500001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 8.600001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 8.700001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 8.800001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 8.900001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 9.000001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 9.100001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 9.200001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 9.300001 |
18 | 8.000001 | 9.499999 | 109.5 | 104.72 | 107.85 | 9.400001 |
19 | 9.5 | 9.999999 | 111.29 | 105.5 | 110.84 | 9.5 |
19 | 9.5 | 9.999999 | 111.29 | 105.5 | 110.84 | 9.6 |
19 | 9.5 | 9.999999 | 111.29 | 105.5 | 110.84 | 9.7 |
19 | 9.5 | 9.999999 | 111.29 | 105.5 | 110.84 | 9.8 |
19 | 9.5 | 9.999999 | 111.29 | 105.5 | 110.84 | 9.9 |
20 | 10 | 10 | 113.29 | 108.88 | 112.15 | null |
Solved! Go to Solution.
Hi, @CEllis
Perhaps you can use the following M expression:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("TVJBEsQgCPtLzzsMqAi+ZWf//42FYGs7YxFNQhS/30uuz8XE+eVUaKz8Yjob+aioM2Mn1uv3+V4NQH2R7LCMTCuKVAyVZPVC6mGtm2RMqLCo90odlCzfyhwo7UXpNLOONWqWcZB0kHI1VTq1scHO1NO0hRmtfCyAZwHBOUf3KOqJXzSk8lbiBmDB7ZHXmBdszMq7Ae4FLPzj3Z00Hfoglcr3zebu2GPx3p00oRq3U9cinJDTtHGkVziFlTjprFw2KZG6xxrUWmlaHmn1RzuX9WjrS1tJvYyhTZFLcTKbe6xJ5iUJgUD5huE1Hel5pIW5OhPsjgj/YKHLewgL8aj9bpu4qjWS1e3I21s+eowjeDYyF9r9LCVlfA9hzYsoAOOd86C5/QN1CqzzYIQXHkWirWh4aqChMLbXy5JIWcqSoArflhpj/flJv6FODncSbQ313x8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Group = _t, #"Start Range" = _t, #"Finish Range" = _t, #"Reading(PAG)" = _t, #"Writing(PAG)" = _t, #"Maths(PAG)" = _t]),
#"Changed column type" = Table.TransformColumnTypes(Source, {{"Start Range", type number}, {"Finish Range", type number}}),
List = Table.AddColumn(#"Changed column type", "Key Stage 1 Average PAG", each List.Numbers(Number.Round([Start Range],2)+0.01, Number.Round(([Finish Range]-[Start Range]),2)*100,0.01)),
#"Expanded Key Stage 1 Average PAG" = Table.ExpandListColumn(List, "Key Stage 1 Average PAG")
in
#"Expanded Key Stage 1 Average PAG"
Here is my preview:
How to Get Your Question Answered Quickly
If it does not help, please provide more details with your desired output and pbix file without privacy information (or some sample data)
Best Regards
Yongkang Hua
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi, @CEllis
Perhaps you can use the following M expression:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("TVJBEsQgCPtLzzsMqAi+ZWf//42FYGs7YxFNQhS/30uuz8XE+eVUaKz8Yjob+aioM2Mn1uv3+V4NQH2R7LCMTCuKVAyVZPVC6mGtm2RMqLCo90odlCzfyhwo7UXpNLOONWqWcZB0kHI1VTq1scHO1NO0hRmtfCyAZwHBOUf3KOqJXzSk8lbiBmDB7ZHXmBdszMq7Ae4FLPzj3Z00Hfoglcr3zebu2GPx3p00oRq3U9cinJDTtHGkVziFlTjprFw2KZG6xxrUWmlaHmn1RzuX9WjrS1tJvYyhTZFLcTKbe6xJ5iUJgUD5huE1Hel5pIW5OhPsjgj/YKHLewgL8aj9bpu4qjWS1e3I21s+eowjeDYyF9r9LCVlfA9hzYsoAOOd86C5/QN1CqzzYIQXHkWirWh4aqChMLbXy5JIWcqSoArflhpj/flJv6FODncSbQ313x8=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Group = _t, #"Start Range" = _t, #"Finish Range" = _t, #"Reading(PAG)" = _t, #"Writing(PAG)" = _t, #"Maths(PAG)" = _t]),
#"Changed column type" = Table.TransformColumnTypes(Source, {{"Start Range", type number}, {"Finish Range", type number}}),
List = Table.AddColumn(#"Changed column type", "Key Stage 1 Average PAG", each List.Numbers(Number.Round([Start Range],2)+0.01, Number.Round(([Finish Range]-[Start Range]),2)*100,0.01)),
#"Expanded Key Stage 1 Average PAG" = Table.ExpandListColumn(List, "Key Stage 1 Average PAG")
in
#"Expanded Key Stage 1 Average PAG"
Here is my preview:
How to Get Your Question Answered Quickly
If it does not help, please provide more details with your desired output and pbix file without privacy information (or some sample data)
Best Regards
Yongkang Hua
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
32 | |
24 | |
12 | |
11 | |
9 |
User | Count |
---|---|
47 | |
46 | |
23 | |
12 | |
9 |