Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
Let me try this again. I posted this once but it was deleted as spam.
Hello!
I have been racking my brain over this and have not been able to figure it out. I may be overthinking it. 
As an example of what I am looking for, I have a table of data that includes a list of orders and includes order number, vendor, and color (either Blue or Red). I would like to pull a random sampling of the data in a certain way. If the vendor's volume is between 1 - 20, I would like to pull 2 records. If the volume is 20 - 40, I would like to pull 4 records. If the volume is over 40, I would like to pull 6 records. 
Going further, if the number pulled is 2, I would like them to be 1 Blue record and 1 Red record. If the number pulled is 4, I would like them to be 1 Blue record and 3 Red records, If the number pulled is 6, I would like them to be 3 Blue records and 4 Red records.
I am hoping to get them all in a table or two. So if the volumes are as follows:
| Vendor | Total Volume | Random Pull Total | Random Blue Amount | Random Red Amount | 
| Bob | 36 | 4 | 1 | 3 | 
| Jane | 50 | 6 | 2 | 4 | 
| Joe | 28 | 4 | 1 | 3 | 
| Sam | 9 | 2 | 1 | 1 | 
And I select Joe, I will get a random 4 orders, with 1 being a random blue and 3 being random reds:
| Vendor | Order # | Color | 
| Joe | 84 | Blue | 
| Joe | 44 | Red | 
| Joe | 15 | Red | 
| Joe | 114 | Red | 
If I need to have two tables, one that pulls the random blues and one that pulls the random reds, this is fine. But I just cannot figure it out!
I tried to provide a OneDrive link to some sample data but my post was marked as spam and deleted, so to be safe, I will not post any data. Sorry.
I appreciate any help!
Thank you!
Solved! Go to Solution.
@Anonymous
let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bZWxUgMxDET/5eoUli3bUktJCeVNijCkA1Lx/5Cb4Uiyr7liT5K13pW1rosth+X5cv79Pn18n5fjYV3qFTp9XbGX8/sGtev/y9sN4juy53UtNf5L7dgELPbUv/opJ1qBRDMCqyY3inPt2Lq0YkPLTaVvSsJSr7ISjUoyKItKLKrDIZ0ChzZdJyQrkZqa2ohIAyKNDKWOamCppmq0AdXIUy2gFbWVEw03PcSr9ucN4lx6dhgNBzV8anvAwsFWnWh0UKOrrTqw6E71yFYdiHSwVVdbdbDVICIDiIxKgeqr4QrRUzUAUz1GKLOR0MksGjhN+U6YjtnkqiZMx+yAwXhMGo8JRCboEaRHmDQY6qug5yrg0Q3yVeizG2CrCMolRbIou6T9kVXYpfoqQZEkIklbMEmTDD0FZt1K0Tu0YpJsBcxlBXdhoWVYwGFWwGJWQBorQemkjRlRMrWZ0XK3BueYvsO2bffX0+cdNnbsJnkSGASmltx2/CNmkLzt+MfAdo8dfwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Order #" = _t, Vendor = _t, Color = _t]),
    #"Added Index" = Table.AddIndexColumn(Source, "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "RandNum_", each Number.Random(), type number),
    #"Sorted Rows" = Table.Sort(Table.Buffer(#"Added Custom"),{{"Vendor", Order.Ascending}, {"Color", Order.Ascending}, {"RandNum_", Order.Ascending}}),
    #"Grouped Rows0" = Table.Group(#"Sorted Rows", {"Vendor"}, {{"Tot volume", each List.Count([#"Order #"])}}),
    #"Grouped Rows1" = Table.Group(Table.Buffer(#"Sorted Rows"), {"Vendor", "Color"}, {{"Temp", each _}}),
    #"Merged Queries" = Table.NestedJoin(#"Grouped Rows1", {"Vendor"}, #"Grouped Rows0", {"Vendor"}, "Tot Volume", JoinKind.LeftOuter),
    #"Expanded Tot Volume" = Table.ExpandTableColumn(#"Merged Queries", "Tot Volume", {"Tot volume"}, {"Tot volume"}),
    #"Added Custom1" = Table.AddColumn(#"Expanded Tot Volume", "Sample", each if [Tot volume] <= 20 then Table.FirstN([Temp],1) else  
                           if [Tot volume] <= 40 then (if [Color] = "Blue" then Table.FirstN([Temp],1) else Table.FirstN([Temp],3)) else
                           Table.FirstN([Temp],3)),
    #"Expanded Sample" = Table.ExpandTableColumn(#"Added Custom1", "Sample", {"Order #"}, {"Order #"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Sample",{"Temp", "Tot volume"}),
    #"Sorted Rows1" = Table.Sort(#"Removed Columns",{{"Vendor", Order.Ascending}, {"Color", Order.Ascending}})
in
    #"Sorted Rows1"
					
				
			
			
				
			
			
				
			
			
				
			
			
			
			
			
		@Anonymous
How many rows are we talking about?
Perhaps something like Python would be more suitable for this
| 
 | 
Please accept the solution when done and consider giving a thumbs up if posts are helpful.   Contact me privately for support with any larger-scale BI needs, tutoring, etc.  | 
@AlB ,The table with the vendors, Order Numbers, and "Colors", etc. is, currently, at 1,979,640 rows and counting. 😕 There is also a date column though, and I really only need the orders from *this year*, which would reduce the rows currently to 349,483 if we can add some sort of date filter to that M code?
@Anonymous
Can you share the actual data? Send it over private message if necessary.
You have to share the URL to the file hosted elsewhere: Dropbox, Onedrive... or just upload the file to a site like tinyupload.com (no sign-up required).
| 
 
  | 
Please accept the solution when done and consider giving a thumbs up if posts are helpful.   Contact me privately for support with any larger-scale BI needs, tutoring, etc.  | 
I am not able to upload or share any actual data, it is very proprietary and the amount of data makes it too hard to redact/spoof the proprietary information. Your solution did not work for me, I am sorry. There is too much data for this to work. Thank you for your help though
@Anonymous
let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bZWxUgMxDET/5eoUli3bUktJCeVNijCkA1Lx/5Cb4Uiyr7liT5K13pW1rosth+X5cv79Pn18n5fjYV3qFTp9XbGX8/sGtev/y9sN4juy53UtNf5L7dgELPbUv/opJ1qBRDMCqyY3inPt2Lq0YkPLTaVvSsJSr7ISjUoyKItKLKrDIZ0ChzZdJyQrkZqa2ohIAyKNDKWOamCppmq0AdXIUy2gFbWVEw03PcSr9ucN4lx6dhgNBzV8anvAwsFWnWh0UKOrrTqw6E71yFYdiHSwVVdbdbDVICIDiIxKgeqr4QrRUzUAUz1GKLOR0MksGjhN+U6YjtnkqiZMx+yAwXhMGo8JRCboEaRHmDQY6qug5yrg0Q3yVeizG2CrCMolRbIou6T9kVXYpfoqQZEkIklbMEmTDD0FZt1K0Tu0YpJsBcxlBXdhoWVYwGFWwGJWQBorQemkjRlRMrWZ0XK3BueYvsO2bffX0+cdNnbsJnkSGASmltx2/CNmkLzt+MfAdo8dfwA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"Order #" = _t, Vendor = _t, Color = _t]),
    #"Added Index" = Table.AddIndexColumn(Source, "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "RandNum_", each Number.Random(), type number),
    #"Sorted Rows" = Table.Sort(Table.Buffer(#"Added Custom"),{{"Vendor", Order.Ascending}, {"Color", Order.Ascending}, {"RandNum_", Order.Ascending}}),
    #"Grouped Rows0" = Table.Group(#"Sorted Rows", {"Vendor"}, {{"Tot volume", each List.Count([#"Order #"])}}),
    #"Grouped Rows1" = Table.Group(Table.Buffer(#"Sorted Rows"), {"Vendor", "Color"}, {{"Temp", each _}}),
    #"Merged Queries" = Table.NestedJoin(#"Grouped Rows1", {"Vendor"}, #"Grouped Rows0", {"Vendor"}, "Tot Volume", JoinKind.LeftOuter),
    #"Expanded Tot Volume" = Table.ExpandTableColumn(#"Merged Queries", "Tot Volume", {"Tot volume"}, {"Tot volume"}),
    #"Added Custom1" = Table.AddColumn(#"Expanded Tot Volume", "Sample", each if [Tot volume] <= 20 then Table.FirstN([Temp],1) else  
                           if [Tot volume] <= 40 then (if [Color] = "Blue" then Table.FirstN([Temp],1) else Table.FirstN([Temp],3)) else
                           Table.FirstN([Temp],3)),
    #"Expanded Sample" = Table.ExpandTableColumn(#"Added Custom1", "Sample", {"Order #"}, {"Order #"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Sample",{"Temp", "Tot volume"}),
    #"Sorted Rows1" = Table.Sort(#"Removed Columns",{{"Vendor", Order.Ascending}, {"Color", Order.Ascending}})
in
    #"Sorted Rows1"
					
				
			
			
				
			
			
				
			
			
			
			
			
			
		Thanks @AlB , I have given this a try; however, it has been evaluating all day. I am actually working with a huge amount of data (the sample I gave was an example), so if it ever loads I will be able to see if it works! Thanks!
@Anonymous
So how do we translate this: If the vendor's volume is between 1 - 20, I would like to pull 2 records. If the volume is 20 - 40, I would like to pull 4 records. If the volume is over 40, I would like to pull 6 records.
to the data you have now?? What is the volume then, the number of rows??
| 
 | 
Please accept the solution when done and consider giving a thumbs up if posts are helpful.   Contact me privately for support with any larger-scale BI needs, tutoring, etc.  | 
Yes @AlB the number of rows where they are listed as the vendor. I generally do a distinctcount of the order number and then a visual filter the vendor though could write a measure for this with a filter as well. Thank you!
Hello @AlB !
I realize now that since I had to remove my sample data some may not have made sense, and I am not sure your suggestion worked with the type of data I have. The table that includes my data is a CSV like this (sorry it is a little long but I cannot attach or link to it):
| Order # | Vendor | Color | 
| 1 | Joe | Blue | 
| 2 | Jane | Red | 
| 3 | Bob | Red | 
| 4 | Bob | Blue | 
| 5 | Joe | Blue | 
| 6 | Jane | Blue | 
| 7 | Jane | Blue | 
| 8 | Joe | Red | 
| 9 | Bob | Red | 
| 10 | Jane | Blue | 
| 11 | Jane | Blue | 
| 12 | Bob | Red | 
| 13 | Jane | Blue | 
| 14 | Joe | Blue | 
| 15 | Joe | Red | 
| 16 | Bob | Red | 
| 17 | Bob | Blue | 
| 18 | Joe | Red | 
| 19 | Jane | Red | 
| 20 | Jane | Blue | 
| 21 | Joe | Blue | 
| 22 | Bob | Red | 
| 23 | Jane | Blue | 
| 24 | Jane | Red | 
| 25 | Jane | Blue | 
| 26 | Bob | Blue | 
| 27 | Jane | Red | 
| 28 | Joe | Red | 
| 29 | Bob | Blue | 
| 30 | Jane | Blue | 
| 31 | Joe | Blue | 
| 32 | Jane | Red | 
| 33 | Bob | Red | 
| 34 | Bob | Blue | 
| 35 | Joe | Red | 
| 36 | Jane | Red | 
| 37 | Jane | Blue | 
| 38 | Joe | Blue | 
| 39 | Bob | Red | 
| 40 | Jane | Blue | 
| 41 | Jane | Red | 
| 42 | Bob | Blue | 
| 43 | Jane | Red | 
| 44 | Joe | Red | 
| 45 | Joe | Blue | 
| 46 | Bob | Blue | 
| 47 | Bob | Red | 
| 48 | Joe | Blue | 
| 49 | Jane | Red | 
| 50 | Jane | Blue | 
| 51 | Joe | Blue | 
| 52 | Bob | Red | 
| 53 | Jane | Red | 
| 54 | Jane | Blue | 
| 55 | Jane | Blue | 
| 56 | Bob | Blue | 
| 57 | Jane | Red | 
| 58 | Joe | Red | 
| 59 | Bob | Blue | 
| 60 | Jane | Blue | 
| 61 | Joe | Blue | 
| 62 | Jane | Blue | 
| 63 | Bob | Red | 
| 64 | Bob | Red | 
| 65 | Joe | Blue | 
| 66 | Joe | Blue | 
| 67 | Bob | Red | 
| 68 | Jane | Red | 
| 69 | Jane | Blue | 
| 70 | Jane | Red | 
| 71 | Bob | Blue | 
| 72 | Jane | Red | 
| 73 | Joe | Red | 
| 74 | Bob | Blue | 
| 75 | Bob | Blue | 
| 76 | Jane | Red | 
| 77 | Jane | Blue | 
| 78 | Jane | Red | 
| 79 | Bob | Blue | 
| 80 | Jane | Blue | 
| 81 | Joe | Red | 
| 82 | Bob | Red | 
| 83 | Jane | Blue | 
| 84 | Joe | Blue | 
| 85 | Jane | Blue | 
| 86 | Bob | Red | 
| 87 | Jane | Red | 
| 88 | Jane | Blue | 
| 89 | Jane | Blue | 
| 90 | Bob | Blue | 
| 91 | Jane | Blue | 
| 92 | Joe | Red | 
| 93 | Bob | Red | 
| 94 | Bob | Blue | 
| 95 | Jane | Blue | 
| 96 | Jane | Blue | 
| 97 | Jane | Blue | 
| 98 | Bob | Red | 
| 99 | Jane | Red | 
| 100 | Joe | Blue | 
| 101 | Bob | Red | 
| 102 | Jane | Red | 
| 103 | Jane | Blue | 
| 104 | Joe | Blue | 
| 105 | Bob | Blue | 
| 106 | Jane | Red | 
| 107 | Jane | Red | 
| 108 | Bob | Blue | 
| 109 | Jane | Blue | 
| 110 | Joe | Blue | 
| 111 | Joe | Red | 
| 112 | Bob | Red | 
| 113 | Bob | Blue | 
| 114 | Joe | Red | 
| 115 | Sam | Red | 
| 116 | Sam | Blue | 
| 117 | Sam | Blue | 
| 118 | Sam | Blue | 
| 119 | Sam | Red | 
| 120 | Sam | Red | 
| 121 | Sam | Blue | 
| 122 | Sam | Red | 
| 123 | Sam | Red | 
There is no volume column. I tried to make what you sent work but it does not look like I am able to. Any suggestions now that I have copied and pasted some sample data?
Thank you so much!
Hi @Anonymous
Place the following M code in a blank query to see the steps of a possible solution. Every time you refresh, the random sample will be updated
let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("ddJNC4MwDAbg/9Kzh6W1rV497jLYjiLDYW/7AGH/f03JnOmSQ6H0IXljcBzN8LqZxpzWJa3XQ75BPsP9nczUcESw+ZzT8mf47hTD93bf9Dg/06YtRX4rGfoqkmGgTLlvpFSxtMs3r2Gfb0HtC7ijqNUCfkq3L77Mj5/Wa2LoqokZtjSx3NfXIzMNNLGskSYWYxF6DRHKOiS15V8CLdailH3IjAJOZRTYljV9AA==", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Vendor = _t, OrderNum = _t, Volume = _t, Color = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"Vendor", type text}, {"Volume", Int64.Type}, {"Color", type text}}),
    #"Added Index" = Table.AddIndexColumn(#"Changed Type", "Index", 0, 1, Int64.Type),
    #"Added Custom" = Table.AddColumn(#"Added Index", "RandNum_", each Number.Random(), type number),
    #"Sorted Rows" = Table.Sort(Table.Buffer(#"Added Custom"),{{"Vendor", Order.Ascending}, {"Color", Order.Ascending}, {"RandNum_", Order.Ascending}}),
    #"Grouped Rows0" = Table.Group(#"Sorted Rows", {"Vendor"}, {{"Tot volume", each List.Sum([Volume]), type nullable number}}),
    #"Grouped Rows1" = Table.Group(Table.Buffer(#"Sorted Rows"), {"Vendor", "Color"}, {{"Temp", each _}}),
    #"Merged Queries" = Table.NestedJoin(#"Grouped Rows1", {"Vendor"}, #"Grouped Rows0", {"Vendor"}, "Tot Volume", JoinKind.LeftOuter),
    #"Expanded Tot Volume" = Table.ExpandTableColumn(#"Merged Queries", "Tot Volume", {"Tot volume"}, {"Tot volume"}),
    #"Added Custom1" = Table.AddColumn(#"Expanded Tot Volume", "Sample", each if [Tot volume] <= 20 then Table.FirstN([Temp],1) else  
                           if [Tot volume] <= 40 then (if [Color] = "Blue" then Table.FirstN([Temp],1) else Table.FirstN([Temp],3)) else
                           Table.FirstN([Temp],3)),
    #"Expanded Sample" = Table.ExpandTableColumn(#"Added Custom1", "Sample", {"OrderNum", "Volume"}, {"OrderNum", "Volume"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded Sample",{"Temp", "Tot volume"})
in
    #"Removed Columns"
| 
 | 
Please accept the solution when done and consider giving a thumbs up if posts are helpful.   Contact me privately for support with any larger-scale BI needs, tutoring, etc.  | 
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.