Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
I am attempting to use PowerQuery to return some metrics from AzureMonitor. Specifically the below gets information about AutoScale values for CosmosDB collections and works great in the case that at least one collection has autoscale enabled.
(ResourceUri as text) =>
let
timeSpanUnencoded = "PT10M",
Source = Json.Document(Web.Contents(
"https://management.azure.com/" &
ResourceUri & "/providers/microsoft.insights/metrics?api-version=2018-01-01&%24filter=CollectionName%20eq%20%27*%27%20and%20DatabaseName%20eq%20%27*%27&interval=PT5M&metricnames=AutoscaleMaxThroughput&top=100000",
[Query=[timespan=timeSpanUnencoded]])),
value = Source[value],
value1 = value{0},
timeseries = value1[timeseries],
#"Converted to Table" = Table.FromList(timeseries, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"metadatavalues", "data"}, {"Column1.metadatavalues", "Column1.data"}),
#"Added Index" = Table.AddIndexColumn(#"Expanded Column1", "CollectionId", 0, 1, Int64.Type),
#"Expanded Column1.metadatavalues" = Table.ExpandListColumn(#"Added Index", "Column1.metadatavalues"),
#"Expanded Column1.metadatavalues1" = Table.ExpandRecordColumn(#"Expanded Column1.metadatavalues", "Column1.metadatavalues", {"name", "value"}, {"Column1.metadatavalues.name", "metadata_value"}),
#"Expanded Column1.metadatavalues.name" = Table.ExpandRecordColumn(#"Expanded Column1.metadatavalues1", "Column1.metadatavalues.name", {"value"}, {"metadata_descriptor"}),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1.metadatavalues.name",{{"metadata_descriptor", type text}, {"metadata_value", type text}}),
#"Expanded Column1.data" = Table.ExpandListColumn(#"Changed Type", "Column1.data"),
#"Expanded Column1.data1" = Table.ExpandRecordColumn(#"Expanded Column1.data", "Column1.data", {"timeStamp", "maximum"}, {"timeStamp", "AutoscaleMaxThroughput"}),
#"Changed Type1" = Table.TransformColumnTypes(#"Expanded Column1.data1",{{"AutoscaleMaxThroughput", type number}, {"timeStamp", type datetimezone}}),
#"Added Custom" = Table.AddColumn(#"Changed Type1", "collectionname", each if [metadata_descriptor] = "collectionname" then [metadata_value] else ""),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "databasename", each if [metadata_descriptor] = "databasename" then [metadata_value] else ""),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom1",{"metadata_descriptor", "metadata_value"}),
#"Grouped Rows" = Table.Group(#"Removed Columns", {"CollectionId"}, {{"CollectionName", each List.Max([collectionname]), type text}, {"DatabaseName", each List.Max([databasename]), type text}, {"AutoscaleMaxThroughput", each List.Max([AutoscaleMaxThroughput]), type nullable number}}),
#"Removed Columns1" = Table.RemoveColumns(#"Grouped Rows",{"CollectionId"}),
#"Inserted Merged Column" = Table.AddColumn(#"Removed Columns1", "db_collection_to_lower", each Text.Lower(Text.Combine({[CollectionName], [DatabaseName]}, ":"), "en-GB"), type text),
#"Added Custom2" = Table.AddColumn(#"Inserted Merged Column", "AccountName", each ResourceUri),
#"Extracted Text After Delimiter" = Table.TransformColumns(#"Added Custom2", {{"AccountName", each Text.AfterDelimiter(_, "/", {0, RelativePosition.FromEnd}), type text}})
in
#"Extracted Text After Delimiter"
The problem I am having with it is when I run it against accounts that currently do not have any autoscale enabled.
In that eventuality
What is the best way of solving this problem?
My desired result would be that if there is an empty timeseries the overall Query does not error but instead returns an empty resultset with the same final columns as in the happy path
A sample of how the Json looks in the happy path is
{
"cost": 9,
"timespan": "2021-09-29T11:28:54Z/2021-09-29T11:38:54Z",
"interval": "PT5M",
"value": [
{
"id": "redacted/Microsoft.Insights/metrics/AutoscaleMaxThroughput",
"type": "Microsoft.Insights/metrics",
"name": {
"value": "AutoscaleMaxThroughput",
"localizedValue": "Autoscale Max Throughput"
},
"displayDescription": "Autoscale Max Throughput",
"unit": "Count",
"timeseries": [
{
"metadatavalues": [
{
"name": {
"value": "collectionname",
"localizedValue": "collectionname"
},
"value": "MyCollectionName"
},
{
"name": {
"value": "databasename",
"localizedValue": "databasename"
},
"value": "MyDatabaseName"
}
],
"data": [
{
"timeStamp": "2021-09-29T11:28:00Z",
"maximum": 200000.0
},
{
"timeStamp": "2021-09-29T11:33:00Z",
"maximum": 200000.0
}
]
}
],
"errorCode": "Success"
}
],
"namespace": "Microsoft.DocumentDB/databaseAccounts",
"resourceregion": "westeurope"
}
Solved! Go to Solution.
Hello - here is one way you can handle this.
SCRIPT OPTION #1
let
// This can be replaced with Record.FieldNames, etc. to generate the list dynamically.
ColumnNames = {"CollectionName", "DatabaseName", "AutoscaleMaxThroughput", "db_collection_to_lower", "AccountName"},
//------------------------------------------------------------------------------------------------
// Sample List Values
// Toggle between the result when the list is empty vs. when it is not empty by
// commenting one of the ListValues and uncommenting the other.
//------------------------------------------------------------------------------------------------
// Sample value for an empty list.
ListValue = {},
// Sample value for a list that is not empty.
/*
ListValue = {
[
CollectionName = "abc",
DatabaseName = "def",
AutoscaleMaxThroughput = "ghi",
db_collection_to_lower = "jkl",
AccountName = "mno"
]
},
*/
// Define a list of null records with field names that match that of the populated records.
EmptyListHandler = { Record.FromList ( List.Repeat ( { null }, List.Count ( ColumnNames ) ), ColumnNames ) },
// Define the list to be converted based on the contents. If the ListValue is empty, then the Empty List Handler is returned.
ListToConvert = if ListValue = {} then EmptyListHandler else ListValue,
// Convert to table on the ListToConvert
ListToTable = Table.FromList(ListToConvert, Splitter.SplitByNothing(), null, null, ExtraValues.Error)
in
ListToTable
If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.
Proud to be a Microsoft Fabric Super User
Hello - here is one way you can handle this.
SCRIPT OPTION #1
let
// This can be replaced with Record.FieldNames, etc. to generate the list dynamically.
ColumnNames = {"CollectionName", "DatabaseName", "AutoscaleMaxThroughput", "db_collection_to_lower", "AccountName"},
//------------------------------------------------------------------------------------------------
// Sample List Values
// Toggle between the result when the list is empty vs. when it is not empty by
// commenting one of the ListValues and uncommenting the other.
//------------------------------------------------------------------------------------------------
// Sample value for an empty list.
ListValue = {},
// Sample value for a list that is not empty.
/*
ListValue = {
[
CollectionName = "abc",
DatabaseName = "def",
AutoscaleMaxThroughput = "ghi",
db_collection_to_lower = "jkl",
AccountName = "mno"
]
},
*/
// Define a list of null records with field names that match that of the populated records.
EmptyListHandler = { Record.FromList ( List.Repeat ( { null }, List.Count ( ColumnNames ) ), ColumnNames ) },
// Define the list to be converted based on the contents. If the ListValue is empty, then the Empty List Handler is returned.
ListToConvert = if ListValue = {} then EmptyListHandler else ListValue,
// Convert to table on the ListToConvert
ListToTable = Table.FromList(ListToConvert, Splitter.SplitByNothing(), null, null, ExtraValues.Error)
in
ListToTable
If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.
Proud to be a Microsoft Fabric Super User
Option 1 did work fine for my scenario, thanks.
It gives me a row with null values but this doesn't cause any errors for me and I can filter this row out.
Awesome!
If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.
Proud to be a Microsoft Fabric Super User
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
9 | |
9 | |
8 | |
6 | |
6 |