Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Ok so I followed these instructions to set up continuous export of my application insights data
https://docs.microsoft.com/en-us/azure/application-insights/app-insights-export-telemetry
And then I follow the instructions in Power BI (Get Data -> Azure -> Azure Blob Storage) to get this query :
= AzureStorage.Blobs("<blobContainerStorageUrl>")
where <blobContainerStorageUrl> is url to my blob container storage.
Then it comes back with data but the data is rows of filenames pertaining to the blob data in the blob container. I found somewhere how to grab the data contained within the files : https://community.powerbi.com/t5/Desktop/azure-blob/td-p/92058 but since the filetypes are ".blob" it can't parse the data inside properly which is all JSON due to how Azure AI built it into the Blob Storage. I feel I'm pretty close just need to get the data parsed out better.
In other words, how does one combine the Continuous Export from Azure Application Insights with Power BI?
Related Post I saw mentions Stream Analytics which I hope I can skip this middle process if I possible now : https://community.powerbi.com/t5/Service/Data-not-appearing-in-PowerBI/td-p/24648
Solved! Go to Solution.
I took another crack at it and I think I figured it out. It first required tweaking some transform queries that PowerBI tries to do to the blob binary data which strips out some important characters in the json serialized strings. Then after that, I had to parse the strings as JSON and expanding the data from there.
I'm surprised considering Continuous Export from AAI is a Microsoft product/feature along with PowerBI, there isn't already some blog post about making this integration work seamlessly (or is this reserved for the BI Pros?) so if anybody needs it, here is the advanced query where you just replace the "<blob_storage_container_url>" with your own url path to your blob container with the blobs stored from AAI. From there it's just expanding the custom.dimensions column to grab all the properties you track and these are unique to whatever your custom event data contains.
let Source = AzureStorage.Blobs("<blob_storage_container_url>"), #"Invoke Custom Function1" = Table.AddColumn(Source, "JsonTransform", each Lines.FromBinary([Content],null,null,1252)), #"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}), #"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "JsonTransform"}), #"ExpandedJsonTransform" = Table.ExpandListColumn(#"Removed Other Columns1", "JsonTransform"), #"Parsed JSON" = Table.TransformColumns(#"ExpandedJsonTransform",{{"JsonTransform", Json.Document}}), #"Expanded Transform File from Query1" = Table.ExpandRecordColumn(#"Parsed JSON", "JsonTransform", {"event", "internal", "context"}, {"event", "internal", "context"}), #"Expanded event" = Table.ExpandListColumn(#"Expanded Transform File from Query1", "event"), #"Expanded event1" = Table.ExpandRecordColumn(#"Expanded event", "event", {"name", "count"}, {"event.name", "event.count"}), #"Expanded context" = Table.ExpandRecordColumn(#"Expanded event1", "context", {"application", "data", "device", "user", "session", "operation", "location", "custom"}, {"application", "data", "device", "user", "session", "operation", "location", "custom"}), #"Expanded custom" = Table.ExpandRecordColumn(#"Expanded context", "custom", {"dimensions"}, {"custom.dimensions"}) in #"Expanded custom"
Hi @ochavez,
If you're looking to explore your data in Power BI, you can do that without using Continuous Export. See: Feed Power BI from Application Insights.
In your scenario, when you get data use Azure Blob Storage data source, how the results look like in Query Editor? Can you show a screenshot?
Best Regards,
Qiuyun Yu
Thanks for the link, but I have gone down that path, sorry I should have mentioned it. Basically there is two solutions there which both don't work for me. The adapter (as far as I could tell) was only picking up certain data from the resource but it's not picking up customEvents which is what I'm looking for.
The other solution ( going to AI and getting the Power BI query ), it works great but the whole point of this endeavor is that I want to be able to query historical data since AI only retains it for 90 days (which also is a point why the adapter solution doesn't work out either) so this query would effectively be doing the same thing as me querying it from AI Analytics (unless I'm missing something and this query is saving it to my desktop or something but there is nothing clear stating that).
The last solution that got me the closest was doing the Continuous Export (which just stores it into a Blob Storage therefore retaining it for longer than the 90 day limit) and then telling Power BI to query it from there. What I get back is a table of filenames which pertain to the blob files that CE pushes out to storage similar to this person : https://community.powerbi.com/t5/Desktop/azure-blob/td-p/92058. The big difference is this person's data is CSV filetype while mine is BLOB files (which are really just JSON files). So Power BI is trying to parse the data from these files and doesn't know how to handle them and tries to parse it using a delimiter but obviously with JSON it's more than just one delimiter to parse that. In my case if the files were JSON, I think the Power BI is smart enough to handle them (I seen it somewhere) but like I said they are ".blob" files. So I tried without a delimiter and I get back the contents of the files as per row but only one column being the JSON represented in string.
Hi @ochavez, I am working through the exact same issue as you.
I'm using Azure Application Insights to collect CustomEvents from an application/website. I have a requirement to retain and report on data for more than the 90 days of retention provided by AAI. I'd like to avoid the complexity of introducing Stream Analytics.
AAI is capturing my events. I've succesfully configured continuous export to blob storage. Power BI can connect to the blob storage, but I've so far failed to find the right combinations of data parsing and manipulation in Power BI that will make the data userful for writing reports against.
Please let me know if you have made any progress, or found a creative solution.
Hi @ochavez, I am working through the exact same issue as you.
I'm using Azure Application Insights to collect CustomEvents from an application/website. I have a requirement to retain and report on data for more than the 90 days of retention provided by AAI. I'd like to avoid the complexity of introducing Stream Analytics.
AAI is capturing my events. I've succesfully configured continuous export to blob storage. Power BI can connect to the blob storage, but I've so far failed to find the right combinations of data parsing and manipulation in Power BI that will make the data userful for writing reports against.
Please let me know if you have made any progress, or found a creative solution.
Hi @ochavez,
As this issue requires sufficient Azure experience, I would suggest you create a support ticket to get dedicated help from Microsoft engineer.
Best Regards,
Qiuyun Yu
I took another crack at it and I think I figured it out. It first required tweaking some transform queries that PowerBI tries to do to the blob binary data which strips out some important characters in the json serialized strings. Then after that, I had to parse the strings as JSON and expanding the data from there.
I'm surprised considering Continuous Export from AAI is a Microsoft product/feature along with PowerBI, there isn't already some blog post about making this integration work seamlessly (or is this reserved for the BI Pros?) so if anybody needs it, here is the advanced query where you just replace the "<blob_storage_container_url>" with your own url path to your blob container with the blobs stored from AAI. From there it's just expanding the custom.dimensions column to grab all the properties you track and these are unique to whatever your custom event data contains.
let Source = AzureStorage.Blobs("<blob_storage_container_url>"), #"Invoke Custom Function1" = Table.AddColumn(Source, "JsonTransform", each Lines.FromBinary([Content],null,null,1252)), #"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}), #"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "JsonTransform"}), #"ExpandedJsonTransform" = Table.ExpandListColumn(#"Removed Other Columns1", "JsonTransform"), #"Parsed JSON" = Table.TransformColumns(#"ExpandedJsonTransform",{{"JsonTransform", Json.Document}}), #"Expanded Transform File from Query1" = Table.ExpandRecordColumn(#"Parsed JSON", "JsonTransform", {"event", "internal", "context"}, {"event", "internal", "context"}), #"Expanded event" = Table.ExpandListColumn(#"Expanded Transform File from Query1", "event"), #"Expanded event1" = Table.ExpandRecordColumn(#"Expanded event", "event", {"name", "count"}, {"event.name", "event.count"}), #"Expanded context" = Table.ExpandRecordColumn(#"Expanded event1", "context", {"application", "data", "device", "user", "session", "operation", "location", "custom"}, {"application", "data", "device", "user", "session", "operation", "location", "custom"}), #"Expanded custom" = Table.ExpandRecordColumn(#"Expanded context", "custom", {"dimensions"}, {"custom.dimensions"}) in #"Expanded custom"
Hi
I am experiencing the same problem as listed above when i try to import Application Insights data that is continuously exported to Azure Storage Account in blobs.
When i import blob data into PowerBI, and used your query that is referred below, invoke Custom Function step is failing.
Below is my query
let
Source = AzureStorage.Blobs("XXX"),
XXX1= Source{[Name="XXX"]}[Data],
#"Invoke Custom Function1" = Table.AddColumn(Source, "JsonTransform", each Lines.FromBinary([Content],null,null,1252)),
#"Renamed Columns1" = Table.RenameColumns(#"Invoke Custom Function1", {"Name", "Source.Name"}),
#"Removed Other Columns1" = Table.SelectColumns(#"Renamed Columns1", {"Source.Name", "JsonTransform"}),
#"ExpandedJsonTransform" = Table.ExpandListColumn(#"Removed Other Columns1", "JsonTransform"),
#"Parsed JSON" = Table.TransformColumns(#"ExpandedJsonTransform",{{"JsonTransform", Json.Document}}),
#"Expanded Transform File from Query1" = Table.ExpandRecordColumn(#"Parsed JSON", "JsonTransform", {"event", "internal", "context"}, {"event", "internal", "context"}),
#"Expanded event" = Table.ExpandListColumn(#"Expanded Transform File from Query1", "event"),
#"Expanded event1" = Table.ExpandRecordColumn(#"Expanded event", "event", {"name", "count"}, {"event.name", "event.count"}),
#"Expanded context" = Table.ExpandRecordColumn(#"Expanded event1", "context", {"application", "data", "device", "user", "session", "operation", "location", "custom"}, {"application", "data", "device", "user", "session", "operation", "location", "custom"}),
#"Expanded custom" = Table.ExpandRecordColumn(#"Expanded context", "custom", {"dimensions"}, {"custom.dimensions"})
in
#"Expanded custom"
Please help!
Best Regards,
sherin
Hi,
I am experiencing the exact same problem after importing Azure Storage Blob data (that is continuously exported from Application Insights) into powerbi. I followed your recommendation by replacing my storage account details with the above query you listed in Advanced Editor this query didn't work.
When i tried to combine all blob files and transform based on comm
let
Source = AzureStorage.Blobs("XXX"),
XXX1 = Source{[Name="XXXX"]}[Data],
#"Removed Other Columns" = Table.SelectColumns(XXX1,{"Content"}),
#"Invoke Custom Function1" = Table.AddColumn(#"Removed Other Columns", "Transform File from XXX (3)", each #"Transform File from XXX (3)"([Content])),
#"Removed Other Columns1" = Table.SelectColumns(#"Invoke Custom Function1", {"Transform File from XXX (3)"}),
#"Expanded Table Column1" = Table.ExpandTableColumn(#"Removed Other Columns1", "Transform File from XXX (3)", Table.ColumnNames(#"Transform File from XXX (3)"(#"Sample File (3)"))),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded Table Column1",{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}, {"Column4", type text}, {"Column5", type text}, {"Column6", type text}, {"Column7", type text}, {"Column8", type text}, {"Column9", type text}, {"Column10", type text}, {"Column11", type text}, {"Column12", type text}, {"Column13", type text}, {"Column14", type text}, {"Column15", type text}, {"Column16", type text}, {"Column17", type text}, {"Column18", type text}, {"Column19", type text}, {"Column20", type text}, {"Column21", type text}, {"Column22", type text}, {"Column23", type text}, {"Column24", type text}, {"Column25", type text}, {"Column26", type text}, {"Column27", type text}, {"Column28", type text}, {"Column29", type text}, {"Column30", type text}, {"Column31", type text}, {"Column32", type text}, {"Column33", type text}, {"Column34", type text}, {"Column35", type text}, {"Column36", type text}, {"Column37", type text}, {"Column38", type text}, {"Column39", type text}, {"Column40", type text}, {"Column41", type text}, {"Column42", type text}, {"Column43", type text}, {"Column44", type text}, {"Column45", type text}, {"Column46", type text}, {"Column47", type text}, {"Column48", type text}, {"Column49", type text}, {"Column50", type text}, {"Column51", type text}, {"Column52", type text}, {"Column53", type text}, {"Column54", type text}, {"Column55", type text}, {"Column56", type text}, {"Column57", type text}, {"Column58", type text}, {"Column59", type text}, {"Column60", type text}, {"Column61", type text}, {"Column62", type text}, {"Column63", type text}, {"Column64", type text}, {"Column65", type text}, {"Column66", type text}, {"Column67", type text}, {"Column68", type text}, {"Column69", type text}})
in
#"Changed Type"
Please help!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
17 | |
9 | |
8 | |
7 | |
7 |