Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
I am loading data into into Azure Data Lake from a IoT Stream Analytic job as json line elements, as described in the guide
https://azure.microsoft.com/en-us/documentation/articles/iot-hub-csharp-csharp-getstarted/
I would then like to load this data into Power BI Desktop using the data lake connector, described in the guide
https://azure.microsoft.com/da-dk/documentation/articles/data-lake-store-power-bi/
This works well with csv files and true json files. But I can’t get the json data from the data lake transformed into a valid data source in Power BI.
I have tried with success to use Binary.Combine as suggested in this blog post:
but this is not a solution that my users will be able to use.
Any ideas?
I am seeing the same issue. I navigated to the json file within a folder inside data lake store. Once i click on the file it automatically tries importing it as a csv. I remove the import csv steps and changed type steps it automatically creates. Then on the navigation step i can now select to import JSON. I then get the following error. The JSON file was created by using a copy pipeline from an on Prem SQL Server table to Data Lake Store as JSON using Data Factory.
I ran into the same thing, but got it to work. My scenario is taking data from Stream Analytics in Azure to a Data Lake Store using JSON fragments into folders named by date and by hour. In the query editor I navigated to the folder that contains the JSON files and selected "Combine Files", then I added a Transform to Parse the data as JSON. After that sample enough of the data to get all the column names (we have multiple record types which result in a sparsely populated matrix). After you have the data looking as you want close and apply the changes.
Something to try in your query on the json fragments - when editing the query, go up to the directory that has all the files and click the link to combine the files, then add a Transform to parse the dataset as json. After that you need to sample the headers to get the column names/types for the json data.
Hi @theflyingpanda,
I am trying to reproduce the issue as yours. However, after I steam data from IoT Stream Analytic job into Azure Data Lake following the instructions in this similar article , I can successfully connect to Azure Data Lake from Power BI Desktop . You can view the following screenshots to get more details.
In your scenario, how about storing IoT hub data into Table Storage and connecting Azure Table Storage from Power BI Desktop?
Thanks,
Lydia Zhang
Correct - storring the data in a Azure Table is one way, csv another and SQL Server a third. But I need a simple way accessing the json line data in Power BI with out writing code 😐
Bjørn
Hi @theflyingpanda,
When I load data into Azure Data Lake from a IoT Stream Analytic job as json line elements and connect to Azure Data Lake from Desktop, I don't need to write any codes. Do you customize the IoT Stream Analytic job following the first link?
Thanks,
Lydia Zhang