Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
theflyingpanda
Frequent Visitor

json in Azure Data Lake as data source in Power BI

I am loading data into into Azure Data Lake from a IoT Stream Analytic job as json line elements, as described in the guide

https://azure.microsoft.com/en-us/documentation/articles/iot-hub-csharp-csharp-getstarted/

 

I would then like to load this data into Power BI Desktop using the data lake connector, described in the guide

https://azure.microsoft.com/da-dk/documentation/articles/data-lake-store-power-bi/

 

This works well with csv files and true json files. But I can’t get the json data from the data lake transformed into a valid data source in Power BI.

 

I have tried with success to use Binary.Combine as suggested in this blog post:

https://community.powerbi.com/t5/Integrations-with-Files-and/Change-JSON-Blob-data-during-import/td-...

 

but this is not a solution that my users will be able to use.

 

Any ideas?

6 REPLIES 6
datanerd
Regular Visitor

I am seeing the same issue.  I navigated to the json file within a folder inside data lake store.  Once i click on the file it automatically tries importing it as a csv.  I remove the import csv steps and changed type steps it automatically creates.  Then on the navigation step i can now select to import JSON.  I then get the following error.  The JSON file was created by using a copy pipeline from an on Prem SQL Server table to Data Lake Store as JSON using Data Factory. 

 

powerbi import data lake json.png

I ran into the same thing, but got it to work. My scenario is taking data from Stream Analytics in Azure to a Data Lake Store using JSON fragments into folders named by date and by hour. In the query editor I navigated to the folder that contains the JSON files and selected "Combine Files", then I added a Transform to Parse the data as JSON. After that sample enough of the data to get all the column names (we have multiple record types which result in a sparsely populated matrix). After you have the data looking as you want close and apply the changes.

Something to try in your query on the json fragments - when editing the query, go up to the directory that has all the files and click the link to combine the files, then add a Transform to parse the dataset as json. After that you need to sample the headers to get the column names/types for the json data.

Anonymous
Not applicable

Hi @theflyingpanda,

I am trying to reproduce the issue as yours. However, after I steam data from IoT Stream Analytic job into Azure Data Lake following the instructions in this similar article , I can successfully connect to Azure Data Lake from Power BI Desktop . You can view the following screenshots to get more details.
1.PNG
2.PNG


In your scenario, how about storing IoT hub data into Table Storage and connecting Azure Table Storage from Power BI Desktop?

Thanks,
Lydia Zhang

Correct - storring the data in a Azure Table is one way, csv another and SQL Server a third. But I need a simple way accessing the json line data in Power BI with out writing code 😐

 

Bjørn

Anonymous
Not applicable

Hi @theflyingpanda,

When I load data into Azure Data Lake from a IoT Stream Analytic job as json line elements and connect to Azure Data Lake from Desktop, I don't need to write any codes. Do you customize the IoT Stream Analytic job following the first link?

Thanks,
Lydia Zhang

Helpful resources

Announcements
September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors