Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for a $400 discount.
Register nowGet inspired! Check out the entries from the Power BI DataViz World Championships preliminary rounds and give kudos to your favorites. View the vizzies.
Hi guys
I created a dataflow and the data source includes a bunch of json files. Each json file has a nested structure of quite a number of levels.
When I transform the data on PBI service, I tried to expand all columns. Eventually I counted there are 24 columns.
I have realized some columns missing.
Therefore, I saved the data source in my local laptop, tried to get data on PowerBI desktop.
After expanding all the columns, I found there are 52 columns.
May I ask is there any limitation on no. of levels of nested documents to expand in PBI service?
I have also googled some people said if the data source has some fields of null values in the first 1000 rows then the columns will disappear on PBI service?
Many thanks.
J
Solved! Go to Solution.
I have done a test.
First I placed the data source in my local laptop
A. On power BI service, create dataflow > select folder > get the json files and expand all the columns
B. On power BI desktop, get data from folder > get the json files and expand all the columns
Then I compared the M query of A with that of B.
Interestingly I found some columns were missing in A, when a nested document contains more than 2 nested documents the 3rd document were missing and therefore further atrributes in this missing nested document was unable to further expand.
As a result, I developed my best practise when dealing with json files
It does not matter where the data source is, but
1. always try to transform the data via PBI desktop, this is much faster and accurate. Very important => always check "use original column name as prefix" (I usually uncheck this box as this makes the field name so long. However I attained errors when paste the M query to PBI service. then I start all over again to check that box whenever expanding columns. in the end I renamed all the attribute names. I found no errors when pasting the M query to PBI service)
2. then copy the M query from advanced editor
3. go to PBI service, create dataflow, select the method you want to connect the data source
4. once connected, go to the advanced editor, paste the M query to the right place
5. save and close 🙂
@ibarrau Thank for hints provided here so that I finally work this out.
Thank the community here very responsive and professional.
I have done a test.
First I placed the data source in my local laptop
A. On power BI service, create dataflow > select folder > get the json files and expand all the columns
B. On power BI desktop, get data from folder > get the json files and expand all the columns
Then I compared the M query of A with that of B.
Interestingly I found some columns were missing in A, when a nested document contains more than 2 nested documents the 3rd document were missing and therefore further atrributes in this missing nested document was unable to further expand.
As a result, I developed my best practise when dealing with json files
It does not matter where the data source is, but
1. always try to transform the data via PBI desktop, this is much faster and accurate. Very important => always check "use original column name as prefix" (I usually uncheck this box as this makes the field name so long. However I attained errors when paste the M query to PBI service. then I start all over again to check that box whenever expanding columns. in the end I renamed all the attribute names. I found no errors when pasting the M query to PBI service)
2. then copy the M query from advanced editor
3. go to PBI service, create dataflow, select the method you want to connect the data source
4. once connected, go to the advanced editor, paste the M query to the right place
5. save and close 🙂
@ibarrau Thank for hints provided here so that I finally work this out.
Thank the community here very responsive and professional.
Hi. This is strange. They should return the same amount. Please try this: copy the Power Query code from advanced editor in Power Bi Desktop to the DataFlow or just right click on the table in the Transform Data (query editor) to copy it and right click paste in the dataflows query editor.
Let's check if dataflows is returning something different for the same exact code.
Regards,
Happy to help!
good idea, will try this soon.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Check out the February 2025 Power BI update to learn about new features.
User | Count |
---|---|
59 | |
32 | |
30 | |
29 | |
23 |
User | Count |
---|---|
48 | |
46 | |
23 | |
16 | |
14 |