Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.

Reply
gwhiteman
Frequent Visitor

Importing multiple JSON files int PBI (Data Source = Admin API GetDatasourcesAsAdmin)

I am using the Admin Power BI API “GetDatasourcesAsAdmin” to extract connection details for semantic models we manage. I have a list of models (by datasetId) which I loop through and export the information to individual JSON files. My plan is to import a folder full of JSON files into Power BI however I have found that JSON file are not consistent hence when importing the files I received one of the following errors:

 

1) Where the M code for “Transform Sample File” is:

 

let

    Source = Json.Document(Parameter1),

    #"Converted to Table" = Table.FromRecords({Source}),

    #"Expanded connectionDetails" = Table.ExpandRecordColumn(#"Converted to Table", "connectionDetails", {"server", "database"}, {"connectionDetails.server", "connectionDetails.database"}),

    #"Changed Type" = Table.TransformColumnTypes(#"Expanded connectionDetails",{{"datasourceType", type text}, {"connectionDetails.server", type text}, {"connectionDetails.database", type text}, {"datasourceId", type text}, {"gatewayId", type text}})

in

    #"Changed Type"

 

ERROR DETAILS

An error occurred in the ‘Transform File’ query. Expression.Error: The column 'datasourceId' of the table wasn't found.

Details:

    datasourceId

 

2) Where the M code for “Transform Sample File” is:

let

    Source = (Parameter1) => let

        Source = Json.Document(Parameter1),

        #"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),

        #"Expanded Column1" = Table.ExpandRecordColumn(#"Converted to Table", "Column1", {"datasourceType", "connectionDetails"}, {"datasourceType", "connectionDetails"}),

        #"Expanded connectionDetails" = Table.ExpandRecordColumn(#"Expanded Column1", "connectionDetails", {"server", "database", "url"}, {"connectionDetails.server", "connectionDetails.database", "connectionDetails.url"}),

        #"Changed Type" = Table.TransformColumnTypes(#"Expanded connectionDetails",{{"datasourceType", type text}, {"connectionDetails.server", type text}, {"connectionDetails.database", type text}, {"connectionDetails.url", type text}})

    in

        #"Changed Type"

in

    Source

 

ERROR DETAILS

An error occurred in the ‘Transform File’ query. Expression.Error: We cannot convert a value of type Record to type List.

Details:

    Value=[Record]

    Type=[Type]

 

 

 

I am looping through the following PowerShell command to create JSON files.

 

    $dataSetId = $ListOfDataSets[$vloop]
    $localFileName = "DataSetSources_"
    $consolidatedFile = "dataset_details_" + $dataSetId +".json"
    $result = (Invoke-PowerBIRestMethod -Url $urlToRetrieveData -Method GET) | ConvertFrom-Json
    $dataSource = $result.value
    $dataSource | ConvertTo-JSON | Out-File -FilePath $consolidatedFile
    Set-AzStorageBlobContent -Force -File $consolidatedFile -Container "srvce-rprt-ds" -Blob $consolidatedFile -Context $StorageAccountContext
  

 

The difference in the JSON files appears to be when semantic model contains one connections it is formatted like this

{

    "datasourceType":  "Oracle",

    "connectionDetails":  {

                              "server":  "MARKETING"

                          },

    "datasourceId":  "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx",

    "gatewayId":  "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx"

}

 

However, when there are multiple connections the JSON file is formatted like this. (extra square brackets)

[

    {

        "datasourceType":  "AnalysisServices",

        "connectionDetails":  {

                                  "server":  "asazure://australia.asazure.windows.net/prod01",

                                  "database":  "Sales"

                              },

        "datasourceId":  "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx",

        "gatewayId":  "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx"

    },

    {

        "datasourceType":  "Oracle",

        "connectionDetails":  {

                                  "server":  "MARKETING"

                              },

        "datasourceId":  "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx",

        "gatewayId":  "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx"

    }

]

 

 

 

  

The frustrating this is when I use the API Admin portal (Admin - Datasets GetDatasourcesAsAdmin - REST API (Power BI Power BI REST APIs) | Microsoft Learn) the format for the JSON file is

 

{

  "@odata.context": "https://wabi-australia-redirect.analysis.windows.net/v1.0/myorg/admin/$metadata#datasources",

  "value": [

    {

      "datasourceType": "AnalysisServices",

      "connectionDetails": {

         "server":  "asazure://australia.asazure.windows.net/prod01",

        "database": "Sales"

      },

      "datasourceId": "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx ",

      "gatewayId": "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx "

    }

  ]

}

 

Or

 

{

  "@odata.context": "https://wabi-australia-redirect.analysis.windows.net/v1.0/myorg/admin/$metadata#datasources",

  "value": [

    {

      "datasourceType": "AnalysisServices",

      "connectionDetails": {

        "server": "asazure://australiaeast.asazure.windows.net/prod01",

        "database": "Sales"

      },

      "datasourceId": " xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx",

      "gatewayId": "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx"

    },

    {

      "datasourceType": "Oracle",

      "connectionDetails": {

        "server": "Marketing"

      },

      "datasourceId": "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx ",

      "gatewayId": "xxxxxx-xxxxxxxx-xxxxxxx-xxxxxxx"

    }

  ]

}

 

 

I realise this is pretty specific us can however any guidedance would be appreciated.

 

1 ACCEPTED SOLUTION

Here is a way to check for record vs. list as part of your parsing. As stated previously, you can simply wrap the records you find into a list and then treat all outputs the same.

 

 

let

    Source = Folder.Files("<local file location with your json examples saved>"),

    //parse the json, if a single record, wrap in list so they are all list of records
    ParseJson = 
    List.Transform( 
        Source[Content] , //binaries are in "Content" column
        each let 
            json = Json.Document( _ ), 
            valTypeIsRecord = Type.Is( Value.Type( json ), type record ) 
        in
            if valTypeIsRecord //if it's a record
            then { json } //wrap in a list
            else json //otherwise leave as is
    ),

    //list of list of records --> list of records
    CombineRecords = List.Combine( ParseJson ), 

    //uniform transformation - Table.FromRecords is designed for list of records
    ToTable = 
    Table.FromRecords( 
        CombineRecords, 
        //provide exact types here so we don't have to later
        type table [
            datasourceType=text,
            connectionDetails=[server=text,database=text],
            datasourceId=text,
            gatewayId=text
        ] 
    ),

    ExpandConnectionDetails = 
    Table.ExpandRecordColumn(
        ToTable, 
        "connectionDetails", 
        {"server", "database"}, 
        {"connectionDetails.server", "connectionDetails.database"}
    )

in
    ExpandConnectionDetails

 

 

Pics of steps so you can see better what's going on:

 

MarkLaf_0-1739887229599.png

 

MarkLaf_1-1739887262861.png

 

MarkLaf_2-1739887296734.png

 

MarkLaf_3-1739887327079.png

 

MarkLaf_4-1739887391397.png

 

 

View solution in original post

7 REPLIES 7
v-prasare
Community Support
Community Support

@gwhiteman,  As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by our community members for your issue worked? or let us know if you need any further assistance here?

 

 

Thanks,

Prashanth Are

MS Fabric community support

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query

v-prasare
Community Support
Community Support

@gwhiteman As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by our community members for your issue worked? or let us know if you need any further assistance here?

 

 

Thanks,

Prashanth Are

MS Fabric community support

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query

v-prasare
Community Support
Community Support

@gwhiteman As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by our community members for your issue worked? or let us know if you need any further assistance here?

 

@Akash_Varuna Thanks for your promt response

 

 

Thanks,

Prashanth Are

MS Fabric community support

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly and give Kudos if helped you resolve your query

Akash_Varuna
Solution Sage
Solution Sage

Hi @gwhiteman , I think the issue arises from inconsistent JSON structures where single connections are objects, and multiple connections are arrays.You can preprocess the JSON files to ensure all entries are wrapped as arrays or adjust the Power Query logic in Power BI to dynamically handle both records and lists You could also use Power BI REST API to handle this 

If this post helped please do give a kudos and accept this as a solution 

Thanks In Advance

Hi Akash, 
Thanks for you feed back, I came to the same conclusion however I was not sure how to get around this neatly. I have manage to work around this using 2 different power queries filtering out errors, then merging them into a single table. It is not cleanest solutions however it does work as far as I can tell. I am also keen to understand how I can "use Power BI REST API to handle this"? I am not the best with PowerShell commands.

Here is a way to check for record vs. list as part of your parsing. As stated previously, you can simply wrap the records you find into a list and then treat all outputs the same.

 

 

let

    Source = Folder.Files("<local file location with your json examples saved>"),

    //parse the json, if a single record, wrap in list so they are all list of records
    ParseJson = 
    List.Transform( 
        Source[Content] , //binaries are in "Content" column
        each let 
            json = Json.Document( _ ), 
            valTypeIsRecord = Type.Is( Value.Type( json ), type record ) 
        in
            if valTypeIsRecord //if it's a record
            then { json } //wrap in a list
            else json //otherwise leave as is
    ),

    //list of list of records --> list of records
    CombineRecords = List.Combine( ParseJson ), 

    //uniform transformation - Table.FromRecords is designed for list of records
    ToTable = 
    Table.FromRecords( 
        CombineRecords, 
        //provide exact types here so we don't have to later
        type table [
            datasourceType=text,
            connectionDetails=[server=text,database=text],
            datasourceId=text,
            gatewayId=text
        ] 
    ),

    ExpandConnectionDetails = 
    Table.ExpandRecordColumn(
        ToTable, 
        "connectionDetails", 
        {"server", "database"}, 
        {"connectionDetails.server", "connectionDetails.database"}
    )

in
    ExpandConnectionDetails

 

 

Pics of steps so you can see better what's going on:

 

MarkLaf_0-1739887229599.png

 

MarkLaf_1-1739887262861.png

 

MarkLaf_2-1739887296734.png

 

MarkLaf_3-1739887327079.png

 

MarkLaf_4-1739887391397.png

 

 

Hi @gwhiteman  could you try these steps please 

 

 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

FebPBI_Carousel

Power BI Monthly Update - February 2025

Check out the February 2025 Power BI update to learn about new features.

Feb2025 NL Carousel

Fabric Community Update - February 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors