Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
Rajendran
New Member

Since dynamic data sources aren't refreshed in the Power BI service

Hi All, 

I trying to connect our own application API data table using mutiple query which need bearer token to authenticate. 

everything working perfectly on powerBI desktop. i can refresh data and able to create report. when i publish to powerbi then we have challange. 

 

Working fine:  If i dont use any dynamic datasource. i'm able to schedule and refresh without any Issue

 

Not working :  some of table i need to pass parameter like "IDs" causing multiple issue. 

 

issue1:  Data source setting give error "some data source may not be listed because of hand-authoured queries"

issue2:   Cluster URI: WABI-WEST-EUROPE-B-PRIMARY-redirect.analysis.windows.net
Activity ID: ac909264-7244-457e-a19a-e8c8c767bf3c
Request ID: 76e92208-6a0c-fd63-f1c1-a88f3bc6b426
Time: 2024-12-04 05:03:47Z 

issue 3:  unable to select gatway or schedule refresh for working datasource. 

Below is one of dynamic query im using. 

 

 

 

 

let
Source = Json.Document(Web.Contents("https://login1.us.net/publicApi/v7-preview/tests?orderBy=name&direction=asc&count=1000&include=connectionResources", [Timeout=#duration(0, 0, 3, 0), Headers=[#"Content-Type"="application/json", Accept="application/json", Authorization=" Bearer DQFq0Jss41"]])),
#"Converted to Table" = Table.FromRecords({Source}),
#"Expanded items" = Table.ExpandListColumn(#"Converted to Table", "items"),
#"Removed Columns1" = Table.RemoveColumns(#"Expanded items",{"totalCount", "offset"}),
#"Expanded items1" = Table.ExpandRecordColumn(#"Removed Columns1", "items", {"type", "scheduleType", "scheduleIntervalInMinutes", "numberOfSessions", "rampUpDurationInMinutes", "testDurationInMinutes", "rampDownDurationInMinutes", "state", "enableCustomScreenshots", "repeatCount", "isRepeatEnabled", "isEnabled", "isEmailEnabled", "emailRecipient", "includeSuccessfulApplications", "restartOnComplete", "euxEnabled", "euxWorkFolders", "sessionMetricsEnabled", "sessionMetricScheduleRate", "sessionMetricGroupKey", "alertConfigurations", "activeTimeSlots", "thresholds", "id", "environmentKey", "name", "description", "created", "connectionResources", "workload", "logonTimeTrackingProcess", "engineStartTimeout", "applicationDebugModeEnabled"}, {"items.type", "items.scheduleType", "items.scheduleIntervalInMinutes", "items.numberOfSessions", "items.rampUpDurationInMinutes", "items.testDurationInMinutes", "items.rampDownDurationInMinutes", "items.state", "items.enableCustomScreenshots", "items.repeatCount", "items.isRepeatEnabled", "items.isEnabled", "items.isEmailEnabled", "items.emailRecipient", "items.includeSuccessfulApplications", "items.restartOnComplete", "items.euxEnabled", "items.euxWorkFolders", "items.sessionMetricsEnabled", "items.sessionMetricScheduleRate", "items.sessionMetricGroupKey", "items.alertConfigurations", "items.activeTimeSlots", "items.thresholds", "items.id", "items.environmentKey", "items.name", "items.description", "items.created", "items.connectionResources", "items.workload", "items.logonTimeTrackingProcess", "items.engineStartTimeout", "items.applicationDebugModeEnabled"}),
#"Removed Columns" = Table.RemoveColumns(#"Expanded items1",{"items.numberOfSessions", "items.rampUpDurationInMinutes", "items.testDurationInMinutes", "items.rampDownDurationInMinutes", "items.state", "items.enableCustomScreenshots", "items.repeatCount", "items.isRepeatEnabled", "items.isEnabled", "items.isEmailEnabled", "items.emailRecipient", "items.includeSuccessfulApplications", "items.restartOnComplete", "items.euxEnabled", "items.euxWorkFolders", "items.sessionMetricsEnabled", "items.sessionMetricScheduleRate", "items.sessionMetricGroupKey", "items.alertConfigurations", "items.activeTimeSlots", "items.thresholds"}),
#"Removed Errors" = Table.RemoveRowsWithErrors(#"Removed Columns", {"items.description", "items.created", "items.connectionResources", "items.workload", "items.logonTimeTrackingProcess", "items.engineStartTimeout", "items.applicationDebugModeEnabled"}),
#"Removed Columns2" = Table.RemoveColumns(#"Removed Errors",{"items.scheduleIntervalInMinutes", "items.environmentKey", "items.description", "items.created", "items.connectionResources", "items.workload", "items.logonTimeTrackingProcess", "items.engineStartTimeout", "items.applicationDebugModeEnabled", "items.scheduleType", "items.type"}),
url = Table.AddColumn(#"Removed Columns2", "URL", each "https://login1.us.net/publicApi/v7-preview/test-runs/"& [items.id] &"/events?direction=desc&count=200&offset=0&from=" & From_Date & "&to=" & To_Date & "&include=properties",type text),
Item = Table.AddColumn(url, "List", each Json.Document(Web.Contents([URL],[Headers = [#"Authorization" =" Bearer DQFq0Jss41kqH11jPg"] ]))[items]),
#"Expanded List" = Table.ExpandListColumn(Item, "List"),
#"Expanded List1" = Table.ExpandRecordColumn(#"Expanded List", "List", {"id", "eventType", "timestamp", "title", "testId", "testRunId", "userSessionId", "applicationId", "properties"}, {"List.id", "List.eventType", "List.timestamp", "List.title", "List.testId", "List.testRunId", "List.userSessionId", "List.applicationId", "List.properties"}),
#"Expanded List.properties" = Table.ExpandListColumn(#"Expanded List1", "List.properties"),
#"Expanded List.properties1" = Table.ExpandRecordColumn(#"Expanded List.properties", "List.properties", {"propertyId", "value"}, {"List.properties.propertyId", "List.properties.value"}),
#"Filtered Rows2" = Table.SelectRows(#"Expanded List.properties1", each ([List.properties.propertyId] <> null)),
Custom1 = Table.Pivot(#"Filtered Rows2", List.Distinct(#"Filtered Rows2"[List.properties.propertyId]), "List.properties.propertyId", "List.properties.value"),
#"Removed Columns4" = Table.RemoveColumns(Custom1,{"Locale", "RemotingProtocol", "Resolution", "ScaleFactor", "TargetOS", "ConnectorTarget", "Screenshot", "LogFile", "URL"}),
#"Changed Type" = Table.TransformColumnTypes(#"Removed Columns4",{{"List.timestamp", type datetime}}),
#"Merged Queries" = Table.NestedJoin(#"Changed Type", {"List.applicationId"}, API_LE_Application, {"Column1.id"}, "API_LE_Application", JoinKind.LeftOuter),
#"Expanded API_LE_Application" = Table.ExpandTableColumn(#"Merged Queries", "API_LE_Application", {"Application_name"}, {"API_LE_Application.Application_name"}),
#"Merged Queries1" = Table.NestedJoin(#"Expanded API_LE_Application", {"List.testId"}, API_LE_Statistic, {"Column1.testId"}, "API_LE_Statistic", JoinKind.LeftOuter),
#"Expanded API_LE_Statistic" = Table.ExpandTableColumn(#"Merged Queries1", "API_LE_Statistic", {"Column1.testName"}, {"API_LE_Statistic.Column1.testName"}),
Custom2 = Table.AddColumn(#"Expanded API_LE_Statistic", "App_status", each if Text.Contains([List.title], "successfull") then "Successful" else if Text.Contains([List.title], "Success") then "Successful" else if Text.Contains([List.title], "error") then "Failed" else if Text.Contains([List.title], "DeletionError") then "Failed" else if Text.Contains([List.title], "CreationError") then "Failed" else if Text.Contains([List.title], "successful") then "Successful" else if Text.Contains([List.title], "Failure") then "Successful" else if Text.Contains([List.title], "failed") then "Failed" else if Text.Contains([List.title], "Failed") then "Failed" else if Text.Contains([List.title], "page has been loaded") then "Successful" else if Text.Contains([List.title], "unable") then "Failed" else if [List.eventType] = "applicationFailure" then "Failed" else "Failed")
in
Custom2

 

 

 

 

4 REPLIES 4
Anonymous
Not applicable

Hi @SacheeTh ,

 

This error usually occurs when you try to connect to a data source using a dynamically generated url. Please check your Privacy Levels in your data source settings. Steps: Go to File > Options and settings > Data source settings and set the privacy levels to Organizational or Public as needed. After finish setting up in Power BI Desktop, republish to Power BI Service.

In addition, if your data source requires a gateway, make sure it is properly configured and running. Make sure install the latest version of gateway.

 

Best Regards,

Stephen Tao

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

 

 

 

Rajendran
New Member

Thanks for providing some more information. My main challange when i pass dynamic parameter.  Any working query for passing dynamic paramter will be help full. 

 

on below query i pass parameter "items.id"  to create 50 different API url to get data for each id. This is causing issue. 

 

url = Table.AddColumn(#"Removed Columns2", "URL", each "https://login1.us.net/publicApi/v7-preview/test-runs/"& [items.id] &"/events?direction=desc&count=10000&offset=0&from=" & From_Date & "&to=" & To_Date & "&include=properties",type text),
Item = Table.AddColumn(url, "List", each Json.Document(Web.Contents([URL],[Headers = [#"Authorization" =" Bearer DQFq0Jss41k"] ]))[items]),

Since i'm new to powerbi could you please provide some sample and explain about 

  • Gateway Requirement: For dynamic data sources, always use a properly configured gateway.
  • Parameters: Replace dynamically constructed URLs with parameterized components.

I understand the challenge of passing dynamic parameters to create multiple API URLs, as this is a common limitation in Power BI Service, mainly due to security concerns with dynamic data sources. Below, I'll walk you through a working solution to help you tackle this challenge effectively.

The Problem (As per me):

  • You are dynamically constructing URLs to fetch data from an API for each unique items.id.
  • This works fine in Power BI Desktop but not in the Power BI Service, as the dynamic generation of URLs results in gateway and refresh issues.

The Solution What I got covered:

We need to parameterize the API call and handle the dynamic IDs in a way that allows Power BI Service to recognize them as valid. Here’s how you can achieve that:

  1. Define Parameters for API Call:

    • Use Power Query parameters for the From_Date, To_Date, and ID values.
    • Set these parameters in Power BI so they can be used effectively without generating fully dynamic URLs directly in the query.
  2. Use Helper Queries and Functions:

    • Break down the dynamic API calls into manageable parts using helper queries and custom functions.
    • This way, the Power BI Service can handle these functions through the gateway.

Here is a step-by-step solution: (got is from a blog, pasting it here as is)

Step 1: Create Parameters

  1. Go to Power Query Editor.
  2. Create three parameters:

Step 2: Create a Function for API Call

  1. Create a new blank query, and call it FetchData.

  2. Paste the following code to create a function that will take id as an input and return data for that id:

    (id as text) =>
    let
        // Construct the URL using the parameters and the input ID
        ApiUrl = BaseUrl & id & "/events?direction=desc&count=10000&offset=0&from=" & Text.From(From_Date) & "&to=" & Text.From(To_Date) & "&include=properties",
    
        // Fetch data from the API
        Source = Json.Document(
            Web.Contents(ApiUrl, [Headers=[Authorization="Bearer " & YourTokenParameter]])
        ),
    
        // Extract items from the response
        Items = Source[items]
    in
        Items
  3. This function takes an ID (id) and creates the appropriate URL, then calls the API.

Step 3: Use the Function to Fetch Data for Each ID

  1. Go back to your main table where you have the list of IDs (items.id).

  2. Add a new column to apply the function to each row:

    • Go to Add Column in Power Query.
    • Click on Invoke Custom Function.
    • Use the FetchData function you created earlier, passing in items.id as the parameter.

    This will create the API calls for each row while leveraging a function, which is easier for Power BI to validate.

Step 4: Configure the Gateway

  • Install and Configure the Gateway:
    • Install the On-premises Data Gateway if it's not already installed.
    • Go to Power BI Service and navigate to Settings > Manage Gateways.
    • Add your data source to the gateway and connect it to your API with the correct authentication method.
  • Set Up Scheduled Refresh:
    • After publishing the report, configure the data source under your dataset.
    • Select the appropriate gateway connection.
    • Enter credentials for your data source so that the refresh process can authenticate properly.

Important Notes:

  • Bearer Token Refresh: If your bearer token expires often, consider automating token retrieval and use Power Automate to store the latest token in a secure parameter.
  • Static Parts of API Call: Parameterize the static parts of the URL. This allows Power BI Service to recognize the structure without being blocked by fully dynamic queries.

Example Summary:

  1. Parameterize: Define parts of your URL (like From_Date, To_Date, and BaseUrl) as parameters.
  2. Use Functions: Create a custom function (FetchData) to generate the dynamic parts of the URL and get data.
  3. Gateway: Ensure the gateway is installed and the data source is registered for proper authentication and refresh.

By using parameters and custom functions, you're creating a more static structure that Power BI Service can manage, thus solving the dynamic source refresh issues.

This is what my casual suggestion for this. I had the same issue with a Sharepoint list, thankfuly with Autho2 got me covered. 

SacheeTh
Resolver II
Resolver II

Seems to be a common issue in Power Query issue you're encountering is common when working with dynamically generated data sources in Power BI that require parameters and authentication, such as bearer tokens. Let’s go through the challenges and possible solutions:

Key Issues:

  1. Dynamic Data Sources Error: The error "some data sources may not be listed because of hand-authored queries" occurs because Power BI Service (where reports are published) doesn’t always support dynamic data sources due to the security concerns involved in executing untrusted dynamic URLs or credentials.

  2. Unable to Select Gateway or Schedule Refresh: Power BI Service requires a data gateway to refresh on-premises data sources or authenticated APIs, especially when dynamic URLs are being used. Without a configured gateway, refreshes won't be allowed.

  3. Bearer Token Expiry: Bearer tokens usually expire after a short duration (e.g., 1 hour). If your token is hardcoded, the refresh fails once it expires.

Possible Solutions:

  1. Use Power BI Data Gateway and Parameterize API Calls:

    • Data Gateway: Ensure that you are using an on-premises data gateway to handle authenticated requests. Power BI Service cannot directly access local files or some APIs without a data gateway.
    • Parameterization: Instead of using hand-authored queries that dynamically generate the URLs, try parameterizing your data source in Power Query to make the data gateway recognize the data source as a valid query.

    Example:

    • Define the base URL and parameters in the Power Query Editor.
    • Use Power BI parameters (e.g., From_Date, To_Date, and ID) that can be managed in the Service.
    let
        BaseUrl = "https://login1.us.net/publicApi/v7-preview/tests?orderBy=name&direction=asc&count=1000&include=connectionResources",
        AuthToken = "Bearer " & YourTokenParameter,
        Source = Json.Document(Web.Contents(BaseUrl, [Timeout=#duration(0, 0, 3, 0), Headers=[#"Content-Type"="application/json", Accept="application/json", Authorization=AuthToken]])),
        ...
    in
        Custom2

    By using parameters, you make the query more transparent to Power BI Service, allowing it to validate and refresh the data properly.

  2. Use OAuth2 and Azure Key Vault or Power Automate for Token Refresh: If your API uses OAuth2, consider using a service like Azure Key Vault to manage and store secrets. Alternatively, you could automate token generation using Power Automate. Here's how:

    • Azure Key Vault: Store the bearer token and use it in Power BI securely.
    • Power Automate: Automate the process of fetching a new bearer token periodically and store it in a parameterized data source that Power BI can read.
  3. Gateway Configuration:

    • Ensure that the data gateway is up and running and configured correctly.
    • The gateway must be associated with the dataset in Power BI Service.
    • Add your custom connector (if applicable) to the gateway to enable API connectivity.
  4. Consider Splitting the Query: Split your query into multiple smaller queries that do not use dynamic URLs and combine them in Power BI. This approach can help avoid the dynamic source error and simplify refresh processes.

  5. Modify API Calls to Avoid Dynamic URLs: Try to limit the dynamic portions of your query that are causing issues with the refresh. For example:

    • Use query parameters (Web.Contents accepts parameters like "Query" to set query strings) instead of constructing URLs dynamically.
    let
        BaseUrl = "https://login1.us.net/publicApi/v7-preview/tests",
        QueryParameters = [
            orderBy = "name",
            direction = "asc",
            count = "1000",
            include = "connectionResources"
        ],
        Source = Json.Document(Web.Contents(BaseUrl, [Query=QueryParameters, Headers=[#"Authorization"="Bearer " & YourToken]])),
        ...
    in
        Custom2
  6. Test Connection on Service with Static Data: Before deploying the dynamic version, test your connection with static data to verify if it's a gateway, configuration, or authentication issue. Once confirmed, add the parameters.

  7. Alternative Solution: Dataflow: Move your data transformation logic to a Power BI Dataflow. Dataflows run server-side, and this sometimes makes it easier to handle API calls and manage token renewal as compared to doing everything directly in Power BI Desktop.

Key Takeaways:

  • Gateway Requirement: For dynamic data sources, always use a properly configured gateway.
  • Parameters: Replace dynamically constructed URLs with parameterized components.
  • Token Renewal: Automate token renewal using Azure Key Vault or Power Automate.
  • Service Testing: Deploy a static version to test configuration and gateway issues before going fully dynamic.

This approach should help you troubleshoot the issues with Power BI Service and API connections. Let me know if this help. Because this is a learning to mee too.

Helpful resources

Announcements
November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.