Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
ahmedshalabyy12
Resolver II
Resolver II

Refresh failed full data and success with 2 weeks

Dears,

 

I tried to refresh the data only with 2 weeks of data and it worked (to test the gateway and everything is running fine )
After that, i put incremental refresh to pull the whole data but I always receive this error .
so why it worked with 2 weeks and didn't work with 3 years of data  ?

Data source error: {"error":{"code":"DM_GWPipeline_Gateway_InvalidConnectionCredentials","pbi.error":{"code":"DM_GWPipeline_Gateway_InvalidConnectionCredentials","parameters":{},"details":[{"code":"DM_ErrorDetailNameCode_UnderlyingErrorCode","detail":{"type":1,"value":"-2147467259"}},{"code":"DM_ErrorDetailNameCode_UnderlyingErrorMessage","detail":{"type":1,"value":"[Microsoft][ThriftExtension] (14) Unexpected response from server during a HTTP connection: Unauthorized/Forbidden error response returned, but no token expired message received."}},{"code":"DM_ErrorDetailNameCode_UnderlyingHResult","detail":{"type":1,"value":"-2147467259"}},{"code":"Microsoft.Data.Mashup.CredentialError.DataSourceKind","detail":{"type":1,"value":"Databricks"}},{"code":"Microsoft.Data.Mashup.CredentialError.DataSourcePath","detail":{"type":1,"value":"{\"host\":\"adb-4001480147023167.7.azuredatabricks.net\",\"httpPath\":\"\\/sql\\/1.0\\/warehouses\\/707c521bc160ab69\"}"}},{"code":"Microsoft.Data.Mashup.CredentialError.Reason","detail":{"type":1,"value":"AccessUnauthorized"}},{"code":"Microsoft.Data.Mashup.MashupSecurityException.DataSources","detail":{"type":1,"value":"[{\"kind\":\"Databricks\",\"path\":\"{\\\"host\\\":\\\"adb-4001480147023167.7.azuredatabricks.net\\\",\\\"httpPath\\\":\\\"\\\\/sql\\\\/1.0\\\\/warehouses\\\\/707c521bc160ab69\\\"}\"}]"}},{"code":"Microsoft.Data.Mashup.MashupSecurityException.Reason","detail":{"type":1,"value":"AccessUnauthorized"}}],"exceptionCulprit":1}}}
w

2 ACCEPTED SOLUTIONS
v-pnaroju-msft
Community Support
Community Support

Thankyou, @Deku, for your response.

Hi ahmedshalabyy12,

We appreciate your question on the Microsoft Fabric Community Forum.

From what I understand, the Power BI refresh worked for only 2 weeks because during that time, the data volume and number of API calls made to Azure Databricks were limited. When you enabled incremental refresh to include 3 years of data, Power BI started sending many parallel queries (one for each partition or date range) to Databricks. This increased the authentication load on the gateway. If the credentials or tokens are not handled properly for each query, it causes the error:
DM_GWPipeline_Gateway_InvalidConnectionCredentials — AccessUnauthorized.

This error can also happen if a large data pull (like 3 years) causes the token to expire or if the Databricks SQL warehouse times out or blocks the long query. That is why a smaller period (2 weeks) works but the full dataset does not.

Please follow these steps which may help fix the issue:

  1. If you are using OAuth, long queries might fail because tokens expire. It is better to use a Service Principal or a long-lived Personal Access Token (PAT). Re-enter your credentials in Power BI Service > Settings > Data source credentials.
  2. Optimize your SQL Warehouse for large queries. Make sure the Databricks SQL Warehouse has a big enough cluster (use a higher tier if needed), longer query timeouts, auto-stop turned off, and enough concurrency.
  3. Adjust Incremental Refresh Settings by going to Modeling > Incremental Refresh in Power BI Desktop. Set it to store data for 3 years but refresh only the recent 1 month. This reduces load and avoids pulling all the data at once.
  4. Make sure your on-premises data gateway cluster is working well. Try restarting the gateway.

You can see the troubleshooting step for the error in the screenshot attached below:

vpnarojumsft_0-1751003228280.png

Also, here are some helpful links:
Connect Power BI to Azure Databricks - Azure Databricks | Microsoft Learn
Create a SQL warehouse - Azure Databricks | Microsoft Learn

If you find this response useful, please mark it as the accepted solution and give kudos. This will help other community members with similar questions.

If you have any more questions, please feel free to ask the Microsoft Fabric community.

Thank you.

View solution in original post

You can also use incremental refresh policy. Apply policy but don't refresh in service. You can manually apply the policy once you have deployed to service with tabular editor, then manually refresh each partition one by one with ssms to reduce the load


Did I answer your question?
Please help by clicking the thumbs up button and mark my post as a solution!

View solution in original post

6 REPLIES 6
v-pnaroju-msft
Community Support
Community Support

Thankyou, @Deku, for your response.

Hi ahmedshalabyy12,

We appreciate your question on the Microsoft Fabric Community Forum.

From what I understand, the Power BI refresh worked for only 2 weeks because during that time, the data volume and number of API calls made to Azure Databricks were limited. When you enabled incremental refresh to include 3 years of data, Power BI started sending many parallel queries (one for each partition or date range) to Databricks. This increased the authentication load on the gateway. If the credentials or tokens are not handled properly for each query, it causes the error:
DM_GWPipeline_Gateway_InvalidConnectionCredentials — AccessUnauthorized.

This error can also happen if a large data pull (like 3 years) causes the token to expire or if the Databricks SQL warehouse times out or blocks the long query. That is why a smaller period (2 weeks) works but the full dataset does not.

Please follow these steps which may help fix the issue:

  1. If you are using OAuth, long queries might fail because tokens expire. It is better to use a Service Principal or a long-lived Personal Access Token (PAT). Re-enter your credentials in Power BI Service > Settings > Data source credentials.
  2. Optimize your SQL Warehouse for large queries. Make sure the Databricks SQL Warehouse has a big enough cluster (use a higher tier if needed), longer query timeouts, auto-stop turned off, and enough concurrency.
  3. Adjust Incremental Refresh Settings by going to Modeling > Incremental Refresh in Power BI Desktop. Set it to store data for 3 years but refresh only the recent 1 month. This reduces load and avoids pulling all the data at once.
  4. Make sure your on-premises data gateway cluster is working well. Try restarting the gateway.

You can see the troubleshooting step for the error in the screenshot attached below:

vpnarojumsft_0-1751003228280.png

Also, here are some helpful links:
Connect Power BI to Azure Databricks - Azure Databricks | Microsoft Learn
Create a SQL warehouse - Azure Databricks | Microsoft Learn

If you find this response useful, please mark it as the accepted solution and give kudos. This will help other community members with similar questions.

If you have any more questions, please feel free to ask the Microsoft Fabric community.

Thank you.

that's also another solution to get token with long life time
thank you 

You can also use incremental refresh policy. Apply policy but don't refresh in service. You can manually apply the policy once you have deployed to service with tabular editor, then manually refresh each partition one by one with ssms to reduce the load


Did I answer your question?
Please help by clicking the thumbs up button and mark my post as a solution!

i thought about this solution to open ssms and refresh the facts one by one this might work  

Deku
Super User
Super User

The error is related to authorisation. Are you sure you still have permissions to read the underlying data on databricks?


Did I answer your question?
Please help by clicking the thumbs up button and mark my post as a solution!

Yes i have refreshed this dataset with only 2 weeks of data about 50m of rows 

now the dataset maybe around 1b of rows , so i spam receive this error for 4 times 

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.