Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
pedro4lmeida05
New Member

databricks connectors - dataflow-databricks.query() m language. stopped working

Hi Community 

For some time now I had several dataflows getting data from databricks using this method: 

 

 

 

let
    Source = Databricks.Query(
        "<host>",
        "httppath",
        [Catalog = null, Database = null, EnableAutomaticProxyDiscovery = null]
    ),
    part1 = "SELECT
                ...",
    myquery = Source(part1)
in 
    myquery

 

 

 

Since this week, all of the sudden the method no longer works. 

And I get this error message. 

pedro4lmeida05_0-1741260094492.png

 

Anyone else has this issue?  

1 ACCEPTED SOLUTION
burakkaragoz
Community Champion
Community Champion

Hi @pedro4lmeida05 ,

 

Sorry to hear you’re running into this issue. Since the method was working previously and suddenly stopped, here’s a step-by-step troubleshooting guide that addresses both recent platform changes and common root causes for the "Dataflow-Databricks.Query()" failure with the error:
Expression.Error: The provided options are not valid.

1. Double-check your credentials and tokens

  • If you are using a Personal Access Token (PAT) for Databricks, make sure it hasn’t expired. PATs often have a limited lifespan. Try generating a new token from your Databricks workspace and update your dataflow credentials with the new token.
  • Verify that the authentication method you use in your query and dataflow settings matches what is expected and supported.

2. Permissions

  • Ensure that the credentials you’re using have sufficient permissions. Ideally, the user or service principal should have Can Use or Can Manage permissions on both the Databricks cluster and the relevant databases/tables.

3. Review changes in Dataflow Gen2 and connector updates

  • If your workspace has migrated to or is using Dataflow Gen2, make sure your dataflow and connector version support Azure Databricks. Some features or behaviors may have changed compared to Gen1.
  • Consider recreating your dataflow using the most recent connector options available in Dataflow Gen2.

4. Query options and parameters

  • The error message references invalid options. Review your query, especially the options part. Sometimes connector updates or back-end changes introduce stricter validation for parameters.
  • Try removing or toggling the EnableAutomaticProxyDiscovery option in your query. For example, set it explicitly to true or false, or remove it to see if the behavior changes.

5. Test the query in Power BI Desktop

  • You mentioned the query works in Power BI Desktop. Sometimes, the M engine versions differ between Desktop and Dataflow, leading to subtle incompatibilities. Try simplifying the query or removing optional parameters to see if it runs in the Dataflow web editor.

6. Recent Service Updates

7. Consider alternative approaches

  • If you continue to have issues, consider using Azure Data Factory or migrating your flows to Dataflow Gen2, which is being prioritized for new features and compatibility.

Summary checklist:

  • Refresh or update your Databricks token
  • Verify permissions on Databricks and the target data
  • Re-check query options for deprecated or invalid parameters
  • Try disabling or editing EnableAutomaticProxyDiscovery
  • Test in both Desktop and Service for compatibility
  • Review recent documentation and release notes

If you try these and the problem persists, please share the exact M code after your latest changes and specify whether you’re using Dataflow Gen1 or Gen2. That will help us dig deeper!

Let us know how it goes. Happy to help further if you need more targeted troubleshooting.

Best,
Burak

View solution in original post

8 REPLIES 8
burakkaragoz
Community Champion
Community Champion

Hi @pedro4lmeida05 ,

 

Sorry to hear you’re running into this issue. Since the method was working previously and suddenly stopped, here’s a step-by-step troubleshooting guide that addresses both recent platform changes and common root causes for the "Dataflow-Databricks.Query()" failure with the error:
Expression.Error: The provided options are not valid.

1. Double-check your credentials and tokens

  • If you are using a Personal Access Token (PAT) for Databricks, make sure it hasn’t expired. PATs often have a limited lifespan. Try generating a new token from your Databricks workspace and update your dataflow credentials with the new token.
  • Verify that the authentication method you use in your query and dataflow settings matches what is expected and supported.

2. Permissions

  • Ensure that the credentials you’re using have sufficient permissions. Ideally, the user or service principal should have Can Use or Can Manage permissions on both the Databricks cluster and the relevant databases/tables.

3. Review changes in Dataflow Gen2 and connector updates

  • If your workspace has migrated to or is using Dataflow Gen2, make sure your dataflow and connector version support Azure Databricks. Some features or behaviors may have changed compared to Gen1.
  • Consider recreating your dataflow using the most recent connector options available in Dataflow Gen2.

4. Query options and parameters

  • The error message references invalid options. Review your query, especially the options part. Sometimes connector updates or back-end changes introduce stricter validation for parameters.
  • Try removing or toggling the EnableAutomaticProxyDiscovery option in your query. For example, set it explicitly to true or false, or remove it to see if the behavior changes.

5. Test the query in Power BI Desktop

  • You mentioned the query works in Power BI Desktop. Sometimes, the M engine versions differ between Desktop and Dataflow, leading to subtle incompatibilities. Try simplifying the query or removing optional parameters to see if it runs in the Dataflow web editor.

6. Recent Service Updates

7. Consider alternative approaches

  • If you continue to have issues, consider using Azure Data Factory or migrating your flows to Dataflow Gen2, which is being prioritized for new features and compatibility.

Summary checklist:

  • Refresh or update your Databricks token
  • Verify permissions on Databricks and the target data
  • Re-check query options for deprecated or invalid parameters
  • Try disabling or editing EnableAutomaticProxyDiscovery
  • Test in both Desktop and Service for compatibility
  • Review recent documentation and release notes

If you try these and the problem persists, please share the exact M code after your latest changes and specify whether you’re using Dataflow Gen1 or Gen2. That will help us dig deeper!

Let us know how it goes. Happy to help further if you need more targeted troubleshooting.

Best,
Burak

v-tsaipranay
Community Support
Community Support

Hi @pedro4lmeida05 ,

 

Could you please confirm whether you have resolved issue. If yes, you are welcome to share your workaround and mark it as a solution so that other users can benefit as well. This will be helpful for other community members who have similar problems to solve it faster. 

If we don’t hear back, we’ll go ahead and close this thread.Should you need further assistance in the future, we encourage you to reach out via the Microsoft Fabric Community Forum and create a new thread. We’ll be happy to help.

 

Thank you.

jmf3
Regular Visitor

I've been having this issue as well. I only noticed it because I wanted to edit one of my dataflows that is using this method. The dataflow has been refreshing fine, but if I go in to edit it, it shows me the same error you're getting, so effectively I can't make any changes to it. Seems very strange, especially since the exact same query works fine, if I copy-paste it into Power BI Desktop.

 

Have you had any luck troubleshooting this?

v-tsaipranay
Community Support
Community Support

Hi @pedro4lmeida05  ,

Thank you for reaching out to the Microsoft Fabric Community. Also thank you @DP700_Pro  for your inputs.

To resolve the issue, please check if your Personal Access Token (PAT) for connecting with Databricks is still valid. If it has expired, regenerate the token from the Databricks workspace and update it in your dataflow credentials. Additionally, ensure you have the necessary permissions (Can Use or Can Manage) on the Databricks cluster and read access to the catalog and database.

It is also recommended to migrate to Dataflow Gen2 in Microsoft Fabric, as it supports Azure Databricks connectors.

Please refer the following documentations for your better undersatnding:

Azure Databricks personal access token authentication - Azure Databricks | Microsoft Learn

Azure Databricks connector overview - Microsoft Fabric | Microsoft Learn

If the issue persists, try disabling the EnableAutomaticProxyDiscovery option in your query if it’s set to true.

 

I hope my suggestions give you good idea, if you need any further assistance, feel free to reach out.

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you. 

Hi @pedro4lmeida05 ,

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

 

Hello @pedro4lmeida05  ,

 

I wanted to follow up on our previous suggestions regarding the issue. We would love to hear back from you to ensure we can assist you further.

If my response has addressed your query, please accept it as a solution and give a ‘Kudos’ so other members can easily find it. Please let us know if there’s anything else we can do to help.

 

Thank you.

Hi @pedro4lmeida05 ,

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

 

Thank you.

DP700_Pro
Frequent Visitor

Es posible que las credenciales utilizadas para establecer la conexión hallan expirado o sean incorrectas. Verificar que las credenciales sean correctas y que el método de autenticación sea el adecuado. Puede regenerar el token de acceso personal si es necesario. Permisos insuficientes en el área de trabajo, si solo tiene permisos de visualización en el área de trabajo que contiene el flujo de datos. Solicite permisos más elevados en el área de trabjo para tener el acceso completo a los flujos de datos. Compatibilidad limitrada del conector en ciertas funcionalidades. Actualmente Data Factory en Microsoft Fabric No admite el uso de Azure Databricks en canalizaciones de datos. Considere utilizar Datraflow Gen2 para conectar con Azure Databricks porque está funcinalodad sí está considerada.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.