Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi, I've some reports which use dataflows. Right now, those dataflows are using amazon redshift as the source. I want to use databricks instead. How can i change that within the dataflow? Please guide. I've downloaded the json file for that dataflow so do i need to copy that content and make amendments to that or how can it be done. Please help.
Solved! Go to Solution.
Hi @bhavyamalik1,
You can change the data source in your dataflow, but it requires recreating the connection. Here's how you can try to approach it:
Option 1: Edit directly in Power BI Service (Recommended)
Option 2: Use the JSON file (Advanced)
Option 3: Recreate the dataflow (Safest)
Best regards!
PS: If you find this post helpful consider leaving kudos or mark it as solution
Hi @bhavyamalik1,
Thank you for reaching out to Microsoft Fabric Community.
Thank you @rohit1991, @Nabha-Ahmed and @Mauro89 for the prompt response.
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hi @bhavyamalik1,
We wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
I hope you are doing well!!
You don’t need to modify the exported JSON file. Dataflow definitions aren’t designed to be edited directly, and manual changes may break the dataflow.
The correct way is to update the connection through Power Query Online:
1. Open the Power BI Service
2. Navigate to the workspace that contains the dataflow
3. Select Edit dataflow
4. In Power Query Online, open each query that currently uses Amazon Redshift
5. Update the Source step to use the Azure Databricks connector instead
6. Provide the Databricks connection details (server, HTTP path, authentication)
7. Save the dataflow and validate the refresh
If the schema between Redshift and Databricks is different, it’s often cleaner to:
Create a new query using Databricks as the source
Reapply the existing transformation steps
Replace the old query once validated
Editing the JSON file is not supported and is not recommended.
This approach keeps the dataflow stable, supported, and easy to maintain.
If this help you mark as "Solution" and put kudo to help other
Hi @bhavyamalik1,
You can change the data source in your dataflow, but it requires recreating the connection. Here's how you can try to approach it:
Option 1: Edit directly in Power BI Service (Recommended)
Option 2: Use the JSON file (Advanced)
Option 3: Recreate the dataflow (Safest)
Best regards!
PS: If you find this post helpful consider leaving kudos or mark it as solution
not getting it as databricks on top but redshift
What about Option 3? Connecting to Databricks and move the remaining Power Query code into the new Dataflow?
Best regards!
data is reflecting in dataflow but not in pbi
Have you actually run the dataflow successfully and in case you used import mode in PBI refreshed the data?
Hii @bhavyamalik1
You should not edit the exported dataflow JSON. Changing sources via JSON is unsupported and can break the dataflow. The correct approach is to edit the dataflow in Power BI Service (Power Query Online), update the Source step from Amazon Redshift to Databricks in the Advanced Editor, then update credentials and refresh. As long as the schema matches, downstream transformations and reports will continue to work.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 41 | |
| 39 | |
| 37 | |
| 29 | |
| 24 |
| User | Count |
|---|---|
| 124 | |
| 107 | |
| 80 | |
| 69 | |
| 67 |