Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Currently, our project is using data from Amazon Athena. I connected to it on Desktop, and was about to set up a Gateway for it. However, the client didn't want to set up a Gateway. They requested that we migrate to Amazon Redshift and use Dataflows to avoid having to deal with a Gateway.
My question now is, how should I go about with "migrating"? Will I still need to connect to Redshift on Desktop? Or will I be able to create the Dataflow for Redshift and replace the dataset on the published report on Service? Thanks in advance everyone! Any other useful resources on Redshift and Dataflows would be incredibly useful.
Solved! Go to Solution.
Just to be clear. Live connection is a concept we didn't talk about. Don't use that words unless you know you are talking about the specific way of connecting. If you connect to redshift with Desktop then publish to Service, you should be able to schedule refresh the imported data without using a datagateway or without a dataflow. You will be able to just edit credentials from service and pick hours of the day.
I hope that make sense
Happy to help!
Hi. It sounds like the migration is for changing the source, from Athena to Redshift. It's not necessary to change Dataset for Dataflow. Redshift won't ask for a gateway with neither any of those. You can modify your current PBI Desktop to get data from redshift instead of athena. Once published it won't ask for a gateway, it will ask for credentials only.
Then if you think Dataflow is necessary here or there is a requirement for that, then go ahead and connect from there.
For both of the scenarios you have to modify the Desktop file. First scenario connecting redshift and the second PBI Dataflow. Then you can publish and repleace the report at Power Bi Service.
I hope that helps,
Happy to help!
Hello @ibarrau ! Thanks for your reply. So if I'm understanding it correctly, if I connect to the dataset from Redshift on Desktop then publish to Service, I should be able to schedule refreshes/have a live connection to the data without using Dataflows?
Just to be clear. Live connection is a concept we didn't talk about. Don't use that words unless you know you are talking about the specific way of connecting. If you connect to redshift with Desktop then publish to Service, you should be able to schedule refresh the imported data without using a datagateway or without a dataflow. You will be able to just edit credentials from service and pick hours of the day.
I hope that make sense
Happy to help!