Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
Anonymous
Not applicable

Using Dataflows and Amazon Redshift

Currently, our project is using data from Amazon Athena. I connected to it on Desktop, and was about to set up a Gateway for it. However, the client didn't want to set up a Gateway. They requested that we migrate to Amazon Redshift and use Dataflows to avoid having to deal with a Gateway.

 

My question now is, how should I go about with "migrating"? Will I still need to connect to Redshift on Desktop? Or will I be able to create the Dataflow for Redshift and replace the dataset on the published report on Service? Thanks in advance everyone! Any other useful resources on Redshift and Dataflows would be incredibly useful. 

1 ACCEPTED SOLUTION

Just to be clear. Live connection is a concept we didn't talk about. Don't use that words unless you know you are talking about the specific way of connecting. If you connect to redshift with Desktop then publish to Service, you should be able to schedule refresh the imported data without using a datagateway or without a dataflow. You will be able to just edit credentials from service and pick hours of the day.

I hope that make sense


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

View solution in original post

3 REPLIES 3
ibarrau
Super User
Super User

Hi. It sounds like the migration is for changing the source, from Athena to Redshift. It's not necessary to change Dataset for Dataflow. Redshift won't ask for a gateway with neither any of those. You can modify your current PBI Desktop to get data from redshift instead of athena. Once published it won't ask for a gateway, it will ask for credentials only. 

Then if you think Dataflow is necessary here or there is a requirement for that, then go ahead and connect from there.

For both of the scenarios you have to modify the Desktop file. First scenario connecting redshift and the second PBI Dataflow. Then you can publish and repleace the report at Power Bi Service.

I hope that helps,


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Anonymous
Not applicable

Hello @ibarrau ! Thanks for your reply. So if I'm understanding it correctly, if I connect to the dataset from Redshift on Desktop then publish to Service, I should be able to schedule refreshes/have a live connection to the data without using Dataflows?

Just to be clear. Live connection is a concept we didn't talk about. Don't use that words unless you know you are talking about the specific way of connecting. If you connect to redshift with Desktop then publish to Service, you should be able to schedule refresh the imported data without using a datagateway or without a dataflow. You will be able to just edit credentials from service and pick hours of the day.

I hope that make sense


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors