Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hi All,
I am trying to set up Gen2 Dataflows to build pipeline from Snowflake, Amazon Redshift, MySQL, etc. Now I keep running into errors like
An exception occurred: DataSource.Error: An error happened while reading data from the provider.
And for Redshift, I believe it's ODBC Error, which is odd cuz I think you don't need ODBC driver for your website service. Same thing happened when I tried to set up gateway for my scheduled refresh. It would give out timeout errors.
A little background:
1. I have pro license with premium feature. And the Workspace I worked on is a premium workspace.
2. My boss has the same access, but she has no problem using Dataflow and Gateway with the same data source.
3. I am sure my user id and password info are correct, since I tried to connect through Power BI Desktop with the same credential and it works.
Hi @quentinyang,
Checking in to see if your issue has been resolved. let us know if you still need any assistance.
Thank you.
Hi @quentinyang,
Checking in to see if your issue has been resolved. let us know if you still need any assistance.
Thank you.
Hi @quentinyang,
As suggested by @Vinodh247, check the env permission as it might be causing the issue.
Thank you.
The issue is likely env / permission related rather than connector specific.
Can you check the following?
Gateway setup - Ensure you are using the same gateway cluster and connection configuration as your boss. Verify your gateway is online and you have admin or user rights to it.
Network/firewall - Confirm outbound access to snowflake/redshift/mysql endpoints is open from your network or the gateway machine. Timeout errors often indicate blocked ports or vpc routing issues.
Authentication scope - Even if credentials are correct, dataflow gen2 connections use service principal or user delegation tokens. Ask your fabric admin to confirm you have the same fabric tenant and data source permissions.
Dataflow region alignment - Verify your workspace, gateway, and data source are in the same region.
ODBC confusion - Fabric gen2 uses native connectors, but under the hood some sources still require ODBC on the gateway host. Ensure the redshift ODBC driver is installed on that gateway server.
If all these match your boss’s setup, recreate the connection under manage connections and gateways and retry the dataflow.
Hi Vinodh!
Thanks for the reply. Sorry I just realized my post was confusing. We actually didn't use any gateway. Only Cloud Connection.
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!