Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Hello Fabricators,
I'm trying to set-up a MySQL connection to load data to a Fabric Lakehouse, also using an On-Premise Data Gateway.
I want to copy data with copy task.
It seems to work with Dataflows Gen2, but I am receiving an error in Data Pipeline (which I prefer over Dataflows Gen2).
> Unable to connect to the mysql server, errorMsg: Access denied for user ''@'localhost' (using password: NO) Access denied for user ''@'localhost' (using password: NO) Aktivitäts-ID: da147145-9079-4684-ba23-d2bd7ed9d97d.
The connection shows it's online.
Is there anything I am missing?
Best, Stefan
Hi @stefanberreiter ,
I think you can try to fix the reported error in the following ways:
1. Ensure that the MySQL user credentials are correctly configured in your Data Pipeline settings. The error message indicates that the username and password might not be set correctly.
2. Verify that your connection string includes the correct username and password. It should look something like this:
Server=myServerAddress;Database=myDataBase;User Id=myUsername;Password=myPassword;
3. Make sure that your On-Premise Data Gateway can communicate with the MySQL server. Check your firewall settings to ensure that the necessary ports are open and that the gateway is allowed to access the MySQL server.
4. The main problem with this error is still the Mysql credentials, so I think you can also check your own Mysql credentials or check the credentials issue in the previous step.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Thanks for responding.
It seems I set the connection exactly in the way that I did for the Dataflow Gen2 (which works) - so it's not a credential "input" problem.
Also the first step of the set-up process in Data Pipelines works (where I believe the credential is superficially validated), but it then throws the error once retrieving potential tables to ingest.
I'm curious, where do you validate the Connection String in Data Pipelines? It should be at least contain **** for username and password right?
I had the same issue but with the online version (didnt put on premise) did you resolve the issue in anyway?