Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
We are trying to connect AWS RDS MySQL using MySQL CDC in Microsoft Fabric Real-Time Intelligence. Since AWS RDS MySQL is in a private network, we established a connection using the On-Premises Data Gateway.
We are able to perform a full load using Azure Data Factory without any connection issues. However, when attempting to create a new Eventstream for incremental sync, we encounter the following error.
In the connection settings, we have specified the username as sys_fabric.
Failed
Failed to create event stream source. Error: "details": "Property 'username's value '' is invalid" Artifact ID: 9e40d2d5-efec-40ab-8665-b1441876c322 Action: createEventStreamSource Type: AzureMySql
Here, the type is showing as Azure MySQL. However, we are not connecting to Azure MySQL; we are connecting to AWS RDS MySQL, but it is defaulting to Azure MySQL.
We tried the same setup in the test environment, where the source is public and not private. There, we were able to create a new Eventstream without any issues.
We are unable to determine why it is showing the username as invalid.
Solved! Go to Solution.
Hi @JayanthiP ,
Thank you for the clarification. You are correct,Microsoft Fabric’s MySQL CDC connector for Eventstream currently only supports publicly accessible MySQL instances and does not support private network sources, even when using the On-Premises Data Gateway. This is a documented limitation (ref).
As an alternative to achieve real-time sync from your private AWS RDS MySQL to Microsoft Fabric OneLake, you can deploy Debezium within your private network to capture CDC changes and stream them to Apache Kafka or Confluent Kafka.
Fabric supports Kafka as a real-time source in Eventstream (Apache Kafka, Confluent Kafka), allowing you to ingest these events into OneLake.
If near real-time is sufficient, another option is to use Azure Data Factory to run high-frequency incremental loads based on a timestamp column.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hi @JayanthiP ,
Thank you for posting in the Microsoft Fabric Community.
The issue appears to be related to how Eventstream detects the data source and handles credentials in your private network setup. Since it is incorrectly identifying the source as Azure MySQL instead of AWS RDS MySQL, ensure that MySQL CDC is selected during setup. The error "Property 'username's value '' is invalid" suggests the username is missing or not passing correctly try manually re-entering sys_fabric and verifying Azure Key Vault credentials if used.
Since full loads work in ADF but incremental sync via Eventstream fails, confirm that the On-Premises Data Gateway is set up for real-time ingestion and the correct gateway cluster is selected. Given that it works in a public network but not in a private one, check firewall settings and ensure port 3306 is open for AWS RDS MySQL.
I hope this will resolve your issue, if you need any further assistance, feel free to reach out.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
We dont have an option to select AWS MySQL RDS,its common connector -MySQL CDC. We have created Gateway connection separately. connection was successful and as i mentioned with the same gateway connection we can perform full load. only we unable to create Event Stream for incremental Sync
Hi @JayanthiP ,
Thank you for the clarification. Since MySQL CDC is a unified connector for both Azure MySQL and AWS RDS MySQL, the issue likely stems from how Eventstream handles credentials through the On-Premises Data Gateway.
While ADF full load works, Eventstream requires real-time ingestion, which may not be configured correctly in the gateway. We recommend manually re-entering the sys_fabric username in the Eventstream configuration and ensuring that the gateway is correctly mapped to the MySQL CDC data source in Microsoft Fabric > Manage Gateways.
Additionally, verify that port 3306 is open and that AWS RDS security groups allow inbound connections from the gateway. If the issue persists, reviewing the Fabric Gateway and Eventstream logs may provide more insights.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hi @JayanthiP ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @JayanthiP ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
MS Fabric Real Time Intelligence MySQL CDC connector only supports publicly accessible MySQL Source. In our case, Source MySQL database is private so we are looking for an alternative solution to achieve Real time Sync from AWS MySQl RDS to MS Fabric Onelake
Hi @JayanthiP ,
Thank you for the clarification. You are correct,Microsoft Fabric’s MySQL CDC connector for Eventstream currently only supports publicly accessible MySQL instances and does not support private network sources, even when using the On-Premises Data Gateway. This is a documented limitation (ref).
As an alternative to achieve real-time sync from your private AWS RDS MySQL to Microsoft Fabric OneLake, you can deploy Debezium within your private network to capture CDC changes and stream them to Apache Kafka or Confluent Kafka.
Fabric supports Kafka as a real-time source in Eventstream (Apache Kafka, Confluent Kafka), allowing you to ingest these events into OneLake.
If near real-time is sufficient, another option is to use Azure Data Factory to run high-frequency incremental loads based on a timestamp column.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thankyou.
Hi @JayanthiP ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.