Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I am sending Avro-serialized messages from Kafka to Azure Event Hubs using SchemaRegistryApacheAvroSerializer. The schema is stored in Azure Schema Registry, and the producer sends events successfully.
public EventHubProducerService(
@VALUE("${EVENTHUB_CONNECTION_STRING}") String connectionString,
@VALUE("${eventhub.name}") String eventHubName,
@VALUE("${schema-registry.endpoint}") String schemaRegistryEndpoint,
@VALUE("${schema-registry.group}") String schemaRegistryGroup) {
tokenCredential = new DefaultAzureCredentialBuilder().build();
this.producerClient = new EventHubClientBuilder()
.connectionString(connectionString, eventHubName)
.buildProducerClient();
SchemaRegistryAsyncClient schemaRegistryClient = new SchemaRegistryClientBuilder()
.credential(tokenCredential)
.fullyQualifiedNamespace(schemaRegistryEndpoint)
.buildAsyncClient();
this.schemaRegistryApacheAvroSerializer = new SchemaRegistryApacheAvroSerializerBuilder()
.schemaRegistryClient(schemaRegistryClient)
.schemaGroup(schemaRegistryGroup)
.autoRegisterSchemas(true)
.avroSpecificReader(true)
.buildSerializer();
}
public void sendMessage(AgreementLifecycleDomainSourceType message) {
EventData eventData = schemaRegistryApacheAvroSerializer.serialize(
message, TypeReference.createInstance(EventData.class)
);
SendOptions sendOptions = new SendOptions().setPartitionId("1");
producerClient.send(Collections.singletonList(eventData), sendOptions);
}When I connect this Event Hub to Microsoft Fabric Eventstream and try to preview the data, I get this error:
Source 'EventHubInputAdapter' had occurrences of kind 'InputDeserializerError.InvalidData'.
Invalid Avro Format.Any insights would be really helpful!
I just had a conversation with the Fabric Support team and they said Eventstream does not support reading from schema registry as of now.
Hello @hamam69
The error arises from Fabric’s inability to reconcile the producer’s schema with its own expectations.
Fabric Eventstream expects the Avro schema used for deserialization to exactly match the schema used during serialization
Ensure the schema used by your producer (`AgreementLifecycleDomainSourceType`) matches the schema Fabric expects. Use a schema validation tool or the Azure portal to confirm.
Fabric uses the full format `namespace.servicebus.windows.net`, not just the namespace name.
In
Thanks @nilendraFabric for your response! I'd like to add few points.
The schema used during serialization is the one which is being sent to Schema Registry. And I also tried deserializing the sent event data in my code and it deserializes just fine.
public void sendMessage(AgreementLifecycleDomainSourceType message){
EventData eventData = schemaRegistryApacheAvroSerializer.serialize(message, TypeReference.createInstance(EventData.class));
System.out.println("Sent message to Event Hub: " + eventData.getBodyAsString());
SendOptions sendOptions = new SendOptions().setPartitionId("1");
List<EventData> serializedMessages = Collections.singletonList(eventData);
producerClient.send(serializedMessages, sendOptions);
for(EventData data : serializedMessages){
AgreementLifecycleDomainSourceType out = schemaRegistryApacheAvroSerializer.deserialize(data, TypeReference.createInstance(AgreementLifecycleDomainSourceType.class));
System.out.println("Deserialized data: "+out);
}
}
And yes, I'm using the full format with '.servicebus.windows.net`
Any idea where else it'd have gone wrong?
Try Bypass Schema Registry
Embed Schema: Serialize data with the Avro schema embedded (not using the registry) to ensure self-contained payloads.
Something like this
DatumWriter<AgreementLifecycleDomainSourceType> writer = new SpecificDatumWriter<>(schema);
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
Encoder encoder = EncoderFactory.get().binaryEncoder(outputStream, null);
writer.write(message, encoder);
encoder.flush();
EventData eventData = new EventData(outputStream.toByteArray());
I've tried this before and I kept getting the same error. But why do we need to bypass schema registry? Eventstream should be handling the deserialization right? The data sent to event hub is avro/binary encoded and the schema id matches with the schema. Can't figure out where it went wrong!
It was just to check whether it is working with bypass.
Hi nilendraFabric,
Any progress on it? My team faced the same issue and couldn't handle deserialising the message on EventStream as well (the code to publish the event just followed the same sample code at Azure Schema Registry Apache Avro client library for .NET - Azure for .NET Developers | Microsoft Le....
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.