Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi
We have a python script where we use Kafka to try to send some simple JSON-data to an EventStream in Fabric for testing.
See screenshot of code here:
We can see data activity in the EventStream:
When trying to preview the JSON data we get the following error message:
Data preview "KafkaJSON": ["Source 'EventHubInputAdapter' had 1 occurrences of kind 'InputDeserializerError.InvalidData' between processing times '2024-06-27T12:47:06.6200231Z' and '2024-06-27T12:47:06.6200231Z'. Json input stream should either be an array of objects or line separated objects. Found token type: String"’
When previewing as CSV it looks like the ”-characters in the original strings have been replaced with \
Do we need to escape the ”-characters in some way when sending messages to the Eventstream?
Solved! Go to Solution.
Hi @joakimfenno ,
Ensure Proper JSON Formatting: Verify that the JSON data being sent from your Python script is correctly formatted. Each message should be a valid JSON object. If sending multiple objects, ensure they are either encapsulated in an array or separated by new lines if sending line-separated objects.[{}]
Validate JSON Before Sending: Use a JSON validation tool to ensure that the JSON data is valid before sending it to the EventStream. This can help catch any formatting issues upfront.
Review Kafka Producer Configuration: Ensure that your Kafka producer in the Python script is configured to send data as a string that represents a valid JSON object or objects. You might need to serialize your data to a JSON string if not already doing so.
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @joakimfenno ,
Ensure Proper JSON Formatting: Verify that the JSON data being sent from your Python script is correctly formatted. Each message should be a valid JSON object. If sending multiple objects, ensure they are either encapsulated in an array or separated by new lines if sending line-separated objects.[{}]
Validate JSON Before Sending: Use a JSON validation tool to ensure that the JSON data is valid before sending it to the EventStream. This can help catch any formatting issues upfront.
Review Kafka Producer Configuration: Ensure that your Kafka producer in the Python script is configured to send data as a string that represents a valid JSON object or objects. You might need to serialize your data to a JSON string if not already doing so.
Best Regards,
Neeko Tang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.