Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
joakimfenno
Helper V
Helper V

Fabric Eventstream Kafka JSON Issues

Hi

 

We have a python script where we use Kafka to try to send some simple JSON-data to an EventStream in Fabric for testing.

 

See screenshot of code here:

joakimfenno_0-1719569511375.png

 

 

We can see data activity in the EventStream:

 

joakimfenno_0-1719569709945.png

 

 

When trying to preview the JSON data we get the following error message:

Data preview "KafkaJSON": ["Source 'EventHubInputAdapter' had 1 occurrences of kind 'InputDeserializerError.InvalidData' between processing times '2024-06-27T12:47:06.6200231Z' and '2024-06-27T12:47:06.6200231Z'. Json input stream should either be an array of objects or line separated objects. Found token type: String"’

 

When previewing as CSV it looks like the ”-characters in the original strings have been replaced with \

joakimfenno_2-1719569511383.png

 

 

Do we need to escape the ”-characters in some way when sending messages to the Eventstream?

 

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @joakimfenno ,

 

  1. Ensure Proper JSON Formatting: Verify that the JSON data being sent from your Python script is correctly formatted. Each message should be a valid JSON object. If sending multiple objects, ensure they are either encapsulated in an array or separated by new lines if sending line-separated objects.[{}]

  2. Validate JSON Before Sending: Use a JSON validation tool to ensure that the JSON data is valid before sending it to the EventStream. This can help catch any formatting issues upfront.

  3. Review Kafka Producer Configuration: Ensure that your Kafka producer in the Python script is configured to send data as a string that represents a valid JSON object or objects. You might need to serialize your data to a JSON string if not already doing so.

  4. Escape Special Characters: In Python, you can use the `json.dumps()` function to serialize your data into a valid JSON format. This function will automatically escape special characters like double quotes. json.dumps() in Python - GeeksforGeeks

 

Best Regards,

Neeko Tang

If this post  helps, then please consider Accept it as the solution  to help the other members find it more quickly. 

View solution in original post

1 REPLY 1
Anonymous
Not applicable

Hi @joakimfenno ,

 

  1. Ensure Proper JSON Formatting: Verify that the JSON data being sent from your Python script is correctly formatted. Each message should be a valid JSON object. If sending multiple objects, ensure they are either encapsulated in an array or separated by new lines if sending line-separated objects.[{}]

  2. Validate JSON Before Sending: Use a JSON validation tool to ensure that the JSON data is valid before sending it to the EventStream. This can help catch any formatting issues upfront.

  3. Review Kafka Producer Configuration: Ensure that your Kafka producer in the Python script is configured to send data as a string that represents a valid JSON object or objects. You might need to serialize your data to a JSON string if not already doing so.

  4. Escape Special Characters: In Python, you can use the `json.dumps()` function to serialize your data into a valid JSON format. This function will automatically escape special characters like double quotes. json.dumps() in Python - GeeksforGeeks

 

Best Regards,

Neeko Tang

If this post  helps, then please consider Accept it as the solution  to help the other members find it more quickly. 

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors