Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
ollieftw
New Member

Streaming dataset

Hello,

I have tried the streaming dataset using API to push data to the dataset from a Databricks notebook, and this works well. However, a message appears: this feature will be retired soon.

 

Question: how do I use real-time intelligence in Fabric to recreate the same functionality as my streaming dataset?

7 REPLIES 7
v-ssriganesh
Community Support
Community Support

Hello @ollieftw,
Thank you for reaching out to the Microsoft Fabric Forum Community.

The Power BI streaming datasets are being deprecated, you can replicate and enhance this functionality using Real-Time Intelligence in Microsoft Fabric, specifically with Eventstreams and Real-Time Dashboards. Here’s how you can transition:

  • In the Real-Time Hub, create an Eventstream and use its Apache Kafka endpoint as the target for your Databricks notebook. Update your notebook to push data to this endpoint using a Kafka producer library.
  • Route the Eventstream data to an Eventhouse for scalable storage and querying or a Lakehouse for Delta table storage.
  • Create a Real-Time Dashboard in Fabric, connect it to the Eventhouse, and use KQL queries to visualize live data, like your Power BI streaming dataset dashboard and Use Fabric Activator to set up real-time alerts for specific data patterns.

For a detailed guide, check out the Real-Time Intelligence tutorial. If you’re using a Custom App to push data, ensure your Fabric capacity is at least F4 to support Eventstreams.

 

If this information is helpful, please “Accept as solution” and give a "kudos" to assist other community members in resolving similar issues more efficiently.
Thank you.

Hello!

 

I am trying to set up the Apache Kafka you mentioned. Here is a screenshot, what do I put as the values?

ollieftw_0-1750691602236.png

 

Hello @ollieftw,

Thanks for sharing the screenshot and your progress on setting up the Apache Kafka connection.

Let's fill in the values for the configuration settings based on your setup with the Eventstream.

  • Since no existing connection is found, select "New connection" to create a new one. You'll need to provide the Kafka endpoint details from your Databricks environment or Kafka cluster. This is typically provided by your Kafka administrator or Fabric Eventstream settings.
  • Enter the specific topic where your Databricks notebook is pushing data. This should match the topic you configured in your notebook's Kafka producer.
  • Use a unique identifier for the consumer group, such as eg-data-processing-group, to ensure the Eventstream can process the data stream effectively.
  • Leave it as "LATEST" to start consuming from the latest messages, which is suitable if you're setting this up fresh. You can adjust to "EARLIEST" if you need historical data.
  • Set it to "SASL_PLAINTEXT" initially. If your Kafka cluster requires encryption, you may need to switch to "SASL_SSL" and provide additional security credentials (e.g: username, password, or certificate) as per your Kafka setup.

For the exact Kafka endpoint and security details, check with your Databricks or Kafka administrator, as these are specific to your environment. Once filled, click "Next" to review and connect. If you encounter any errors, ensure the topic exists and the connection details align with your Databricks notebook configuration.

 

I trust this addresses your needs. If it does, please “Accept as solution” and give it a "kudos"  to help others find it easily.
Thank you.

Hello!

 

I got stuck on the first item: I cannot find any Kafka details in the Eventstream Settings, see screenshot.

ollieftw_0-1750749402188.png

 

I then checked my Databricks environment, but there is no Kafka to be seen anywhere in there. 

They have partner connect for a lot of services but I cant see any Kafka details:

ollieftw_1-1750749523918.png

 

 Here is a list of the Databricks connect partners:

https://www.databricks.com/company/partners/technology

 

How do I proceed?

 

Thanks

Oliver 

Hello @ollieftw,
Thank you for the update and for sharing the screenshots.

It seems there’s some confusion regarding the Kafka setup, which is understandable since the integration with Microsoft Fabric’s Eventstream and Databricks requires specific configuration. Please consider below points:

  • The Eventstream settings (first screenshot) don’t display Kafka details because the Kafka endpoint is only generated after you fully configure the Eventstream with a data source. Since you haven’t connected a Kafka source yet, the details aren’t visible.
  • The Databricks environment (second screenshot) doesn’t show Kafka natively because Kafka isn’t a built-in service in Databricks Partner Connect by default. Databricks supports Kafka integration, but you need to set it up manually or connect to an existing Kafka cluster (e.g., Confluent Cloud, Azure Event Hubs with Kafka compatibility, or a self-managed Kafka instance).
  • Check if your organization has an existing Kafka cluster. The Kafka endpoint and topic details would come from this cluster, not directly from Databricks or Fabric.

 

I trust this information proves useful. If it does, kindly “Accept as solution” and give it a "kudos" to help others locate it easily.
Thank you.

Hello!

 

We dont have a Kafka cluster, unfortunately.

 

In the streaming dataset one does not need any Kafka cluster, I can just use the API details to push data to the model. Why has Microsoft made it so difficult for me to replicate this easy and simple functionality in Fabric? 

 

What other methods in Fabric can I use to achieve the above?

 

Thanks.

Oliver 

Hello @ollieftw,
Thanks for your input.

The best option to replicate your API push functionality without a Kafka cluster is the Custom App with Eventstream in Fabric. Because it provides an API-like endpoint (via a connection string) for your Databricks notebook to push data directly, like your streaming dataset, without needing external infrastructure. You can find official guidance in the Microsoft Fabric documentation here: Add a custom endpoint or custom app source to an eventstream - Microsoft Fabric | Microsoft Learn

If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Thank you.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors