Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello,
I have tried the streaming dataset using API to push data to the dataset from a Databricks notebook, and this works well. However, a message appears: this feature will be retired soon.
Question: how do I use real-time intelligence in Fabric to recreate the same functionality as my streaming dataset?
Hello @ollieftw,
Thank you for reaching out to the Microsoft Fabric Forum Community.
The Power BI streaming datasets are being deprecated, you can replicate and enhance this functionality using Real-Time Intelligence in Microsoft Fabric, specifically with Eventstreams and Real-Time Dashboards. Here’s how you can transition:
For a detailed guide, check out the Real-Time Intelligence tutorial. If you’re using a Custom App to push data, ensure your Fabric capacity is at least F4 to support Eventstreams.
If this information is helpful, please “Accept as solution” and give a "kudos" to assist other community members in resolving similar issues more efficiently.
Thank you.
Hello!
I am trying to set up the Apache Kafka you mentioned. Here is a screenshot, what do I put as the values?
Hello @ollieftw,
Thanks for sharing the screenshot and your progress on setting up the Apache Kafka connection.
Let's fill in the values for the configuration settings based on your setup with the Eventstream.
For the exact Kafka endpoint and security details, check with your Databricks or Kafka administrator, as these are specific to your environment. Once filled, click "Next" to review and connect. If you encounter any errors, ensure the topic exists and the connection details align with your Databricks notebook configuration.
I trust this addresses your needs. If it does, please “Accept as solution” and give it a "kudos" to help others find it easily.
Thank you.
Hello!
I got stuck on the first item: I cannot find any Kafka details in the Eventstream Settings, see screenshot.
I then checked my Databricks environment, but there is no Kafka to be seen anywhere in there.
They have partner connect for a lot of services but I cant see any Kafka details:
Here is a list of the Databricks connect partners:
https://www.databricks.com/company/partners/technology
How do I proceed?
Thanks
Oliver
Hello @ollieftw,
Thank you for the update and for sharing the screenshots.
It seems there’s some confusion regarding the Kafka setup, which is understandable since the integration with Microsoft Fabric’s Eventstream and Databricks requires specific configuration. Please consider below points:
I trust this information proves useful. If it does, kindly “Accept as solution” and give it a "kudos" to help others locate it easily.
Thank you.
Hello!
We dont have a Kafka cluster, unfortunately.
In the streaming dataset one does not need any Kafka cluster, I can just use the API details to push data to the model. Why has Microsoft made it so difficult for me to replicate this easy and simple functionality in Fabric?
What other methods in Fabric can I use to achieve the above?
Thanks.
Oliver
Hello @ollieftw,
Thanks for your input.
The best option to replicate your API push functionality without a Kafka cluster is the Custom App with Eventstream in Fabric. Because it provides an API-like endpoint (via a connection string) for your Databricks notebook to push data directly, like your streaming dataset, without needing external infrastructure. You can find official guidance in the Microsoft Fabric documentation here: Add a custom endpoint or custom app source to an eventstream - Microsoft Fabric | Microsoft Learn
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Thank you.