Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
LOSt_210
Frequent Visitor

Kafka -> ? -> PowerBI

Hello,

I have a case for which I need solution suggestions. The requirements are:

  1. Retrieve data from different Kafka topics, where authorization is required.
  2. Store the Kafka data so that I can use historical data in the report.
  3. Have a scalable solution that can handle 200,000+ data points from Kafka each day in the future.
  4. Easily access the data to build Power BI reports.

I have read a bit about Azure Event Hubs and Azure Stream Analytics. Could this be a viable approach? Any experiences with best practices for handling data from Kafka?

1 ACCEPTED SOLUTION
V-yubandi-msft
Community Support
Community Support

Hi @LOSt_210 ,

1. Event Hubs supports Kafka, enabling seamless data ingestion from existing Kafka producers.

2. Utilize the Capture feature to store Kafka data in Azure Data Lake Storage Gen2 or Azure Blob Storage. For processing, use Azure Stream Analytics and store results in Azure SQL Database or Azure Cosmos DB.

3. Handles 200,000 events/day. Start with Standard tier namespace + 1 to 2 Throughput Units and enable Auto Inflate for dynamic scaling. Use 1 to 3 Streaming Units and 4 to 8 partitions for optimal parallel processing. Azure Data Lake scales efficiently using date-based partitioning for optimized retrieval.

 

4. Send processed summaries from Stream Analytics to Power BI Streaming Datasets for live dashboards.

Power BI connects to Azure Data Lake Storage and Azure SQL Database.

Azure Data Factory can format and clean raw data before reporting.

 

FYI:

Vyubandimsft_0-1744650940680.pngVyubandimsft_1-1744650945708.png

 

 

If my response solved your query, please mark it as the Accepted solution to help others find it easily.

And if my answer was helpful, I'd really appreciate a 'Kudos'.

View solution in original post

2 REPLIES 2
LOSt_210
Frequent Visitor

great answer - thank you!

V-yubandi-msft
Community Support
Community Support

Hi @LOSt_210 ,

1. Event Hubs supports Kafka, enabling seamless data ingestion from existing Kafka producers.

2. Utilize the Capture feature to store Kafka data in Azure Data Lake Storage Gen2 or Azure Blob Storage. For processing, use Azure Stream Analytics and store results in Azure SQL Database or Azure Cosmos DB.

3. Handles 200,000 events/day. Start with Standard tier namespace + 1 to 2 Throughput Units and enable Auto Inflate for dynamic scaling. Use 1 to 3 Streaming Units and 4 to 8 partitions for optimal parallel processing. Azure Data Lake scales efficiently using date-based partitioning for optimized retrieval.

 

4. Send processed summaries from Stream Analytics to Power BI Streaming Datasets for live dashboards.

Power BI connects to Azure Data Lake Storage and Azure SQL Database.

Azure Data Factory can format and clean raw data before reporting.

 

FYI:

Vyubandimsft_0-1744650940680.pngVyubandimsft_1-1744650945708.png

 

 

If my response solved your query, please mark it as the Accepted solution to help others find it easily.

And if my answer was helpful, I'd really appreciate a 'Kudos'.

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

November Power BI Update Carousel

Power BI Monthly Update - November 2025

Check out the November 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.