Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hello,
I have a case for which I need solution suggestions. The requirements are:
I have read a bit about Azure Event Hubs and Azure Stream Analytics. Could this be a viable approach? Any experiences with best practices for handling data from Kafka?
Solved! Go to Solution.
Hi @LOSt_210 ,
1. Event Hubs supports Kafka, enabling seamless data ingestion from existing Kafka producers.
2. Utilize the Capture feature to store Kafka data in Azure Data Lake Storage Gen2 or Azure Blob Storage. For processing, use Azure Stream Analytics and store results in Azure SQL Database or Azure Cosmos DB.
3. Handles 200,000 events/day. Start with Standard tier namespace + 1 to 2 Throughput Units and enable Auto Inflate for dynamic scaling. Use 1 to 3 Streaming Units and 4 to 8 partitions for optimal parallel processing. Azure Data Lake scales efficiently using date-based partitioning for optimized retrieval.
4. Send processed summaries from Stream Analytics to Power BI Streaming Datasets for live dashboards.
Power BI connects to Azure Data Lake Storage and Azure SQL Database.
Azure Data Factory can format and clean raw data before reporting.
FYI:
If my response solved your query, please mark it as the Accepted solution to help others find it easily.
And if my answer was helpful, I'd really appreciate a 'Kudos'.
great answer - thank you!
Hi @LOSt_210 ,
1. Event Hubs supports Kafka, enabling seamless data ingestion from existing Kafka producers.
2. Utilize the Capture feature to store Kafka data in Azure Data Lake Storage Gen2 or Azure Blob Storage. For processing, use Azure Stream Analytics and store results in Azure SQL Database or Azure Cosmos DB.
3. Handles 200,000 events/day. Start with Standard tier namespace + 1 to 2 Throughput Units and enable Auto Inflate for dynamic scaling. Use 1 to 3 Streaming Units and 4 to 8 partitions for optimal parallel processing. Azure Data Lake scales efficiently using date-based partitioning for optimized retrieval.
4. Send processed summaries from Stream Analytics to Power BI Streaming Datasets for live dashboards.
Power BI connects to Azure Data Lake Storage and Azure SQL Database.
Azure Data Factory can format and clean raw data before reporting.
FYI:
If my response solved your query, please mark it as the Accepted solution to help others find it easily.
And if my answer was helpful, I'd really appreciate a 'Kudos'.
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
Check out the November 2025 Power BI update to learn about new features.
| User | Count |
|---|---|
| 59 | |
| 43 | |
| 42 | |
| 23 | |
| 17 |
| User | Count |
|---|---|
| 190 | |
| 122 | |
| 96 | |
| 66 | |
| 47 |