Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi All,
I have a C# application that fires queries when user interacts with the application. These queries run and bring 1 or 2 rows from thousands of rows.
These queries are currently running in Azure SQL Database and we are in the process of re-pointing them to Fabric LH SQL Endpoint. (the data is coming from D365 through Link to Fabric). When we tried to compare the performance I see that Azure SQL just takes 2-3 seconds anytime the query is fired while Fabric LH SQL Endpoint takes 15-20 seconds for bringing the same data.
Azure SQL is at 3000 DTUs and Fabric LH is running at F32. I already have updated STATISTICS on the table in LH. Any idea on how to improve the query performance in Fabric LH? Due to unavailability of Query plan, we are also not able to see where exactly it takes most time.
Solved! Go to Solution.
Hopefully, someone else can chime in with suggestions for optimizing SQL Analytics Endpoint.
I agree, 20 seconds is not nice...
Have the underlying delta tables been optimized in some ways? E.g. have you run OPTIMIZE on the delta table?
Is the SQL query you're using a complex query?
Out of curiosity, have you tried a Warehouse instead of a Lakehouse?
Here are some relevant links for SQL Analytics Endpoint and Warehouse performance:
https://learn.microsoft.com/en-us/fabric/data-warehouse/sql-analytics-endpoint-performance
https://learn.microsoft.com/en-us/fabric/data-warehouse/guidelines-warehouse-performance
I'm wondering if it's possible to keep the data in the SQL Analytics Endpoint "warm" by regularly querying the tables (I'm wondering if the data will be kept in warm cache. Or perhaps that would only work if the exact same query was being run each time, idk).
Hi All,
We solved this problem using this approach: We ran the Stored procedures every 5 minutes with few params so that the data gets cached. Because of this, the next time Stored procedure was accessed from the external application it performed much better because of the cache.
We checked the utilization rate in Capacity Metrics app and we verified that it did not cause over-utilization.
In general, Lakehouse or Warehouse is not optimized for point queries. They use columnar storage (delta parquet tables) and are optimized for aggregations.
Have you tested/considered Fabric SQL Database?
(Fabric SQL Database is a preview feature, so not meant for production yet).
Hi @frithjof_v,
Yes usually aggregation queries run much faster but still 20 seconds seems to be bad. In Snowflake, we have this Search Optimization Service which enhances point queries. So I am trying to see if something is similar is available.
We did think about Fabric SQL DB, but it is not yet ready for Production! And again, we might have to replicate the tables manually to the Database which means additional complexity!
Hopefully, someone else can chime in with suggestions for optimizing SQL Analytics Endpoint.
I agree, 20 seconds is not nice...
Have the underlying delta tables been optimized in some ways? E.g. have you run OPTIMIZE on the delta table?
Is the SQL query you're using a complex query?
Out of curiosity, have you tried a Warehouse instead of a Lakehouse?
Here are some relevant links for SQL Analytics Endpoint and Warehouse performance:
https://learn.microsoft.com/en-us/fabric/data-warehouse/sql-analytics-endpoint-performance
https://learn.microsoft.com/en-us/fabric/data-warehouse/guidelines-warehouse-performance
I'm wondering if it's possible to keep the data in the SQL Analytics Endpoint "warm" by regularly querying the tables (I'm wondering if the data will be kept in warm cache. Or perhaps that would only work if the exact same query was being run each time, idk).
Hi @frithjof_v,
Thanks for your quick response!
These tables are replicated using D365 Link to Fabric and they are OPTIMIZEd internally by process (meaning no small files).
We did not try Warehouse because of the copy process involved again. Theoretically, LH and WH almost operates the same way (Delta Lake storage and STATISTICS). May be the engine in WH might be different!
SQL query is not that complex. It has a filter and 2-3 joins on tables.
Do you have many Lakehouses and Warehouses in the same Workspace?
Having fewer SQL Analytics Endpoints and Warehouses in a Workspace is beneficial for performance, according to these docs: https://learn.microsoft.com/en-us/fabric/data-warehouse/sql-analytics-endpoint-performance
Optimally just one Warehouse or one SQL Analytics Endpoint per workspace, for performance.
Hi @frithjof_v,
Yes. We just have one LH and one WH (for storing data that has been transformed using complex SQL Queries).
If we don't have any other option, the best approach for us would be to try materializing the data that the SQL query produces and see if it atleast brings the query time down. But the SQL query is very simple. A filter is passed from application to the SQL Query which reads that record from a table and joins it with 2 more tables.
Hi @govindarajan_d,
Materializing the data is a good approach to improve query performance. May I inquire if this issue has been resolved?
If it has been resolved, please mark the helpful reply or share your solution and accept it as the solution. This will be beneficial for other community members facing similar issues.
Regards,
Vinay Pabbu
Hi @govindarajan_d,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.
Regards,
Vinay Pabbu
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
6 | |
4 | |
3 | |
3 | |
3 |