Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
JuanMahecha
Helper I
Helper I

Update policy with real-time tables and external tables (Shorcuts) in DB KQL

Hello, good day.
I have the following problem:
I have a DB KQL in an Eventhouse, this one has the bronze table that receives real-time data from a sensor, I also have 3 shorcuts that bring me 3 master tables from a lakehouse, I want to create the gold table directly (without going through silver) using this real-time bronze table and the three external tables, The problem is that when I try to create the update policy it tells me that I can't use these external tables, what options do I have that doesn't involve having to make a new table that saves the data from the external tables?
Thanks 😄

1 ACCEPTED SOLUTION

Hi @JuanMahecha ,

Thank you for reaching out to Microsoft Fabric Community Forum.

Can you try with below points and let me know if you face any issues.

 

  • Use real-time data ingestion tools like Azure Stream Analytics or Azure Databricks to reduce delays between loading and displaying data.

  • Use Delta Lake for your Bronze tables to allow faster updates and handle data better.

  • Turn on Delta Caching to speed up the reading of the latest data.

  • Set frequent checkpoints in Delta Lake so you can access the latest data faster.

  • Increase the frequency of data updates from daily to hourly or more often to keep data up-to-date.

  • Organize your data by partitioning it (e.g., by date) and use indexing to make data loading and queries faster.

  • Use Azure Data Factory to automate data loading and run it more frequently for real-time updates.

  • Set up Azure Event Grid or Event Hubs to trigger data updates as soon as new data is available.

  • Make sure your system has enough resources to handle data loading jobs without slowing down.

  • Use Azure Monitor or Databricks dashboards to track and fix any issues that slow down your data pipeline.

Regards,

Chaithanya.

 

View solution in original post

7 REPLIES 7
v-kathullac
Community Support
Community Support

Hi @JuanMahecha ,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.


Regards,

Chaithanya.

v-kathullac
Community Support
Community Support

Hi @JuanMahecha , 

Thank you for reaching out to Microsoft Fabric Community Forum.

 

You can't use external (shortcut) tables in update policies or materialized views in Eventhouse.

 

Alternate solution:

Use a scheduled Fabric notebook or ADF pipeline to:
1) Join the Bronze table with the external (shortcut) master tables.
2) Write the result into your Gold table.
3) Schedule it to run every few minutes.
This simulates real-time updates and bypasses the limitation on external tables.

 

if you think external (shortcut) is a bug  I recommend submitting your detailed feedback and ideas through Microsoft's official feedback channels, such as the Microsoft Fabric Ideas.  

 

Regards

Chaithanya.

 

This does not work since when reading the bronze table in the lakehouse it will not show the data in real time or at least the last data loaded, since the information takes time to load in the onelake, until the data has been updated there, I will not be able to read the latest data from a notebook or even doing a shorcut in the lakehouse of that bronze table.

I should make them update faster on the Onelake but I don't know how

Hi @JuanMahecha ,

Thank you for reaching out to Microsoft Fabric Community Forum.

Can you try with below points and let me know if you face any issues.

 

  • Use real-time data ingestion tools like Azure Stream Analytics or Azure Databricks to reduce delays between loading and displaying data.

  • Use Delta Lake for your Bronze tables to allow faster updates and handle data better.

  • Turn on Delta Caching to speed up the reading of the latest data.

  • Set frequent checkpoints in Delta Lake so you can access the latest data faster.

  • Increase the frequency of data updates from daily to hourly or more often to keep data up-to-date.

  • Organize your data by partitioning it (e.g., by date) and use indexing to make data loading and queries faster.

  • Use Azure Data Factory to automate data loading and run it more frequently for real-time updates.

  • Set up Azure Event Grid or Event Hubs to trigger data updates as soon as new data is available.

  • Make sure your system has enough resources to handle data loading jobs without slowing down.

  • Use Azure Monitor or Databricks dashboards to track and fix any issues that slow down your data pipeline.

Regards,

Chaithanya.

 

Hi @JuanMahecha ,

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided for the issue worked? or Let us know if you need any further assistance?
If our response addressed, please mark it as Accept as solution and click Yes if you found it helpful.


Regards,

Chaithanya.

JuanMahecha
Helper I
Helper I

I tried to use a materialized view but it tells me that for security reasons the use of external tables is not allowed, so it does not work for me either.
I am using a function that is the one that does the Join between the bronze table and the master tables but I need this result to be stored in a table that is automatically updated when a new data enters, for this there are the materialized views but they do not allow me to use external tables and the update policies, which do not allow me either.
Then I wouldn't know what to do :c

datacoffee
Super User
Super User

Have you tried the Materialized views? 
they work more or less in the same way, though they have a requirement to have an "aggregate" function in them.

 

you can use external tables in these and if you don't want to aggregate the data, you can always use a count() over <list all columns>.

 

you can also consider using a function (the equivalent of a stored procedure from SQL). It can also help you with optimizing the result set at query time as they are very dynamic and can do a lot cool stuff for you. They are also applicable when using PowerBI as the connector sees them and exposes the parameters (if any).

 

I hope you find a solution which can help you to your needs.


If you find this reply to help with your problem, please consider hitting the accept button...
----------------
Blog: https://dcode.bi
KQL ref guide: https://aka.bi/kql
LinkedIn: https://aka.bi/follow

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

Real Time Intelligence in a Day

Real-Time Intelligence in a Day—Free Training

Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors