Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Scenario:
I used shortcuts in the lakehouse to bring data from the ADLS storage account to the lakehouse files folder, and then I converted those files into Delta tables.
Question:
When the data in the ADLS storage account is refreshed with the latest data, how can the tables be updated without manually loading from the files into the delta tables in Lakehouse .?
Solved! Go to Solution.
Hi,
I suppose the shortcuts you create from the ADLS storage are landed in the Files section of your Lakehouse. Those files are always up-to-date with the source (ADLS storage). To get an updated Delta table, you should use one of the options to copy data from the Files section to the Tables. This can be done with copy data in a data pipeline, DataFlow Gen2 or a notebook where you write your own (Python) script.
Additionally, you could use Data Activator to automatically run the pipeline or notebook once a new file in ADLS is landed. Trigger Fabric items - Microsoft Fabric | Microsoft Learn, but it depends how your data is changed in the ADLS if the trigger exists.
I don't want to move data from ADLS to OneLake. Is there any option to access the data without moving it from one place to another?
Another option is to create an external (unmanaged table) in the Table section in the Fabric Lakehouse, based on the file shortcut in the File section. The external table will always be updated, without the need to copy data.
However, this external table won't work with the SQL Analytics Endpoint or Direct Lake afaik.
I think the options provided by @FabianSchut are generally your best options in this case.
If your data in ADLS is in Delta Table format, then you can create the shortcut directly in the Table section of the Lakehouse, without copying data.
Please note: if you will use Direct Lake with the table, there is an advantage of creating the Delta Table using Fabric (notebooks, data pipeline, Dataflow Gen2). That is that the Delta Table will be V-Ordered when you create it using Fabric. V-Ordering is a performance booster for Direct Lake.
Could you share what will be your use cases for this particular data inside Fabric?
Will you use it for Power BI (Direct Lake or Import mode), T-SQL queries, data science/ML, building a data model, etc.?
If you want to create a shortcut directly from the ADLS to the Tables section in your Lakehouse, you should have your data in delta format in ADLS. If your source data is not in delta format, you should convert it. You can still use a pipeline for that, but then the source and destination are both ADLS and you can create a shortcut in the Lakehouse from the delta files.
Hi,
I suppose the shortcuts you create from the ADLS storage are landed in the Files section of your Lakehouse. Those files are always up-to-date with the source (ADLS storage). To get an updated Delta table, you should use one of the options to copy data from the Files section to the Tables. This can be done with copy data in a data pipeline, DataFlow Gen2 or a notebook where you write your own (Python) script.
Additionally, you could use Data Activator to automatically run the pipeline or notebook once a new file in ADLS is landed. Trigger Fabric items - Microsoft Fabric | Microsoft Learn, but it depends how your data is changed in the ADLS if the trigger exists.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
10 | |
4 | |
4 | |
3 | |
3 |