Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Dear Microsoft,
in a Project for a costomer I have the following setup: I extract Data from an API, then clean it and save it to a Lakehouse. The tables in this Lakehouse are shared externally with the customer in another tenant. My user has rights on both tenants. In his tenant I build a tsql notebook which copys the data from the external Tables in a LandingLH to my customer's warehouse.
The whole process is running everyday but the new data is just in my Lakehouse and in the shared lakehouse of my customer's workspace and not in the warehouse of my customer. When I manually trigger the TSQL notebook to copy from LH to WH it works fine but my scheduled tsql notebook doesnt copy new data!
I thought that the problem could be that the LH is not refreshed when the notebook wants to extract the data so I added a "select * from ..." before I started the copy-process to refresh the data but it didn't work.
I don't know what I can now and why my scheduled notebook could not get the new data from LH to WH and why it works when I manually trigger it.
Could you help me?
Thanks,
Marius
Solved! Go to Solution.
Hello @marius1106 ,
I wanted to follow up on our previous suggestions regarding the issue. We would love to hear back from you to ensure we can assist you further.
If my response has addressed your query, please accept it as a solution and give a ‘Kudos’ so other members can easily find it. Please let us know if there’s anything else we can do to help.
Thank you.
Hi @marius1106 ,
Thank you for reaching out to the Microsoft Fabric Community.
Based on your scenario where the scheduled T-SQL notebook doesn’t pull the latest data from the shared Lakehouse to your customer’s Warehouse, though manual execution works the issue likely stems from a combination of Lakehouse endpoint sync delays, differences in how scheduled and manual notebooks execute, or a timing misalignment between data refresh and the notebook run.
Since you’ve already tried using SELECT * FROM ... to force a refresh without success, I recommend verifying that the notebook runs under the same identity and permissions for both manual and scheduled executions.
Additionally, ensure that cross-tenant data sharing permissions support scheduled access specifically. If permissions aren’t the issue, the problem may stem from a delay in the Lakehouse’s T-SQL endpoint sync.
In that case, a workaround could involve introducing a short delay or scheduling the notebook slightly later to ensure data is fully refreshed. I also suggest enabling detailed logging within the notebook to capture any errors or unusual behavior that occurs during scheduled runs but not manual ones.
I hope my suggestions give you good idea, if you need any further assistance, feel free to reach out.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thank you.
Thanks for your answer! I evaluated the aspects you mentioned:
1) I created the notebook so when I schedule it, I thinks it takes my user
2) "cross-tenant data sharing permissions support scheduled access": Could you show me where I can find this permission in the Admin portal? I didn't find it.
3) Lakehouse-Delay: My LH is filled 4 hours before I copy the data from it. I think thats not the problem.
And how can I see detailed logs for my notebook? When I click on "view detail" in monitoring I can only see the duration or JobID.
Hi @marius1106 ,
To fix your scheduled T-SQL notebook not pulling updated data, while manual runs work start by checking cross-tenant sharing permissions. In the Admin portal, under Tenant settings → Export and sharing settings, ensure "External data sharing (preview)" is enabled in your tenant and "Users can accept external data shares (preview)" is enabled in the customer’s tenant. Microsoft’s documentation on data sharing covers this setup.
For troubleshooting, enable detailed logging by connecting the notebook to Azure Log Analytics (Fabric Monday guide) or add custom PRINT statements in the notebook to track progress. To ensure the latest data is fetched before copying, force a refresh using SELECT * FROM [External_Lakehouse_Table] OPTION (LABEL = 'ForceRefresh');
If scheduled runs still fail, try re-authenticating the notebook to renew credentials. For further guidance, check Microsoft’s notebook monitoring docs.
If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.
Thank you.
Hi @marius1106 ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hello @marius1106 ,
I wanted to follow up on our previous suggestions regarding the issue. We would love to hear back from you to ensure we can assist you further.
If my response has addressed your query, please accept it as a solution and give a ‘Kudos’ so other members can easily find it. Please let us know if there’s anything else we can do to help.
Thank you.