Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
I've setup a pipeline that pulls data from a lakehouse (with shortcuts to tables in another workspace) into a data-warehouse via two stored procedures.
It succeeds when I run the pipline manually, or when I schedule it for the near future (5 - 15 minutes from the current time). However, every overnight run reports success, runs for the appropriate time, but does not seem to actually modify the data-warehouse (I'm checking via the max time-stamp in the source vs. the destination tables).
Has anyone else observed anything similar, or know an appropriate workaround for the time being? I will file a bug as soon as I find the appropriate place to do so (I'm relatively new to Fabric, and these forums).
More info - I've tried having the data pipleine use scripts instead of stored procedures, adding a step that runs the same time-stamp check in the warehouse (to check if it may be a sync issue, and my test queries were forcing an endpoint refresh), and tried deleting and recreating the pipeline already. I've also tried increasing the time between the ingest in the source workspace and the pipeline, but still saw the same behavior (the original period I tried was a 2 hour difference between the two, increased to 4 hours to test if that was an issue).
Solved! Go to Solution.
Support got back to me quickly; we found a workaround that involves running the pipeline twice (the second time, it succeeds). It sounds like I'm not the only person who has observed this bug, and the product team is working on a fix.
I think I have exactly the same problem. I am pulling from a SharePoint folder into a lakehouse, then using a stored proc to copy into a staging table. The pipeline runs successfully each day, but no data gets copied to the staging table. If I run it a second time it seems to work ok. I have thought it was a timing issue - i.e. when the stored proc runs it is somehow pulling the old data - so I am trying a wait between the two tasks. But very frustrating as logically it should work!
I am afraid it's been a long time since logic left the building. 😉 The product on the backend is too complex and there's a long string of bugs plagueing its use. That being said, does this happen with a Trial license of a paid SKU? When I was using Trial, I noticed there were lots of sync delays across artifacts and one needs to give it time to propagate chnages. But on a paid SKU, I'd think that for the money, Microsoft would make the effort to provide a feedback loop at the very least as close as possible to real-time. But who knows.
If I were you, I'd open a support ticket. That's what I do when I run into such issues. I came across so many bugs I ended up opening more than 2 dozen tickets over 3 months or something. So go forth, Padhawan, and may the Force be with you! 😊
Thanks for responding (and the sympathy!)
I did put a 60 second arbitrary delay between the tasks and it worked fine this morning 🙂 So maybe is a timing issue. I have thought about some check function, but there could be a danger of infinite loops and a nice cash generator for MS!
Support got back to me quickly; we found a workaround that involves running the pipeline twice (the second time, it succeeds). It sounds like I'm not the only person who has observed this bug, and the product team is working on a fix.
Are you using any pipeline variables to filter the rows you want to ingest from the source? And if so, do you use only 1 such variable but have 2 different Set variable activities modifying this variable in conjuction with a conditional IF activity?
No variables used, no IF activity; the procedure is actually runnable as a Script, but I put it as a Procedure because that is where the other developers I am working with would think to look for it.
Looks like a bug for me. I usually log the support tickets for Fabric here: https://admin.powerplatform.microsoft.com/
Check out the September 2024 Fabric update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
9 | |
8 | |
3 | |
3 | |
2 |