Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi everyone,
I'm setting up a data pipeline in Microsoft Fabric, where I need to copy an Excel file into a Lakehouse folder every month. The file name should be dynamic to reflect the current year and month (e.g., MonthlyReport_202310.xlsx for October 2023).
I've tried using expressions like:
concat('MonthlyReport_', formatDateTime(utcnow(), 'yyyyMM'), '.xlsx')
in the File name field, but it’s creating a literal folder with the expression text instead of evaluating it dynamically. I also tried defining the concatenated file name in a pipeline parameter and referencing it in the destination path, but I’m still running into issues.
Has anyone successfully configured a dynamic file name in a Copy Activity for Lakehouse? Any tips on how to make this work properly would be greatly appreciated.
Thanks for any help.
Solved! Go to Solution.
Hi @HamidBee ,
There is an alternative:
New and changed files can be copied from one lakehouse to another in an incremental manner. It uses “ Filter by Last Modified ” to determine which files to copy.
After completing the steps here, Data Factory scans all files in the source store, applies file filters that are filtered by “ Filter by Last Modified ”, and then copies the new files and files that have been updated since the last time to the target store.
For more information, please refer to:
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
I think you are close, but instead of use the concate() directly in the copy activity, you need to create a variable called 'source file name', then add a Set Variable activity where you can define the variable using Add dynamic content, and input @concat('Monthly Report', utcNow('yyyyMMM'),'.xlsx') .
see my learning notes attached.
Hi @HamidBee ,
Is my follow-up just to ask if the problem has been solved?
If so, can you accept the correct answer as a solution or share your solution to help other members find it faster?
Thank you very much for your cooperation!
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @HamidBee ,
There is an alternative:
New and changed files can be copied from one lakehouse to another in an incremental manner. It uses “ Filter by Last Modified ” to determine which files to copy.
After completing the steps here, Data Factory scans all files in the source store, applies file filters that are filtered by “ Filter by Last Modified ”, and then copies the new files and files that have been updated since the last time to the target store.
For more information, please refer to:
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
12 | |
4 | |
3 | |
3 | |
3 |
User | Count |
---|---|
8 | |
6 | |
6 | |
5 | |
5 |