The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi everyone,
We're looking to reduce the storage burden on our salesforce instance by moving over archived data from the tasks and events objects onto onelake and making it accessible via powerBI.
We've tried various permission models on the Salesforce end but cant seem to get visibility on archived data using dataflow gen2. I then tried using the copy assistant in data pipelines and turned on the 'inlcude deleted objects' option in the source settings for the copy activity. We can now see some archived data but only still a fraction of it.
Has anyone tried to this before? are there other solutions to moving over archived data from salesforce?
Thanks.
Hi @alirng ,
Firstly I think you can go to Data flow Gen2 and search Salesforce.
Also I think you can setup a Lakehouse. You can look at this document: OneLake Architecture (microsoft.com)
This way you can also see the database you want.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi,
Thanks for your message. So I know how to get data from salesforce objects using dataflow however we're trying to get the data from archived rows within an object on salesforce.