Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I had the intention of automating the extraction of data from dataverse to a lakehouse using pipelines and copy data task.
Users require a lot of dataverse tables and rather than have a copy data task for each of the hundreds of tables, I wanted to automate this using a metadata table.
Table has columns for SourceTable, DestTable.
Pipeline will itterate through each row in this metadata table and copy from source to destination.
So far there have been a number of blockers:
Hi @wilson_smyth ,
Are you attempting to do this in Azure Synapse or Fabric. Assuming you are attempting to do this in Fabric, you can leverage the Link to Fabric in dataverse which will automatically load dataverse tables into a lakehouse in Fabric. This will automate the ingestion process into Fabric without a need for setting up any pipelines or copy activities. In the case you still needed to reference the data in a separate lakehouse, you can leverage cross-warehouse querying in Fabric, removing the need to load data an additional time.
Here is some official Microsoft documentation around this:
If this helped, please mark it as the solution so others can benefit too. And if you found it useful, kudos are always appreciated.
Thanks,
Samson
Connect with me on LinkedIn
Check out my Blog
Going to the European Microsoft Fabric Community Conference? Check out my Session
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
9 | |
4 | |
3 | |
3 | |
2 |