Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
wilson_smyth
Post Patron
Post Patron

Copy data - if table does not exist in lakehouse, create it. is this possible?

I had the intention of automating the extraction of data from dataverse to a lakehouse using pipelines and copy data task.
Users require a lot of dataverse tables and rather than have a copy data task for each of the hundreds of tables, I wanted to automate this using a metadata table.

Table has columns for SourceTable, DestTable.
Pipeline will itterate through each row in this metadata table and copy from source to destination.

So far there have been a number of blockers:

4 REPLIES 4
jennratten
Super User
Super User

Hello @wilson_smyth - I agree with @SamsonTruong - it would be best to avoid replication by adding a shortcut to Dataverse in Fabric.  Have you considered this? 

However, to answer your the technical question, given your are trying to do it for other sources, the metadata approach with a data pipeline is recommended.  In my pipelines, within the for each (row of the metadata table) acitivity, I have a notebook that performs the necessary actions.  Within the notebook, I include a statement that first checks whether or not the table exists.

If this post helps to answer your questions, please consider marking it as a solution so others can find it more quickly when faced with a similar challenge.

Proud to be a Microsoft Fabric Super User

SamsonTruong
Impactful Individual
Impactful Individual

Hi @wilson_smyth ,

Are you attempting to do this in Azure Synapse or Fabric. Assuming you are attempting to do this in Fabric, you can leverage the Link to Fabric in dataverse which will automatically load dataverse tables into a lakehouse in Fabric. This will automate the ingestion process into Fabric without a need for setting up any pipelines or copy activities. In the case you still needed to reference the data in a separate lakehouse, you can leverage cross-warehouse querying in Fabric, removing the need to load data an additional time.


Here is some official Microsoft documentation around this:


If this helped, please mark it as the solution so others can benefit too. And if you found it useful, kudos are always appreciated.


Thanks,

Samson

 

Connect with me on LinkedIn

Check out my Blog

Going to the European Microsoft Fabric Community Conference? Check out my Session

Hi @wilson_smyth 

I wanted to follow up to see if the suggestions shared earlier by @SamsonTruong  helped clarify your question about extraction of data from dataverse to a lakehouse using pipelines and copy data task.

If this resolved your issue, please consider marking the response as accepted, as it may help others with similar questions.

 

We’re happy to help explore other options if needed.

Thank you for being part of the community!

Hi @wilson_smyth 

I wanted to follow up on the suggestion shared  by @SamsonTruong earlier about using the Link to Fabric feature in Dataverse to automate loading tables into your Lakehouse.

Could you please let us know if this approach helped resolve your issue, or if you’re still facing any challenges?

We’re here and happy to help with any further questions you might have!

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.