March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi Folks,
I need to send Tables from my Fabric Lakehouse to Dataverse environment (Reverse Integration), is there any way to send.
I know this is out of box question but can anyone helps here?
Thanks in advance.
Solved! Go to Solution.
Hello @Ananth_Bhirappa ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread .
My method is to use Dataverse's dataflow: create a dataflow, then select the data source as Azure SQL Database, enter the necessary information (SQL connection string and lakehouse name), and configure the connection settings. Finally, you can access and retrieve data from the lakehouse.
Hi,
Steps which you forwarded is worked.
Thank you.
Alternatively, rather than using CopyData in a pipeline, you could instead create a dataverse "virtual table" that points at the fabric tables (no data is actually copied). Doco here
In make powerapps portal, open your solution and choose "New->Table->Table from external data"
Then you will be able to choose Fabric as a data source:
I follow these steps. But I face this issue. I am not sure, why it's causing this error. could you please suggest me if you have any idea regarding this?
Hey @Sohan014, I am getting the same error as you - presumably there are issues on the MS side preventing this from working any more 😞
When would I choose Data Pipelines over Virtual Tables? Are pipelines only needed to update existing dataverse tables? If I am using a virtual table I don't need a pipeline, correct?
Hi @Ananth_Bhirappa ,
Thanks for using Fabric Community.
As I understand you want to send the tables from Fabric Lakehouse to Dataverse.
You can use copy activity in Fabric Data Factory.
Source - Lakehouse:
Destination - Dataverse:
After configuration run the pipeline.
You can also refer this - Set up your Dataverse connection - Microsoft Fabric | Microsoft Learn
Hope this is helpful. Please let me know incase of further queries.
Hello @Ananth_Bhirappa ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .
Hello @Ananth_Bhirappa ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread .
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
8 | |
4 | |
3 | |
2 | |
2 |
User | Count |
---|---|
14 | |
10 | |
9 | |
5 | |
4 |