Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
Hi Folks,
I need to send Tables from my Fabric Lakehouse to Dataverse environment (Reverse Integration), is there any way to send.
I know this is out of box question but can anyone helps here?
Thanks in advance.
Solved! Go to Solution.
Hello @Ananth_Bhirappa ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread .
My method is to use Dataverse's dataflow: create a dataflow, then select the data source as Azure SQL Database, enter the necessary information (SQL connection string and lakehouse name), and configure the connection settings. Finally, you can access and retrieve data from the lakehouse.
Hi,
Steps which you forwarded is worked.
Thank you.
Alternatively, rather than using CopyData in a pipeline, you could instead create a dataverse "virtual table" that points at the fabric tables (no data is actually copied). Doco here
In make powerapps portal, open your solution and choose "New->Table->Table from external data"
Then you will be able to choose Fabric as a data source:
I follow these steps. But I face this issue. I am not sure, why it's causing this error. could you please suggest me if you have any idea regarding this?
Hey @Sohan014, I am getting the same error as you - presumably there are issues on the MS side preventing this from working any more 😞
When would I choose Data Pipelines over Virtual Tables? Are pipelines only needed to update existing dataverse tables? If I am using a virtual table I don't need a pipeline, correct?
Hi @Ananth_Bhirappa ,
Thanks for using Fabric Community.
As I understand you want to send the tables from Fabric Lakehouse to Dataverse.
You can use copy activity in Fabric Data Factory.
Source - Lakehouse:
Destination - Dataverse:
After configuration run the pipeline.
You can also refer this - Set up your Dataverse connection - Microsoft Fabric | Microsoft Learn
Hope this is helpful. Please let me know incase of further queries.
Hello @Ananth_Bhirappa ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
Otherwise, will respond back with the more details and we will try to help .
Hello @Ananth_Bhirappa ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .
In case if you have any resolution please do share that same with the community as it can be helpful to others .
If you have any question relating to the current thread, please do let us know and we will try out best to help you.
In case if you have any other question on a different issue, we request you to open a new thread .
Check out the October 2024 Fabric update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
20 | |
16 | |
16 | |
6 | |
6 |