Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi
We are exporting data from D365 to a Gen 2 datalake and from there I want to ingest it into a SQL azure database. I tried using the " Copy Dataverse data into Azure SQL using Synapse Link" template to acheive this, however the files in the storage account for the datalake are parquet files and this only seems to work for csv's.
What is the most efficient way to ingest the parquet files into SQL Azure? Is there an established pattern for the D365 data structure? Can I take advantage of the Common Data Model?
Cheers
Alex
Solved! Go to Solution.
Hello @alexp01482
I would suggest this approach
When exporting data from Dynamics 365 to Azure Data Lake Gen2, it often adheres to the Common Data Model (CDM).
If this is helpful , please accept the answer and give kudos
Hello @alexp01482
I would suggest this approach
When exporting data from Dynamics 365 to Azure Data Lake Gen2, it often adheres to the Common Data Model (CDM).
If this is helpful , please accept the answer and give kudos
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
4 | |
3 | |
1 | |
1 | |
1 |