Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
IDoData
Frequent Visitor

Updates to the DP-700 exam

In the current version of Microsoft's DP-700 exam, there are several existing topics:

 

  • Create and configure deployment pipelines
  • Ingest data by using pipelines

 

In the new version, starting in a few days' time, there is a new topic:

Ingest data by using continuous integration from OneLake

 

What this new topic is about? Presumably it is something different from the other two topics, but I can't find anything it which is different from the other two topics.

1 ACCEPTED SOLUTION
IDoData
Frequent Visitor

FYI - I've think I've worked it out, for anyone else wondering.

 

I think it's to do with subscribing to OneLake events (File/Folder create/modify/delete), and then running an action based on that event.

View solution in original post

4 REPLIES 4
v-aatheeque
Community Support
Community Support

Hi @IDoData 

Following up to confirm if the earlier responses addressed your query. If not, please share your questions and we’ll assist further.

Rufyda
Super User
Super User

Pipelines = you orchestrate ingestion
CI from OneLake = ingestion runs automatically when data changes

This topic tests understanding of event-driven + CI/CD-based ingestion, not manual pipeline design.

If this response was helpful, please accept it as a solution and give kudos to support other community members
Regards,
Rufyda Rahma | MIE 0

IDoData
Frequent Visitor

FYI - I've think I've worked it out, for anyone else wondering.

 

I think it's to do with subscribing to OneLake events (File/Folder create/modify/delete), and then running an action based on that event.

deborshi_nag
Resident Rockstar
Resident Rockstar

Hi @IDoData 

 

The wording here is a bit of a brain teaser ... 

  • Ingest data by using pipelines” is your classic, hands-on approach—think building Data Factory or pipeline-based solutions where you map out the data journey yourself using Fabric pipeline tools.
  • Ingest data by using continuous integration from OneLake” is a whole different flavor. This one’s about letting your Git/CI process call the shots, automatically kicking off ingestion whenever something changes in OneLake. No manual button-mashing required!

So, what does this look like in action?

  • A Git repo triggers pipelines or notebooks whenever new files pop up in OneLake.
  • CI/CD tools like Azure DevOps or GitHub Actions swoop in to deploy and ingest data straight into OneLake.
  • Ingestion happens automatically when files change in OneLake, so you don’t have to orchestrate every step yourself.

Hope this helps - please appreciate by leaving a Kudos or accepting as a Solution

I trust this will be helpful. If you found this guidance useful, you are welcome to acknowledge with a Kudos or by marking it as a Solution.

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.