- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Data ingestion from Oracle EBS
Hi Team,
What is the ideal way to ingest data from Oracle EBS to MS Fabric. Should we use Dataflow gen 2, Data Pipeline Copy activity or Notebook. Kindly advise.
Thanks,
Pallavi
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @pallavi_r ,
First of all, if you're trying to strike a balance between ease of use and functionality, Dataflow Gen2 is a great.
If you're working for large-scale, repeatable workflows, Data Pipeline Copy Activities are a great choice.
For highly customized and complex scenarios, notebooks can provide maximum flexibility and control.
Dataflow Gen 2:
1. Best for: Code-free data preparation, cleaning, and transformation.
2. Advantages: Easy to use with a graphical interface, supports complex transformations, and integrates well with other Fabric services.
3. Limitations: May not be the best choice for very large datasets or highly complex ETL processes.
Data Pipeline Copy Activity:
1. Best for: Robust, repeatable data ingestion workflows.
2. Advantages: Supports large volumes of data, can be scheduled, and offers a low-code experience. Ideal for full ETL processes.
3. Limitations: Currently, it may have limitations with on-premises Oracle databases.
Notebook:
1. Best for: Custom, code-rich data ingestion and transformation.
2. Advantages: Highly flexible, supports complex logic and custom scripts, and can handle large datasets efficiently.
3. Limitations: Requires coding knowledge and may be more complex to set up and maintain.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @pallavi_r ,
To your first question, I don't think many integration tools have a direct connector specifically for Oracle E-Business Suite (EBS). Typically, you need to connect to the underlying Oracle database.
For your second question, you can use a notebook or a data pipeline replication activity to achieve a bulk import or a complete one-time load. Data pipeline replication activities are particularly well suited to large-scale, repeatable workflows that can efficiently process large amounts of data. Notebooks offer greater flexibility and control and are particularly suited to complex scenarios, but require coding knowledge.
For your third question, you can use metadata-driven ingestion that utilizes data pipelines and notebooks for timed incremental refreshes. This approach allows you to manage and automate the incremental loading process with metadata, enabling dynamic and flexible data handling. This approach works well for working with large data sets and maintaining up-to-date data.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @pallavi_r ,
First of all, if you're trying to strike a balance between ease of use and functionality, Dataflow Gen2 is a great.
If you're working for large-scale, repeatable workflows, Data Pipeline Copy Activities are a great choice.
For highly customized and complex scenarios, notebooks can provide maximum flexibility and control.
Dataflow Gen 2:
1. Best for: Code-free data preparation, cleaning, and transformation.
2. Advantages: Easy to use with a graphical interface, supports complex transformations, and integrates well with other Fabric services.
3. Limitations: May not be the best choice for very large datasets or highly complex ETL processes.
Data Pipeline Copy Activity:
1. Best for: Robust, repeatable data ingestion workflows.
2. Advantages: Supports large volumes of data, can be scheduled, and offers a low-code experience. Ideal for full ETL processes.
3. Limitations: Currently, it may have limitations with on-premises Oracle databases.
Notebook:
1. Best for: Custom, code-rich data ingestion and transformation.
2. Advantages: Highly flexible, supports complex logic and custom scripts, and can handle large datasets efficiently.
3. Limitations: Requires coding knowledge and may be more complex to set up and maintain.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @v-yilong-msft ,
Thanks for your reply.
I have few more queries. Please bear with me in answering these.
1. There is no direct connector to Oracle EBS. We can only connect to Oracle Database only. Is that so?
2. Can I achieve mass ingestion/one time full load also with notebook or data pipeline copy activity?
3. For scheduled incremental refresh, can I use metadata diven ingestion using data pipeline and notebook?
Thanks,
Pallavi
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @pallavi_r ,
To your first question, I don't think many integration tools have a direct connector specifically for Oracle E-Business Suite (EBS). Typically, you need to connect to the underlying Oracle database.
For your second question, you can use a notebook or a data pipeline replication activity to achieve a bulk import or a complete one-time load. Data pipeline replication activities are particularly well suited to large-scale, repeatable workflows that can efficiently process large amounts of data. Notebooks offer greater flexibility and control and are particularly suited to complex scenarios, but require coding knowledge.
For your third question, you can use metadata-driven ingestion that utilizes data pipelines and notebooks for timed incremental refreshes. This approach allows you to manage and automate the incremental loading process with metadata, enabling dynamic and flexible data handling. This approach works well for working with large data sets and maintaining up-to-date data.
Best Regards
Yilong Zhou
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources
Join us at the Microsoft Fabric Community Conference
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Fabric Monthly Update - February 2025
Check out the February 2025 Fabric update to learn about new features.

Subject | Author | Posted | |
---|---|---|---|
02-11-2025 10:47 AM | |||
01-02-2025 03:12 PM | |||
05-02-2024 03:52 AM | |||
11-08-2024 08:56 AM | |||
02-11-2025 01:07 AM |
User | Count |
---|---|
6 | |
5 | |
2 | |
2 | |
2 |