Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
pallavi_r
Resolver III
Resolver III

Data ingestion from Oracle EBS

Hi Team,

 

What is the ideal way to ingest data from Oracle EBS to MS Fabric. Should we use Dataflow gen 2, Data Pipeline Copy activity or Notebook. Kindly advise.

 

Thanks,

Pallavi

2 ACCEPTED SOLUTIONS
v-yilong-msft
Community Support
Community Support

Hi @pallavi_r ,

First of all, if you're trying to strike a balance between ease of use and functionality, Dataflow Gen2 is a great.

If you're working for large-scale, repeatable workflows, Data Pipeline Copy Activities are a great choice.

For highly customized and complex scenarios, notebooks can provide maximum flexibility and control.

 

Dataflow Gen 2:

1. Best for: Code-free data preparation, cleaning, and transformation.

2. Advantages: Easy to use with a graphical interface, supports complex transformations, and integrates well with other Fabric services.

3. Limitations: May not be the best choice for very large datasets or highly complex ETL processes.

 

Data Pipeline Copy Activity:

1. Best for: Robust, repeatable data ingestion workflows.

2. Advantages: Supports large volumes of data, can be scheduled, and offers a low-code experience. Ideal for full ETL processes.

3. Limitations: Currently, it may have limitations with on-premises Oracle databases.

 

Notebook:

1. Best for: Custom, code-rich data ingestion and transformation.

2. Advantages: Highly flexible, supports complex logic and custom scripts, and can handle large datasets efficiently.

3. Limitations: Requires coding knowledge and may be more complex to set up and maintain.

 

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

Hi @pallavi_r ,

To your first question, I don't think many integration tools have a direct connector specifically for Oracle E-Business Suite (EBS). Typically, you need to connect to the underlying Oracle database.

 

For your second question, you can use a notebook or a data pipeline replication activity to achieve a bulk import or a complete one-time load. Data pipeline replication activities are particularly well suited to large-scale, repeatable workflows that can efficiently process large amounts of data. Notebooks offer greater flexibility and control and are particularly suited to complex scenarios, but require coding knowledge.

 

For your third question, you can use metadata-driven ingestion that utilizes data pipelines and notebooks for timed incremental refreshes. This approach allows you to manage and automate the incremental loading process with metadata, enabling dynamic and flexible data handling. This approach works well for working with large data sets and maintaining up-to-date data.

 

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

3 REPLIES 3
v-yilong-msft
Community Support
Community Support

Hi @pallavi_r ,

First of all, if you're trying to strike a balance between ease of use and functionality, Dataflow Gen2 is a great.

If you're working for large-scale, repeatable workflows, Data Pipeline Copy Activities are a great choice.

For highly customized and complex scenarios, notebooks can provide maximum flexibility and control.

 

Dataflow Gen 2:

1. Best for: Code-free data preparation, cleaning, and transformation.

2. Advantages: Easy to use with a graphical interface, supports complex transformations, and integrates well with other Fabric services.

3. Limitations: May not be the best choice for very large datasets or highly complex ETL processes.

 

Data Pipeline Copy Activity:

1. Best for: Robust, repeatable data ingestion workflows.

2. Advantages: Supports large volumes of data, can be scheduled, and offers a low-code experience. Ideal for full ETL processes.

3. Limitations: Currently, it may have limitations with on-premises Oracle databases.

 

Notebook:

1. Best for: Custom, code-rich data ingestion and transformation.

2. Advantages: Highly flexible, supports complex logic and custom scripts, and can handle large datasets efficiently.

3. Limitations: Requires coding knowledge and may be more complex to set up and maintain.

 

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi @v-yilong-msft ,

 

Thanks for your reply.

I have few more queries. Please bear with me in answering these.

1. There is no direct connector to Oracle EBS. We can only connect to Oracle Database only. Is that so?

2. Can I achieve mass ingestion/one time full load also with notebook or data pipeline copy activity?

3. For scheduled incremental refresh, can I use metadata diven ingestion using data pipeline and notebook?

Thanks,

Pallavi

Hi @pallavi_r ,

To your first question, I don't think many integration tools have a direct connector specifically for Oracle E-Business Suite (EBS). Typically, you need to connect to the underlying Oracle database.

 

For your second question, you can use a notebook or a data pipeline replication activity to achieve a bulk import or a complete one-time load. Data pipeline replication activities are particularly well suited to large-scale, repeatable workflows that can efficiently process large amounts of data. Notebooks offer greater flexibility and control and are particularly suited to complex scenarios, but require coding knowledge.

 

For your third question, you can use metadata-driven ingestion that utilizes data pipelines and notebooks for timed incremental refreshes. This approach allows you to manage and automate the incremental loading process with metadata, enabling dynamic and flexible data handling. This approach works well for working with large data sets and maintaining up-to-date data.

 

 

 

Best Regards

Yilong Zhou

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

AugFabric_Carousel

Fabric Monthly Update - August 2024

Check out the August 2024 Fabric update to learn about new features.

August Carousel

Fabric Community Update - August 2024

Find out what's new and trending in the Fabric Community.