Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
CeeVee33
Advocate I
Advocate I

SAP ECC Data Ingestion using Pipeline into DataWarehouse

Hi 

 

I'm newbie for SAP. I'm trying it bring SAP ECC from On-premise into Fabric Data warehouse using Pipeline. I've the gatway correctly setup. It is passing the connect test. I've tried both calcualtion view data and table data.

 

First I got an error suggesting incorrect syntax near ",". I assumed that was becuase I was getting the pipeline to create new table.

 

Then I created the table manually and then ran the pipeline. Now the error is "INSERT is not a supported statement type."

 

So does that mean I cannot use SAP as a source to ingest. Do I need to use another connector like ABAP etc.?

 

Thanks

1 ACCEPTED SOLUTION
v-csrikanth
Community Support
Community Support

HIi @CeeVee33 

Here are the responses to your follow-up questions:

Q) With copy into and merge, can I use Pipeline for it? If not, then what should be used?
A) Pipelines in fabric do not natively support “copy into” or “merge” directly.
     However, you can use Notebook (PySpark or T-SQL) activities within the Pipeline to execute these statements.
      Another option is to use a stored procedure in the warehouse that performs copy into or merge, and call this           stored procedure from the Pipeline.

Q) Is Data Lake the same as Lakehouse?
A) A data lake is a storage layer that holds raw and processed data.
       Lakehouse is a structured approach on top of the Data Lake that integrates data
        warehousing features (like tables, SQL queries) with the flexibility of a Data Lake.

Q)Can I use Dataflow Gen2? If so, how do I configure incremental load in Power Query?
A)Yes, you can use Dataflow Gen2 to ingest data into the Lakehouse or Warehouse.
     For increamental refresh in Power Query, follow these steps:
     Define Date/Time Filter: In Power Query, add a filter on a date column to select only new or modified rows.
     Enable Incremental Refresh:
      In the Dataflow settings, go to Incremental Refresh & Real-Time Data.
      Define the Range (e.g., load data from the last X days).
      Choose Detect Data Changes (if a LastModified column exists).
      Publish and ensure your destination supports incremental loading.
Develop ABAP reports within SAP ECC to extract data into a staging area such as Azure Blob Storage or Azure Data Lake.
From there, use COPY INTO to efficiently load the data into Fabric Data Warehouse.


If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.

Best Regards,

Community Support Team _ C Srikanth.



View solution in original post

5 REPLIES 5
v-csrikanth
Community Support
Community Support

HIi @CeeVee33 

Here are the responses to your follow-up questions:

Q) With copy into and merge, can I use Pipeline for it? If not, then what should be used?
A) Pipelines in fabric do not natively support “copy into” or “merge” directly.
     However, you can use Notebook (PySpark or T-SQL) activities within the Pipeline to execute these statements.
      Another option is to use a stored procedure in the warehouse that performs copy into or merge, and call this           stored procedure from the Pipeline.

Q) Is Data Lake the same as Lakehouse?
A) A data lake is a storage layer that holds raw and processed data.
       Lakehouse is a structured approach on top of the Data Lake that integrates data
        warehousing features (like tables, SQL queries) with the flexibility of a Data Lake.

Q)Can I use Dataflow Gen2? If so, how do I configure incremental load in Power Query?
A)Yes, you can use Dataflow Gen2 to ingest data into the Lakehouse or Warehouse.
     For increamental refresh in Power Query, follow these steps:
     Define Date/Time Filter: In Power Query, add a filter on a date column to select only new or modified rows.
     Enable Incremental Refresh:
      In the Dataflow settings, go to Incremental Refresh & Real-Time Data.
      Define the Range (e.g., load data from the last X days).
      Choose Detect Data Changes (if a LastModified column exists).
      Publish and ensure your destination supports incremental loading.
Develop ABAP reports within SAP ECC to extract data into a staging area such as Azure Blob Storage or Azure Data Lake.
From there, use COPY INTO to efficiently load the data into Fabric Data Warehouse.


If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.

Best Regards,

Community Support Team _ C Srikanth.



That is great. I've tried the notebook route and it works.

 

I have also tried Copy Assistant and provided it base tables and calculation views to ingest data and it worked.

 

I did not try ABAP reports. I dont have enough front end access, yet.

 

But thank you so much for the assistance. Rockstar!!

v-csrikanth
Community Support
Community Support

Hi @CeeVee33 
Thank you for being part of the Microsoft Fabric Community.

  • Yes, SAP ECC can be used as a data source, but direct INSERT operations are not supported.
  • Error:  Incorrect syntax near "," This error typically occurs when the pipeline attempts to create a new table, but the generated SQL syntax is incompatible with Fabric’s table structure.
  • Error: "INSERT is not a supported statement type" Fabric Data Warehouse does not support direct INSERT statements for loading data into Delta tables. Instead, it requires bulk-loading methods such as 'COPY INTO' or 'MERGE'.
  • If you require more control over data extraction, you can develop ABAP reports in SAP ECC to extract the required data into a staging area such as Azure Blob Storage or Azure Data Lake.
    Note for reference: Data Factory documentation in Microsoft Fabric 

If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.

Best Regards,

Community Support Team _ C Srikanth.

 

 

 

Hey @v-csrikanth - thank you very much for responding.

 

I must confess, I'm new to Fabric and Azure world too. I've used PBI in the past, but thats about it.

 

Few followup questions -

With Copy INTO and Merge, can I use Pipeline for it? If not, then what should be used?

Data Lake that you refer is it the same as Lakehouse?

Can I use Data Flow gen2? If so, I cant seem to figure out how to adding a query for incremental load in Power Query.

 

Any links to known sites will also be helpful, if it is a big list of questions.

 

Thanks

CeeVee33
Advocate I
Advocate I

@amitchandak - I saw one of your posts which says you've been recently working with SAP. Any clue on this issue?

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.