Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi
I'm newbie for SAP. I'm trying it bring SAP ECC from On-premise into Fabric Data warehouse using Pipeline. I've the gatway correctly setup. It is passing the connect test. I've tried both calcualtion view data and table data.
First I got an error suggesting incorrect syntax near ",". I assumed that was becuase I was getting the pipeline to create new table.
Then I created the table manually and then ran the pipeline. Now the error is "INSERT is not a supported statement type."
So does that mean I cannot use SAP as a source to ingest. Do I need to use another connector like ABAP etc.?
Thanks
Solved! Go to Solution.
HIi @CeeVee33
Here are the responses to your follow-up questions:
Q) With copy into and merge, can I use Pipeline for it? If not, then what should be used?
A) Pipelines in fabric do not natively support “copy into” or “merge” directly.
However, you can use Notebook (PySpark or T-SQL) activities within the Pipeline to execute these statements.
Another option is to use a stored procedure in the warehouse that performs copy into or merge, and call this stored procedure from the Pipeline.
Q) Is Data Lake the same as Lakehouse?
A) A data lake is a storage layer that holds raw and processed data.
Lakehouse is a structured approach on top of the Data Lake that integrates data
warehousing features (like tables, SQL queries) with the flexibility of a Data Lake.
Q)Can I use Dataflow Gen2? If so, how do I configure incremental load in Power Query?
A)Yes, you can use Dataflow Gen2 to ingest data into the Lakehouse or Warehouse.
For increamental refresh in Power Query, follow these steps:
Define Date/Time Filter: In Power Query, add a filter on a date column to select only new or modified rows.
Enable Incremental Refresh:
In the Dataflow settings, go to Incremental Refresh & Real-Time Data.
Define the Range (e.g., load data from the last X days).
Choose Detect Data Changes (if a LastModified column exists).
Publish and ensure your destination supports incremental loading.
Develop ABAP reports within SAP ECC to extract data into a staging area such as Azure Blob Storage or Azure Data Lake.
From there, use COPY INTO to efficiently load the data into Fabric Data Warehouse.
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
HIi @CeeVee33
Here are the responses to your follow-up questions:
Q) With copy into and merge, can I use Pipeline for it? If not, then what should be used?
A) Pipelines in fabric do not natively support “copy into” or “merge” directly.
However, you can use Notebook (PySpark or T-SQL) activities within the Pipeline to execute these statements.
Another option is to use a stored procedure in the warehouse that performs copy into or merge, and call this stored procedure from the Pipeline.
Q) Is Data Lake the same as Lakehouse?
A) A data lake is a storage layer that holds raw and processed data.
Lakehouse is a structured approach on top of the Data Lake that integrates data
warehousing features (like tables, SQL queries) with the flexibility of a Data Lake.
Q)Can I use Dataflow Gen2? If so, how do I configure incremental load in Power Query?
A)Yes, you can use Dataflow Gen2 to ingest data into the Lakehouse or Warehouse.
For increamental refresh in Power Query, follow these steps:
Define Date/Time Filter: In Power Query, add a filter on a date column to select only new or modified rows.
Enable Incremental Refresh:
In the Dataflow settings, go to Incremental Refresh & Real-Time Data.
Define the Range (e.g., load data from the last X days).
Choose Detect Data Changes (if a LastModified column exists).
Publish and ensure your destination supports incremental loading.
Develop ABAP reports within SAP ECC to extract data into a staging area such as Azure Blob Storage or Azure Data Lake.
From there, use COPY INTO to efficiently load the data into Fabric Data Warehouse.
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
That is great. I've tried the notebook route and it works.
I have also tried Copy Assistant and provided it base tables and calculation views to ingest data and it worked.
I did not try ABAP reports. I dont have enough front end access, yet.
But thank you so much for the assistance. Rockstar!!
Hi @CeeVee33
Thank you for being part of the Microsoft Fabric Community.
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
Hey @v-csrikanth - thank you very much for responding.
I must confess, I'm new to Fabric and Azure world too. I've used PBI in the past, but thats about it.
Few followup questions -
With Copy INTO and Merge, can I use Pipeline for it? If not, then what should be used?
Data Lake that you refer is it the same as Lakehouse?
Can I use Data Flow gen2? If so, I cant seem to figure out how to adding a query for incremental load in Power Query.
Any links to known sites will also be helpful, if it is a big list of questions.
Thanks
@amitchandak - I saw one of your posts which says you've been recently working with SAP. Any clue on this issue?
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
3 | |
1 | |
1 | |
1 | |
1 |