Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
GonzaloB
Helper I
Helper I

Fabric pipeline failing - Failure happened on 'Source' side. ErrorCode=UserErrorUnclassifiedError,'T

Hi everyone,

 

Data analyst here, experienced with Pbi desktop but none with pipelines.

Going through the Fabric trial, and testing it by connecting to our Postgres replica DB, hoping to query that database multiple times a day and offer users with reports that are refresh using direct lake connection.

 

I managed to set up a few tables, however when working on the most critical one, account invoices and sales, I get this error, which I think relates to the source data being modified while I queried it.

 

Failure happened on 'Source' side. ErrorCode=UserErrorUnclassifiedError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Odbc Operation Failed.,Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,''Type=System.Data.Odbc.OdbcException,Message=ERROR [40001] [Microsoft][ODBC PostgreSQL Wire Protocol driver][PostgreSQL]ERROR: VERROR; canceling statement due to conflict with recovery(Detail User query might have needed to see row versions that must be removed.; File postgres.c; Line 3143; Routine ProcessInterrupts; ),Source=mspsql27.dll,' 

 

Can anyone help me understand how to fix this and get the invoice data into the lakehouse?

 

On a different note, how can I modify the selected columns to import, after the pipeline has been created ? Is there a way to modify those columns and data types or do I need to recreate the pipeline altogether?

 

Thanks!

 

EDIT: Just to clarify in case there is a better way to do this. 

I'm trying to query a live Postgres DB, I was hoping to set up the pipeline, have the data update automatically and create curate a dataset in the service for power users (which for now it will also be me).

We have PPU licenses and Fabric trail.

I thought the process would be Pipeline from Postgres to Lakehouse. Dataflow from lakehouse to lakehouse, ETL and publish dataset.

 

Should I be doing something differently ?

 

Thanks for any assistance.

1 REPLY 1
Anonymous
Not applicable

yes, this is related to PostgresDB - PostgreSQL ERROR: canceling statement due to conflict with recovery - Stack Overflow

You should find a postgresDB forum to find an answer

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors