Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for a $400 discount.
Register nowGet inspired! Check out the entries from the Power BI DataViz World Championships preliminary rounds and give kudos to your favorites. View the vizzies.
Hi everyone,
Data analyst here, experienced with Pbi desktop but none with pipelines.
Going through the Fabric trial, and testing it by connecting to our Postgres replica DB, hoping to query that database multiple times a day and offer users with reports that are refresh using direct lake connection.
I managed to set up a few tables, however when working on the most critical one, account invoices and sales, I get this error, which I think relates to the source data being modified while I queried it.
Failure happened on 'Source' side. ErrorCode=UserErrorUnclassifiedError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Odbc Operation Failed.,Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,''Type=System.Data.Odbc.OdbcException,Message=ERROR [40001] [Microsoft][ODBC PostgreSQL Wire Protocol driver][PostgreSQL]ERROR: VERROR; canceling statement due to conflict with recovery(Detail User query might have needed to see row versions that must be removed.; File postgres.c; Line 3143; Routine ProcessInterrupts; ),Source=mspsql27.dll,'
Can anyone help me understand how to fix this and get the invoice data into the lakehouse?
On a different note, how can I modify the selected columns to import, after the pipeline has been created ? Is there a way to modify those columns and data types or do I need to recreate the pipeline altogether?
Thanks!
EDIT: Just to clarify in case there is a better way to do this.
I'm trying to query a live Postgres DB, I was hoping to set up the pipeline, have the data update automatically and create curate a dataset in the service for power users (which for now it will also be me).
We have PPU licenses and Fabric trail.
I thought the process would be Pipeline from Postgres to Lakehouse. Dataflow from lakehouse to lakehouse, ETL and publish dataset.
Should I be doing something differently ?
Thanks for any assistance.
yes, this is related to PostgresDB - PostgreSQL ERROR: canceling statement due to conflict with recovery - Stack Overflow
You should find a postgresDB forum to find an answer
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code FABINSIDER for a $400 discount!
Check out the February 2025 Power BI update to learn about new features.
User | Count |
---|---|
60 | |
34 | |
29 | |
27 | |
27 |
User | Count |
---|---|
52 | |
46 | |
35 | |
15 | |
12 |