Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
I'm just starting to play with the copy activity in data pipelines to move data from on premise SQL servers.
If I configure the destination to be a CSV in files section of lakehouse the copy is very quick.
However if I set it to write to table or a parquet in the files section of lakehouse it just hangs on in progress. The details suggest it is reading the data fine but failing to write the actual files. The screen shot below was still hanging after 16 minutes while the csv version completed in seconds.
Gen2 Dataflows are now working fine so am assuming our firewall is ok unless data pipeline needs something new opening.
Thanks in advance,
Ben
I am now regularly ingesting 20+ tables from on-prem sql to lakehouse utilizing a copy activity by staging the output in parquet file in lakehouse files and use notebook to write to target utlizing delta mege for each table. They are working as desired and smoothly so far.
On the other hand, Gen2 dataflow is slower compared to notebooks hence less performant.
If you are trying to copy directly to the table, I would strongly advise against it as the best practice dictates to bring in the daily ingestion to a layer (files in this case) first and write to the destination (delta tables) later.
Hi @bcdobbs
Apologies for the inconvenience.
If you have opened a support ticket, a reference to the ticket number would be greatly appreciated. This will allow us to track the progress of your request and ensure you receive the most efficient support possible.
Thank you.
Playing more with writing to the file area if I change the setting on the parquet write to not use V-order it starts working.
I'm guessing there is some interaction between the compute and the on prem gateway that's not working. Doesn't help with writing to the tables area though.
Hi @bcdobbs
Thanks for using Microsoft Fabric Community.
I tried to repro the scenario and didn't face any delay in copying the data using Pipeline copy activity.
Destination data type : csv file
Destination data type : Table
Destination data type : parquet file
Please do let us know if you have any further queries. We will try to help.
Thank you.
Hi @bcdobbs
Unless you have a very specific firewall configuration that strictly differentiates between Gen2 Dataflows and data pipelines, you likely don't need to make any changes to your existing firewall rules.
For additional information please refe : Adjust communication settings for the on-premises data gateway | Microsoft Learn
There might be a chance of an intermittent issue could you please follow the below steps that might help you.
Temporary Glitch: Clearing cookies and caches can sometimes resolve temporary glitches within the application that might be causing the issue.
Corrupted Data: In rare cases, corrupted data stored in the browser's cache related to Microsoft Fabric might be causing the issue. Clearing the cache removes this potentially problematic data.
Hard Refresh: A hard refresh bypasses the cached version of the webpage and forces the browser to download the latest version from the server. Press Ctrl+Shift+R (Windows) or Cmd+Shift+R (Mac).
Try Microsoft Edge: If you are currently using Chrome or Firefox, try switching to Microsoft Edge to see if the issue persists. Edge is the native browser for Microsoft products and might have better compatibility with Microsoft Fabric.
If the issue still persists, please do let us know. Glad to help.
Thank you.
Hi @bcdobbs
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others.
Otherwise, will respond back with the more details and we will try to help.
Thank you.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.