Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric certified for FREE! Don't miss your chance! Learn more
Hi
I need to open this again and I hope to get answers...
In my Dataflow Gen2 I want copy 8 tables from an external source into a Fabric Warehouse. Flow outputs all columns as text, target tables contain only text columns. The yellow marked tables have approx 80'000 records. The green ones < 1'000 records.
After 8 hours, the Dataflow fails and suggests to reduce complexity.
In case 80'000 records are too complex for a Dataflow Gen2, is there an alternative option?
Thanks
Christian
Hi, thanks for your Feedback @miguel - I created a support ticket but am waiting for 2 days now and did not get a reply...
@v-cboorla-msft
I ran the huge flow and reduce the number of records to 11 mio - that worked. Therefore I doubt that this issue is related to firewalls or proxy!?
Please share your case number with us here and ping us if you don't hear from a support engineer by the end of the week.
Update from my side:
It is possible, to load the data into the Lakehouse. At least for these small to medium files it worked.
I am still running another Flow that is supposed to write 50 Mio records. That - so far - did not succeed loading to lakehouse...
Hi @webchris
Welcome to Fabric Community and thanks for posting your question here.
As I understand you are trying to load data from external source to Lakehouse using Dataflow Gen2. The below documentation might help you.
Please refer to this link: Link1
Hope this helps. Please let me know in case of any queries.
Hi Christian,
Please reach out to our support team so they can do a more thorough investigation on why some of these refreshes fail.
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Fabric update to learn about new features.