Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Sign up nowGet Fabric certified for FREE! Don't miss your chance! Learn more
Hi
I need to open this again and I hope to get answers...
In my Dataflow Gen2 I want copy 8 tables from an external source into a Fabric Warehouse. Flow outputs all columns as text, target tables contain only text columns. The yellow marked tables have approx 80'000 records. The green ones < 1'000 records.
After 8 hours, the Dataflow fails and suggests to reduce complexity.
In case 80'000 records are too complex for a Dataflow Gen2, is there an alternative option?
Thanks
Christian
Hi, thanks for your Feedback @miguel - I created a support ticket but am waiting for 2 days now and did not get a reply...
@v-cboorla-msft
I ran the huge flow and reduce the number of records to 11 mio - that worked. Therefore I doubt that this issue is related to firewalls or proxy!?
Please share your case number with us here and ping us if you don't hear from a support engineer by the end of the week.
Update from my side:
It is possible, to load the data into the Lakehouse. At least for these small to medium files it worked.
I am still running another Flow that is supposed to write 50 Mio records. That - so far - did not succeed loading to lakehouse...
Hi @webchris
Welcome to Fabric Community and thanks for posting your question here.
As I understand you are trying to load data from external source to Lakehouse using Dataflow Gen2. The below documentation might help you.
Please refer to this link: Link1
Hope this helps. Please let me know in case of any queries.
Hi Christian,
Please reach out to our support team so they can do a more thorough investigation on why some of these refreshes fail.