Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.
Hello everyone,
I'm running some tests with Data Fabric to retrieve data from Oracle and send it to the Lakehouse. I subscribed to the F2 license for testing purposes.
Essentially, I create a Dataflow Gen2 pipeline to fetch a sales table from Oracle. This table has 200 million records and is over 120GB in size.
In my pipeline, when I set a filter to retrieve only 1 day of data from the table, the pipeline works fine. However, when I increase the filter to retrieve 1 month of data, it throws an error.
The errors I receive are:
- There was a problem refreshing the dataflow. Please review the error message(s) below, fix the problem, and try again. (Request ID: d4f7561f-b5ad-4fe1-a3ac-4ed534a59292).
-Error Code: Challenge Error, Error Details: Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again. (Request ID: d4f7561f-b5ad-4fe1-a3ac-4ed534a59292).
The gateway I'm using is the On-premises data gateway version 3000.178.9 (June/23).
F2 is likely (far) too small for that size of data. Back in the pre-Fabric days (ie the present) you would need a P4 SKU (at least).
It might be that your Oracle source has its own timeout enforcement.
I upgraded to an F16 license and tested a 6GB table with 17 million rows, but I received the same error.
Note: I updated the gateway to the latest version (3000.182.4).
In the legacy days that size would require at least a P1 (F64)
Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City
Check out the April 2024 Power BI update to learn about new features.