Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
Hello,
I successfully utilized Dataflow Gen 2 to ingest data from an on-premises SQL Server into the current Data Lake environment. The data import process completed without issues.
Subsequently, I attempted to configure incremental refresh on the query. However, I encountered the following system message:
Incremental Refresh
⚠️ Incremental refresh cannot be configured when:
• The query uses a default destination
• The query targets an unsupported data destinationPlease update the data destination to one of the supported options: Fabric Warehouse, Fabric SQL, or Azure SQL, before enabling incremental refresh.
Given this, I would like to confirm whether Lakehouse supports incremental refresh functionality or if an alternative destination is required.
Lakehouse doesn’t support incremental refresh directly. To enable it, redirect your query to Fabric Warehouse, Fabric SQL, or Azure SQL—those destinations are compatible.
Lakehouse (default Dataflow destination) does not support incremental refresh. To enable it, you must change the destination to a supported option: Fabric Warehouse, Fabric SQL, or Azure SQL. Once switched, incremental refresh can be configured.
I have a implemented a solution to ingest bulk + CDC (incremental) data from sql server on-premise to Onelake in delta parquet format where the delta tables are automatically registered in lakehouse. You can use the delta tables in lakehouse just as you would any other table using SQL/Spark etc. If you need guidance, you may reach out to me at info@702analytics.com
Hi @Alaahady,
Incremental refresh isn’t supported when the Dataflow Gen2 output goes straight to a Lakehouse. That’s why you see the “default destination” message. Right now, the supported destinations for incremental refresh are:
So if you need incremental refresh, you’ll need to land the data in one of those targets instead of directly in the Lakehouse.
Workarounds:
If this helps, consider marking the post as solution.
Best regards!
Hi @Mauro89
Thank you for the clarification—it was very helpful and saved me valuable time.
I attempted to create another Dataflow Gen2 to the Warehouse, and the initial connection was successful. However, after publishing, I encountered the following error:
Error: There was a problem refreshing the dataflow: "Something went wrong, please try again later. If the error persists, please contact support."
Error Code: ActionUserFailure
Request ID: e0245b18-3f27-4168-873a-a0282024b0fa
Additionally, the fact_case_detail_WriteToDataDestination step failed with the following message:
Error: Data source credentials are missing or invalid. Please update the connection credentials in settings, and try again.
Error Code: Challenge Error
Request ID: e0245b18-3f27-4168-873a-a0282024b0fa
Hi @Alaahady,
that's good to hear!
The new error messages may be a bit tricky.
For the first, maybe give it a bit of time and refresh your browser or clear the cache. This message sometimes flies by as fast as it arrived 😅
For the second go and double check your credentials, maybe through another interface to ensure your connection is working in general.
Best regards!
User | Count |
---|---|
2 | |
2 | |
1 | |
1 | |
1 |