Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
Really want to test Fabric, but need to get on-prem data.
Key items:
- Current On-Prem gateway installed
- Servers run 2019
- DFG2's load, but will not write to a warehouse or lake house due to this error: WriteToDatabaseTableFrom_TransformForOutputToDatabaseTableFrom_tablename
- In addition to the WriteToDatabase error, I've also seen them time out due to an hour limit: Solved: Datasource crednetials missing or Invalid AFTER da... - Microsoft Fabric Community
So, here are my questions:
- Will it not write to a wh or lh because it's SQL Server 2019? Do we need SQL Server 2022?
- I've seen a couple of videos where it's being done, so I should be able to, too?
- Pipeline functionality isn't ready yet (think it's on the roadmap for Q1 2024)?
Can anyone point me in the right direction? Gut tells me we need SQL Server 2022, but I honestly don't know.
Any help will be appreciated.
Solved! Go to Solution.
Thank you for the follow-up. I have not utilized support. Want to exhaust every other option, first.
We reviewed info here: Adjust communication settings for the on-premises data gateway | Microsoft Learn
Most of the "solutions" is to add *.datawarehouse.pbidedicated.windows.net to the windows FW or Fortinet in this case, we can do that via a custom URL category on the Palos but very limited otherwise.
The hostname has a randomly generated "host" section (<random hostname>.datawarehouse.pbidedicated.windows.net). Feels like there is no limit to the number of these than can exist. Is there a limit and are they created for each workspace? One thought is that we could build a list of those FQDNs that are needed, as long as the IP doesn't change often. We could possibly create some sort of automated import of new ones....hoping MS has a better solution, just brainstorming other things we could do. There is a list of IPs, subnets actually but it's a lot.
Hi @jcampbell474 ,
Thanks for using Fabric Community.
To answer your questions:
1) It is not the case that Dataflows Gen 2 cannot write to a warehouse or lake house because it's SQL Server 2019. Dataflows Gen 2 can write to SQL Server 2019, SQL Server 2022, and Azure Synapse Analytics.
2) Yes, you should be able to write to a warehouse or lake house from Dataflows Gen 2, regardless of whether you are using SQL Server 2019 or SQL Server 2022.
3) You are correct that pipeline functionality is not yet ready in Dataflows Gen 2. Currently we do not have any ETA. Stay tuned for more updates.
As per the error you are experiencing, this might require a deeper investigation from our engineering team about your workspace. I would request you to please go ahead with Microsoft support on this and raise a support ticket at this link : https://support.fabric.microsoft.com/en-US/support/.
Also once you have opened the support ticket , please do share the supportcase# here.
Please let us know if you have any further queries.
Thanks
Hi @jcampbell474 ,
We haven’t heard from you on the last response and was just checking back to see if your query has been resolved. Please let us know if you have any further queries.
Hi @jcampbell474 ,
Following up to check whether you got a chance to create a support ticket for the issue you have been facing. If yes please share the ticket details here as it would help us to track for more information.
Thank you.
Thank you for the follow-up. I have not utilized support. Want to exhaust every other option, first.
We reviewed info here: Adjust communication settings for the on-premises data gateway | Microsoft Learn
Most of the "solutions" is to add *.datawarehouse.pbidedicated.windows.net to the windows FW or Fortinet in this case, we can do that via a custom URL category on the Palos but very limited otherwise.
The hostname has a randomly generated "host" section (<random hostname>.datawarehouse.pbidedicated.windows.net). Feels like there is no limit to the number of these than can exist. Is there a limit and are they created for each workspace? One thought is that we could build a list of those FQDNs that are needed, as long as the IP doesn't change often. We could possibly create some sort of automated import of new ones....hoping MS has a better solution, just brainstorming other things we could do. There is a list of IPs, subnets actually but it's a lot.
Hi @jcampbell474 ,
Thanks for the update. Regarding your question about limit to the number of hosts, I have taken help from internal team. I will update once I hear back from them.
Appreciate your patience.
Thank you for the reply and clarification.
Unfortunately, I have opened many Fabric related tickets and did not get the expected result. Examples, traces, etc....
Your clarification, as well as additional clarification in Reddit, helped me determine that it may be something entirely different than my initial suspicions.
Error: WriteToDatabaseTableFrom_TransformForOutputToDatabaseTableFrom_tablename
Engine: -
Errors: Mashup Exception Data Source Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Microsoft SQL: A connection was successfully established with the server, but then an error occurred during the pre-login handshake. (provider: TCP Provider, error: 0 - The semaphore timeout period has expired.) Details: Reason = DataSource.Error;DataSourceKind = Lakehouse;DataSourcePath = Lakehouse;Message = A connection was successfully established with the server, but then an error occurred during the pre-login handshake. (provider: TCP Provider, error: 0 - The semaphore timeout period has expired.);ErrorCode = -2146232060;Number = 121;Class = 20 GatewayObjectId: Obfuscated