The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi all, dumb question - in a Dataflow Gen2, if I don't choose a destination, where exactly does the data get written? The job runs as long as I would expect it to if it were actually writing data, but I'm not sure where the data actually goes.
As a follow up question - if it's like a Gen1 dataflow and writes essentially to ADLSgen2, is there a way for me to then reference this data from another dataflow (without having landed it in a Lakehouse or Warehouse)?
Thanks!
Scott
Solved! Go to Solution.
The data is staged in a Lakehouse - the LH is currently visible in the workspace, but access will be limited in the future as this is a implementation detail.
You can reference the data by using the Power Platform Dataflows connector - this works the same way as it did in Gen1.
The data is staged in a Lakehouse - the LH is currently visible in the workspace, but access will be limited in the future as this is a implementation detail.
You can reference the data by using the Power Platform Dataflows connector - this works the same way as it did in Gen1.
Awesome - thank you! One issue I'm currently fighting is that our on-prem gateways don't allow communication with the Fabric service on port 1433, so anything that writes from on-prem gateway directly to a datalake or warehouse fails. If it lands from staging, I might be able to "temporarily" write a job to finish the move to the final destination - without having to have the gateway server involved.
I understand this will break as soon as the staging lakehouse/warehouse is hidden, but I'm really just trying to find a way to land data until the firewall changes can go in to allow port 1433.
Thanks very much for the info!
Scott