Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
My dataflows are erroring when pointing them towards a table in a schema on my lakehouse and I have no idea why.
The best error description I can get is "Job instance failed without detail error" - super helpful.
Here's what I've done - I'm looking to enrich my data with some data from the Office of National Statistics (ONS). So I've pulled some data with the API via a notebook, collecting the JSON. Then I'm using a dataflow to flatten out the JSON because it makes it easy to drill into the results. If I link to the my lakehouse and let it default to the dbo schema, everythings fine, but when I move the table to an ons. schema, and then repoint the dataflow to use an existing table and point it at that table, it fails giving me the above error.
everything looks fine here (in the selection of the existing table, it can see ons.dataset, and the mappings work fine as this is the table it created and originally put into dbo. - i've just moved it).
Anyone have any ideas why this might be failing? Is schema destination just not supported yet from a dataflow gen2?
Thanks
Solved! Go to Solution.
Ah, I see what you're saying.
I tried replicating your test. Upon testing it further, I think your initial assertion is correct that the schema destination is not supported yet.
In my example, I have a Bronze schema. I started having the dataflow create a new table. The UI for the destination doesn't allow me to input a schema anywhere. I can see my schema'd tables with a {schema}.{tablename} notation, so I tried that and got the same error as you:
As a workaround, I tried creating the table by hand and pointing my dataflow to the existing table. I was able to map the columns from my source to the destination table with the schema. However, upon saving and running, the dataflow fails:
Based on the UI only giving you a place to enter a table name, I don't think you can yet reference a non-dbo schema because the dataflow still expects a table name, not table and schema. It then assumes the schema to be dbo.
I tried to search for additional resources of people reporting similar issues. I didn't find any other conversations on Dataflow Gen2 to lakehouse. However, I did find a Warehouse forum question and one Reddit thread where someone was trying to interact with warehouses and had a similar experience trying to write to a non-dbo schema. Based on the response in the Fabric forum, it's on the roadmap to support this, but hasn't been addressed yet:
Solved: Re: Dataflow Gen2 writing to Warehouse - how to ch... - Microsoft Fabric Community
Dataflows Gen2 - How to specify a schema when writing to a data warehouse : r/MicrosoftFabric
My guess would be that if non-dbo schemas in the warehouse aren't supported by Dataflow Gen 2, they likely don't support the same thing in lakehouse schemas yet.
Hopefully this helps. If you have follow-up questions, please ask away. If not, be sure to mark your answer for this question so other users can learn from your question!
Did you shift the target from dbo to ons within the existing dataflow?
Depending on the complexity of your workflow, I'd try creating a brand new dataflow and see if the same error occurs. Occasionally I've gotten weird errors like that with a dataflow and had to recreate the item to resolve it.
I did.
I built the dataflow with a connection to the lakehouse allowing it to create the table (which is automatically puts in dbo) then hopped over to the lakehouse, dragged the table from dbo into ons, then went back to the data flow, removed the lakehouse connection and reconnected changing the destination to 'existing table' and finding the moved version of the table (ons.dataset).
As you can see above I also have a 'util' schema for which I've created a master calendar in a dataflow and stepped through the same process and have the same issue with - if I try and re-run that into util.dateDim I get the same error.
Is there another way to set a schema'd endpoint in a dataflow I'm missing? Trying to let the dataflow create it initially within the target schema results in an error
Ah, I see what you're saying.
I tried replicating your test. Upon testing it further, I think your initial assertion is correct that the schema destination is not supported yet.
In my example, I have a Bronze schema. I started having the dataflow create a new table. The UI for the destination doesn't allow me to input a schema anywhere. I can see my schema'd tables with a {schema}.{tablename} notation, so I tried that and got the same error as you:
As a workaround, I tried creating the table by hand and pointing my dataflow to the existing table. I was able to map the columns from my source to the destination table with the schema. However, upon saving and running, the dataflow fails:
Based on the UI only giving you a place to enter a table name, I don't think you can yet reference a non-dbo schema because the dataflow still expects a table name, not table and schema. It then assumes the schema to be dbo.
I tried to search for additional resources of people reporting similar issues. I didn't find any other conversations on Dataflow Gen2 to lakehouse. However, I did find a Warehouse forum question and one Reddit thread where someone was trying to interact with warehouses and had a similar experience trying to write to a non-dbo schema. Based on the response in the Fabric forum, it's on the roadmap to support this, but hasn't been addressed yet:
Solved: Re: Dataflow Gen2 writing to Warehouse - how to ch... - Microsoft Fabric Community
Dataflows Gen2 - How to specify a schema when writing to a data warehouse : r/MicrosoftFabric
My guess would be that if non-dbo schemas in the warehouse aren't supported by Dataflow Gen 2, they likely don't support the same thing in lakehouse schemas yet.
Hopefully this helps. If you have follow-up questions, please ask away. If not, be sure to mark your answer for this question so other users can learn from your question!
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
10 | |
4 | |
4 | |
3 | |
3 |