Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Why does Dataflow gen2 always have some inexplicable problems, and can we really use it with confidence?
In the Power Query editor, it's a decimal number,
In writing to the Lakehouse's column mapping settings, it is also a decimal number.
After writing the lake house, it is not a decimal number
For other DF2, some are correctly mapped to decimal numbers. There is no pattern, what should we do?
The following one is another DF2 which is normal, is the desired decimal number.
Solved! Go to Solution.
Thank you! We've identified the issue and have a fix for it that will be available in January of next year.
Hi @yjh
Thanks for using Fabric Community. Our community thrives on open communication and collaboration, and for that to happen effectively, we encourage everyone to use English for their questions and discussions. This ensures everyone can understand and participate, regardless of their native language.
Would you be comfortable rephrasing your question in English? We will definitely try to help.
Thanks for understanding.
I changed to English, and I have terrible English.
Thank you!
could you share some sample queries that could help us replicate this issue on our end?
I haven't been able to replicate it on my own with the sample queries that I use
I simply fetch the data from the local database, keep some columns, and that's it.
x5=
I cannot replicate this within my environment.
if you can replicate this with any public data source or some sample data that you can share with us, please go ahead and do so.
if you are unable to provide this info, please go ahead and raise a support ticket so an engineer can take a closer look at your specific scenario and determine what could be happening.
Thank you! We've identified the issue and have a fix for it that will be available in January of next year.
meanwhile is there a workaround to this issue? decimals are pretty important for any calculations. is this an issue with the dataflow? is there other tools we could use in lieu of dataflow to load the data into lakehouse tables? pyspark notebook or pipeline copydata perhaps?
Check out the September 2024 Fabric update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.