March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Why does Dataflow gen2 always have some inexplicable problems, and can we really use it with confidence?
In the Power Query editor, it's a decimal number,
In writing to the Lakehouse's column mapping settings, it is also a decimal number.
After writing the lake house, it is not a decimal number
For other DF2, some are correctly mapped to decimal numbers. There is no pattern, what should we do?
The following one is another DF2 which is normal, is the desired decimal number.
Solved! Go to Solution.
Thank you! We've identified the issue and have a fix for it that will be available in January of next year.
Hi @yjh
Thanks for using Fabric Community. Our community thrives on open communication and collaboration, and for that to happen effectively, we encourage everyone to use English for their questions and discussions. This ensures everyone can understand and participate, regardless of their native language.
Would you be comfortable rephrasing your question in English? We will definitely try to help.
Thanks for understanding.
I changed to English, and I have terrible English.
Thank you!
could you share some sample queries that could help us replicate this issue on our end?
I haven't been able to replicate it on my own with the sample queries that I use
I simply fetch the data from the local database, keep some columns, and that's it.
x5=
I cannot replicate this within my environment.
if you can replicate this with any public data source or some sample data that you can share with us, please go ahead and do so.
if you are unable to provide this info, please go ahead and raise a support ticket so an engineer can take a closer look at your specific scenario and determine what could be happening.
Thank you! We've identified the issue and have a fix for it that will be available in January of next year.
meanwhile is there a workaround to this issue? decimals are pretty important for any calculations. is this an issue with the dataflow? is there other tools we could use in lieu of dataflow to load the data into lakehouse tables? pyspark notebook or pipeline copydata perhaps?
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.