Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I have a pipeline in fabric and source has 2-3 fields that has datetime as datattype and in pipeline I am using copy activity to copy data from source to lakehouse table. Now lakehouse table has fields datatype as timestamp.pipeline is executing successfully but when I am trying to run select query through spark sql on notbook it is returning error datetime inconsistency. Please help me how I can fix this issue
Hi @Kanika123
Is there any update? Have you resolved this issue? If any of the answers provided were helpful, please consider accepting them as a solution. If you have found other solutions, we would greatly appreciate it if you could share them with us. Thank you!
Best Regards,
Jing
I am assuming you are able to view the data from lakehosue explorer and you are facing this error when ran the query in spark sql, is it?
Are you able run a T-SQL query on this table from sql analytics end point?
if the answer for the above two questions is yes then I am guessing its a type conversion problem. Please checkout this blog
was there any unsual datetime value like '31-12-9999' ?
I would suggest you to performing manual type casting in the spark sql
BTW, in Fabric lakehouse timestamp datatype is datetime2.
https://learn.microsoft.com/en-us/fabric/data-warehouse/data-types
Need a Power BI Consultation? Hire me on Upwork
Connect on LinkedIn
|
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.
| User | Count |
|---|---|
| 16 | |
| 7 | |
| 3 | |
| 2 | |
| 2 |