The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
In the table view of the Lakehouse, date type columns with no dates or null is appearing as InvalidDate.
Same table in the SQL endpoint is displaying it properly
Solved! Go to Solution.
Hi @MathieuSGA , Thank you for reaching out to the Microsoft Community Forum.
Your data is fine, the Invalid Date you see in the Lakehouse Table view is just a UI rendering quirk when the column is a date/timestamp and the value is actually NULL. That’s why in PySpark and the SQL endpoint the same rows correctly show NULL. Nothing is wrong in PostgreSQL or your parquet files. If you want the Lakehouse view to look clean, you can cast the column to string or replace nulls with blanks, but for reporting and analysis you should rely on the SQL endpoint or Power BI, which already handle NULL properly.
Similar observation here.
I might have applied changes to the schema but after doing a proper table drop and cleaning all tables from the lakehouse.
Anything to do with information that might still be available in the 'Files' directory ?
Was doing this before without noticing any similar behavior though...
Hi @MathieuSGA , Thank you for reaching out to the Microsoft Community Forum.
Your data is fine, the Invalid Date you see in the Lakehouse Table view is just a UI rendering quirk when the column is a date/timestamp and the value is actually NULL. That’s why in PySpark and the SQL endpoint the same rows correctly show NULL. Nothing is wrong in PostgreSQL or your parquet files. If you want the Lakehouse view to look clean, you can cast the column to string or replace nulls with blanks, but for reporting and analysis you should rely on the SQL endpoint or Power BI, which already handle NULL properly.
Hi @need_a_name , hope you are doing great. May we know if your issue is solved or if you are still experiencing difficulties. Please share the details as it will help the community, especially others with similar issues.
You might want to check the data that is in the column using PySpark, because the values in this column are probably not formatted in a correct date format. It is typical behaviour for the different views to show something different. For SQL endpoint if the value is incorrect, it will show you a NULL value.
Unfortunately if was unable to replicate the "Invalid Date" error.
Hope this helps. If so, please give a Kudos 👍 and mark as Accepted Solution ✔️.
As suggested, I checked the data using PySpark.
Shown below is the result I could see for one of the rows where I have the issue
I checked the source system ( a postgresql table) for the issues with the data. Using the above ordernumber as the reference, I could confirm that the value in that column is null.
But on the Table view in the Lakehouse, it is appearing as a Invalid Date for all the rows where it is a null
User | Count |
---|---|
15 | |
14 | |
9 | |
7 | |
6 |
User | Count |
---|---|
32 | |
30 | |
26 | |
15 | |
12 |