Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
omarecd
Helper I
Helper I

Timestamp is lost, it does not reach the data warehouse.

Hello people.

I am working with a time series and I am moving the data from a table where I have 3 columns, one of them is the timestamp on a column called at.

 

omarecd_0-1738154691033.png

 

The data is being sent succesfully to my warehouse:

omarecd_1-1738154789058.png

 

The problem is that when I go and have a look, the wherehouse does not contain the column 'at', the others two are fine but for some reason that timestamp column is missing ... 😞

 

omarecd_2-1738154961742.png

 

Any idea why ?

 

Thanks in advance and greetings from Belgium 🇧🇪

 

Omar C.

 

 

 

 

1 ACCEPTED SOLUTION
v-pnaroju-msft
Community Support
Community Support

Hi @omarecd,

Thank you @nilendraFabric and @FabianSchut for the response.

We sincerely appreciate your inquiry on the Microsoft Fabric Community Forum.

The most probable reason for this issue is that the data warehouse is unable to automatically identify the data type of the 'at' column. This could be due to:

  • Incorrect interpretation of the timestamp format in the source data.
  • Presence of null values in the source column.

Kindly follow the steps below to resolve the issue, along with the necessary screenshots:

  1. Load the data from a CSV file into the warehouse using a dataflow.
  2. Open a dataflow, select the source as a CSV file, and ensure that the date in the CSV file is formatted as MM:DD:YYYY.
    vpnarojumsft_1-1738223042745.png
  3. After loading the data into the dataflow, check the datatype of the fields and ensure that they align with the type of data they hold.
  4. Ensure that the datatype of the 'at' column is set to Date/Time. If the required datatype is not present, perform the necessary transformations on the 'at' column. Convert the Date/Time/Zone data type to Date/Time to prevent errors.
    vpnarojumsft_2-1738223112799.png

    vpnarojumsft_3-1738223174843.png

     

  5. Set the destination as "Warehouse" in the dataflow and allow the creation of a new table.
  6. Map the source fields accurately to the corresponding warehouse table fields along with their respective datatypes.
  7. Publish the dataflow and verify the details in the warehouse.
  8. Refresh and confirm that the data from the source CSV file is successfully available in the warehouse table.
    vpnarojumsft_4-1738223271948.png

     

If you find this response helpful, kindly mark it as the accepted solution and provide kudos. This will assist other community members facing similar queries.

 

Thank you.

View solution in original post

8 REPLIES 8
v-pnaroju-msft
Community Support
Community Support

Hi omarecd,

We are following up to see if your query has been resolved. Should you have identified a solution, we kindly request you to share it with the community to assist others facing similar issues.

If our response was helpful, please mark it as the accepted solution and provide kudos, as this helps the broader community.

Thank you.

v-pnaroju-msft
Community Support
Community Support

Hi omarecd,

We wanted to check in regarding your query, as we have not heard back from you. If you have resolved the issue, sharing the solution with the community would be greatly appreciated and could help others encountering similar challenges.

If you found our response useful, kindly mark it as the accepted solution and provide kudos to guide other members.

Thank you.

v-pnaroju-msft
Community Support
Community Support

Hi omarecd,

We have not received a response from you regarding the query and were following up to check if you have found a resolution. If you have identified a solution, we kindly request you to share it with the community, as it may be helpful to others facing a similar issue.

 

If you find the response helpful, please mark it as the accepted solution and provide kudos, as this will help other members with similar queries.

Thank you.

v-pnaroju-msft
Community Support
Community Support

Hi @omarecd,

Thank you @nilendraFabric and @FabianSchut for the response.

We sincerely appreciate your inquiry on the Microsoft Fabric Community Forum.

The most probable reason for this issue is that the data warehouse is unable to automatically identify the data type of the 'at' column. This could be due to:

  • Incorrect interpretation of the timestamp format in the source data.
  • Presence of null values in the source column.

Kindly follow the steps below to resolve the issue, along with the necessary screenshots:

  1. Load the data from a CSV file into the warehouse using a dataflow.
  2. Open a dataflow, select the source as a CSV file, and ensure that the date in the CSV file is formatted as MM:DD:YYYY.
    vpnarojumsft_1-1738223042745.png
  3. After loading the data into the dataflow, check the datatype of the fields and ensure that they align with the type of data they hold.
  4. Ensure that the datatype of the 'at' column is set to Date/Time. If the required datatype is not present, perform the necessary transformations on the 'at' column. Convert the Date/Time/Zone data type to Date/Time to prevent errors.
    vpnarojumsft_2-1738223112799.png

    vpnarojumsft_3-1738223174843.png

     

  5. Set the destination as "Warehouse" in the dataflow and allow the creation of a new table.
  6. Map the source fields accurately to the corresponding warehouse table fields along with their respective datatypes.
  7. Publish the dataflow and verify the details in the warehouse.
  8. Refresh and confirm that the data from the source CSV file is successfully available in the warehouse table.
    vpnarojumsft_4-1738223271948.png

     

If you find this response helpful, kindly mark it as the accepted solution and provide kudos. This will assist other community members facing similar queries.

 

Thank you.

FabianSchut
Super User
Super User

Hi, can you check in the data destination whether the column 'at' is selected? You can find the documentation here: https://learn.microsoft.com/en-us/fabric/data-factory/dataflow-gen2-data-destinations-and-managed-se....

 

I've noticed that if a column has a ANY type, the column will not automatically be selected when creating the table the first time. Maybe there are other column types that have the same result. The solution for the ANY type to be automatically added to the destination table was to convert the type to a known type, such as double or text. 
Can you try to convert your column to a known type for your destination, drop the table once and reconfigure your destination in the data flow? 

nilendraFabric
Community Champion
Community Champion

@omarecd could you please share the timestamp datatype in source. With some example value 

nilendraFabric
Community Champion
Community Champion

hi @omarecd 

 

Could you please check the format in both source and dw.

 

The timestamp column in your source might have a different format or data type than what is expected in the destination table. For example, if the column is in `DateTimeZone` format but the destination expects `DateTime`, it might cause issues without throwing explicit errors

 

see if below discussion helps

 


https://community.fabric.microsoft.com/t5/Power-Query/DataFormat-Error-We-couldn-t-parse-the-input-p...

Hello @nilendraFabric , thanks for your valuable feedback.

Your comment makes a lot of sense, yes. The thing is that I did not configure anything at all 🙂 I just send the data to the warehouse and ... the structure of the warehouse table was created automatically. My question is how it comes that 2 columns were created but not the 'at'.

How could I create/add the 'at' column manually ? (even if I think that I shouldn't do that manually)

 

Maybe @v-priyankata , do you have an idea ? 🤞

Thanks in advance.

Omar C.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June FBC25 Carousel

Fabric Monthly Update - June 2025

Check out the June 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.