Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I'm creating a data aggrator/report generator. My data collectors will output to a .csv file so my ultimate goal is to be able to dump the outputs to a specific location, then open my report generator hit data refresh an the only manual work in the report is documentation of collection process. I have to build this to be able to flex from 5 instruments recording to two collectors up to five collectors.
The headache I'm running in to is the data collectors will collect data every second, but when I export to .csv the timestamp drops the second portion. Is there a way to use a forumla inside power query to look at the minute fake in a second value or some other apporach that I'm overlooking?
Some other aspects I'm trying to also work through is that the instrument that runs the least amount of time controls lenght of the study.
Solved! Go to Solution.
Thanks all for the posts. My orginal post came at the end of a Friday that was spent trying to troubleshoot and figure out why my report generator was crashing when I had started the day expecting a smooth process and being able to finish writting and validating my SOP that this report would be a part of. After taking the weekend to get a clear head and fresh perspective on it, and digesting some information from a vendor, I've got everything working correctly this morning.
The core problem was that the .csv files had a time stamp format of m/d/yyyy h:mm and that was the information the power querry was digesting, resulting in the :ss field being populated with 00 when it would change type to date time. Once I modified the .csv to show the full time stamp including seconds everything function as expected.
Thanks all for the posts. My orginal post came at the end of a Friday that was spent trying to troubleshoot and figure out why my report generator was crashing when I had started the day expecting a smooth process and being able to finish writting and validating my SOP that this report would be a part of. After taking the weekend to get a clear head and fresh perspective on it, and digesting some information from a vendor, I've got everything working correctly this morning.
The core problem was that the .csv files had a time stamp format of m/d/yyyy h:mm and that was the information the power querry was digesting, resulting in the :ss field being populated with 00 when it would change type to date time. Once I modified the .csv to show the full time stamp including seconds everything function as expected.
Please show an example of the raw csv data and tyhe M Code you are having an issue with.
Make sure you also show (a screenshot of) the step where you first see the timestamp coming in, so right after the call to CSV.Document()
Hi @JwEvans
You’re right. Power Query can’t recover missing seconds unless there's a clear pattern, like consistent sampling. If each row represents one second and the data is sorted, you could use an index to rebuild the timeline, but it only works if no data is missing.
For trimming by the shortest-running instrument, try filtering all datasets to the earliest end time. That should keep things aligned across collectors.
but when I export to .csv the timestamp drops the second portion.
That should not happen. Is the timestamp in UTC, and is it in ISO-8601 format?
Check out the July 2025 Power BI update to learn about new features.