Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hello,
I have a large dataset of calls of a call center grouped via a unique identifier ID.
Each row has a single ID and (among many others) multiple columns with the time stamp of each call attempt dd-mm-yyyy hh-mm-ss.
e.g.: ID | call 1 type | call 1 timestamp | call 1 outcome | call 2 type | call 2 time stamp | call 2 outcome | ... | call 13 type | call 13 time stamp | call 13 outcome| ...
I would like to visualize a distribution of the the best time to call in 20/30 minutes intervals based on the outcome. And be able to slice it for the different call type.
Any suggestion on how to proceed?
Many of the threads I have found have a dataset with one time stamp per row but I need my source file to remain in the format it is now.
I hope some of you can help a complete noob 🙂
Hi @marzio ,
For per ID, are there multiple timestamps in the same row?
I am not sure what desired result would you want, could you please share your sample data and desired output screenshots for further analysis? You can also upload sample pbix to OneDrive and post the link here. Do mask sensitive data before uploading.
Please read this post to get your answer quickly: How to Get Your Question Answered Quickly.
Best Regards,
Amy
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi Amy,
thank you for your answer.
Yes, there are multiple time stamps per row. You can see down below a small redacted sample of the data.
The source is a data dump with more than 200 columns that I would rather not upload.
Starting from 2018 and onwards.
Every set of columns starting with FC1,FC2, SC1 is a different call attempt with different outcomes. In total there are max 13 time stamps per row (3 for every FC call, 10 for every SC call).
I would like to visualise what is the best time for attempting a call based on a successful status. Both being able to see all FC attempts together, but also slicing for FC1, FC2, FC3 or time period. Using some predefined time intervals (20-30 minutes).
The output would be something like this with the green line being successful and orange not successful attempts.
I have tried to define which steps I would need to model the data but I have not been able to find all the sources to create them for PowerBI. The second point is the most troublesome.
- Create time intervals table (http://datacornering.com/how-to-generate-date-and-time-interval-list-using-power-query/)
- Link time stamps in source data to time intervals table
(does this require splitting time stamps into different rows? Do I have to add extra columns to the source file?)
- Calculate number of time stamps for every interval
- Add data to chart/graph to visualize in the best way
I hope you can help me.
Marzio
| UPLOADED_DATE | FC1_STATUS | FC1_UNSUCCESSFUL_STATUS | FC1_ATTEMPTED_DATE | FC1_CALLBACK_DATE | FC2_STATUS | FC2_UNSUCCESSFUL_STATUS | FC2_ATTEMPTED_DATE | FC2_CALLBACK_DATE | FC3_STATUS | FC3_UNSUCCESSFUL_STATUS | FC3_ATTEMPTED_DATE | FC3_CALLBACK_DATE | SC1_STATUS | SC1_UNSUCCESSFUL_STATUS | SC1_ATTEMPTED_DATE | SC1_CALLBACK_DATE | SC2_STATUS | SC2_UNSUCCESSFUL_STATUS | SC2_ATTEMPTED_DATE | SC2_CALLBACK_DATE | SC3_STATUS | SC3_UNSUCCESSFUL_STATUS | SC3_ATTEMPTED_DATE | SC3_CALLBACK_DATE | SC4_STATUS | SC4_UNSUCCESSFUL_STATUS | SC4_ATTEMPTED_BY | SC4_ATTEMPTED_DATE | SC4_CALLBACK_DATE | SC4_COMMENTS | SC5_STATUS | SC5_UNSUCCESSFUL_STATUS | SC5_ATTEMPTED_BY | SC5_ATTEMPTED_DATE | SC5_CALLBACK_DATE | SC5_COMMENTS | SC6_STATUS | SC6_UNSUCCESSFUL_STATUS | SC6_ATTEMPTED_BY | SC6_ATTEMPTED_DATE | SC6_CALLBACK_DATE | SC6_COMMENTS | SC7_STATUS | SC7_UNSUCCESSFUL_STATUS | SC7_ATTEMPTED_BY | SC7_ATTEMPTED_DATE | SC7_CALLBACK_DATE | SC7_COMMENTS | SC8_STATUS | SC8_UNSUCCESSFUL_STATUS | SC8_ATTEMPTED_BY | SC8_ATTEMPTED_DATE | SC8_CALLBACK_DATE | SC8_COMMENTS | SC9_STATUS | SC9_UNSUCCESSFUL_STATUS | SC9_ATTEMPTED_BY | SC9_ATTEMPTED_DATE | SC9_CALLBACK_DATE | SC9_COMMENTS | SC10_STATUS | SC10_UNSUCCESSFUL_STATUS | SC10_ATTEMPTED_BY | SC10_ATTEMPTED_DATE | SC10_CALLBACK_DATE | SC10_COMMENTS | |
| A5572197 | 07-08-19 01:06:00 | Successful | 8-7-2019 9:08 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| A5418573 | 09-07-19 03:30:00 | Successful | 7-9-2019 9:59 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| A5418574 | 09-07-19 03:30:00 | Successful | 7-9-2019 9:57 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| A5418575 | 09-07-19 03:30:00 | Successful | 7-9-2019 9:57 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| A5553728 | 02-08-19 05:53:00 | Successful | 8-2-2019 10:40 | Successful | 8-5-2019 14:43 | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| A5553726 | 02-08-19 05:53:00 | UnSuccessful | Ringing | 8-2-2019 11:27 | UnSuccessful | Ringing | 8-2-2019 12:59 | UnSuccessful | Ringing | 8-5-2019 9:56 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| A5519141 | 26-07-19 05:52:00 | UnSuccessful | Callback | 7-26-2019 10:01 | 7-26-2019 11:01 | UnSuccessful | Ringing | 7-26-2019 11:02 | Successful | 7-26-2019 12:09 | UnSuccessful | Voicemail | 8-1-2019 12:18 | UnSuccessful | Callback | 8-1-2019 14:05 | 8-5-2019 10:00 | UnSuccessful | No call for this time | 8-7-2019 11:05 |
|
Hi @marzio ,
You may need to unpivot the same call type , call timestamp and call outcome into a separate columns.
For example, enter into Query Editor, click on columns call 1 timestamp - call N timestamp at the same time, then choose "Unpivot Columns" option, you can rename this result column "Attribute" with "call timestamp" . Don't forget to click the "Close & Apply" button. Then you can create relationship between the source table and the data to time intervals table on date field.
Best Regards,
Amy
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi Amy,
The Unpivot solution is going to the right direction but doing it for the data source multiplies every row, and that causes many issues to the rest of the data. All the charts and calculations are showing wrong data so I had to backtrack.
I have tried to SUMMARIZE the columns I need into a separate table but I cannot UNPIVOT that because it is a DAX function.
The step I am missing to make it work is creating a new table in Power Query based on a selection of columns.
So I can UNPIVOT the columns and follow the rest of your suggestion.
Any ideas for that step?
I got lost in other threads without finding a working solution.
Marzio
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.