The prototype lives: first steps
Some time ago we wrote here and here in this blog that we are working on developing a service that minimizes the wrap-up time after calls. Based on the insights from several user interviews and ideas gained from internal workshops, we have now defined a set of features for our first prototype.
The prototype should:
- Transcribe and then display the content of calls
- Automatically create notes of important statements of the dialog
- Create appointments in Google Calendar during calls
- Show addresses that are mentioned during a call
The software extracts all the necessary information from the spoken dialogues. We are currently experimenting and testing mostly with colleagues within our team, sometimes also with other sipgate employees. And of course we hope that in a few weeks the first external testers will be able to play around with the prototype.
Transcriptions and automated notes
As mentioned above, our assumptions about useful functions and augmentations mostly came from user interviews we have recently conducted. In the tests we asked which tasks take up most of the time in the wrap-up of a call. A frequently mentioned activity was the summary of the call content, focused on the most important points and data of the conversation.
The experiments that we have carried out in this context so far have already produced quite good results in English language. With the help of NLU (Natural Language Understanding), the majority of statements within the dialog are interpreted and displayed as text after the call. Graphically, the whole thing is arranged in such a way that the statements can then be related to the respective speaker.
The quality of the transcriptions does not only depend on the machine learning skills, but also on the quality of the telephone connection. Our current conclusion about the state of the transcriptions is: The most important statements of the dialogues are often correctly recognized. No more and no less. For conversations in German, for example, the results currently do not look quite so good. But that also seems normal, because there is not yet enough data that could contribute to improvement.
What already works quite good is the automatic highlighting of key words within the transcription during a call. In a next step we’ll try to display the most important information as an ‘Action Card’ and then store and show it as a note in the interface of CLINQ. It is not yet clear whether we will define the key words for the notes ourselves or whether the user will be able to enter the key words individually. Or both.
Create appointments automatically
The purpose of this function is to propose the creation of appointments for dates that are mentioned during a call. The agent will then also see an ‘Action Card’ with the specified date and can then – if desired – have the appointment created automatically by clicking on it. The current requirement is that the agent, in our case the tester, uses Google Calendar.
Show addresses in Google Maps
Just as with creating appointments, the objective here is to analyze the spoken language and recognize the important key words. In this case of addresses. If the information is recognized correctly, the agent sees the addresses mentioned in the conversation in the interface of CLINQ in the form of a Google Map. A recent idea is that instead of the map we show for example the distance to the address, the travelling time and the suitable method of transport.
To be honest: We clearly are still just at the beginning of the development of this service. But it’s already fun to experiment with the prototype and try out new things. We are already looking forward to being able to let external testers experiment with the service in the near future. I cannot predict when that will be. But I will announce it here immediately then. Stay tuned!