How to configure a Voice Bot with DialogFlow in Dialogue Studio
With this scenario you will integrate your dialogflow (google) Natural Language Understanding (NLU) with the Anywhere365 Dialogue Studio. This means that you can integrate a smart assistant into your IVRInteractive Voice Response (IVR) is a telephone application to take orders via telephone keypad or voice through a computer. By choosing menu options the caller receives information, without the intervention of a human operator, or will be forwarded to the appropriate Agent. that can handle questions, such as case information, and intelligence routing based on the users input. Anywhere365 Dialogue Studio contains specific nodes that can connect with Dialogflow (Please see the nodes explanation under components for more details).
Using a smart assistant in your IVR adds an extra layer that can gather information before the call is handed to your agent. The assistant can route your users to specific skills based on their voice input.
This IVR will be able to route users to skills:
Support (based on ticket information)
Insurance with 3 skills ‘Car’, ‘Bike’,’Household’
Create a new Case
DialogFlow can be configured at https://dialogflow.com/ .
Intents are basically the goal of the users input, what does the user want to achieve, such as ‘I need help with insuring my new car’ or ‘I got a question about my current ticket’. Create a DialogFlow with at least two or three specific intents, start off with 3 intents that are not to similar. Per intent you need around 15-20 intents (more is better but too much can make you assistant to specialized in one intent). Each intent should have around the same amount of training phrases.
Entities are specific topics, such as the type of insurance. You can define synonym that the assistant can then recognize (instead of car you could introduce car brands). The content of the entity does not determine the intent, but the presence of an entity will. Don’t forget to highlight the entities in the training phrases, if it doesn’t do it automatically.
To configure this scenario, you will start off with the node ‘Incoming conversation’. This will catch the incoming call and send it into the Anywhere365 Dialogue Studio flow. Right after this the Transcriptor needs to start, this node will transcribe the speech of the custom to text. This is needed for Dialogflow, for it to detect the user's intent.
Before the introduction of the DialogFlow node, save the transcribed input into the Payload. To do this, place a ‘Change’ Node after the Transcriptor. This node lets you set and save a value for later use. In this scenario you will set Msg.payload to Msg.payload.Transcriptor.Transcript. Now the user's input is saved for later use.
Next is the DialogFlow node, the node that connects your IVR to the smart assistant. This nodes needs configure, but the Anywhere365 Dialogue Studio presents the information to the creator. To configure this node correctly you need a working DialogFlow assistant and access to the google cloud console. This process is explained under the introduction of this scenario. Also set the identifier to ‘msg.session.id’ this will make sure the user will stay in the same session and does not start over after one run through DialogFlow.
Now whenever a user says something in the IVR, this message is sent to DialogFlow which detects the intent, entities and if needed further dialogue to obtain the information that is needed to route the user to the correct Skill or handle their questions.
To get the intent or entities from DialogFlow you introduce a debug node. This basic node will simply debug the output of a node. As DialogFlow saved different Intents in different paths it is useful to call your UCCUCC stands for Unified Contact Center and consists of a queue that can be handled by Agents Each Contact Center has its own settings, interactive voice response questions and Agent with specific skills. Agents can be member of, or sign up to, one or more Contact Centers. and let DialogFlow detect your intent (or use low coding to create an input loop). Locate the intent of entities in the debug window: entities are stored under msg.payload.parameters.fields and intents under msg.payload.intent, the rest of the path is determined by your DialogFlow setup.
With this information you can introduce a ‘Switch’ node, which lets you route the flow based on e.g. intents or entities. This way you can route the dialogue to nodes that check your SQL, API or other sources (see scenario ‘Pull data from a source’). In this scenario you will check if the intent of the user is to talk about a Support Ticket or other intents. Now your flow has two routes to go to.
The top flow will follow the route of the support ticket intent. Here you can introduce nodes that check your company’s API’s, your SQL database and other sources and check if the ticket number (an entity) can be found. The user is than given the information, the easiest low code way is to use Switch nodes. This way you can check if the content of the returned payload matches your needs and introduce flow directions for there. You can also introduce functions that you can code to handle requests with in one node. This a little bit more advanced, and in no way necessary, but it can save you steps. Coding these function nodes will not be handled in this documentation.
Based on the flow route, the IVR assistant is triggered to say pre coded lines. You can introduce DialogFlow answers based on their intents. This will allow your IVR to answer different variants on the same question (different formats).
As you can see the flow stops after the switches and the say nodes. This is because the say nodes should trigger the user to answer. And then the flow will start again at the Transcriptor node.
Other intents flow is followed by a ‘Say’ node. This node will trigger the IVR to say the DialogFlow output, to get the missing information (such as the type of insurance in this scenario), or handle follow up intents.
After the ‘Say’ node you introduce another switch node, which checks if the DialogFlow intent the user provided is the end of the conversation (this can be configured in DialogFlow), which will start the routing based on the Intent and Entities gathered by your Smart Assistant. To guide the call to the queue, use the ‘Enqueue’ node and type the skillname into the details. You can also introduce a ‘Disconnect’ node if the user chooses not to connect to the agent.
In this scenario the user is routed based on intent, but with the insurance intent the entity is of importance to get routed to the specific insurance skill. The user can disconnect if he/she does not want to connect to an agent if the information provided by the IVR assistant was sufficient.