How to connect DialogFlow in Dialogue Studio

Introduction

With this scenario you will integrate your DialogFlow (Google) Natural Language Understanding (NLU) with the Anywhere365 Dialogue Studio. This means that you can integrate a smart assistant into your IVRInteractive Voice Response (IVR) is a telephone application to take orders via telephone keypad or voice through a computer. By choosing menu options the caller receives information, without the intervention of a human operator, or will be forwarded to the appropriate Agent. that can handle questions, such as case information, and intelligence routing based on the users input. Anywhere365 Dialogue Studio contains specific nodes that can connect with Dialogflow (Please see the nodes explanation under components for more details).

Using a smart assistant in your IVR adds an extra layer that can gather information. The assistant can route your users to specific skills based on their voice input or provide a self-service experience..

 

Preparation

This scenario will make use of Google Dialogue Flow.

DialogFlow can be configured at https://dialogflow.com/ .

Intents are basically the goal of the users input, what does the user want to achieve, such as ‘I need help with insuring my new car’ or ‘I got a question about my current ticket’. Create a DialogFlow with at least two or three specific intents, start off with 3 intents that are not to similar. Per intent you need around 15-20 training phrases (more is better but too much can make you assistant to specialized in one intent). Each intent should have around the same amount of training phrases.

Entities are specific topics, such as the type of insurance. You can define synonym that the assistant can then recognize (instead of car you could introduce car brands). The content of the entity does not determine the intent, but the presence of an entity will. Don’t forget to highlight the entities in the training phrases, if it doesn’t do it automatically.

 

Prerequisites

For this scenario the following point must be done first:

 

Configuration

Bot greeting

Our first step is initiate the bot, greet the customer and wait for there response.

We are going to start with a “Incoming conversation” node. This node only listens to both audio/video and Chat. This node is connected to our server using SignalR.

Steps:

  1. Drag and drop Incoming Call Node

  2. Open Node

    1. Select / Configure server

    2. Filter on:

      1. All = If you want both a Voice- and Chat-bot

      2. Audio/Video = If you want a Voice-bot

      3. Chat = If you want a Chat-bot

 

Next is greeting and asking what their question is the customer, this can be done with a “Say” node. In here you can type the text, this will be converted to speech using the a Speech engine (Learn More).

Steps:

  1. Drag and drop a Say Node

  2. Open Node

    1. Enter text you want to play

  3. Connect end of the Incoming Call node with begin of the Say Node

 

The next step depends on if you only want kind of bot you want to create.

 

Interaction - Voice- and Chat-bot

Our next step is to receive the question from the customer. Both channels has there own way. For a Voice-bot we need to transcribe what the customer is saying, for a Chat-bot we need to retrieve a chat message.

Because we want to handle both channels in one flow, we need to make a Switch.

Steps:

  1. Drag and drop a Switch Node

  2. Open Node

    1. Change Property to msg.session.type

    2. Add an option for the chat channel

      == Chat

    3. Add the option otherwise for audio/video

  3. Connect end of the Say node to begin of the Switch Node

You can now continue with both the interactions below. Connect option 1 to the Chat-bot interaction, connect the otherwise option to the Voice-bot interaction.

 

Interaction - Chat-bot

Now for the Chat-bot we want to give the customer the option to say their message. This can be done with a Wait for Chat Node.

Steps:

  1. Drag and drop a Wait for Chat node

  2. Connect end of the Say node (or option 1 of the switch node) with begin of the Wait for Chat node

 

Interaction - Voice-bot

For the Voice-bot we want to listen to the customer. This can be done with the Transcriptor Node

Steps:

  1. Drag and drop a Transcriptor node

  2. Open Node

    1. Optional change the language of the node

  3. Connect the end of the Say node (or option otherwise of the switch node) with begin of the Transcriptor node.

 

By default the Transcriptor node will keep listening to the customer and restarting the flow. This is useful when integrating with natural language understanding API, which can hold an conversation (for example DialogFlow). In this case we need to stop after the first output. This can be done stopping the transcriptor with a stop payload.

Steps:

  1. Drag and drop Change node

  2. Open Node

    1. Create rule

      1. Set: msg.payload.stop

        to: true (Boolean)

  3. Connect "events with transcriptor" end with begin of the Change node

  4. Connect end of the Change node with begin of the Transcriptor node

 

Finally we need to store the output in an object we can use later. This can be done with a Change node.

Steps:

  1. Drag and drop a Change node

  2. Open Node

    1. Create rule

      1. Set: msg.payload

        to: msg.payload.transcriptor.transcript

  3. Connect "events with transcriptor" end with begin of the Change node

 

Configure DialogFlow

Next is the DialogFlow node, the node that connects your IVR to the smart assistant. This nodes needs to be configured, but the Anywhere365 Dialogue Studio presents the information to the creator. To configure this node correctly you need a working DialogFlow assistant and access to the Google cloud console. Also set the identifier to ‘msg.session.id’ this will make sure the user will stay in the same session and does not start over after one run through DialogFlow.

Steps:

  • Drag and drop a DialogFlow node

  • Open Node:

    • Agent = Select your DialogFlow configuration

      Note If you haven't got an configuration yet, you can add a new configuration.

      Login to your Google Dialogflow administration.

      Get the "Service Account" credentials for your desired Agent.

      Download the JSON file and copy/paste the content in the field.

    • Session = msg.session.id

    • Language = Select the language of the bot.

  • Connect the output of the previous node(s) (Voice-bot = Change node / Chat-bot = Wait for Chat node) to the begin of the Function node.

Now whenever a user says something in the IVR, this message is sent to DialogFlow which detects the intent, entities and if needed further dialogue to obtain the information that is needed to route the user to the correct Skill or handle their questions.

 

Handle response

Now that we have our response, we need to handle appropriate. First we need to store the output in an object we can use later. This can be done with a Change node.

Steps:

  1. Drag and drop a Change node

  2. Open Node

    1. Create 2 rules

      1. Set: msg.completed

        to: msg.payload.diagnosticInfo.fields.end_conversation.boolValue

      1. Set: msg.intent

        to: msg.payload.intent.displayName

      Note Additionally if you have parameters you can find them in:

      msg.payload.parameters.fields.[parametername].stringValue

  3. Connect end of the DialogFlow node with begin of the Change node

 

With this information you can introduce a ‘Switch’ node, which lets you route the flow based on e.g. intents, parameters or entities. This way you can route the dialogue to nodes that check your SQL (Learn More), API or other sources. In this example we will switch on intent.

Steps:

  1. Drag and drop a Switch node

  2. Open node

    1. Change Property to msg.intent

    2. Add an option per intent, for example:

      == ticket_status

    3. Add the option otherwise, in the case the DialogFlow didn't detect the intent.

  3. Connect end of the Change node to begin of the Switch node

 

You can now continue with your flow, based on the intent of the customer. Below some scenario's you can encounter when using DialogFlow.

 

(Optional) Play DialogFlow Response

In DialogFlow you can also write responses when an Intent has been detected or a prompt when a parameter is missing. These responses can be used by Dialogue Studio, this is done using a Say node.

Steps:

  1. Drag and drop a Say node

  2. Open node

    1. Change to expression

    2. Use the following expression:

      payload.fulfillmentMessages[0].text.text[0]

  3. Connect end of previous node to begin of the Say node

 

(Optional) Handle end of DialogFlow conversation

In DialogFlow you can set if a conversation is completed. This will tell you if you need to send another message to DialogFlow or if you can continue with you own flow. When handling the response we stored the complete state in msg.complete. Now we need to check the value, this can be done with a Switch node.

Steps:

  1. Drag and drop a Switch node

  2. Open node

    1. Change Property to msg.complete

    2. Add an option to check if completed

      is true

    3. Add the option otherwise, in the case the DialogFlow isn't finished yet.

  3. Connect end of the previous node to begin of the Switch node

If not completed you can continue with "Send another response"

 

(Optional) Handle missing parameter

In DialogFlow you can configure that some parameters are required. For example when you want the schedule a meeting, in that case you need to know the date and time.

An intent with missing parameters, will be marked as incomplete (msg.complete is false). To verify our parameter we need to check the value, this can be done with a Switch node.

Steps:

  1. Drag and drop a Switch node

  2. Open node

    1. Change Property to msg.payload.parameters.fields.[parametername].stringValue

    2. Add an tow options to check if the parameter is filled

      is empty

      is not empty

  3. Connect end of the previous node to begin of the Switch node

If missing, you can continue with "Send another response"

 

(Optional) Handle unknown intent

When DialogFlow cannot detect the intent, it will select the "Default Fallback Intent". This will also contain a response, which can be send to the customer. This will trigger the customer to send another response. By default this will be marked as an incomplete conversation (msg.complete is false).

During fallback, you can continue with "Send another response"

 

(Optional) Send another response

Depending on what kind of bot you are using, some configuration my be required if you want to send another response.

When making a Voice-bot, you don't need to do anything. The Transcriptor node (added in Interaction - Voice-bot) keeps listening and sends a new output to DialogFlow. (unless you have stopped you transcriptor, in that case you need to start it again)

When making a Chat-bot, you need to make sure you are listening for a new chat message. The easiest way to do this is the loop back to the beginning. You can always connect a line all the way back to the beginning, but to keep your flow easy to read we recommend using Link nodes.

Steps:

  1. Drag and drop a Link in node, near the Wait for Chat node.

  2. Open the node

    1. Set the name to "New message dialogue flow"

  3. Connect the end of this Link in node with begin of Wait for Chat node

Steps:

  1. Drag and drop Link out node, near "End conversation" Switch node

  2. Open the node

    1. Select the "New message dialogflow"

  3. Connect Otherwise of the "End conversation" Switch node with begin of this Link out node.