Creating channels – The Conversation API
What is the Conversation API
Any developer can integrate their own channels to eva. A user could benefit from a chat in the company’s website using a custom interface or even custom templates for responses, such as graphs, masked inputs or interaction with other elements of the webpage.
Companies normally add virtual agent platforms to their existing app, or use one of the internal channels to release a virtual agent for its employees. When using eva, this can be done by consuming the Conversation API described below.
Conversation service
The conversation service is used to execute a conversation. Each call to this service is a message from the user that the virtual agent must process in order to understand and answer the user.
URL Parameters
Name
Type
Required
Description
sessionCode
String
No
Conversation ID. The first call to eva’s conversation service must not have this parameter. After the first call, this parameter is required to keep the conversation. It is returned in the service’s response.
Request headers
Name
Type
Required
Description
PROJECT
String
Yes
The virtual agent name. Same name as the virtual agent created in the Cockpit.
CHANNEL
String
Yes
The channel name. The channel must be created in the virtual agent above through the Cockpit.
API-KEY
String
Yes
API key for client identification. The environment administrator must provide this data.
OS
String
Yes
User operating system. Example: for web chat, might be Windows; and for a mobile app, iOS.
OS-VERSION
String
No
Version of the operating system above.
BROWSER
String
No
User’s browser, when using one.
BROWSER-VERSION
String
No
Version of the browser above
USER-REF
String
Yes
This field is used for identifying the user by a technical value, depending on the channel. Some examples:
- For web chat: the user IP address
- IVR: phone number
- Messenger: Facebook’s user ID
BUSINESS-KEY
String
No
This field is used to identify the user in a business level if the channel has information about the user. Examples:
- In a private section of a webpage that requires logging in, the business key might be the user login
- User document number
- Client #
LOCALE
String
Yes
Virtual agent’s language: <language>-<COUNTRY>
This must be the same as configured in the Cockpit.
Examples: en-US es-ES pt-BR
Request body
Name
Type
Required
Description
text
String
No
Text input by a user or transcription from audio. Either this value or a code (like the one below) must be provided.
code
String
No
On the first call, the code “%EVA_WELCOME_MSG” can be sent to execute the Welcome flow created in the Cockpit.
This code might be used to locate a specific answer. Learn more here.
Either this value or a text (like the one above) must be provided.
Response body
Name
Type
Description
text
String
The same text sent in the request or, if no text is provided, the code.
sessionCode
String
Conversation identifier, generated in the first request. This must be sent in the following calls in the URL as explained URL parameters in this chapter.
intent
String
Name of the intent returned by the NLP for the user message.
confidence
Double
Confidence score for the intent above, from 0 to 1.
User Input
Name
Type
Description
type
String
Same type as selected by the editor in the Cockpit through the input cell modal.
callToAction
String
For chatbots, text for the input field placeholder for the next message.
pattern
String
When the selected type is ‘Custom’, this field will have the pattern filled by the editor in the Cockpit.
Answer
Name
Type
Description
content
String or JSON Array
Depends on the type of the answer. If it is a Carousel, this field will contain a JSON Array with each card of the carousel.
For a file answer, this field will contain an URL and a filename.
For other types, this content will be String with the content filled by the editor in the Cockpit.
buttons
Button[]
Button list configured for the answer, showing those buttons inside the response card.
quickReply
Button[]
Button list configured for the answer, showing those buttons as a carousel above the user input.
description
String
Answer’s description. This information is inserted by the editor in the Cockpit and is for organization purposes. It isn`t mandatory.
type
String
Card template selected for the answer. Types include:
TEXT_OPTIONS – when the channel is ALL (default response for any channel)
- TEXT - IMAGE - AUDIO - VIDEO - FILE - CAROUSEL - CUSTOM
interactionId
String
UUID representing the current interaction. This value can be used for answers like/dislike (thumbs up and down).
evaluable
Boolean
true – if this answer must show a thumbs up / thumbs down (like / dislike) option for the user
false – otherwise
See Likable service
Button
Name
Type
Description
name
String
Text of the button to be shown and sent back as text on the next call, if the button is clicked (depends on the type)
type
String
Possible values:
· URL – if these buttons opens a browser page
· FLOW – if the button is an action in the conversation. In this case, when clicked, other API call must be made using the name of the button as text.
action
String
If the type is URL, this field will have the URL that the browser will open.
Carousel card
Name
Type
Description
imageUrl
String
URL for the image on the card
title
String
Title of the card
subTitle
String
Subtitle of the card
buttons
Button[]
Buttons for the card
Sample requests
The request below is an example of a first call to the conversation service, requesting the execution of the welcome flow. It also add a variable to the context, although it is not necessary.
{
"code": "%EVA_WELCOME_MSG",
"context": {
"user": 25237
}
}
Another possible request, for following user messages:
{
"text": "How much do I have in my account?",
"context":{
"user": 25237,
"foo": "bar"
}
}
Sample response
The following JSON is an example for a response for the request above.
{
"text": "How much do I have in my account?",
"sessionCode": "555c03b6-d22f-431d-b82e-2b33aff5719d",
"intent": "BALANCE",
"confidence": 0.88,
"answers": [
{
"quickReply": [],
"interactionId": "2f300826-629f-495c-b925-3d6131946934",
"buttons": [],
"description": "",
"type": "TEXT_OPTIONS",
"content": "You have 10 points in your wallet."
}
],
"context": {
"user": 25237,
"foo": "bar"
}
}
Loading answers
When you want to avoid NLP calls, eva offers a front-end pre- processing option that bypasses cognitive processing. The CODE practice ties a specific code to a specific answer and obliges eva to deliver this answer.
In eva, a call to the Conversation API with “code”:“%EVA_WELCOME_MSG” loads the welcome flow. When this code appears, eva is obliged to load the welcome flow. The extension of this behavior to any other answer is what is called the CODE practice.
When you register an answer name, it will also be its “code”, eva will deliver that specific answer when faced with that code. If the answer is transactional, the transaction is done before the answer is delivered. If the answer is not found, the “code” content is sent to the NLP so it can be interpreted.
When eva API encounters a “code” and a “text”, the code is and the text not (unless the text is used by a transactional component). If an answer with the same name of the “code” content is not found, the “text” content is sent to the NLP. This happens too in the middle of a flow. If a code is sent in the middle of a flow, the flow is stopped to run the code.
So, eva loading priority will be code --> answer -> NLP --> Fallback.
Every code interaction is registered in the User Interactions table.
This is useful when you want to build a clickable menu with preset options and each option is a code. For example, a simple menu with options such as “check balance”, “check opening times” and “ask for a refund”.
Likable service
The likable service is used when an answer is configured to be evaluable. When this option is enable, the answer should give the user a thumps up / thumbs down option (like / dislike) in the chat.

When the user likes or dislikes an answer, this service must be called.
Request body
Name
Type
Required
Description
evaluation
Boolean
Yes
true – user liked the answer (thumbs up)
false – user disliked the answer (thumbs down)
interactionId
String
Yes
Answer interactionId must be the same received through the conversation service.
Response body
The likable service will return a HTTP Status 200 with a “Success” string.
Sample request
{
"evaluation": true,
"interactionId": "7cf85a1e-244b-4c75-bc8d-bb188911c724"
}
Sample Response
"Success"
URL parameters
Name
Type
Required
Description
Request headers
Name
Type
Required
Description
API-KEY
String
Yes
API key for client identification. The environment administrator must provide this data.
LOCAL
String
Yes
Bot’s language: <language>-<COUNTRY>
This must be the same as configured in the Cockpit.
Examples:
- en-US - es-ES - pt-BR
Request body
Name
Type
Required
Description
evaluation
Short number
Yes
This number represents how the user graded the bot. It is recommended to be a number from 1 to 10, but can be changed to use other systems (i.e. 5 stars)
answered
Short number
Yes
Considering that a user interacts with a bot to have a question/problem answered:
1 – the user had its problem solved
0 – his problem was not solved
userComments
String
No
User comments about the session.
expireSession
Boolean
Yes
· true – when the session should be expired
· false – the session must not be expired because the conversation might still continue
Response body
The satisfaction service will return a HTTP Status 200 with a “Success” string.
Sample request
{
"evaluation": 5,
"answered": "true",
"userComments": "You answered where to find the information, but why didn’t you give the info in the chat?",
"expireSession": true
}
Sample response
"Success"
Integrating Existing Channels
How to integrate other messaging platforms and smart assistants, such as Facebook Messenger, Whatsapp and Google Assistant
Besides custom developed front-ends, such as web chats, mobile chats and IVR, eva can integrate with other messaging platforms and smart assistants, such as Facebook Messenger. This connection asks for a connector between eva and the channel, translating the Conversation API to the form of communication used in the channel.
Some of these connectors exists out-of-the-box in eva and are explained below.
Infobip Whatsapp connector
To configure an Infobip integration, the first step is to add some configuration parameters through the Cockpit. Access the Settings screen and go to the Parameters section.
Add the following parameters:
1. whatsapp.broker.info – this parameter contains the Conversation API configuration for the connector.
The value must be a JSON with these attributes:
Name
Type
Required
Description
project
String
Yes
Name of the virtual agent to be used
channel
String
Yes
Name of the channel as created in the Cockpit
os
String
Yes
See operating system in the
Conversation Service request
locale
String
Yes
Locale configured for the virtual agent.
I.e.: es-ES
Example:
{ "baseUrl": "Erro! A referência de hiperlink não é válida.", "apiKey": "01a86ff7-e7eb-4482-82ef-d5a3cd120b19", "project": "MYBOT", "channel": "WhatsApp Help Desk", "os": "Infobip connector", "locale": "es-ES"}
2. whatsapp.infobip.info – this parameter configures the Infobip end of the integration. It is also a JSON value.
Name
Type
Required
Description
omniUrl
String
Yes
This URL is provided by infobip.
Log in to the portal.infobip.com and access this site:
https://dev.infobip.com/#programmable-communications/omni-failover/list-all-omni-failover-scenarios
Copy the URL after the GET, remove the /scenarios at the end.
user
String
Yes
Your Infobip portal user
password
String
Yes
Password for the user above
whatsappNumber
String
Yes
The Whatsapp number given by Infobip
channel
String
Yes
Fixed value: “WHATSAPP”
keyword
String
Yes
The keyword used by eva to represent a channel
Example:
{ "omniUrl": "https://h38h8.api.infobip.com/omni/1", "user": "MyUser", "password": "Password1234", "whatsappNumber": "447494163530", "channel": "WHATSAPP"}
The last step for this configuration is to go to the Infobip portal, in the number configuration, paste your Infobip Connector URL in the URL field as shown in the image below:
Important:
The Infobip Connector URL is provided by your eva installation administrator.
Google Assistant connector
This section shows how to integrate with Google Assistant, which can be used in Google Home as well.
Link
Google Actions
1. Create your project in Google Actions
2. Action invocation
Give your action a name for users to use it. For example, if your action’s name is “eva car”, the user will use it by saying “ok google, open eva car”.
It is recommended to put simple words and test the pronunciation.
3. Configuring the action
Download and install the Google CLI and execute the following command.
$gactions init
Doing this, a file named action.json will be generated. Open this file and change the conversation name, name and URL.
{ "actions": [ { "description": "Default Welcome Intent", "name": "MAIN", "fulfillment": { "conversationName": "<INSERT YOUR CONVERSATION NAME HERE>" }, "intent": { "name": "actions.intent.MAIN", "trigger": { "queryPatterns": [ "talk to <INSERT YOUR NAME HERE>" ] } } } }, "conversations": { "<INSERT YOUR CONVERSATION NAME HERE>" "name": "<INSERT YOUR CONVERSATION NAME HERE>" "url": "<INSERT YOUR CONVERSATION NAME HERE>" } }, "locale": "en"}
The conversation name and the name are based on your action.
The URL is made of two parts:
The endpoint – this part is provided by your eva installation administrator and will end in /conversations.
Channel ID – the last part will be the ID of the channel that is to be used by Google Assistant.
Your URL should look like this: https://ga-mydemo-eva.bot/conversations/53
4. Update your action
Go to the Settings tab and copy the Project ID. This information is needed to update the action with the JSON file altered in the previous step.

Update the project with the command below.
$gactions update --action_package action.json --project <Project ID>
This command will provide a URL for accreditation. Access this URL and copy the code to your terminal. This works as a confirmation of your identity.
5. Test it!
Execute the command below to test your integration in the terminal.
$gactions test --action_package action.json --project
With all steps executed, your integration is done and you can test your virtual agent. Follow the Google documentation to test it and publish.
Facebook Messenger connector
To configure the Messenger connector, you must access the Facebook for Developers console and have access to eva’s MySQL database. Then, execute the following steps.
1. Create your Facebook App
In the Facebook for Developers console, create a new App or use an existing one of your choice.
2. Add Messenger configuration
In the App Home, click on the Set Up button in the Messenger box.
If this box does not appear to you, click on the plus button besides the PRODUCTS label in the left menu.
3. Get your Facebook Page ID and Name
In Facebook, access the page in which you want to enable the messenger chat and copy the Page ID and Page Name. This information will be used in the next step.
The Page ID can be found in the URL after you access your page. For example:
https://www.facebook.com/MyFacebookPage-105498651278284/?modal=admin_todo_tour&ref=admin_to_do_step_controller
The Page ID above is 105498651278284.
The Page Name is the same that appears on the screen.
4. Select the channel and configure a token in eva
Access eva’s database and fetch the channel ID. To do that, first check your virtual agent id through the URL in the Cockpit. After selecting it, the number between the slashes, after the /home/ is your virtual agent id.
For example, if theURL shows:
https://mybot.eva.bot/home/12/dialogmanager/workspace
The virtual agent ID in this case is 12.
With this ID, run the following command on the MySQL database:
SELECT id, name FROM channel WHERE removed = 0 AND botId = < BOT_ID > ;
This will return a list of channels. Find the one to integrate with Facebook Messenger and copy the ID of that channel. This will be used it in the next SQL command.
Choose a security token. This can be any sentence to be used on both eva and Facebook to check that your Facebook Developer App and eva instance are yours. For example, a token could be simply “eva-facebook-security-token”.
Now, execute the following command replacing the values:
insert into facebook_configuration (pageId, pageName, hubToken, pageAccessToken, channelId) values ( < PAGE_ID > , '<PAGE_NAME>,' < TOKEN > ','', < CHANNEL_ID > );
5. Configure Webhook
On Facebook for Developers console, go to the Messenger > Settings menu (if you clicked on the “Set Up” button before, you should be on this page already) and click on the “Add Callback URL” in the Webhooks box.
In the modal that opens, the Callback URL is provided by your eva’s installation administrator. It must be a URL ending in /fb/webhook.
Example URL: https://facebook-myeva.eva.bot/fb/webhook
In the Verify Token field, put the same token created before and click on the Verify and Save button.
This step configures how Facebook will call eva.
6. Select a Facebook Page
In the Webhooks box, click the Add or Remove Pages button and select which page you want to use. You will have to give the Facebook for Developers permission to access your account for this step to work.
A table will appear with the columns Pages and Webhooks. Click on the Add Subscriptions button and select the messages and messages_postbacks options.
Save it.
On the Access Tokens box, click the Generate Token button for your page and copy the token, it will be used to update the facebook_configuration table.
7. Configure the page token in eva
The final step for configuring the integration is to update the database with the page token. Execute the command below in your MySQL database replacing the page token and page id.
update facebook_configuration set pageAccessToken = '<PAGE_TOKEN>' where pageId = '<PAGE_ID>';
8. Test it!
To test your virtual agent, enable the chat button in the Facebook Page by clicking the “Add button” on the top right and select the “Send message” option. You are now ready to test your virtual agent.
Important:
This manual shows how to integrate a virtual agent with Facebook in development mode. This will not make it available to all Facebook users. To set a virtual agent to production mode, you still have to follow Facebook compliance rules and submit it for evaluation.
Last updated