User Guide
API DOCSVOICE GATEWAYHELPSECURITY
Current Version
Current Version
  • Welcome
  • What's New
  • 🚀GETTING STARTED
    • Login
    • Concepts and Glossary
    • Language Models
      • Syntphony NLP
      • Other NLP and LLM Connectors
      • FAQs
    • Build a Virtual Agent
      • Overview
      • From Scratch
      • By Importing
      • Pre-Built Templates
      • Training task
    • Testing
      • Automated Test
      • Advanced Request
      • Simulate Dialog
      • View Logs
    • Create and manage profiles
  • 💬BUILD DIALOGS
    • Flows
    • Dialog Cells
      • Intent
      • Entity
      • Answer
      • Input
      • Jump
      • End
      • Service
      • Rest Connector
      • Code
      • Rule
        • Variable answers using Code and Rule cells
        • Enable and disable flows using Rule Cells
    • Data Masking of Personal Identificable Information
    • Dynamic Content and Contexts
    • Voice Agent
    • Multilingual Agent (beta)
  • ✨GENERATIVE AI
    • AI features
    • Assist Answer (beta)
    • Examples Generator
    • Knowledge AI
    • Prompt cell
      • Prompt crafting
      • Practical examples
    • Rephrase Answer
    • Zero-Shot LLM
  • 🌐CHANNELS
    • Channels
      • WhatsApp (by Infobip)
      • Facebook Messenger
      • Microsoft Teams
      • Integrating Existing Channels
    • Webchat Plugin
  • ⚙️CONFIGURATIONS
    • Parameters
    • Advanced Resources
    • Other Options
      • Intent Navigator
  • 📊ANALYTICS & INSIGHTS
    • Dashboards
      • Overview
      • Funnel charts
      • User messages
      • Conversations
      • Reports
    • External Analytics Platforms
  • API DOCS
    • Overview
    • API Guidelines
      • Conversation API
      • Cloner API
      • EVG Connector
      • Management API
        • Admin API
          • Bot Admin
          • Environment
          • Organization
          • User
          • Notification
        • Instance API
          • Knowledge AI
          • Knowledge AI NLP
          • Answer
          • Automated Tests
          • Bot
          • Broker
          • Channel
          • Dashboard
          • Dialog Manager
          • Entity
          • Generative Service
          • Intent
          • Parameters
          • Tag
          • Technical Log
          • Training
          • Transactional Service
          • Rest Connector
          • Wait Input
          • Websnippet
      • Webhooks
    • Infrastructure Guidelines
      • Syntphony CAI server Installation guide
      • Maintenance Methods
      • Supported/verified third-party software
    • Data Structure
      • Admin Data Structure
      • Environment Data Structure
    • Voice Gateway
      • Genesys Cloud CX
Powered by GitBook
On this page
  • NLP
  • IBM Watson Assistant
  • Google Dialogflow Essentials
  • Microsoft Luis
  • Amazon Lex
  • LLM
  • OpenAI
  • Azure OpenAI

Was this helpful?

  1. GETTING STARTED
  2. Language Models

Other NLP and LLM Connectors

PreviousSyntphony NLPNextFAQs

Last updated 8 months ago

Was this helpful?

In this chapter, you'll learn how to connect the virtual agent to an external engine.

The NTT DATA proprietary Natural Language Processing engine - NLP comes integrated as default.

Syntphony Conversational AI allows you to use different NLP engines:

  1. IBM Watson Assistant

  2. Google Dialogflow Essentials

  3. Microsoft Luis

  4. Amazon Lex

  5. OpenAI for LLM models

To use any of these, just follow this step by step.

Click on Change Model to open this window with other options.

NLP

IBM Watson Assistant

Watson is a service package offered by IMB. Among them, there is a question-answering software that applies natural language processing, information retrieval, knowledge representation, automated reasoning and machine learning technologies to answer questions posed in natural language.

2) Log in with your IBMid

3) Click on "skills" in the upper left corner

4) Then click “create skill” to create a virtual agent on Watson

5) If you have existing skills, select one, then click on the menu in the upper right corner of the selected skill card.

6) Click on “view API details”

7) If you are using a newer account, copy the links and codes after Assistant URL and Api Key insert them on cockpit. Remember to switch to the newer version in Syntphony Conversational AI.

8) If you are using an older account, copy the links and codes after v1 Workspace URL, Username and Password and insert them on Syntphony CAI. Remember to switch to the older version in Syntphony CAI.

Google Dialogflow Essentials

Google Dialogflow is a human-computer interaction framework that works on natural language.

1) Go to https://dialogflow.com/

2) Then click on “go to console”.

3) Click on settings on the upper left corner (the cogwheel icon - see image).

4) Click the link right after “Project ID”.

5) You will be taken to a page in the Google Cloud Platform.

6) Once in the Google Cloud Platform, click on the link below “e-mail”.

Important: Remember to charge your agent permission or else your intents won’t work

7) Go to IAM on the upper left corner of the menu (as shown in the image below).

8) Once there, click on the edit icon (pencil) on the right of the agent named as Dialogflow Integrations (see image below).

9) Now, select “Dialogflow” and then “Dialogflow API Admin” (as shown in the image below).

10) Once you changed your agent permission, go to “service accounts” and then click on the menu on the right of the agent you want to use.

Important: If you don’t have a Service Account, click on “Create Service Account” and create one

11) Click on “create key” and select JSON.

12) Save the JSON file on your computer.

New option to configure Dialogflow multi region.

13) (Optional) If you want to use a Dialogflow agent from a specific region, you need to modify the JSON file with a new parameter called Dialogflow.region. This parameter must contain the official region identifier described in this table:

Country grouping

Geographic location

Region ID

Europe

Belgium

europe-west1

Europe

London

europe-west2

Asia-Pacific

Sydney

australia-southeast1

Asia-Pacific

Tokyo

asia-northeast1

Global

Dialogflow delivery is global, data at rest is within the US

global

If this parameter does not exist when creating the bot in Syntphony CAI, the global region will continue to be used by default as it has been to date.

Example Dialogflow metadata JSON with “region” parameter:

{
  "type": "service_account",
  "project_id": "projectId",
  "private_key_id": "d8313783b67e14489ef0ea8b2fafd2b23c62c507",
  "private_key": "-----BEGIN PRIVATE KEY-----CRIPTED_KEY-----END PRIVATE KEY-----\n",
  "client_email": "email@ email.iam.gserviceaccount.com",
  "client_id": "1234",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/...",
  
  # NEW PARAMETER --------------------------------
  "region": "australia-southeast1"
  # NEW PARAMETER --------------------------------
}

14. Upload this file when creating a Dialogflow virtual agent in cockpit to complete the integration.

Microsoft Luis

Language Understanding (LUIS) is a cloud-based API service that applies custom machine-learning intelligence to a user's conversational, natural language text to predict overall meaning, and pull out relevant, detailed information.

To integrate LUIS to Syntphony Conversational AI, you have to have an active Azure account with created resources.

1) Go to luis.ai

2) Login with your Microsoft account.

3) Create an app or click on an existing one.

4) Click on “manage”.

5) Then click on “Azure Resources” at the left.

6) Copy the example query, located at the bottom of the screen.

7) Then, click on authoring resource and copy the primary key.

8) Paste the Example Query on the URL prediction field and the primary key on the authoring key field.

Using system entities in Luis

Syntphony Conversational AI supports Luis version 2. When using the datetimeV2 system entity in Luis, you can use subcategories, such:

  • date

  • time

  • datetime

  • daterange

  • timerange

  • datetimerange

Those subcategories should be added after a dot (.).

So, if you are using the date subcategory, the entity name should be builtin.datetimeV2.date

Where builtin.datetimeV2 is the system entity name and date is the subcategory.

Amazon Lex

You will be asked to provide some information on your request, as listed below:

1. Create a new user

AWS User and Password

  • Log in and access IAM in the AWS menu

  • Create a new user by clicking on Users on the Access Management menu

  • Fill in the required information (tip: try naming it with something obvious, such as syntphony-user).

  • Then, click on "Next: Permissions" and select "existing policies", enabling the "AmazonLexFull" policy.

  • Finish downloading this user and the CSV file.

This file contains all the data required to integrate Syntphony Conversational AI to your AWS account.

2. Go back to Amazon Lex page

Access the Services menu to go back to the Amazon Lex page

Name

Then, proceed to the side menu to access the virtual agent you want to integrate. Click on the name to open this "Bot details" card. Copy the ID.

Alias

On the same side menu, choose "Implementation" and then "Aliases". Select the alias you want to use, then find the value on the fiel "ID" within "Details".

Region

There are two ways of finding out the region: the first is on your virtual agent URL.

One way is clicking on the top bar and find the selected region (as seen below).

The other way is through the URL, for example: “https://us-east-1.console.aws.amazon.com/lexv2/home?region=us-east-1#bot/YZ24GFVCSX”.

Note that it shows the region us-east-1.

Version

Now Select "Draft Version" and find the field "Version".

These are the information required to integrate Amazon Lex.

LLM

OpenAI

You'll be asked to provide the following information:

Endpoint

API Key

2) When you're on the API Keys page, click on Create New Secret Key:

3) Enter a name that represents the key and click on Create Secret Key.

4) After the key is created, before you click Done, remember to save it somewhere right away. ⚠️ It is only possible to view the key at the time of creation

Deployment Name

After filling out the Endpoint and API Key fields, the system will load the available model options.

Refer to the OpenAI documentation to learn about the models:

Tokens Limit

A token is roughly 3-5 characters long, but its exact length may vary. It usually consists in the sum of both a system prompt and the user input.

The outcome may depend on the availability of the generative service chosen and the token limit defined. If you're using Azure OpenAI by Syntphony CAI, the limit is set at 4000 tokens.

Azure OpenAI

You will be asked to provide the following information:

API Key and Endpoint

1) Once you're in the Azure webpage, select the OpenAI instance (the one marked with the OpenAI symbol), in this case, it's the eva-dev-openai-keys.

2) It'll direct you to your main OpenAI instance page. On the side menu, select the option Keys and Endpoint.

3) On this page, you will be able to view the Keys and Endpoint that will be used on the Syntphony Conversational AI cockpit screen.

Deployment Name

After filling out the Endpoint and API Key fields, the system will load the available model options.

Refer to the Azure OpenAI documentation to learn about the models:

1) Go to

For further information, check

Learn more about the

The current OpenAI Endpoint is always the same: .

-> Read the to learn more about endpoints.

1) Access and click on the lock icon, corresponding to the API Keys.

This model is highly influenced by the limitation of tokens. You can set this limit at the time of creation of the virtual agent or at the page.

Learn more about the

🚀
https://login.ibm.com/
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-reference-prebuilt-datetimev2?tabs=1-3%2C2-1%2C3-1%2C4-1%2C5-1%2C6-1#subtypes-of-datetimev2
Zero-Shot learning model
https://api.openai.com
OpenAI documentation
https://platform.openai.com/docs/overview
Parameters
Zero-Shot learning model
AWS User and Password
Name
Alias
Region
Version
Endpoint
API Key
Deployment Name
Tokens Limit
Endpoint
API Key
Deployment Name
Tokens Limit
https://platform.openai.com/docs/modelsplatform.openai.com
LogoAzure OpenAI Service models - Azure OpenAIMicrosoftLearn
Watson skills
APIs
IAM
Agents list
Permissions
Agent selection
Endpoints on Azure
Azure resources