Rephrase Answer
Last updated
Last updated
The Rephrase Answer feature enables real-time rephrasing of the virtual agent's answers during conversations with end-users. By leveraging generative AI capabilities, it enhances conversations by delivering context-sensitive answers for a dynamic conversation.
This feature optimizes user experience and engagement by providing more natural and empathetic interactions.
By default, the real-time answers provided by the virtual agent remain static, adhering to the original text input. However, when the rephrasing toggle switch is active, a request is made to the LLM, allowing for text variation in real-time.
Enabling this feature may result in additional costs for each new request. You can enable it in Extensions
Once enabled, you can activate the option at a granular level, in each Answer cell.
Upon enabling this feature in the Extensions section, you can activate answer rephrasing on the Answer cell. The answers are reformulated during runtime in the virtual agent's primary language.
Adjusts the model's creativity, controlling text variation. Lower values (close to 0) produce more common and predictable results, while higher values (close to 1) yield more diverse vocabulary. The recommended default value is 0.7.
Considers the conversation context by inputting user messages into the generative AI. It represents the number of previous user inputs influencing the answer's tone and data. You can configure a value from 0 to 5 for the number of previous inputs used as context. The recommended default value is 2.
Restricts specific words or expressions from being included in rephrased answers.
In case the answer times out waiting for OpenAI, the system delivers a static answer, even if the rephrasing option is enabled. You can set this timeout value between 1 to 10 seconds, with the recommended default being 4 seconds. If the request exceeds this time limit, the system delivers a static answer. This parameter can be configured in the Parameters section.