Create a LLM Usage to enable advanced features that use Large Language Models (LLMs) to help you maintain your chatbot content, such as generating training phrase suggestions or handover transcript summaries for your live agents. The LLM Usage defines which LLM Connector (the provider and model) to use, and the prompt to send to generate the results.
The content generated by the LLM is never visible to your chatbot users and won't impact their experience.
LLM Usages are divided into scopes for the different features they support. To use these features:
- Make sure you've created an LLM Connector for the LLM provider you want to use.
- Create an LLM Usage for the scope you want to use:
- Set that LLM Usage as the primary for that scope.
Changes to most LLM Usage scopes are applied to your chatbot immediately, but some must be published before they take effect in your live chatbot.
Each usage is preconfigured with a default prompt for the scope, making them ready to use out-of-the-box. You can view or customise this prompt from the LLM Usage page. Each LLM Usage is chatbot-wide, using the same prompt for the whole chatbot, but some features allow you to add additional prompt instructions per-use, such as to customise transcript summaries.
Prompt parameters
LLM Usage prompts contain prompt parameters that look like ${this}. Before sending the prompt to the LLM, the chatbot replaces the prompt parameters with content specific to the situation, such as the user's conversation transcript (for a generated transcript summary) or the bot message content (for a suggested training phrase). Each LLM Usage has its own set of prompt parameters. When customising LLM Usage prompts, make sure you don't delete the parameters.
Create an LLM Usage for handover transcript summaries
Changes to LLM Usages for handover transcript summaries must be published before they take effect in your live chatbot. The generated summary will not be visible to your chatbot users.
Make sure you have created an LLM Connector for the provider and model you want to use.
- Select your team, and the chatbot you want to edit.
- Open the Manage section of the left navigation and click LLM Usages.
- Click + LLM Usage.
- In the Usage Scope, select Handover Transcript Summary.
- Type a distinctive name for the usage.
- Click Create.
- In the new usage screen, select the LLM Connector to use.
If you can't see the LLM Connector you want to use, check the LLM Connector model's capabilities, listed under the model name on the LLM Connector page. Handover transcript summaries require a model capable of text generation and text summarisation. - Click Save.
- If you want to edit the default prompt used to generate the summary, type any edits into the Prompt area and click Save.
Make sure you don't remove the prompt parameters: ${additionalContext} and ${transcript}. - In the list of LLM Usages on the left, click the menu next to the usage you just created.
- Click Set as Primary for Handover Transcript Summary.
Set an LLM Usage as primary
Set an LLM usage as Primary to designate it as the one the chatbot should use.
- Select your team, and the chatbot you want to edit.
- Open the Manage section of the left navigation and click LLM Usages.
- In the list of LLM Usages on the left, click the menu next to the usage you want to set as the primary.
- Click Set as Primary.
Comments
Article is closed for comments.