As technology evolves and the content landscape becomes more crowded, users are gravitating toward chatbots as the next form of search and personalized customer service. Recognizing this is no passing trend, businesses across all industries are jumping to meet these expectations.
A Guide to Syncing, Linking, Marrying, and Integrating Chatbot Content with the Content Management System
According to new research from Juniper, banking, healthcare, social, e-commerce and retail organizations have saved $20 million this year alone, and the trend is expected to continue with $8 billion savings per year by 2022, making accessibility and versatility ever more paramount for content producers.
In part one of our chatbot series, we talked about the increasing demand for chatbots as a new content channel, the critical mistakes many make in a rush to meet demand, and the value of single-sourcing structured content from your existing CMS to build a custom chatbot. Here in part two, we’ll walk you through the specifics.
Less Chat, More Bot
The first step to building a chatbot is to develop a true understanding of what its function will be. This will determine what kind of chatbot to build, and how best to build it.
At [A], we are especially interested in how structured content is fundamentally changing the way users demand information. Analysts forecast that in the next five to ten years, users will largely prefer to receive a tailored “answer” on demand instead of digging through extraneous information across multiple webpages.
Long form content has a time and place but most information consumers start with a directed search.
We’re preparing for that change by leveraging the chatbot’s automated and conversational interface to educate content strategists and stakeholders about content engineering, in the same way we aim to do so on our site with traditional articles now.
With that in mind, we determined that the most efficient way to accomplish this goal was by building a simple information retrieval chatbot with API.AI and our native CMS. Key chatbot components for this model include intents, responses, and entities.
Intents and Responses
At [A], when modeling for interactive content, we define an intent as the question or request that a user inputs, and response as the chatbot’s answer or action to the user’s input.
This is how it looks to enter intents and responses in API.AI.
*NOTE: In API.AI, responses are considered part of the intent.
We can link these two sections—User Says and Response—to our CMS. We do this by treating intents and responses as content elements that live within a content type.
If you aren’t familiar with content modeling principles and content types, you might want to push pause on this advance course and review the basics.[1]
We want our subject matter experts to address a single domain of knowledge, or topic, at a time. By associating intents and responses with an Article Type or other similar content types, we give authors the ability to focus question and answer variants with the associated topical container. Articles are handy containers. We already use Article Type to house social promotion text, mobile variation, and now—interactive variations for chatbots.
For this bot, we create the intent element and label it “primary question”. This will be used to title the Intent and as a starting point for the User Says of the Intent.
There are, however, an almost endless number of ways to ask the same thing. To address this problem, we created a second content element for the intent and name it “alternate questions”. If a user asks any one of the following questions, it will trigger the same response from the chatbot.
Fortunately, API.AI also helps tackle this linguistic challenge by automatically incorporating machine learning into your bot. This way, an exact match of your intent is not always necessary. The question, “What is it that a content engineer does?”—even if it is not directly entered into your intent—may still be recognized and trigger the appropriate response.
To do this, we enable Match Mode as Hybrid within API.AI, and then the platform will start taking a broader, more intelligent view of the user’s questions when determining intent.
Pure Machine Learning is still a work in progress. At [A], when we used the ML only setting, we found that responses were unreliable for similar intents, such as, “What is a content engineer?” and “What is content engineering?” Making the matches “hybrid” applies basic fuzzy recognition of terms using the platform’s machine learning capabilities, while still making sure the bot inherits the rules we provide it.
For more on API.AI’s machine learning, click here.
From there we map our new Primary Question and Alternate Questions elements to the User Says section of API.AI. In our CMS, we created a custom control to do the initial push of data to API.AI to create a new Intent and store an identifier of the Intent in the CMS. Editors can click a button on this custom control to create a new Intent from within the CMS. Additionally, since the Intent ID is stored within the CMS, we can use that to keep the Intent up to date if we want to modify or create questions.
Once an intent is entered by the user, a response is generated by the chatbot. “Response” can also be mapped as a content element—in our structured content element, we call it “Answer”.
To make a chatbot more natural and variable, at this point, we may also choose to create an “Alternate Answers” content element that alternates and randomizes the response to a user’s intent.
From there we map our new Answer element to the Response section of API.AI.
ENTITIES
Entities are parameters we can pull out of a conversation.
They are especially important to task fulfillment chatbots.
API.AI gives the example of a chatbot that is designed for order fulfillment for a pizza restaurant.
To comprehend the order, they need to designate entities such as @crust @sauce @size @topping @type. The entity @size might have the sub-entities small, medium, large.
For our education-based bot, we chose to use entities a little differently. We use them primarily to enhance our fallback response.
A fallback response is how your chatbot responds when it does not recognize a user’s intent.
We know that building an informational bot to teach content engineering is quite ambitious. There are conceivably hundreds or even thousands of questions that a user might have on the topic. This means entering hundreds, or even thousands of intents. This will take time. However, these thousands of questions can generally be grouped into a smaller number of key subtopics that make up the larger field of content engineering.
So, for now, we only have the need for one entity group—@CE Topics. Our sub-entities include the key content engineering subtopics that [A] covers like personalization, metadata, schema, microdata, and content management systems.
The chatbot might not have an exact answer to the question, “What is personalization?” if neither that question nor anything similar is entered as an intent. However, since personalization is entered as an entity, it understands that the user wants to know something about personalization. In these cases, our chatbot offers relevant articles on personalization from simplea.com that might provide the information they’re looking for.
Here, we find another way to take advantage of existing structures from within our content management system. We already tag our articles by content engineering subtopic. These categories, or “taxonomy nodes”, can be mapped from our CMS directly into the entity section of API.AI.
Bringing It All Together
At this point, we have planned for or created the necessary entry points in our CMS for our chatbot. Now, we have to link it to API.AI.
Anytime your chatbot retrieves information from a source other than API.AI, this is considered a “fulfillment task”.
For example, to create a chatbot that reports the weather, a fulfillment task would need to be created to grab the information from an external source that contains that information.
Since all of our information is coming from another source—our CMS—every interaction with the chatbot is a fulfillment task.
As mentioned earlier, we created a custom control within the CMS to create the initial Intent using the data from the Article Type (Primary Question, Alternate Questions, and Answer). We leverage a connection to API.AI’s web service to accomplish this. Once a new Intent is created, the service returns an ID, which we store with the article data.
This ID is used to update the Intent, as new content is added to an Article, the Intent within API.AI is updated. This is accomplished by extending the CMS’s behavior for when an article is edited.
Additionally, we have a webhook running to accept requests from API.AI and send Responses for the given Intent. This is where the logic lives to decide what to return based upon a visitor’s ask to API.AI. If we can match an Intent ID from API.AI, then we respond with the Answer from the Article. Otherwise, a CETopic Entity should be provided, and we respond with the top three articles that fall into that taxonomy.
A webhook is a web service that listens for data, but the requestor doesn’t wait for a response.
For a more detailed description of how to implements webhooks and fulfillment tasks on API.AI, click here.
Deploying Your Chatbot
API.AI easily integrates with many of the most popular user touchpoints like Twitter, Kik, Slack, Facebook Messenger, Google Assistant, Skype, Cortana, and Alexa.
To get started, simply navigate to the Integrations tab on your agent’s console and click the platform you’d like your bot to live in.
Each platform will have a different set of instructions needed to complete integration.
To learn how to deploy your bot on a specific platform with API.AI, click here.
Build a Bot to Fit the Need
Organizations are unique. Customers are unique. Transactions are customized. And so bots are already becoming specialized based on their function. What will your customers benefit most from experiencing?
- Task fulfillment bot
- Voice assistant
- Entertainment bot
- Customer service bot
- Educational bot
The way that we approach our bots—the degree of complexity, the model, the level of content, the functionalities, the sophistication of the AI, even the personality—will all greatly depend on the type of bot we are creating.
The key components and strategies listed in this article will not be the most efficient way to approach every bot. Some questions you might want to think about before going down the road of chatbots.
- Who will use the chatbot? Employees, members, potential members, etc.
- How will a chatbot help the user?
- Is the data that we want to use with the chatbot structured properly in our CMS?
- Will the visitor ask questions that can be parameterized?
- Will the chatbot provide a function to the user, such as ordering an item?
Based on how the questions were answered, some additional items will need to be considered:
- What features are needed within the chatbot framework?
- Does extending the existing content structure make sense or should a different content type be used?
- If entities are used, how will those be mapped from the CMS?
- What is the best way to handle fulfillment of tasks?
Or, perhaps, if your chatbot is a simple, content-light, service-based bot, you might not even need to link to an external CMS at all. Single-source doesn’t solve all the problems.
Start With the Content Model
At [A] we like to say, ‘Everything starts with the content model’. That statement is true with chatbots too. Think of this as a challenge to consider how intelligent content principles can be used in the outset of every content project—chatbot or otherwise. Start with the content structure. Model the structure. We need a smart structure to move content from place to place, including from the CMS to the chatbot.
If you need help determining the best way to approach a chatbot project, download [A]’s 80+ page slide deck Engineering Content for Bots, AI, and Marketing Automation and the companion Resource Guide: Engineering Content for Bots, AI, and Marketing Automation.
Technology evolves quickly. Investing the time to future-proof content and build it to scale will help you COPE with these changes and prepare you for whatever is next on the horizon.
Do you have a plan to use a chatbot with your CMS? Have you experienced chatbots on other sites? How was your experience? Did it influence your opinion on how chatbots should be implemented? We would love to hear your opinions and experience in the comments section below.
You can learn more about multichannel marketing, content engineering, personalization, chatbots, artificial intelligence (AI), and digital maturity at simplea.com or on Twitter by following @mrcruce and @simpleateam.