In an earlier blog post, I introduced the Bot Framework Adaptive Dialogue. I ran through how this cool new feature of the Bot Framework paves the way for developers to build fully dynamic database driven chatbots (or any other storage mechanism for that matter). One thing I mentioned towards the end of that blog post was an idea for a feature:
Visual Interface – Create a visual designer that lets you drag and drop widgets onto an ASP.NET canvas that model the conversations you need to create. Granted this will take a little time but being able to drag and drop these components in canvas of sorts would be a great user experience!
In this post, I’m happy to announce the Bot Framework Team has shipped a product that does exactly that. It’s called Composer.
What is Bot Framework Composer?
Bot Framework Composer is an open-source tool based on the Bot Framework SDK. It runs as a web application on your local machine and at the time of writing, the tool is in public preview mode.
Composer lets you build conversational logic using a drag and drop interface. The development experience reminds me a little of Windows Workflow Foundation. Some of the main features include, but are not limited to:
- A visual editing canvas for conversation flows
- In context editing for language understanding (NLU)
- Tools to train, test and manage language understanding (NLU) and QnA components
- Language generation and templating system to support basic string manipulation
- A ready-to-use bot runtime executable
I’ve been using Composer for the last few months now. There were the expected teething problems around some of the earlier releases, but I’ve got to the point where I’ve built an end to end chatbot with the tool.
Whilst I’ve been using Composer, I’ve found the main benefits to be:
- Low code chatbot development (it creates JSON files under the hood)
- You can quickly model conversational logic by dragging and dropping new dialogues, activities and other widgets onto the designer canvas
- You can integrate Composer generated JSON with Bot Framework Adaptive Dialogues. This is POWERFUL and my favourite feature. I may do a separate blog post on this topic.
- Integration with 3rd party systems such as web services is easy – thereby letting you consume more complex business logic
- Lets you hand off conversational logic development to less technical business users
Those are a handful of the benefits I’ve found so far, no doubt you’ll find your own.
Who should use Composer? Why might you want to use it?
So, who should Composer ? In my opinion, the tool can be used by different types of user across multiple disciplines such as:
- Business Analysts
- Business Users
The biggest draw of Composer is that it frees up the developer from creating a lot of the boilerplate conversational logic (dialogues).
For example, if you’re a developer and find yourself with a collection of dialogues to implement, you can install Composer then let the Business Analyst model these in the designer canvas. As the Business Analyst creates the conversational logic, Composer will generate JSON under the hood.
A developer can then take the output JSON, and with a little bit of code, integrate these JSON files to generate conversations at run-time. This is an approach that I’ve been using on for the past few weeks and it works well.
The developer is then freed up to work on more complex tasks such as training the natural language model (LUIS), perform further integrations or work on more complex tasks.
Tip: A good practice I’ve found is to create Web APIs that encapsulate the more complex logic your chatbot needs, you can then use the Http Request Activity in Composer to invoke the Web API’s business logic.
You can also use data returned by the Web API in Composer by reading into a variable on the designer canvas.
Building Chatbots with Composer
Building chatbots and conversational logic in Composer involves selecting widgets from a menu and simply adding them to nodes on the designer canvas. After adding a widget, you set the widget’s properties from another window then repeat the process. If you’re familiar with creating flowcharts in PowerPoint, you’ll be able to get to grips with Composer quickly.
You have everything you need to build a fully functioning chatbot, some of the main widgets include, but are not limited to:
- Custom Events – you can raise events at any point in the conversation and trap/listen for these elsewhere.
- Condition Logic – IF THEN ELSE, SWITCH statement type widgets to help you model rules
- Dialogues – these contain the conversational logic and can contain 1 or more widgets
- Loops – for looping through data
- Prompts – to ask and prompt the user for input (text, numbers, datetime etc.)
- Property Manipulators – handy for initializing and setting variables
You also have other options such as setting how you want the Recognizer to identify an Intent from user-supplied Utterances. Intents in Composer can be identified by using Regular Expressions or by hooking Composer up to a full-blown LUIS Application (the Language Understanding service).
Tip: If you’re just getting started with Composer, don’t worry about setting up and building a LUIS Application. Use the Regular Expression Recognizer to handle the initial identification of Intents. It’s simple and will get you up to speed quickly.
Natural Language Understanding Integration
Natural Language Understanding is one of the most important aspects of chatbot development as it helps the bot understand what the human is saying.
In the Bot Framework wold, this is generally implemented by using the cloud service LUIS. Using LUIS, you can create language models that can:
- process the human’s Utterance (the message sent to the chatbot, e.g. I want to call my HR rep)
- infer the underlying Intent in the Utterance, e.g. “HRContact”
- parse out the existence of any Entities (call)
(image source: Microsoft Docs)
Normally, you’d log in to the online LUIS Dashboard to do this. Fortunately, you don’t need to leave the designer canvas in Composer. You can author the natural language model directly from the Composer. This is done by creating .LU files. You can see a sample of this here:
(image source: Microsoft Bot Framework Team)
After you’ve completed the first cut of your language model, you simply publish this to the LUIS service for your chatbot to use.
I’ve just touched the surface of Composer, Natural Language Understanding and .LU files. Language Understanding files deserve their own blog post as you can do quite a lot with them.
Find out more about LU files here.
QnA Maker Integration
I haven’t used QnA maker for a while. During the last few months chatbot development, I’ve not had to integrate the service with Composer yet, but the option is there.
If you haven’t heard about QnA Maker, it’s a service that lets you ingest data from multiple sources such as FAQs, websites, PDFs and other static content. The service will then auto-generate a dictionary of “questions and answers” that lend themselves to being queried with the relevant answer getting returned. It’s ideal if you need to quickly seed a chatbot with domain knowledge.
Like everything else in Composer, you just select the widget from the toolbox and drop it onto the canvas then set the relevant properties:
Note: You need to provision the QnA Maker service prior to using it.
Tip: A good practice is to implement the QnA Maker as a kind of “last line of defence” for use cases where your LUIS model is unable to infer the Intent from an Utterance.
Find out mor about QnA Maker here.
Extensibility and Community Support
Composer is being developed in the open so you’re free to download the code, experiment and play with it or make suggestions or feature requests.
It has extensibility points built into it and I haven’t play with that side of things yet but one thing I’m interested in doing is saving the output JSON to a backend database as opposed to the file system.
Testing your Composer Chatbot
If you’re already developing chatbots using the Bot Framework, you’ll be familiar with the Emulator tool. You use the same emulator to test Composer generated chatbots. You click Restart Bot to load your Composer bot then click Test in Emulator:
The Bot Framework Emulator will be invoked and will connect to your Composer built chatbot. You’re then free to quickly interact and test your chatbots logic.
Tip 1: To help debug your bot, I suggest adding trace event widgets in entry points to dialogues and key branches of conversational logic. You could even send messages to the user. This gives you a quick at-a-glance idea as to where your error is occurring (you don’t get a breakpoint like Visual Studio!)
Tip 2: I’ve found whilst making ad-hoc changes to the underlying .dialog files, the emulator doesn’t always pick those up. If you’ve made direct changes to these files, restart the Bot Application and your Emulator and retry.
In this blog post, I’ve introduced Bot Framework Composer. We’ve seen some of the benefits of this tool, who might want to use it and how Composers drag and drop interface makes it easy to quickly build conversational logic with little to no code.
We’ve touched on some of the integrations that can also be added through the designer canvas such as your own custom Web APIs and how natural language services such as LUIS can also be integrated.
We also seen how you can quickly test your Composer chatbot but using the Bot Framework Emulator.
One thing that would be great was if Composer could be run as a standalone application / executable that I can run on my laptop (like the Emulator tool), rather than having to install Yarn, Node then spin it up on the command prompt.
Another important point to mention some of this functionality is in Preview Mode so be mindful of that.
Are you building chatbots?
Have you checked out Bot Framework Adaptive Dialog?
Got any questions or feedback?
Drop a message below or reach out on social