Building a Guitar Chord Tutor for Actions on Google: Part One

Joe Birch
10 min readSep 13, 2017

--

Google Actions is a fascinating platform. Not only does it provide us with a conversational tool that gives us the ability to create experiences which make certain tasks easier and more convenient for the user, but it allows us to create applications that can be easily accessible for a wide range of users with different needs.

Recently, I’ve been playing a lot with Actions — so I wanted to create a simple application that would allow me to give a quick run-through of what an Action is made up of as a way of getting started with creating your own. This article is meant to be a higher level intro to building for the platform, just to demonstrate the tools and approaches used for creating an Action. It will accompanied by following parts where I build on the app, followed by some deeper dive into the platform itself.

For now in part one, we’re going to look at how we can create a simple text/voice Action that tells us how to play guitar chords when requested.

To build my Action I made use of Api.ai — this tool was acquired by Google last year and provides us with a way of creating conversational platforms without the need to handle natural language processing ourselves. You aren’t required to use api.ai when it comes to building for the Actions platform, but it will definitely make your process easier in some cases. The Action that I’ve built is a simple guitar chord tutor called Fret — the user can simply ask the Google Assistant how to play a chord and Fret will respond with the fret numbers and guitar string state (open, muted or played) so that the user can quickly and easily learn how to play a chord.

You can view the code for the project here (there’s not much to it!). I’ve also uploaded the .zip file for the api.ai project so that you can import it and explore the project.

Defining our Entities

An entity can be seen as a parameter value which is to be grabbed from user input and used for our queries — this could be something such as a vegetable in a recipe, a colour the hex code we want to know, or in our case the chord which it is we want to to learn.

I added the entities to my project first as i feel they are a core part of the development process, you could see them as your business rules, so I felt it important to have these outlined first.

You can add entities to your assistant application by using the navigation menu on the left-hand side and selecting the Entities option just below the Intents section that we previously looked at.

Within Fret, we have a chord entity which defines the different chords which the assistant supports. For this example, we’re just going to add a small collection of guitar chords that are supported.

The first column identifies the reference value, this is the value which is going to be used as the parameter value when the entity use is detected. Then in the second column we define the synonyms, these are the different patterns which can be input by the user for the assistant to detect the desired entity. So for example, for the entity value of the chord G the user can input either G, the chord G or G chord.

Default intents

When we started creating our action we are provided with some default intents which should have already been created for you. One of these is the default welcome intent, this is the intent which is fired up when the action is first invoked by the user. You don’t have to use this intent, but for this example it provides a nice way to kick off our experience for the action.

For now, the main thing to take not of here is the Text response. If you select the Actions on Google tab, you’ll notice that the intent is configured to return this as the first response for the action — in this case of the welcome intent it means that this will be returned before input is given by the user, so it’s a good point to provide a welcome message.

You’ll also notice that the WELCOME event has been assigned to the intent. Events can be used to trigger intents instead of user queries, there are a bunch of different events available for use, the welcome event is triggered when the action is first invoked so we make use of it here.

There is a also a Default Fallback Intent created for your project — this just provides a default response to your users if things don’t quite go as planned when interacting with your action.

Creating our own Intents

Now that we have our entity defined, we’re going to go ahead and create the mechanism used for mapping the users input to the actions to be performed by our application — this mechanism is known as an intent. For this part of the development for Fret, besides from the welcome and fallback intents, we’re going to add just the one intent to our application — this will be the learn.chord intent.

Within our intent, we’re going to want to go ahead and define some conversational triggers which will cause our Intent to be fired up. Our aim here is to tell the user how to play a guitar chord, so we want to listen for terms that will lead to this point in the conversation. Here are some of the terms that I have defined for the intent to listen for:

Now whenever our user inputs any of these terms, our intent will be triggered. So for example I could say “Hey, I want to learn how to play a chord” or “Would you be able to tell me how to play a chord” — when this happens, we need a way to grab some more information from the user about the chord they want to know about.

This is where out Entity comes into play. Our Entity is a piece of data (known as a parameter) that we need to get from the user, so we’re going to add it here as a required parameter when our Action is in process.

When you create a parameter you need to give it a:

  • Parameter name — This is simply a unique reference for the parameter
  • Entity — This is the entity which the parameter represents. This can either be one defined by the Api.AI system or one that we have defined ourselves
  • Value — This is the value assigned to the parameter, in this case $chord will be the resolved value that has been extracted from the users input.

You can also add a prompt for a parameter. So when the Intent is triggered, one of the prompt variants will be used to ask the user to input the required parameter. The nice thing here is that you can add variants of the prompt to make the conversations with your tool feel more natural.

The last part of setting up our intent involves ticking this Use webhook checkbox. Now if your conversation doesn’t require any form of external operations (such as API requests) then you won’t need to do this. However, ticking this means that the response for our intent will be handled by the webhook that we will be providing. In the case of Fret, this will be the task of fetching the details on a chord based on the one requested by the user.

But first, we need to create our function that will handle the creation of our chord to feed back to the user — this is also the point where we will receive our webhook URL.

Creating our Function

Before creating the code for our function, you need to carry out several steps in-order to setup firebase functions. To do this, I suggest you follow the guide here —it’s pretty straight forward to follow and gives you all the steps needed to use the service.

Once you’ve done the above, you should have a generated functions directory that contains an index.js file. This is the file that we’re going to use to be defining the function for retrieving the chord. Let’s take a look inside the index.js for Fret and see what’s going on:

  • The key part of the file is the entry point for requests that come from api.ai. You’ll find this in the file here— this essentially receives any incoming requests, allowing us to handle them accordingly. Here we make use of the handleRequest() function from the Actions SDK — this allows us to easily handle intents within our code.
const fret = functions.https.onRequest((request, response) => {
const app = new ApiAiApp({ request, response });
app.handleRequest(actionMap);
});
  • We provide the handleRequest() method from the Actions SDK a map of the actions which we can handle, along with the function that is used to handle that action. If we are only dealing with a single action here then it is possible to pass in the function instead of a map, but the map gives us room for easily building on our application in the future.
const actionMap = new Map();
actionMap.set(Actions.LEARN_CHORD, learnChord);
  • To keep things tidy, I’ve defined my actions in a constant declaration within the file, here. You’ll notice that this action matches the name of the action from inside of our Intent within api.ai
const Actions = {
LEARN_CHORD: ‘learn.chord’,
};
  • Finally, this is where the magic (kind of) happens. This is the function that is called when the LEARN_CHORD action is invoked — it essentially uses a couple of helper functions to build a chord string and return it.
const learnChord = app => {
const chords = strings.chords;
const input = app.getArgument(CHORD_ARGUMENT)
const chord = chords[input]

if (chord != undefined) {
return app.ask(buildString(chord) + “. “ +
strings.general.whatNext)
}
return app.tell(strings.error.chordNotFound)
};

What happens here is that we begin by fetching the chords from our strings file, followed by retrieving the chord entity input by the user, using the getArgument() method from the Actions SDK and passing in our CHORD_ARGUMENT reference.

Next, we try to fetch a chord from our chords reference using this input from the user. If there is a chord, then we use our helper functions to build the chord string, followed by the ask() method from the Actions SDK to give the response back to the user. We use the ask() method as it keeps the microphone input open, allowing our user to request another chord.

There is also the ability to use the tell() method, which simply just outputs our content and closes the microphone input. You can see that we use the tell() method in error cases to let the user know that the chord doesn’t exist.

Now that this has been created, we can update our function by executing the following command:

firebase deploy — only functions

This will upload out function content to firebase so that it can be accessed by our action within api.ai. At this point we’re also given the fulfilment webhook which needs to be added to our api.ai project. To do so, just navigation to the Fulfilment navigation item and you’ll be presented with the webhook form to enter your details for the webhook.

Simulating our Action

We’re now ready to test out our Action with the simulator. For this, we’re going to head on over t0 the Integrations tab and check the the Actions on Google integration is switched on.

If this is the case, then we can go ahead and click on it. At this point, we’re going to be shown a dialog which allows us to update our action draft.

If the Welcome Intent isn’t already set here, then you’ll need to set this — you can see here that I’ve set it to my Default Welcome Intent.

Next, we need to go ahead and add any of our triggering intents to the list of intents that we want to be available for our Action. At this point I only have the one intent, so I’ve added this to the list — you can see here that it is the learn.chord intent.

Next, we can select the Update Draft button and we’ll be shown the option to visit the console — this is where we can try out our Action.

All you need to do is navigate to the simulator navigation item and you’ll be able to test out the action — here’s a clip of my testing Fret in the simulator:

Conclusion

This was the first Action that I’ve built and the first time that i’ve used API. Fret feels like a really simply project and i’m interested to see how I can build on it to make use of more of the actions SDK. Stay tuned for following articles if you wish to follow along on where this goes next!

--

--

Joe Birch
Joe Birch

Written by Joe Birch

Android @ Buffer, Google Developer Expert for Android. Passionate about mobile development and learning. www.joebirch.co

Responses (3)