Seeing the potential of Google Assistant as it was presented in Google I/O 18 as well as Volvo Cars in their infotainment system, I could not resist trying it out. I reused one of my older projects, VasttraPi and interfaced it with Google Assistant. If you want to read more about the project, I would be delighted if you take a look at the related article on my blog: Custom actions for Google Assistant.
In this Instructable we are going over the steps necessary in order to launch your own Google Assistant Action which will communicate with your server and talk back to you, listing upcoming departures. Keep in mind we are not going to use the SDK since we are running this completely within Google Assistant and not our own device. Overall it is going to be kept simple. Your action will be usable on devices that support Google Assistant such as phones, tablets, home assistants and wearables and you can even share it as you would with an app on the Play Store!
One of the main advantages of creating something like this over using services such as IFTTT is that you can create custom responses in your server which will be read by Google Assistant. In IFTTT the communication is as far as I am concerned one-way.
The following topics will be covered:
- Creating your first Action project
- Creating your custom Intent
- Creating your custom entity
- Setting a webhook to fulfil your action
- Setting up a simple REST server, written in Python with Flask, which will provide the responses or "fulfillment"
- Testing your Action project
- Releasing your Action project
Here you will not find many details or theory about the various Google Assistant functions and concepts. If you are interested in them, I strongly suggest you follow the official tutorial or watch this video.
Teachers! Did you use this instructable in your classroom?
Add a Teacher Note to share how you incorporated it into your lesson.
Step 1: Set Your Google Account Permissions
Before we begin we need to ensure Google Assistant has the appropriate permissions.
- Go to Activity Controls
- Make sure the following are enabled:
- Web & App Activity
- Device Information
- Voice & Audio Activity
Step 2: Create Your Action
- Go to Actions Console and click on "Add/import project"
- Let's call this Actions project "local-traffic-planner".
- Click on "Create project".
- On the next page, do not choose a category and click on "Skip".
- You should be in the main Actions Console page now.
Step 3: Action Invocation
Determine how you want to trigger Google Assistant to start your action.
- Click on "Decide how your Action is invoked" under "Quick setup".
- Call it "My local traffic planner" and click on "Save".
- Click on the "UPDATE SAMPLE INVOCATIONS" prompt that popped up after clicking save.
- You can also find this option under "Directory Information".
- Add "Ask My local traffic planner" as a new invocation and click on "Save".
Step 4: Add Your First Action
- Click on "Actions" on the left-hand bar.
- Click on "ADD YOUR FIRST ACTION".
- Select "Custom intent" and then click on "Build".
- You will be taken to the Dialogflow page which is where you will implement the main logic.
Step 5: Training Phrases for Your Intent
- Choose your time zone and click "Create".
- On the next page, leave the existing intents be and click on "CREATE INTENT".
- Give the intent a reasonable name, i.e. "departures".
- Go to "Training phrases" and click on "Add training phrases".
- Use the following phrases to train your model so it can interpret what you are telling it:
I am at home right now
For the time being I am home
I am at our apartment at the moment
I am sitting at home
Currently I am near work
I am at work
I am at the office
Step 6: Entities
Now you have specified what should be more or less expected as an input, we need to define which parts of the input are of interest to our business logic so they can be extracted and highlighted to our server. In our case, we want to know whether the user is at home or at work, so we can respond back with the departures from the specific station. Let's see how we can do that.
- If you double click on one or more words of the training phrases, you will get a list of predefined entities.You can read more about each one of them here. Overall, the most suitable one would be @sys.location however I think it's best and easiest if we create our own entity which we should call @current-location.
- Click on the "Entities" option on the left side.
- Click on "CREATE ENTITY".
- Set the name to "current-location" and define two reference values along with their synonyms:
- home, house, apartment, crib
- work, office, Aptiv, code mines (lol just kidding)
Step 7: Action and Parameters
Time to give some meaning to the keywords inside your training phrases.
- Click on "Intents" and then navigate to your custom Intent, i.e. "departures" if you have been following my name suggestions.
- Scroll down to "Training phrases".
- Double click on the words that indicate your current location and choose the @current-location tag from the pop up window.
- Scroll down to "Actions and parameters", click on "manage"
- If everything was done correctly, you will see your new entity being listed there.
- Check the "Required" box which makes a new column, "Prompts", appear. Prompts is what the user shall hear if nothing that matches the expectation has been supplied.
- Click on "Define prompts" and insert something like "I did not understand your location. Where are you at the moment?".
Step 8: Fulfillment
Now it is time to hook your web service to the Google Assistant Action. Your hook will be called when this specific intent is triggered and should produce the fulfillment of this action. Before that, we also want to set our intent to conclude the action after being fulfilled.
- Go to "Responses" and click on "ADD RESPONSE".
- Do not add any responses, just enable "Set this intent as end of conversation".
- Scroll down to "Fulfillment" click on "ENABLE FULFILLMENT" and then turn on the "Enable webhook call for this intent".
- Click "Save" and then go to the "Fulfillment" option on the left-hand side.
- Enable the "Webhook" option and insert the URL that is being "listened" by your webserver.
- Whenever the intent is triggered, it will send a POST request to your website with the body of the request containing a JSON object with the current location of the user.
- Click Save.
- Now we are ready to create our web service, but before that, let's make sure that our Action welcomes us in a proper manner.
Step 9: Welcome Intent
In order to customize the user experience, we should create an appropriate greeting for us whenever we trigger our action.
- Go to "Intents" and then click on "Default Welcome Intent".
- Scroll down to "Responses", remove the existing ones and insert what you want your action how to welcome you once it is initiated.
- Click "Save".
Step 10: Your Python Web Service
Let's make a quick and dirty Python server using Flask. No screenshots for this step, but it should be pretty straight forward.
- Open a new tab and create an account on pythonanywhere.com
- Verify your email.
- Set up your web application by clicking on "Open web tab".
- Click on "Add a new web app" and select "Flask" as your Python web framework.
- Select Python 3.6 and click "Next".
- Choose the path your want your "flask_app.py" to reside in. I placed it directly inside of my home folder as "/home/your-username/flask_app.py".
- Go back to the main page by clicking on the Python logo on the upper left corner.
- Under files, click on "flask_app.py" to start editing it.
- When the web text editor opens, paste the following code and click "Save".
The overall idea is that depending on the parsed JSON coming from Google Asssistant our server will perform an action (e.g. read or write) and report it back as a response/fulfillment that should be read to the user.
- Click on "Open web tab" again and then on the green "Reload button".
- By now you should be having your own Python web server running at "https://your-username.pythonanywhere.com/departures".
Step 11: Test Your Action
OK, you are pretty much done by now. Let's test out the whole stack now and get this "Hello world" example working!
- Click on "Integrations" from the left-hand side bar.
- Click on "Integration Settings" under the Google Assistant option.
- Under "Implicit invocation add the name of your intent, i.e. "departures" so it can be triggered directly by saying something like "Hey Google, talk to my local traffic planner about departures from home".
- Enable "Auto-preview" changes.
- Click on "Test" which will open a new page.
- Type "Talk to My local traffic planner".
- Your action should be invoked which should be greeting your with one of the previously set welcome intent responses.
- Then type "I am at work". Your Python server should be contacted and the response will be read by Google Assistant.
Cool isn't it? Now imagine what you can do interacting with sensors, actuators and other APIs through your Google Assistant.
Step 12: Release Your Action
After you are done testing your Action and it is in a good state it is time to share the love with the world or, if that does not make sense, with your friends and family.
- Go back to your Actions console and select your local traffic planner action.
- Under "Get ready for deployment" click on "Enter information required for listing your Action in the Actions directory".
- Scroll up and click on "Save".
- Click on the "Release" option from the left-hand side bar.
- Here you can choose what state your Action is in. If you do not want to hear the response "Let's get the test version of My local traffic planner" you have to make a full-fledged public release. However that requires a review by Google and will not be covered in this tutorial.
Instead, you can still share this with up to 20 people by choosing an Alpha release and adding them as Alpha testers.
- Add any alpha testers by either sending them a link or adding their emails.
- Click on "SUBMIT FOR ALPHA", tick the boxes, click "SUBMIT" and you are done!
Now your Action is live and can be accessed by you and your friends. Have fun!
If you are interested in the code I used, take a look at the project on GitHub.