Actions on the Google Assistant in 5 minutes | Google I/O 2018

, , Leave a comment


[MUSIC PLAYING] SPEAKER 1: Hi, everyone. This is our overview session. So if you’re totally new
to all this, hopefully, it’ll pique your interest. Well, Actions on Google
is the developer platform for the Google Assistant. Now, users are already
coming to the Assistant to perform over 1
million actions. And they’re performing
these actions across a huge range of
devices and context, from voice-enabled speakers
like the Google Home to phones to Android Auto to
TVs, Chromebooks, and more, bringing the number of devices
where you can reach users to over 500 million. Actions are now available
in 25-plus locales, with more coming this year. It still takes some work and
technical know-how to get up and running. And that’s why we provide
templates to get you started. No programming
knowledge required. You can create fun and
engaging conversational actions like Personality Quiz,
Trivia, and Flash Cards. All you have to do is
come to the Actions console and follow a
step-by-step wizard to get started
with the template. Then, you just fill out a
Google Sheets spreadsheet with the content. Of course, I know most of you
are developers, so that’s never going to be satisfying enough. SPEAKER 2: Google
is going to give me this amazing tool for natural
language understanding called Dialogflow. It turns that unstructured
natural language input into a structured representation
that my endpoint can consume. Dialogflow sits in the
middle between the Assistant and my server. And it does that for
me with hybrid machine learning powered intent
matching and entity recognition. All right. So the way Dialogflow
intents work is that you train them to listen
for phrases that the user is going to speak in
each turn of dialogue. And then each semantically
similar set of phrases is called an intent, right? So with the power
of machine learning, when the user says anything
similar to the phrases you’ve provided at any point
in the conversation, Dialogflow is going
to match that intent. And how am I going to
account for the variation in the way each programming
language is spoken? Programming languages
have nicknames. Well, Dialogflow offers the
feature of designating synonyms for each canonical value. But now the user
is going to respond with a request to know about
some specific language. Like, let’s say they say,
tell me about Kotlin. And this time, the
Google Assistant is going to hand off
directly to my action because the user’s already
entered the dialog. And it’s going to handle the
speech-to-text transcription. And Dialogflow is
going to match the which language intent
that I described earlier. But this time,
instead of responding, it’s going to reach out
to my Webhook hosted on Cloud Functions for Firebase. And it’s going to
dynamically pull information about that language
and present it to the user. And we’re also announcing
something really cool that you’ve asked
for for a long time. You can create alpha and
beta versions of your actions that can be shared with a
limited audience for feedback and iteration. SPEAKER 3: Let’s talk about
how built-in intents can also help you get discovered. So as you can see here,
there are a ton of languages, there are dialects, there
are cultural differences, there’s slang, that all make
a million different ways to say the exact same thing. Built-in intents can provide
a common taxonomy for actions across all of our
Google surfaces. So in this scenario,
for example, you’ll see if somebody
says, hey Google, I want to start
meditating, we can then suggest just a few
different actions that we know will
service that need and are actually
indeed meditation apps. So here are some of the
built-in intents that we’re working on this year. You’ll see the ones that
are starred will be soon in developer preview. So the last tool that I’ll
talk about today for discovery is called Action Links,
which is an awesome tool to help you promote
your experience. You’ll see it here in
the Headspace example that we, Google, will actually
promote these experiences in our own Assistant Directory,
where a user can actually deep link directly from the
app directory into your action on the device of their choice. Assistant Links is really,
really easy to implement. You can simply generate a
URL really quickly and then share across all
of your channels. We will soon open up the
ability for app developers to sell digital goods
on the Assistant. This will work across both
phone and voice surfaces. And we’re thinking of
three key use cases here as we launch this– in-app digital goods,
subscription services, and lastly, digital
media purchases. So what we know here is that
people are creatures of habit. So we’ve identified
some of these clusters and where the Assistant can
actually provide extra value. This new feature, Routines, lets
users combine multiple actions into a single command. So daily updates are
when Google can actually share a notification and pull
an update at a specified time each day by the user. One example of this is you
could get transportation or news alerts every single
morning at the same time if Google understands
that about you. Push notifications are
really exciting because they could help you do
things in real time. So things like getting stock
alerts from your favorite news publisher. The value to users is
that it actually– it helps give you access to
relevant information, which is immediate, it’s content-rich,
and it’s specific to you. What’s really valuable to
brands and to some of you folks here as developers is
that this can really help you enhance your
utility as a partner to some of your users. So thank you, everyone,
so much for having us. Enjoy the rest of your week. [APPLAUSE] [MUSIC PLAYING]

 

Leave a Reply