05th June 2018

Kicking off my collection of writing on WWDC 2018, I’m going to talk about Siri Shortcuts, and the Shortcuts app. As soon as I saw it, I knew it would be one of my favourites from the whole event.

The announcement was received by most people as being “now we know what the Workflow team have been up to”, I’m not complaining, I also posted the same thing. It is probably the best way the Workflow acquisition could have gone, because now it’s completely tied into the OS. It may have a different name, but it will always be Workflow.

The features announced was really how the OS interacted with the shortcuts, and how Siri is more intelligent because of it. Not the voice Siri, but the computational Siri that can understand you, and suggest things.

It will, of course, require developers to open up about different user actions in their apps. Which will allow Siri to analyse their usage, suggest them later on, and also maybe for users to build with in the Shortcuts app.

There were some intriguing demos for the suggested actions, which is not something I really say, because usually they are based around unrealistic or ideal situations. But because Siri is in your phone? I’m not sure how to explain that, but it knows about you, what you’re like, and also the environment around you. Which is why it can suggest you turn on Do Not Disturb when you go to the cinema.

They also showed an example of a regularly occurring event, such as buying a coffee in the morning. Maybe not everyone buys a coffee from an app on their phone every day, but I use the Starbucks app every time I go. And that could easily be at least 3 times a week. So if it learned (or just used Maps) the location of Starbucks, recognised it was associated with that action, that would be very helpful! It’s certainly something I feel would be capable, and it’s not a usual Siri feature that’s nice to think about, but never use.

It does get more advance though, and that’s with the Shortcuts (Workflow) app. I conceptualise it by being similar to Scenes in HomeKit, where you could say a phrase such as “Good morning”, and then Siri can perform a bunch of tasks to set you up for the day. Maybe it sort of encompasses the automation of HomeKit?

I’ve already been playing around on the iOS 12 beta, and while I’ve already been suggested some actions, like enabling a alarm, messaging my girlfriend, and even adding a new to-do in Things, we don’t have the Shortcuts app yet. That will come in a later update via the App Store. So I will definitely have to write more about that in the future. But from the keynote, it looks like they’ve added the Apple-style to Workflow, which will definitely make it feel easier to use for general users.

One of my questions though, is how well suited is this to a general user? I will be very keen to see if it’s a widely adopted feature, and even if the Shortcuts app with custom actions might not be, I see the Siri suggestions being a bit hit.

Read more of my coverage of WWDC here.

01st March 2018

Matt Birchler has a pretty common request for Siri, and that is the ability to schedule requests.

While all of these assistants can turn things on, turn them off, move thing up and down, and such, they can only do those things now. I can turn on the lights now. I can open the garage door now.

It makes so much sense for this to be supported. Sure you might be able to schedule actions inside of an app. But if voice is an official method of input, you should be able to do everything with it.

There’s not even a particularly high barrier in creating a delay/schedule system. The simplest method I can think of, is that when a voice assistant hears a request with a related time, all it needs to do is store that exact request (even plain text is fine), along with the date/time. Then the system can set its own reminder, and at that time, it simply performs the request automatically, and deletes it from the queue.

01st September 2016

In a new article released by Apple today, “Developers Preview New Ways to Make Payments, Send Messages and More Using Siri“, they released a few previews on how a few payment providers have created a Siri experience for their service/app.

Developers are previewing new Siri integrations for payments, messaging apps and more, creating new experiences using “Hey Siri” to simplify everyday tasks using just your voice. Siri can already help you send an iMessage to a friend, but with the introduction of SiriKit for developers, messaging apps can now tap into the power of Siri. You can use your voice to do things you couldn’t do before, like ask Siri to send a secure payment without ever opening an app.

I’m really happy to see Monzo (formerly Mondo) on the list, and they’ve proven to be a really forward thinking (now official bank) service!

There are a few more previews in the article as well, which aren’t for payments, but show some future third-party integration with Siri.

With all of these integrations coming to Siri, it may finally become useful.