Yesterday was iOS 12 release day, and it was a pretty good day overall, so I thought I’d share some of my personal highlights.
Of course, iOS 12 and the new Shortcuts app was released, so it was good to see everyone discover the new features that some of us developers and beta users have been playing around with for a while.
I released version 1.3 of Text Case which includes support for Siri Shortcuts. It was covered by John Voorhees at MacStories, which was great! And it was met with a good response from quite a lot of people. It’s by far my favourite app I’ve developed, and it’s been super fun seeing where I can integrate the functionality of Text Case throughout the system.
Then, of course, there’s the annual event for nearly everyone in the Apple community, Federico Viticci’s iOS review! He puts so much work into it, and you can always see that when you read it. This year it’s packed with some great Siri Shortcuts information and even more if you’re a Club MacStories member (which I am!).
There was also a bunch of other apps that released updates with support for Siri Shortcuts, this was great to see from a developer and user perspective. My favourites so far are PCalc, Bear, Ulysses, Citymapper, Overcast, CARROT Weather, and Things. And now that password managers can integrate properly into iOS, I’m also going to look at using 1Password, LastPass, or even one of the many others that have been updated
The best part of the day was probably Twitter, and while it hasn’t received a lot of praise recently, the Apple community is a major reason why I can’t see myself stopping using it. Everyone was happy about the updates, there was a bunch of conversations about Shortcuts, and generally everyone having a good time! I even made a little snark about Mastodon users. Phil Schiller even shared some unfortunate news about the Squirrel from the 4S Introduction event.
Now I just need to wait until my XS arrives on Friday, so I can try out the parts of iOS that I’ve been missing. Like Animoji, Memoji, Portrait Lighting, the new Depth Control, and even FaceID!
…if you want the best Google software, iOS is really the place to be.
That sounds crazy, and maybe for some people it is, but as someone who relies heavily on Google’s software in both my personal and professional life, iOS has been a great platform for getting everything done that I need to do. Not only that, but a shocking amount of Google apps are updated first on iOS or are totally exclusive to iOS for months before going to Android. And with new apps like Files and updates to Siri intents, Google’s apps can interact more closely with iOS than they could in earlier versions of iOS.
I can’t say I’m well versed in the Android ecosystem, but I am aware of it. I pay attention to Google I/O announcements, and of course, there’s an Android developer at work so I have at least some perspective.
The only, or at least the biggest issue I can determine, is the obvious levels of fragmentation. This used to be the argument of app design, and quality, where iPhones used to be just the one size, and Android already had loads of variety.
The fragmentation I think causes these problems is the multiple Android vendors and mobile networks, that introduce needless bottlenecks to the whole platform. Whether it’s a small update that will get ignored by certain manufacturers or a major release which will take extra time for a company like Samsung (just picking one at random) to add their software on top, before shipping it to consumers. I just don’t think the wide varieties of Android phones combine to make a stable ecosystem.
That’s a whole lot different with iOS though because there’s less device variety, a higher percentage of users are on the latest version of the OS, and the App Store is a widely known success. I think this is why Google do so well. Because they can leave the foundation work to Apple, and that leaves them with just the software. And I can admit they can make pretty good software.
Theo Strauss, writing about Lyft’s new implementation of the search bar, and why its best placed at the bottom:
Although we don’t think about it too often, a search bar all the way at the top of the screen is hard to reach. Especially for users who have smaller hands or users who have less flexible hands, reaching up is annoying, mostly because the top of the screen is far away from where their fingers sit.
If you visualize most apps, the main content is in the middle or lower-mid area. Tab bars for navigation, posts on social media, and keyboards on messaging platforms are all examples of important pieces of experiences sitting in a more reachable position.
I feel exactly the same. The ability to search within an app, or just accessing the main navigational controls of an app, should be the most accessible parts.
In a world where we use tools such as a mouse, or laptop trackpad to direct a cursor around a screen, a classic vertical layout where all navigation is at the top, and the content filling the rest of the space, is probably fine.
However nowadays we interact with content on our displays directly, so it needs to be designed with a human hand in mind, not a cursor.
You can already see Apple pushing developers/designers towards this bottom-up approach, as they’ve added the “pull up” drawer-like component that contains a search bar and results, into the Maps app. This is the approach I feel needs to be standardised going forward, but this isn’t the only approach. As the Music app also follows this idea of having controls at the bottom, with the now playing indicator being there.
I do see this becoming a trend very soon, and I suspect that in a few months quite a lot of apps will be using a sheet similar to the one in Apple Maps. The only drawback is that Apple don’t provide a standard implementation of this bottom sheet, and instead developers either have to implement this manually, or adopt a library from other third-party developers.
I’ve been experimenting with it at work, and I’ve found one library to be very useful, and that is PullUpController by Mario Iannotta. It provides you with a simple one liner to add any view to act as the bottom sheet, and also manages the sticky points, management of inner scrolling views and content, and you can also extend it to your wishes.
Hopefully Apple can share their implementation and more developers can make use of this new interface style.
With all the nostalgia of the early App Store and iOS SDK days, Frederik Riedel tweeted about his experience developing iRedstone:
After he tweeted that, other developers started quoting it, and sharing their experiences. Frederik has compiled a great collection of them over on his blog.
It hasn’t been long since the release of Text Case, but I’ve already had some great suggestions, so I decided to add them in!
So here it goes.
Five extra formats: – URL Decoded – Capitalise All Words – Camel Case – Snake Case – Hashtags
One format has been “fixed”, and that is Capitalise. It now does the obvious and also capitalises the first letter after a period.
You can now choose which formats you want to enable, by navigating to the Settings page, and flipping the switches. This will obviously allow for a more customised interface, as I imagine some people won’t want all 12 formats to show if there aren’t needed.
I still have two things I want to work on. One is the ability for the action extension to be able to replace the original selected text with the new converted value. The other is a pretty great idea that I can’t share until I figure out how exactly I’m going to implement it. But it will be an advanced feature.
I’d also like to say thank you to everyone that has already downloaded Text Case, and I plan to keep adding useful updates!
If you haven’t already, you can download Text Case on the App Store.
Very unsurprisingly, iOS 12 brings better notifications support. There’s not too many changes, but they are certainly most welcome.
The big one is grouped notifications. It’s probably the notification feature I’ve been wanting the most, and Android constantly used to make me jealous with it.
I’ve not quite worked out the requirements for them to group together, because I’ve seen iMessage conversations automatically group, but other apps group after 4 or so individual notifications.
There are three options for grouping your notifications, automatic, by app, and of course, none. The interesting one is automatic grouping, because apps can actually help the OS work out what notifications should be grouped together, by providing different identifiers. I’m not going too too much in the technical side, but you’ll notice that Messages.app will group messages from conversations together, but each of these are separate on your lock screen.
There’s going to be more to find out about grouping though, as I’m sure there are different quantifiers which will change the way the system handles them.
In regard to the actions you can take on notifications, you now get to control how any apps notifications are configured, right on the lock screen. All you need to do is swipe right-to-left, and tap Manage. Then you’ll find three different options (depending on the current settings):
- Deliver Quietly/Prominently (The opposite of what is currently set)
- Turn Off
- Settings (This takes you straight to the app’s notification settings, so you can fine tune all of the settings.
These are all welcome options, and I particularly like the deliver quietly, for apps that I want information from, but don’t care that much about it. The options have always been available for these settings, but they’ve always been a hassle to get to. And the Quietly/Prominent options make it simple and clear so everybody can understand.
Do Not Disturb
Something else related to notifications, is the Do Not Disturb, which also received a few improvements.
It’s actually been split into two different levels of not disturbing you, and that’s differentiated by the Bedtime Mode option. So normally Do Not Disturb just means not to notify you, but if you turn on Bedtime Mode, it will keep your screen completely free of distractions until the time period is over, or you turn it off. Something that makes a lot of sense.
It also benefits from “Siri” (the intelligence in your device, rather than the voice), because it’s something else it can suggest for you. It can be triggered by a time, location, or event. I’ve already seen this a few times, once where it suggested I turned it on, but only until an event in my calendar was over. Pretty clever.
These are some fantastic improvements to how notifications work in iOS, but I still would like one more thing from Android. And that is to set specific notification categories. You can do this already as an app, in that it’s the way iOS can group notifications. But Android users can select to mute specific categories from apps, making it an even more personalised system. However, that’s very much want, and not a need.
Read more of my coverage of WWDC here.
If you read my recent piece on refining how I use my devices to maximise their value, you’ll know that the one major thing I wanted to help this was more insight into how I used them. Screen Time is that thing.
In the most Apple way, the data is accompanied with pretty graphs, and there’s quite a bit of information available. You see the apps that have taken up your screen, how many notifications you receive from each app, how often you pick up the phone, and even what your longest session was.
I haven’t received one yet, of course, but Screen Time will also give you a weekly activity summary. Which would be a good time to reflect on how the week went, and then take measures to ensure you use your devices in the ideal way.
If you want to be more strict with yourself, there’s some settings you can play around with, to ensure you know when to stop looking at your phone.
Downtime is a period of time where you will not be able to open any applications that are not in your Allowed list, ideal for setting a strict bedtime. Then you have App Limits, where you set an amount of time that you’re allowed to use on a specific app, or category, and they can even be specific for each day of the week. Finally, there’s a bunch more restrictions you put on yourself, but these apply more to parents who want to stop their children from accessing certain content, or just ensure they don’t just sit on Minecraft all day (what I used to do).
I’m super happy with this feature, and I can’t wait to see my first weekly report. Although I imagine this weeks will be completely skewed, as I’m using my device more than usual to try and find any cool new things in the beta.
Read more of my coverage of WWDC here.
Kicking off my collection of writing on WWDC 2018, I’m going to talk about Siri Shortcuts, and the Shortcuts app. As soon as I saw it, I knew it would be one of my favourites from the whole event.
The announcement was received by most people as being “now we know what the Workflow team have been up to”, I’m not complaining, I also posted the same thing. It is probably the best way the Workflow acquisition could have gone, because now it’s completely tied into the OS. It may have a different name, but it will always be Workflow.
The features announced was really how the OS interacted with the shortcuts, and how Siri is more intelligent because of it. Not the voice Siri, but the computational Siri that can understand you, and suggest things.
It will, of course, require developers to open up about different user actions in their apps. Which will allow Siri to analyse their usage, suggest them later on, and also maybe for users to build with in the Shortcuts app.
There were some intriguing demos for the suggested actions, which is not something I really say, because usually they are based around unrealistic or ideal situations. But because Siri is in your phone? I’m not sure how to explain that, but it knows about you, what you’re like, and also the environment around you. Which is why it can suggest you turn on Do Not Disturb when you go to the cinema.
They also showed an example of a regularly occurring event, such as buying a coffee in the morning. Maybe not everyone buys a coffee from an app on their phone every day, but I use the Starbucks app every time I go. And that could easily be at least 3 times a week. So if it learned (or just used Maps) the location of Starbucks, recognised it was associated with that action, that would be very helpful! It’s certainly something I feel would be capable, and it’s not a usual Siri feature that’s nice to think about, but never use.
It does get more advance though, and that’s with the Shortcuts (Workflow) app. I conceptualise it by being similar to Scenes in HomeKit, where you could say a phrase such as “Good morning”, and then Siri can perform a bunch of tasks to set you up for the day. Maybe it sort of encompasses the automation of HomeKit?
I’ve already been playing around on the iOS 12 beta, and while I’ve already been suggested some actions, like enabling a alarm, messaging my girlfriend, and even adding a new to-do in Things, we don’t have the Shortcuts app yet. That will come in a later update via the App Store. So I will definitely have to write more about that in the future. But from the keynote, it looks like they’ve added the Apple-style to Workflow, which will definitely make it feel easier to use for general users.
One of my questions though, is how well suited is this to a general user? I will be very keen to see if it’s a widely adopted feature, and even if the Shortcuts app with custom actions might not be, I see the Siri suggestions being a bit hit.
Read more of my coverage of WWDC here.
After watching the Keynote, I was thoroughly impressed. While there still isn’t a dark mode for iOS, I can imagine it coming soon. And there are a lot of cool things that were announced.
While watching the event, I took a note of the top 4 for each OS, excluding tvOS, because who cares?
So here they are:
- Siri Shortcuts
- Screen Time
- Automatic Workout Detection
- Walkie Talkie
- Interactive Notifications
- Dark Mode
- Dynamic Desktop
- Mac App Store
I plan on doing some writing about the new features, but in more of an opinionated way, rather than a simple informative guide. You’ll find these with the WWDC 18 tag.
With iOS 12’s imminent announcement, I thought I’d prepare myself for a new way of using my devices.
For months now, I’ve been trying to refine my use of my devices, apps, and services that I use. But I think a different approach is needed, and I hope that future OS updates will help me along the way.
The method I’ve been using for a while is quite a harsh one, where I disabled notifications, and everything associated with them, on nearly all applications. Along with getting rid of some apps/services that I don’t think provide any value.
But while I think this has been a step in the right direction, I don’t think it’s a particularly accurate way to achieve my goal of adapting my devices to my needs, and for it to provide me with the most value as possible.
That’s why I’ve now done a complete reversal and turned on all the notifications, and possible distractions on my iPhone. In the short term, I’m hoping this will let me find out where I don’t need to be spending my time and also see if there is any value to them. I mean, I know notifications can be valuable, but I want the right balance. And by turning them all off, I’m potentially missing out.
So tonight, I’ve already gone through a few apps to disable types of notifications, and in some cases, just deleted the app entirely. For example, I have an app for a restaurant I go to maybe once every two months, but they send at least one offer notification every single day.
What I’m majorly hoping for in the next iOS update, are pretty minor things. With the ability to group notifications having the highest priority. I can’t even bear thinking about the types of apps that would benefit from this, because it’s probably all of them. I also think there can be improvements made to the way notifications are visualised. Because even grouped, it’s still just a list.
Then there’s priority, not all bits of information are equally useful. And if they are, you might not need to know about it right now. Things like iMessages are more important than likes on an Instagram post, and work emails are certainly not relevant out of work hours, or maybe even a work location. So there’s a lot of work that can be done here, involving sorting, filtering, and queueing/snoozing.
If all of these issues are “resolved”, then I think the way devices are experience, and even used, will change quite a lot.
There’s also one more tool that would be able to help focus your device usage on a bigger scale, and that would be a way to monitor/visualise your usage, or habits, system-wide. Of course, you can kind of track this by using the battery analytics that tells you the time on screen for apps, but I want it better, and more in my face. Because more insight can only be better.
This is, of course, a long-term goal, and maybe more of a process. But I plan to write about my journey of focusing my usage of devices, and in general, refining my life to maximise value.
I have a few more ideas that I want to try soon, so you’ll find these here only blog as well.