WWDC 2018 Updates

We are very excited we are attending WWDC 2018 in San Jose this week. We will be keeping you posted on all things #WWDC18. Don't miss out!

WWDC 2018 Updates - Apple´s Worldwide Developers Conference

Day 4 - Machine Learning and Customer Focus

It was the final day at NextDoor, with some great sessions ranging from test driven development and developer tools to Unity and ARKit.

From WWDC I spent more time on machine learning sessions and continued experimenting with Create ML, this time with text classification. I also started working with the more advanced (and not significantly less approachable) Turi Create, Apple's open source Python toolset for creating Core ML Models. Turi Create supports a wider range of Machine Learning tasks including object detection, image similarity, style transfer, recommendations and clustering. I was impressed at how I could generate an object detection model locally on my Mac just as easily as I could create a simple text classifier in Create ML. Training times are significantly longer, but being able to do this on a single MacBook at all is ridiculously impressive.

The focus of Apple's ML tools on tasks to perform, rather than on the specific ML algorithms is what I appreciate about them most, we can think in terms of what feature we want to provide the user, and the data we need to enable it first and foremost.

This user centric approach is common on Apple's platforms and is something that resonates with us at Marino Software. One of the very first statements by Tim Cook at the keynote on Monday was that "[Apple aims] to put the customer at the centre of everything". This was about more than just providing new user features as they proved with the very first technical demo showing performance enhancements on older devices and the first topic in the state of the union (the developer's real keynote) being all about security and privacy. Performance, security, and privacy are all pillars of good customer app experiences.

Of course there were other new features too, like Siri Shortcuts, and how they are implemented reveals more about Apple's focus, doing more on device whereas competitors are satisfied to offload work to the cloud. Core ML likewise runs locally on device, with Apple's end to end control of the platform, from the chips, to low level frameworks like Metal, and high level SDK features all working together to provide highly performant new features without compromising user data.

It has been a great week at WWDC, tomorrow I'll be getting ready to head home equipped with a tonne of new tools and knowledge, all focused on providing better customer experiences. There's more sessions left to cover though, those on new OS logging features and changes to Auto-layout to bring the advertised performance boosts to our apps and design focused sessions like 'Designing Fluid Interfaces' will keep me busy on the long journey home.

Day 3 - Training Day

There were some great sessions at NextDoor discussing architectural design patterns and philosophies that I really enjoyed and learned a lot from. Besides the conference sessions though, once again I wanted to take a deeper dive into one of the main topics from the keynote and start writing some code.

Today I investigated CreateML, Apple's new framework for training machine learning models natively on the Mac, with Swift. The introduction of CoreML last year meant we could take existing models, convert them to CoreML (.mlmodel), incorporate them into our apps and start using them to make predictions about images and natural language (e.g. classifying the type of flower in a picture or the sentiment of a sentence.)

The promise of CreateML is that we can now just as easily create our own models, in just a few lines of Swift, quickly and efficiently right on a Mac with no cloud services required.

And it certainly lived up to the promise, in just the handful of lines of Swift you can see above I created a machine learning model from tabular data (which is supported in addition to images and text) using the entire Irish property price register from 2017 in CSV format. It took less than a 10th of a second to process the 51,639 properties on my MacBook Pro and now we have a .mlmodel file that can be added to an app and used with CoreML to make predictions.

One powerful feature to note here is that while a number of algorithms are supported and you can chose and configure them depending on your dataset you can also use the MLRegressor class as I did here and let the system run all the algorithms and choose the best one (in this case it chose boosted tree regression).

Also, check out the colour scheme in that Swift playground, a couple of days ago I was sticking to my white background and dark text, but Mojave has sold me on dark mode!

Day 2 - Investigating Siri Shortcuts

Today I took in sessions from NextDoor, AltConf and WWDC itself, learning about advanced techniques for Xcode build configuration, debugging, testing and training machine learning models.

But what I was most excited about was getting to dig into the new Siri capabilities that were announced yesterday and start writing code and building features with it.

Yesterday, my response to the Siri announcement was that it was "more like defining a macro rather than Siri understanding the real meaning of your words" but I think Apple undersold how powerful the new features are. Sure, you can donate (that's the language used by the SDK) a simple action to the system, described in an NSUserActivity, and have the user provide a specific phrase to invoke it, but you can also define completely custom Intents to use in shortcuts - complete with different parameter combinations and response templates.

Very quickly, I was able to add a 'check balance' Siri shortcut to one of our apps, where the user can define which account to check. 

This can be invoked by the user at any time using a custom phrase and will also be suggested to the user by Siri including on the Siri watch face on Apple Watch even though this particular app doesn't have a watch app.

Right now, it doesn't look like these custom intents can be used in all the same ways the system provided ones can (i.e. they're just for shortcuts.

But it's easy to imagine in future releases they'll gain more capabilities and we'll be able to provide the system with enough information about the intent for user to be able to perform the actions without needing to use a specific phrase.

The custom intent definition is reminiscent of an Alexa Skill definition, the difference being that with Alexa we can provide a large number of sample invocations that mean the user has more flexibility in how it calls your skill.

Day 1 - Stocks App, Seriously?

While I’m getting ready to dive into the real keynote from a developer’s perspective, the platforms state of the union, here’s my quick take on this morning’s keynote:

No Device Left Behind

iOS 12 will support all the devices currently supported by iOS 11, and Apple say they’re specifically focusing on performance improvements for those old devices. Great news for members of the real world who don’t get the highest end new iPhone every single year. 

This, along with Apple again showing how quick they are to get everyone to update, should help us drop older OS versions sooner as we won’t fear cutting off users of popular devices.

ARKit 2.0

This saw my first wishlist item ticked off with 3D object detection, but even more exciting is the multi-user shared experience support that was demoed. That could be a game changer for AR.

Siri

There had to be big news about Siri and thankfully there was. Shortcuts for Siri are more like defining a macro rather than Siri understanding the real meaning of your words which was SiriKit’s original pitch, but it will enable us to bring Siri to almost any app. 

If we take the example of a mobile network self care app, we can now create a shortcut to say ‘check my balance’ or ‘top up my credit’. Surfacing those shortcuts on the Siri watch face will be especially powerful I think.

The shortcuts app for chaining together multiple actions from different apps looks like what we’ll be getting out of the Workflow acquisition. It looks promising, will definitely be one to dig into more a bit later.

Managing Your Screen Time

Surfacing data about how much you’re using your phone and empowering you to take control of it is something we at Marino Software have helped pioneer with the Vero app and it’s great to see Google at I/O and now Apple bring system level controls for this.

WatchOS

New abilities for interactivity in notifications look great. More and more I’m coming to think that notifications are the watch apps we want, not actual watch apps. 

I like the idea of getting a low balance reminder and being able to select an amount and top up all from a notification on your watch, that’s the sort of thing we’ll be able to bring on watchOS 5.

WebKit was announced for watchOS, which may seem like sort of a strange one (who wants to read a web page on their wrist?) but there have been times when I’d have loved to be able to follow a link or have a quick read of something that it was a pain to go take out the phone so I think this is a win. 

Could we have WebKit on tvOS next please?

Another Wishlist item that got checked off was better audio support, looking forward to improved apps from Overcast and Castro.

Apple TV

Speaking of tvOS, there was a whole section on it … but I can’t remember them showing off anything by way of new developer features. I’m hoping for more in the state of the union for sure.

MacOS

I couldn’t believe the Stocks app was getting so much air time in the keynote and was worried at first that they were really struggling for filler but in the end it all made sense. They’d ported these apps from iPhone to iPad and the Mac using the much rumoured but even after today not terrible well understood new cross platform project. 

I strongly believe that apps on different platforms should have different UIs and I really don’t want to see a bunch of iOS apps showing up on Mac, so I’ll need to see more detail on this.

Probably at next years' WWDC before I’m sold on it being a good thing.

Dark mode on Mojave looks great, I’m generally not a fan of dark modes and I like my white background and dark text in Xcode but I know a lot of people will love this, and it looks like they done a good job on it.

Being Next Door

I’ve watched this from next door at The Next Door conference, a much more relaxed atmosphere, with much less (no) queueing, it was much cheaper, the lunch was much nicer, and it was easier to get talking to new people. It’s been great so far and I’m looking forward to the rest of the week here.

I’m really glad I’m here, but I’d still rather be at WWDC!

!@THEqQUICKbBROWNfFXjJMPSvVLAZYDGgkyz&[%r{\"}mosx,4>6]|?'while(putc 3_0-~$.+=9/2^5;)<18*7and:`#

Need a Quote for a Project?

We’re ready to start the conversation however best suits you - on the phone at
+353 (0)1 833 7392 or by email