Top 10 Development Picks from Google I/O 2018

May 21, 2018

From emails that pretty much write themselves to digital wellbeing initiatives, the recent Google I/O keynote had something for everyone.

As developers however, we’re most excited by the bits that make OUR working lives more simple and productive – and in turn, better for our customers. Here’s our top ten:

  • Android things 1.0: officially launched ahead of the conference, Android Things is Google’s very own IoT platform, promising to help developers to prototype new ideas for a variety of uses. The platform is being built into speakers and smart displays from major manufacturers so it’ll be very interesting to see how our two main areas of interest – hardware and embedded technology – can be used to benefit Coderus customers.
  • Jetpack: this set of handy tools and components consolidates the existing support library as well as adding additional tools to aid with app navigation, background job scheduling, and more.
  • Material theming: an innovative system that looks to overcome a heap of everyday pain points. It’ll help us build quality apps more quickly, streamlining things like uploading, modifying and applying styles within material guidelines while maintaining brand identity.
  • Adaptive battery: another smart move from the Android team, the AI monitors your battery usage on your phone and closes down apps that haven’t been active for a while. Android says the feature offers an average 30 per cent reduction in CPU usage when apps are woken up.
  • Google Duplex: the conversation between a real person and its AI-powered assistant had our jaws dropping. Takes machine learning to the next level.
  • Android app bundle: new publishing format will help squeeze the APK size of your application making it more appealing for download. While a catch up of iOS (of sorts) it’s great to see this for smaller footprints on the device.
  • Google Lens: makes use of AI with real time multiple object recognition and search. Pan around your table for example and Google Lens will tell you what books are on your table, and what reviews they’ve had – and that’s just for starters.
  • Actions: based on usage history, actions link to app data from other parts of the OS helping users jump straight to what they need. Actions are also context-smart, so if you plug in a pair of headphones, Actions might suggest a podcast app.
  • Navigation: for Google maps will definitely ease UX – and maybe make the journey more interesting than the destination.
  • Slices: allow you to embed parts of your application into other apps or the OS itself. These interactive snips let you see content of interest without having to open the full app, saving taps and time. Currently limited to embedding the app into Google Search results, but lots of possibilities once extended.

What’s next for the development community? Make sure to book your place at our next Apple Live Stream event to find out! Before the Live Stream, there will be ample opportunity for networking and the chance to meet the team as we demo our own innovations.

Event: Apple Worldwide Developers Conference
When: 4th June 2018, 16:30 – 20:00
Where: Ipswich Waterfront Innovation Centre


Industry Accreditations