Will your enterprise app dev teams visit Apple's Dynamic Island?

Apple has successfully turned the part of the iPhone display everyone ignored into the place to be seen. This is an opportunity to build engagement — will you exploit it?

Apple, iOS, iPhone, AR, Dynamic Island, iOS 16

When you think about it, Apple’s newly introduced Dynamic Island feature is likely to become a critical interaction point between apps and users of the iPhone 14 Pro and Pro Max phones that support it. It looks like a great opportunity for B2C and B2B app experiences that can build digital connection with end users.

What is Dynamic Island?

Dynamic Island turns Apple’s boring notch, behind which the FaceID camera sits, into an excitingly interactive area that lets users monitor important app activity (such as delivery service ETA, music playback, and sports scores) and take control of apps.

It’s possible to hold two apps active in Dynamic Island while working in another app. (For many, that will come down to doom-scrolling Twitter while waiting for pizza delivery. Let’s not judge.)

In typical Apple style, the Dynamic Island presents itself in a range of different shapes with fluid animations. The information it presents should be seen as glanceable data — you won’t be reading books or engaging in more sophisticated tasks in Apple’s Dynamic Island. You have other apps for that.

Apple will enable the feature, which depends on new Live Activities frameworks, in a later release of iOS 16. “Once ActivityKit is available in an update to iOS 16, you'll be able to implement Live Activities and interact with them in Dynamic Island,” Apple told developers.

Dynamic Island isn’t universally available. It will only be supported on Apple’s most expensive iPhone models and exploits the pill-shaped area that supplants the notch.

To achieve this, Apple redesigned its TrueDepth (selfie) camera, making it smaller. It has also put a proximity sensor behind the display. What’s important about this is that if you are looking at your phone this small interactive area is likely to grab your attention, as it is also where your FaceID sensor sits.

How does Dynamic Island work?

The feature relies on the OLED screen, which can switch off individual pixels, turning them black. This enabled Apple to design a user interface for the area that changes shape to reflect the content of what it does. This space is then used to provide small nuggets of information, such as those described above or data relevant to your app.

The Live Activity frameworks Apple made available to developers are designed to give you active insights into app-related information. That means ride sharing and delivery, sports scores, timers, flight or transit times and more.

What Apple said about Dynamic Island

Apple wheeled out Alan Dye, vice president for human interface design, to explain the feature, which turns a section of the display many saw as dead space into a section not to ignore. With customary modesty, Apple said Dynamic Island “blurs the line between hardware and software.”

It explained that it does so by sharing real time alerts in different shapes to reflect what kind of action is taking place. It’s not like Notifications, which also exist, but is designed to support ongoing background activities.

In a statement, the company also said:

“Without impeding content on the screen, the Dynamic Island maintains an active state to allow users easier access to controls with a simple tap-and-hold. Ongoing background activities like Maps, Music, or a timer remain visible and interactive, and third-party apps in iOS 16 that provide information like sports scores and ridesharing with Live Activities can take advantage of the Dynamic Island.”

In other words, it works to combine notifications, alerts, and activities in one highly interactive space.

What sort of activities does it support?

Apple described several uses of Dynamic Island during its presentation. Different uses present themselves in different shapes, reflecting the information they're attempting to present.

To support multiple apps, the island divides, presenting itself as a small pill shape to the left with a circle on the right. These can then display different information from different apps. When tapped, the chosen item gets larger so you can interact.

The company showed off a few ways people could use the feature. It is important to stress that Apple has also created frameworks to enable developers to introduce support in their apps. In other words, the examples Apple demonstrated are not the only ways in which the feature may be used.

  • Apple Pay payment confirmations – in this case, the Dynamic Island presents itself as a square shape to confirm payment has taken place.
  • Maps can use the feature to show directions.
  • Music playback controls, song titles and so on. When you are playing music but quit the music app you’ll see a small cover art image appear on the left side of the Island with a live music waveform on the right.
  • Tracking rides or deliveries.
  • Timers
  • Sports scores.
  • Battery life, charging status, and privacy indicators.

What Apple has also done is create a section of display real estate that promises to deliver the highest casual engagement available anywhere on the device. I think this may be an opportunity for enterprises that can think creatively about how to use this space to create highly useful experiences with their iOS apps.

It's also important to note that the Digital Island is shown on a device with an always-on screen, which means you can pop your phone on a desk in a meeting and still chance a glance to keep up with sports scores, for instance.

Live Activities and Activity Kit

Apple’s frameworks to provide these interactions are called Live Activities and Activity Kit, used in conjunction with WidgeKit and SwiftUI. “Live Activities display and update an app’s most current data on the iPhone Lock Screen. This allows people to see live information they care about the most at a glance,” Apple explained.

While not available when iOS 16 ships, they will be introduced in an update later in the year, the company said. Live Activities will exploit WidgetKit and SwiftUI to display information, while Activity Kit will handle the life cycle of that data – a Live Activity can only be active for up to eight hours, but can remain on the LockScreen for four more hours after that.

Live Activities aren’t apps in themselves: They work in a sandbox and don’t enjoy unique network access, which means the information they present is provided from within the relevant app.

[Also read: 10 lesser-known iOS 16 features to make work easier]

It is going to be important to ensure the experience offered by any app supporting Dynamic Island is sufficiently meaningful that people remain engaged with the information provided.

I don’t really believe Dynamic Island is meant to be a space for marketing messages, no matter how subtle — it’s an active space for active information that users want and need. Defining where that relationship is in terms of your own company interaction is going to be a task that’s unique to every business.

Developers can find out more about Live Activities in an Apple developer note that is available here.

What comes next?

9to5Mac is already wondering how this feature might work on an iPad. It’s also relatively easy to imagine this as the kind of tech that may once have been destined for the Touch Bar on a Mac.

But, for me, Dynamic Island also feels like the kind of rich content, immersive interactive experience space you can imagine being made visible within AR-based user interfaces. That potential may be the biggest reason for enterprise developers to explore Apple’s new Live Activities frameworks as it’s plausible these will form part of how the company will enable app interactivity in AR space, particularly within wearable experiences.

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Copyright © 2022 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon