Voice First

Build a voice-first UX to record body temperature with Apple Health

By Nicholas Gracilla  ·   March 9, 2021  ·  7 minute read

When designing a voice-first experience, developers must create guard rails around all the many unexpected edge cases.

(Not interested in voice-first software development? Skip the development story and download the Temperature Shortcut app right away.)

It’s winter 2021 — the depths of flu season and the beginnings of the global pandemic recovery. Battling a cold, it made sense to me to record my temperature somewhere, so I’d have my temperature history should I need to talk to a doctor. Apple Health comes to mind: a secure, private, centralized data store for health information from the vast app ecosystem of fitness, wellness, and sleep-focused apps. Apple Health includes everything from mindful meditation minutes to workouts, estimated V02, and heart rate recovery to menstrual cycle tracking. Apple’s fanatical attention to security and privacy puts users in control — users grant apps fine-grained access on a metric-by-metric basis. Moreover, the Health app’s reporting and graphing are top-notch: it’s easy to see data trends over time.

Add temperature data to Apple Health by hand

The Health app is not merely a data storage and reporting system. Although designed to receive data from other apps, users can also add data to it directly. Browse > Body Measurements > Body Temperature provides historical graphs, helpful information from the Mayo Clinic, app access, and a link at the top, Add Data.

That’s a significant improvement over yet another sticky note, soon to be lost or discarded. While the Health app privacy and security architecture is an enormous value, something is uninspiring about this screen. After taking my temperature, I don’t want to stop what I’m doing, click through multiple menus, and manually type into a form, that’s for sure. It should be as easy as talking with Siri on the phone or Apple Watch. The entire user experience should be voice-driven. It’s 2021, after all! Weren’t we promised jet packs, flying cars, and shrewd AIs?

The entire user experience should be voice-driven. It’s 2021, after all! Weren’t we promised jet packs, flying cars, and shrewd AIs?

Shortcuts: APIs for the masses

I suspect Shortcuts are among the most powerful yet most ignored features of iOS. Shortcuts allow regular users to write simple, visual scripts that connect apps and systems to get things done. They can be triggered by an event or condition, by the user directly, or by the user, through Siri.

Behind the scenes, Shortcuts are no less than an Application Programming Interface (API) into the feature sets of native apps and the iOS service architecture. Developers can expose their apps’ best features and enable their customers to use them in innovative ways.

For example, a user can run a Shortcut that: checks the user’s current location (Geo-location services); and when at work (Map API), can determine the expected drive time home (Map API); alert the user (Notification services); send a text message to a spouse (Messages services); using a randomly customized set of message templates (Notes API).

Users create Shortcuts from simple, visual cards and configure them to suit their needs: instead of driving, perhaps I use public transportation, or I bike or walk. As long as the feature exists within iOS services or an app, and as long as it has an API for use with Shortcuts, users can make flexible workflows that suit their needs.

Shortcuts and Apple Health

Apple has built the Health app with a rich API available to Shortcuts: one can log workouts and sleep schedules, find recorded health data like weight, heart rate, or headphone audio levels (among dozens of options), and log new health data. It’s this last feature that’s the foundation of this Temperature data project. I realized I could use Shortcuts to record the data instead of menuing through the Health app and typing into a form!

Iteration 1: add data to Health from Shortcuts

To start, we’ll use the Log Health Sample action of Health. We’ll configure it to record Body Temperature based on what the user provides and record the date and time, too:

This workflow is a start. We can save the Shortcut to the desktop as an app icon. But when we click on it, we still get a form — this time from Shortcuts:

Although the input value says, “Ask Each Time,” in Shortcuts, this means typing text. And while it works — Temperature data is saved to Health with a date and time stamp — it’s not what we wanted. Let’s look more closely at data entry in Shortcuts and Siri.

Iteration 2: add data to Health with Siri

There are two methods of collecting voice input with Shortcuts: Dictate Text and Ask For Input.

Dictate Text (from Documents services) transcribes what the user says into text and passes it along to Shortcuts’ next card. It is explicitly voice-focused.

Ask for Input (from Scripting services) is more sophisticated: it can accept inputs and handle them as different types — from text to dates, times, URLs, or numbers. It can also parse input in various ways. When launched from the home screen or Shortcuts, it collects input via a form. When launched from Siri via voice, it will transcribe what the user says.

There are always many approaches to solving a problem. I’ve noticed that Dictate Text over Siri often prompts, unasked, “What text?” which interrupts the voice-first experience. Let’s work with Ask for Input instead.

Iteration 3: creating guard rails for voice-driven experiences

This is starting to work. But when developing a voice-driven experience, we need to plan for the unexpected. What if Siri hears a child in the room who shouts “Unicorn!” right at the temperature prompt?

Well, we can set the input type to Number in the scripting card, which will cause an error if the input is not a number. That can be nicely handled with the Shortcut failing with a reasonable spoken message, like “Sorry, I didn’t understand that.”

What if Siri misunderstands a mumbled number, as 50 or 200? Well, we can set a conditional range around the voice input for expected values — say, 95 to 105.

And, as voice-first developers, should we be so confident with the voice-to-text translation? A lot can go wrong, from a noisy room to a heavy accent or a momentary WiFi outage. Since Shortcuts records this data in Health, We should probably tell the user what we think we heard and allow her to confirm before saving.

When developing voice-first experiences, plan for the unexpected. Set guardrails around what reasonable inputs should be, and consider confirming them with the user, too.

It’s here, in the guard rails, where the majority of complexity — and value — occurs. We might, for example, prompt the user to consult a doctor if a temperature is dangerously high or low.

Caveats

Shortcuts run in many environments

Running Siri Shortcuts requires iOS 12 or later on iPhone, iPod touch, HomePod, or Apple Watch Series 3 or later, and iOS 12 or iPadOS or later on iPad. Shortcuts that require an app to open might not work on HomePod and Apple Watch. (https://support.apple.com/en-us/HT209055)

I don’t have an iPod touch or HomePod, but I quickly discovered that my Shortcut is shared to my iPad, which doesn’t have access to Healthkit data—and so it fails in that environment. Consider the broad number of domains in which a Shortcut might run:

  1. In the Shortcuts app, directly, by touching its icon.
  2. From the iOS Home Screen, by saving the Shortcut to the home screen.
  3. From Siri on an unlocked iPhone, by asking it to run the shortcut by name.
  4. From Siri on a locked iPhone, by asking it to run the shortcut by name.
  5. From Siri on Apple Watch, by asking it to run the shortcut by name.

For a production-quality application, we need to test each environment. Ideally, in my iPad case, the Shortcut would acknowledge its requirements and let the user know why it can’t run.

Bugs and operating systems

In iOS 14.3, there was a bug when running this shortcut on the phone via Siri. Dictate Text, instead of using the prompt, would say, “What text?” Which was the wrong choice. This didn’t happen on Apple Watch, however. Apple fixed the “what text?” in iOS 14.4, but their famously opaque update logs don’t mention it explicitly, and it seemed to create the problem on Apple Watch. This is a good news / bad news caveat: the platform is evolving rapidly, but there’s not reliable developer documentation around it.

Get the Shortcut

Download the Apple Shortcut from iCloud. When you install and run this, you’ll need to give it access to your Health data to record your temperature.



Comments? Questions?

Feel free to send us a note; we'll get right back to you.