How Apple Applied Nielsen’s Heuristics to Improve the Emoji Experience

I recently brushed up on Nielsen’s heuristics with The Design Review’s podcast Nielsen’s Usability Heuristics: Part 3 and felt years of repressed frustration to bubble to the surface. Why? Finding the right emoji in my iPhone is frustrating. Very. Frustrating. I know exactly which emoji I want to use, but I struggle to remember which icon it lives under in the emoji keyboard. The Design Review’s podcast reminded me that finding emoji is frustrating because the keyboard design violates Nielsen’s principle of recognition rather than recall.

Luckily, in doing research for this blog post I learned that Apple has already released improvements to alleviate this frustration. I hadn’t reaped the benefits because the predictive emoji feature doesn’t work with SwiftKey, my primary keyboard. Also, engaging the emoji replacement feature requires a series of distinct taps that I, and likely many others, was not privy to.

This blog overviews recognition rather than recall, and applies the principle to the stages of evolution of the improved emoji experience in iOS.

Nielsen’s Heuristic Principle: Recognition Rather Than Recall

Back in the early 1990’s, Jakob Nielsen shared his ten heuristic principles in the book Usability Engineering. The ten principles are widely upheld as doctrine in UX. A Google search for “Nielsen recognition rather than recall” provides about 372,000 results and a Google Scholar search provides about 46,400. Nielsen defines the principle recognition rather than recall as follows:

Minimize the user’s memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.  – From nngroup.com

In other words, visual cues should make the workflow easy and obvious. If the interface itself does not provide obvious visual cues, then instructions should be easily accessible. If the user has to remember something in order to successfully perform the task, and the instructions aren’t easy to access, the experience is frustrating because it does not embrace the principle. Below I describe how the three stages of emoji evolution in IOS move away from recall and toward recognition to make for a better experience.

Stage 1: Which icon is the emoji under?

This stage is the frustrating purgatory that I have been unknowingly stuck in because iOS emoji replacement isn’t obvious and predictive emoji only works with the iOS keyboard. In this exasperating realm, I know exactly what emoji I want. In order to find the emoji, I either have to remember which icon the emoji lives under, or page through the growing number of sections to find it. Some of the placements make sense, while others do not.

For example, the smiley face icon lets the user recognize that is the place to find other smiley faces. However, other sections of the menu rely on recall. For instance, I struggle to understand the logic behind which emojis live under the lightbulb icon and therefore struggle to recall what lives in that section. What common thread runs through a gift box emoji and a hole in the ground emoji? And why is a diamond under the lightbulb icon, but a diamond ring is under the smiley face icon?

Each time I struggle to locate the perfect emoji that conveys more meaning than mere text, I start out shocked, become saddened, and then transition to frustration and anger. It just so happens that the first page of the smiley face menu captures my experiential journey down a single column.

emoji_recall_1

Stage 2: Emoji replacement

In a movement away from recall and toward recognition, iOS 10 includes emoji replacement. Each time the feature is activated, the keyboard identifies words that can be replaced by emojis. Once the replaceable word is tapped, the keyboard provides emoji options. This feature makes it easy for the user to recognize which emoji to select. They no longer need to recall under which icon the desired emoji lives in the emoji keyboard.

Although emoji replacement utilizes recognition, the user needs to be aware of, and recall, the 5-tap process needed to activate it. The iOS and emoji keyboards themselves do not do not offer clues about the existence of emoji replacement. Therefore, the user must learn about the feature outside of the interface. Once the user is aware of emoji replacement, they can find Apple’s clean and simple instructions. However, the instructions are not accessible within the iOS keyboards. This goes against the portion of the principles that states “instructions for use of the system should be visible or easily retrievable whenever appropriate.” A search for iOS emoji replacement reveals there are many how-to guides on the topic, suggesting it’s not easy to figure out or recall the steps. Simple mentions of emoji replacement, such as this one in Wired, do not offer enough information for the user to learn how to operate the feature.

emoji_recall_2

Stage 3: Predictive emoji

The newest iOS 10 emoji feature, predictive emoji, exemplifies recognition rather than recall and gives clear visual cues to users without any additional input. Predictive emoji utilizes the iOS keyboard to recommend emojis based on what word has been typed. Recognition is evident in this feature because the emoji appears in between the keyboard and the message log every time a replaceable word is typed. Under this arrangement, options are even available to users who don’t know what emojis are, if such an iPhone user exists. The user doesn’t need to recall how to activate the feature, what emojis exist, or which icon they live under in the emoji keyboard. A user would only need to resort to recall if they want something other than the suggested emoji.

emoji_recall_3

Over time, iOS updates have made the process of selecting the perfect emoji go from frustrating to seamless. Apple incrementally applied Nielsen’s heuristic recognition over recall in the user experience, and now the iOS keyboard recommends emojis as you type. Users no longer need to remember what page their desired emoji lives under, or how to activate a feature to recommend emojis. The next great step iOS could integrate to improve the emoji experience is to let users add their own emojis, like in Slack or HipChat.

« Prev Article
Next Article »