Usability Assignment: Barnes & Noble Nook Simple Touch – Reading Interface (Good Design)

The Barnes & Noble Nook Simple Touch (e-reader) is both my new favorite electronic device and an exemplar of good design. Although it was my first e-reader, it was designed in such an understandable way that I was able to figure out how to use it immediately with only an occasional reference to the instruction manual.

The Barnes & Noble Nook Simple Touch’s ebook interface. Image via barnesandnoble.com 

The Nook combines conceptual models of print books and touch-screen electronics in its device, using common and expected design techniques to help the user operate the ereader. To wake up the Nook, the user presses a button located on the bottom center of the device, similar to the home button placement of other handheld electronic devices. The user expects the home button to be there and it is raised like a button, an example of affordances. In addition to its central, lower placement, the is shaped like the letter “n” in the Nook logo, which semantically designates this button as the home key. The device also matches the conceptual model of a print book: it is designed in portrait orientation and its pages turn right to left.

Using the touch screen to read has a small gulf of execution. The steps needed to read an ebook are simple and clearly mapped, with simple and easy to remember gestures for operating the touch interface. Single touch actions required to turn pages and call up different menus, likewise single actions are all that is needed to reverse a mistake. To read an ebook, the user can use either the touch interface or the buttons of the side of the device to read. To turn a page using the touch interface, the user just lightly taps the edge of the screen. The right side advances the page forward, the left goes back. This is utilizes natural mapping to help the user use the device: the action to turn the page is located where the edge of the page would be in a physical book. Buttons and touch gestures only correspond to one specific action. For example, only tapping on the left side of the device will turn a page backward, which is both a physical and a semantic constraint since pages are turned from right to left.

The hardware complements the digital interface. Two buttons on either side of the device turn pages back and forth for situations when using the touch screen is difficult or impossible (e.g. when wearing gloves): the buttons on the device make up for the interface failure. The top button advances a page, the bottom button goes back a page. Their position on the edge of the device makes this an easy, natural fix rather than a tricky work-around. The buttons afford pressing: they have a raised edge to distinguish them from the non-button parts of the device, and the user needs to press down firmly to prevent accidentally turning a page. There is a small learning curve with the page-turning buttons. Initially, I thought both buttons on the right side were to go forward and the left buttons to go back, and I accidentally went backward when I meant to go forward a few times. However, it was easy to tell when I made an error (besides recognizing the previous page, the page number is listed at the bottom of the screen so I was able to tell if I had moved forward or back) and just as easy to correct (push the other button). Now, I appreciate this feature because if I have to hold the device in my left hand I can still advance the pages. This is also a good feature for left-handed users.

 To access the table of contents, notes, or change the display, the user just needs to tap the center of the screen. Constraining this control to the center of the screen makes it less likely to be activated accidentally by a user trying to turn pages. Pressing down on a word brings up an additional menu for annotations, such as highlighting, adding notes, sharing and looking up the word with the dictionary. This follows the model of other touch interfaces, like iOS 7, where holding down a word brings up options to edit and share the text. By using similar interface gestures (sort of like a common language of touch interfaces), the user does not need to learn a whole new set of interactions and can quickly navigate the new device. Mistakes are easy to correct: if the user mistakenly calls up the table of contents by touching the middle of the screen, the user just need to tap anywhere outside of the toolbar to get back to the ebook.

The Nook has excellent visibility. Menus are easy to understand by using icons to make the options clear, e.g. including an icon of a magnifying class with the Find tool or using pictures to show different options for line spacing or page margins. The unlock screen includes a text explanation as well also lock icon and an animated arrow demonstrating the slide to unlock action. By using both icons and text in its menus, the interface directs the user to the correct choices.

Options for changing the text display on the Nook. The information is well-mapped by using icons of the potential changes to the display.

The interface also provides good mapping and feedback. When the user wants to change the text display, it shows live changes. Changing the font size from “small” to “medium” shows the result immediately, an example of clear mapping and immediate feedback.

 The Nook exemplifies understandability and has clearly been designed with the user in mind. Even for someone who has never used an ereader, the design of the touch interface and menus make it easy for the user to figure out what she needs to do to read, take notes, or navigate throughout the ebook.

NB: This was originally posted to Norman Doors, a blog for my usability class. The post can be found here: http://normandoors.wordpress.com/2014/02/06/barnes-noble-nook-simple-touch-reading-interface-good-design/