Blind Touch-Screen Launcher

This is an attempt to sketch-out my concept for a blind/speech recognition UI for mobile phones.

Prior Art

Blind Navigation

Current user interfaces for hands-free or handicap accessible interfaces rely heavily on reading button labels and links available to the user.  This is highly inefficient as it requires serializing all of the commands on the screen into a linear list.

Slide Rule, a research prototype from the University of Washington, ameliorated the time required by grouping options alphabetically.  Users could then scroll down the list to the correct alphabetical character, thereby greatly reducing the number of list items needed to reach the target application/function.

Filtered/Incremental Search (think iTunes)

The above approaches are required for exploration of unknown functionality.  However, neither lends itself well to habituation nor scaling to a large/application-name space.  Older phones, with their limited screen real-estate and navigation options, used filtered or incremental search to quickly find a contact.  iTunes does this as well, enabling the user to quickly restrict the number of total songs in a playlist.

UnistrokeIllustration of unistroke alphabetOlder resistive touchscreens required the use of a stylus to input print-like characters.  Palm’s original Graffitti writing system used a uni-stroke alphabet to simplify recognition tasks.  This also streamlined input by reducing each alphanumeric character to a single stroke.  While a good adaptation at the time, throughput was slow compared to T-9 or multi-touch keyboards and the stylus was cumbersome.


I believe that a combination of the above older technologies would create a launcher system that would come close to the speed of visual interfaces, especially after habituation.  Furthermore, this system adapts well to voice-control/hands-free car navigation.

I am proposing a launcher using filtered command lists.  The functionality would be something of a restricted subset of launchers such as Spotlight, Quicksilver, Gnome Do, Enso, and Ubiquity.  Upon tapping the “home” hardware key, users would enter a command launch mode whereby they could use a Slide Rule or VoiceOver system to explore the list of available commands/apps.  After the user becomes accustomed to the commands available, they can then use the unistroke system to quickly filter the list to a single command/app.


To illustrate the concept and prove the increase in efficiency, I will walk through the process of launching the music application.

An alpha-level prototype of the unistroke interface is available online; one with instructions for use with a desktop browser and another full-screen version that is runnable on the Firefox OS emulator.  At the moment, both require a data connection and neither implement the full alphabet.

  • a-z alphanumeric characters entered via unistrokes/braile = bold character = A, B, C,… , Z
  • “Home” hardware button = H
  • spoken word = “quoted text”
  • gestures = italics
  • State/application/namespace = [namespace name]
  • List of available items or commands in current [namespace]
    • Alice
    • Allen
    • Bob
  • Available applications or commands after typing  A, L
    • Alice
    • Allen
  • Available applications or commands after typing  A, L, I
    • Alice

Music Example

H -> [home namespace]
  1. calculator
  2. calendar
  3. contacts
  4. email
  5. facebook
  6. maps
  7. money scanner
  8. music
  9. notes
  10. podcasts
  11. text message
  12. weather


H -> [home], M “hm”,
  1. maps
  2. money scanner
  3. music
  4. text message


H -> [home], M “hm”, U “music” *

  1. music


H -> [home], M “hm”, U “music” * , open gesture -> [music app namespace]

  1. album
  2. artist
  3. genius
  4. genre
  5. radio
  6. shuffle
  7. song name

*Note that there is no need to speak the last letter name as the last item is “music”

Summary of steps required to open music application:

iOS/Android Visual: 2
H -> [home], tap music icon -> [music app]

Filtered Commands: 4
H -> [home], M “hm”, U “music”, open gesture -> [music app namespace]

Slide Rule/VoiceOver: 10
Home {hardware button} -> {scroll} C, E, F, G, M -> Maps, Money Scanner, Music, open gesture -> [music app namespace]


I have demonstrated how mixing various input modalities can create a compromise interface for single-handed, blind, and voice input to use a smart phone.  The filtered command approach reduces the number of discrete steps required to open an application from 10 to 4, close to the speed of navigating the “native” visual interface.

Powered by WordPress. Designed by WooThemes