Universal Remote Console -
Prototyping for the Alternate Interface Access Standard
Gottfried Zimmermann, Gregg Vanderheiden, Al Gilman
Zimmermann, G., Vanderheiden, G., & Gilman, A. (2003). Universal remote console - prototyping for the alternate interface access standard. Universal Access. Theoretical Perspectives, Practice, and Experience. 7th ERCIM International Workshop on User Interfaces for all. Revised Papers, 24-25 Oct. 2002, 524-31.
Abstract. A Universal Remote Console is a device that can be used to operate any compatible services or devices. The Universal Remote Console (URC) renders a user interface for the target service or device in a way that accommodates the user’s needs and preferences. The V2 technical committee of the InterNational Committee for Information Technology Standards (INCITS) is currently developing a standard for “Alternative User Interface Access" that includes URCs. This paper describes preliminary design aspects of the standard under development, in particular an XML-based language that is used to communicate an abstract user interface definition for the target service or device to a URC. Prototypical implementations for URCs developed at the Trace Center serve as the basis for experimental research for the standard under development, and will be demonstrated at the workshop.
Today the majority of network-based services and electronic devices such as home security systems, thermostats, copy machines, or public kiosks are operable only through the one built-in user interface (UI). This may change in the near future, as wireless technologies are already invading our environments at home, at work, in public places, and on the move. By employing network based technologies, people could, in the future, remotely control connected services and electronic devices from anywhere, using a wide variety of remote console devices, such as cell phones, PDAs, car radios, wrist watches, and wearable and other computers. To do this however, a means must be found to accommodate the diverse user interface needs of access devices today and tomorrow, without having to implement a separate UI for each of them. For example, a wrist watch cannot accommodate the same graphically rich UI as a desktop computer, and a UI written for a handheld computer will be different from an audio based UI of a car radio.
In a similar manner, people with different types of disabilities find it difficult or impossible to directly use electronic devices and services because the device’s/service’s user interface cannot accommodate the special needs of certain user groups (such as users with visual, hearing, or mobility impairments). People with disabilities therefore have to rely on service and device implementations that are specifically designed for them (or implementations whose user interfaces have been re-implemented for special needs). However, relying on special implementations rather than being able to use standard services and devices has several disadvantages. First, special implementations of services and devices may not be available at all, or may not be provided in the same variety as their mainstream counterparts. This means, that people with disabilities often do not have the same range of choices, and often must put up with services and devices implementing older technologies. Second, it is expensive for those who have special needs because special service implementations are only used by a small fraction of the population, and special devices are produced in small numbers. Third, this approach does not work for services and devices to be used in public places (e.g. ATMs, and information kiosks), as it is not reasonable to provide a multitude of alternate implementations, each offering the same service but tailored for a different user group.
1.1 V2 Universal Remote Console Specification
For many services and devices this could be made possible through the development of a standard for allowing a separate device to be used as an alternative UI (an alternate ‘console’). People could then use their personal choice of devices appropriate to their preferences or abilities to control the various services and devices in their environments at home, at work and in communities.
In order to define a standard for flexible and replaceable user interfaces for electronic services and devices, INCITS has established a technical committee called V2 (http://v2.incits.org/). Members of this committee include representatives from user groups, industry, government, and academia. V2 is currently working on a “Universal Remote Console" (URC) specification as part of a to-be-developed “Alternative User Interface Access" standard. The scope of the URC specification covers the full range of requirements and technologies needed to discover services and devices (targets), to provide secure ways of communication, and to allow the user to explore a target’s functions and control it from any “personal" device. This paper focuses on UI related issues of the specification.
The URC approach is simple though powerful: A person carries their own “personal" device which can act as a remote console to other services and devices (targets). This personal (URC) device is tailored to their specific needs. It may employ graphical user interfaces, voice interfaces, braille-based interfaces, switch-based input methods, etc., or any combination of these. Common examples of the remote console would be a mobile device like a PDA or cell phone, a home or office computer, or ultimately an inconspicuous wearable computer. For people with disabilities, though, it could also be a special adaptive device such as a note-taker (a pioneering form of PDA) or laptop with a Braille display.
This personal remote console is a “universal" remote console, because it lets the user control any target that supports the standard, from the thermostat at home to the public kiosk in the town hall. The target transmits an abstract (modality-independent) user interface to the URC which provides the particular input and output mechanisms appropriate for its user. Target UIs on URC devices may be presented in a variety of modalities, including visual, audio, and tactile, and any combination of these. Input from the user may be accepted via keyboard, mouse, stylus, speech recognition, braille keys, switches, etc.
Under this URC approach, the target manufacturer is not responsible for devising different UIs for many types of access devices and users. Instead, the target manufacturer need only supply the “user interface needs" of their product in the standard form. The user brings their own appropriate UI binding with them.
2 Related Work
The idea of “abstract UI descriptions" goes back to the concept of model-based User Interface Management Systems (UIMS, e.g.,) which emphasize the separation between application logic, dialog control, and presentation component according to the Seeheim model . Myers  uses the notion of interactors to provide a high-level interface for user input in a graphical user environment.
Former work of Trace in the area of universal remote controllers include input emulation efforts , and the development of the infrared-based Universal Remote Console Communication (URCC) Protocol .
The Total Access System uses a personal information appliance (called “accessor") to provide alternative ways of performing keyboard, mouse and/or monitor functions on a computer system .
UIML aims to provide a target platform independent XML-based user interface description language that can be automatically rendered on a diversity of user interface platforms . Cross-platform vocabularies for UIML have yet to be defined. However, work in this area seems to be in progress for UIML2.
The Pittsburgh Pebbles PDA Project  introduces the “Personal Universal Controller" (PUC), which resembles the “Universal Remote Console" in many aspects, in particular the notion of state variables and commands describing an abstract user interface that can be downloaded from a device or service.
XWeb  aims to support an abstract notion of interaction that is independent of a particular set of interactive techniques, by harnessing existing Web technologies.
XForms (http://www.w3.org/MarkUp/Forms/) defines a set of abstract controls for UIs. However, these controls are designed to be embedded in a host container (e.g. an HTML document), which may or may not be accessible. Depending on the technology a particular XForms implementation is built upon, it may or may not support two-way synchronization of state information between server and client, which would be needed for implementing highly interactive UIs on remote consoles.
Sun’s Jini technology transmits UI code from a target device to a remote control . However, this technology requires that the target provide different UI implementations for different classes of remote console devices.
3 URC Design Concepts
A key part of the Universal Remote Console (URC) scenario is the definition of a language to convey a UI description from a service or device (target) to the URC. This language, with the code name “Alternate Abstract Interface Markup Language" (AAIML), must be sufficiently abstract (in terms of modality independence), so that a particular URC device can render the provided UI in its own way, taking advantage of the specific interaction techniques the URC device is capable of. The URC specification is still under development, with the current working draft employing basic components such as the ones depicted in fig. 1.
Fig. 1. Basic components of the URC specification under development, shown in a layered presentation. The arrows indicate usage dependencies (arrows originate from the calling component and point to the called component)
3.1 Abstract UI Description Language
By rendering the abstract UI description, the URC translates the elements of an abstract UI into the concrete UI elements available on its specific platform. For example, a PDA could render the UI description by using GUI elements (visual) for output, and pointing with a stylus, as well as hand writing recognition for input; a car radio would render the same UI description auditorially with sound and synthetic speech for output, and speech recognition for input; and a braille note-taker would use its braille output and input capabilities in order to render the very same UI description tactily. However, all URC devices would allow access to all functions of the target, each in a peculiar way.
3.2 Why the URC Needs an Abstract UI Description Language
One might argue that we could accommodate the same variety of remote UIs without an abstract UI language, solely by providing separate UI implementations for each remote console device. And that this approach would yield better UIs, by taking advantage of the very specific implementation techniques available on the specific remote console platforms.
However, this approach has the drawback of shifting the responsibility of providing remote console-specific UIs to the manufacturer of the target. This puts a large load on the manufacturer of the target service or device. It is likely that the target manufacturer would support only the most common remote console devices. It would be very difficult to accommodate the variety of special devices used by people with disabilities and they would not be able to access the target services and devices. The manufacturer may also not be able to or not want retroactively support future technologies not released at the time of target manufacture.
3.3 Abstract UI Elements
The goal of AAIML is to provide a set of abstract UI elements (called “interactors"), each with a distinct semantic or function, but not restricting the way the URC renders them. XML is a language for building languages. It provides an excellent base for prototyping this application.
AAIML defines a variety of interactors for input and output operations. On the URC an interactor is mapped to a concrete widget (or combination of widgets) available on the URC platform. For example, the interactor “string-selection" may be rendered as an array of radio buttons on a GUI, and as a voice menu on a voice-based UI platform. It is up to the URC device to decide what mapping patterns to use, and the target does not have to have any knowledge about it.
3.4 Layered Help Texts and Feature Classes
AAIML allows for help texts being attached to the entire UI (for describing general concepts related to the target, its controls and the information it presents), to groups of interactors (for an explanation of concepts specific to the group, and/or help on using the group), to interactors, and to options that are contained in interactors such as string-selection. For each of these instances the help text may be layered to start with a short help first, and at the user’s request provide more in-depth explanations.
In order to allow for the generation of simplified UI on a URC device, individual interactors are assigned to the different feature classes. The class of “basic" interactors consists of controls that cover the simpler functions of the target; “general" interactors include commonly used UI elements; and “full" refers to the complete UI.
4 Prototypical Implementations
The Trace Center is supporting the development of the URC specification by hosting a URC prototyping project. This project aims to pursue experimental research by implementing an array of URC scenarios, varying both the remote console device and the target. Prototypes take advantage of powerful existing technologies, like Bluetooth, 802.11b, Universal Plug and Plug (UPnP), and Jini/Java.
Goals of the prototyping project include the support for the V2 committee by:
- Writing code and building physical prototypes necessary to explore the proposed ideas for the standard.
- Identifying potential problems in the emerging standard, and proposing solutions to the V2 technical committee.
- Soliciting early user feedback on real-life implementations.
- Providing proof-of-concept implementations for the standard.
4.1 Prototypes Developed and in Development
As outlined before, the URC specification should not depend on any specific platform for implementation, whether it is the URC, networking environment, or target platform. Therefore the prototyping project at Trace strives to develop as many implementations as possible, varying all three dimensions of the solution space: URC device type, networking environment technology, and target.
All prototypes are implemented mainly in Java, and take advantage of third-party tools for middleware and wireless communication implementations. For demonstrational targets we use a TV simulation and video player running on a PC, as well as a small fan, and a desk lamp. A gateway to a real VCR is currently in development.
At present, 5 different URC implementations have been developed, and 2 additional ones are in development:
- A Swing URC running on a Linux based Compaq iPAQ, showing a graphical rendition of the target’s UI (see fig. 2).
- A Windows CE based BrailleNote providing braille based access to a target.
- A “talking" iPAQ implementation with large-scaled text output and a speech synthesizer, which can be used by persons with visual impairments.
- An alternative and augmentative communication (AAC) device (Pathfinder from Prentke Romich) for user input, optionally involving head tracking or scanning.
- A Swing based graphical URC that runs as an applet in any Web browser, connecting to a “configurable target" applet (for demonstrational purposes only).
The following prototypical implementations are in development:
- A graphical URC running on a PocketPC based iPAQ.
- A simple “open text" URC, which will allow a user to type or speak any text in order to control a target. For example, “Tune to the Weather Channel" would switch to the appropriate channel on the remotely controlled TV. A verbal feedback on status changes of the target will be provided to the user.
Fig. 2. A prototypical URC implementation on a Linux based iPAQ handheld computer, remotely controlling a TV, a small fan, a desk lamp, and a video player
5 Future Work
The V2 technical committee is currently exploring options to harmonize AIAP-URC with W3C’s XForms technology for next-generation Web forms. The underlying technologies of networked devices and Web services are merging already. In the future, users of V2 compliant URC devices should expect using the same access mechanisms for electronic devices and Web services.
At present, user interfaces as provided by AAIML rely on text-only elements and their ability to be represented in any modality. However, the ability to provide richer user interfaces by adding presentation-specific elements such as icons, and sounds, is also being explored. The challenge is to prevent use of these features in a way that would render the UI inaccessible to others.
Another area of exploration deals with modality specific presentation streams. For some applications, there is a need for providing modality-specific information streams from the target to the URC, and vice versa. For example, a user might want to initiate a link into a performance of a violin concerto and use the URC to then listen to it. V2 will address the need for the delivery of modality specific information, but at the same time make sure that access to modality specific content does not compromise accessibility to devices and services in general.
Another area V2 is working on is the specification of discovery mechanisms for target services and devices, including the kind of information from targets that is needed by the URC and the user prior to actually controlling the target.
In order to vet the evolving standard and technologies used to implement the standard, more prototypical systems need to be built. This should ideally involve manufacturers of targets and Assistive Technology devices.
The emerging V2 standard addresses the alternate interface connection needs of people with and without disabilities. The URC specification will enable people using their personal URC device to control electronic devices and Web based services in their environment. The standards work is continuing.
This work was partly funded by the National Institute on Disability and Rehabilitation Research (NIDRR), US Department of Education under grants H133E980008, & H133E990006; and the National Science Foundation (NSF) via the Partnership for Advanced Computational Infrastructure (PACI). Opinions expressed are those of the authors and not the funding agencies or V2.
-  Hayes, P. J.; Szekely, P. A.; Lerner, R. A. (1985). Design alternatives for user interface management systems based on experience with COUSIN. Proceedings of the CHI '85 conference on Human factors in computing systems, April 1985.
-  Savidis, A.; Stephanidis, C. (1995). Developing dual user interfaces for integrating blind and sighted users: the HOMER UIMS. Conference proceedings on Human factors in computing systems, May 1995.
-  Pfaff, G.E., (1985). User Interface Management Systems. Springer-Verlag, Berlin, 1985.
-  Myers, B.A. (1990). A new model for handling input. ACM Transactions on Information Systems (TOIS), July 1990, Volume 8, Issue 3.
-  Vanderheiden, G. (1981). Practical Applications of Microcomputers to Aid the Handicapped. Computer, IEEE Computer Society, January.
-  Vanderheiden, G. C. (1998). Universal remote console communication protocol (URCC). Proceedings of the 1998 TIDE Conference, Helsinki, Finland: Stakes.
-  Scott, N.G., & Gingras, I. (2001). The Total Access System. CHI 2001 Extended Abstracts, pp. 13-14.
-  Abrams, M., Phanouriou, C., Batongbacal, A.L., Williams, S., & Shuster, J.E. (1999). UIML: An Appliance-Independent XML User Interface Language. WWW8 conference, May 1999, Toronto, Canada.
-  Nichols, J.; Myers, B.A.; Higgins, M.; Hughes, J.; Harris, T.K.; Rosenfeld, R.; Pignol, M. (2002). Generating Remote Control Interfaces for Complex Appliances. CHI Letters: ACM Symposium on User Interface Software and Technology, UIST'02, 27-30 Oct. 2002, Paris, France. http://www-2.cs.cmu.edu/~pebbles/papers/PebblesPUCuist.pdf.
-  Olsen Jr., D.R., et al. (2000). Cross-modal Interaction Using XWeb. Proceedings UIST’00, 2000, San Diego, CA, pp. 191-200. Retrieved 21 June, 2002, from the WWW: http://icie.cs.byu.edu/ICE/LabPapers/CrossModalXwebInteraction.pdf.
-  Beard, M., & Korn, P. (2001). What I Need is What I Get: Downloadable User Interfaces via Jini and Java. CHI 2001 Extended Abstracts, pp. 15-16.