The Universal Control Hub: An Open Platform for Remote User Interfaces in the Digital Home

Gottfried Zimmermann
Access Technologies Group
Wilhelm-Blos-Str. 8, 72793 Pfullingen, Germany
gzimmermann@acm.org
http://www.accesstechnologiesgroup.com

Gregg Vanderheiden
Trace R&D Center, University of Wisconsin-Madison,
1550 Engineering Dr., 2107 Engineering Centers Bldg,
Madison, WI 53706-1609, USA

This article has been published as book chapter in: Springer LNCS, Volume 4551/2007. Human-Computer Interaction. Interaction Platforms and Techniques. Pages: 1040-1049. ISSN: 0302-9743 (Print) 1611-3349 (Online). DOI: 10.1007/978-3-540-73107-8. ISBN: 978-3-540-73106-1. Springer Berlin / Heidelberg, 2007.


Table of Contents:


Abstract

This paper describes the application of an international user interface standard in the digital home in a gateway-based approach: The Universal Control Hub. ISO/IEC FDIS 24752 specifies the Universal Remote Console framework, promoting a "user interface socket" and pluggable user interfaces. The Universal Control Hub implements this standard in a way that facilitates the operation of existing controlled devices and controller devices. It retrieves user interfaces for the discovered target devices from resource servers that hold registered resources from multiple parties for any target device. Thus an open platform for user interfaces in the digital home is created that decouples the user interface from the device. This approach is expected to lead to more usable and more accessible user interfaces.

Keywords

Remote user interfaces, task-based user interfaces, digital home, usability, accessibility, Universal Control Hub, Universal Remote Console.

1 Introduction

It is widely acknowledged that complex user interfaces are an impediment for the proliferation of the digital home. A recent CEA study [1] found that "ease of use" is the third most important aspect for home theater owners, only narrowly topped by "video quality" and "sound quality". Also, "ease of using remotes" is identified as the second most important aspect for these owners that they are not satisfied with, indicated by a low ratio of the percentage of satisfied consumers and the percentage of those rating the attribute as important.

Consumer electronics industry and manufacturers of home appliances need to recognize the importance of solving the usability problem in their very own interest. Philips Electronics North America's CEO Zeven reports that only 13 percent of Americans believe technology products in general are easy to use [2]. That is why experts have suggested having new products tested by older persons as a litmus test for ease-of-use [3]. The answer to the question "Can grandma figure out how to use it?" is a good hint for a product's usability in general.

The usability problem has a significant impact on sales numbers. Two out of three Americans have lost interest in a technology product because it seemed too complex to set up or operate [2]. In addition, the manufacturer's profit on actual sales is cut down by a large number of returned products. Den Ouden [4] found out that half of all 'malfunctioning products' returned to stores by consumers are in full working order, but customers cannot figure out how to operate the product.

What is a huge problem if unsolved can be a tremendous opportunity if adequate solutions are found. Bias and Mayhew [5] estimate that companies that excel in usability can improve their return on such investment more than 10,000 times. This number should be reason enough for product manufacturers to increase their efforts in solving the "complexity crisis".

What constitutes an "impediment" for some consumers can be an insurmountable barrier for others. Older people and people with disabilities are often cut off from technological advancements in the home because they are not able to operate modern products. In fact, these user groups would probably benefit the most from recent digital home products if existing usability and accessibility problems would be overcome.

This paper introduces the "Universal Control Hub" architecture as an open user interface platform for the digital home that can facilitate solutions that address current usability and accessibility problems.

Note: In this paper, the term "target" is used for devices and services that are remotely controlled, and the word "controller" for devices used to control targets. A controller is running a "remote user interface" exposed by the target for the purpose of controlling it remotely.

2 State of the Art

For currently available remote user interfaces for the digital home, we identify the following problems:

Problem 1: Remote user interfaces are tied to specific target devices

Traditionally, remote user interfaces, running on a controller, are dedicated to one particular target.

Infrared-based remote controls shipped with and bound to specific target devices are prevalent in today's homes. This results in a large number of remote controls in the digital home, a real usability problem with a growing number of target devices in the average home.

Devices such as network routers and digital media players can be controlled through software dedicated to this target device. The control software may be located on the target device and running inside a Web server (allowing control from any Web browser) or the control software has to be installed on a platform-specific controller such as a PC. In either case, each product again has its own different user interface - except this time they are software based rather than hardware.

UPnP Remote User Interface [6] builds upon the Web server model for remote control, and specifies a protocol for controllers (called "remote user interface clients") to discover a set of user interfaces exposed by a target ("remote user interface server").

Similarly, CEA-2027-B [7] lets a target device (called "logical unit") convey a remote user interface in HTML (called "control frame") to a TV acting as controller (including its remote control).

Because of this 1-to-1 relationship between remote user interfaces and targets, issuers are typically stuck with the user interface provided by the device manufacturer. Often, this does not meet the needs and preferences of a particular user.

Problem 2: Remote user interfaces are controller device dependent

"Universal remote controls" have some success, but are still bound to a particular controller device. Nevertheless, the ability to be programmed or to download infrared codes, based on a user's configuration, eliminates their dependency on specific targets. Some newer models connect to a "master controller" via wireless radio frequency technology, allowing users to control targets even when these are located in a different room. Still, users have to use the user interface that comes with the controller which has been designed by its manufacturer. These remotes are careful to always expose the full functionality of the devices resulting in interfaces that are often even more complex than the individual remotes provided with the products.

Some very high-end universal remote controls are made to be programmed by professional user interface designers, the custom-home installers, using proprietary tools and proprietary protocols. Designing a user interface for one of these controllers is a cumbersome process requiring skills specific to the platform employed. This is a niche market, offering products that are out of reach for most people because of the expensiveness of these custom installation homes. Still, this shows that there is market for custom-made user interfaces, though hampered by cumbersome programming and installation, and high costs.

Besides custom-installation homes, users are left with a very limited choice of controller devices and user interfaces that they can employ to control a particular target device. Oftentimes, users will find the "default" controller unintuitive to use. For some users who have special needs, it may not even be accessible. Speech-based or tactile user interfaces are typically not provided by manufacturers at all.

Abstract user interface specifications have been discussed as a potential remedy for this problem. The idea is that an abstract user interface can be rendered on any device, using whatever input and output modalities are present.

The User Interface Markup Language (UIML) standard [8], developed by OASIS, claims to be such an abstract language. But it is not living up to its promises: It doesn't include a generic vocabulary for abstract widgets, although this has been attempted externally to the standardization effort, for an older version of UIML. Moreover, UIML does not provide a clean separation between content and presentation (see problem 5 below).

XForms [9], a technology for next-generation Web forms, defines a useful set of abstract user interface components (called "XForms form controls"), but its model is based on the typical request-response pattern of Web forms. It cannot be easily applied to digital home control.

Other work on abstract user interfaces includes the Pebbles project [10] at CMU and the SUPPLE project [11] at the University of Washington. Although Pebbles' "Personal Universal Controller" language has been demonstrated to generate visual and auditory user interfaces, it lacks the flexibility of external resources and third-party user interface components. SUPPLE has developed a technology for automatic generation of user interfaces, but is constrained to visual user interfaces.

CEA-2014 [12], which builds upon UPnP Remote User Interface [6], could provide a partial solution to this problem. It facilitates a match-making process for an arbitrary controller device to pick a user interface protocol that is supported by the controller platform and suits the user's preferences. Support for a specific user interface protocol, dubbed "CE-HTML" (an XHTML profile in Web 2.0 style) is mandatory for user interface servers so that at least interoperability is guaranteed for controllers supporting this protocol. However, CE-HTML requires a Web browser with JavaScript support, and is focused on visual user interfaces only.

Problem 3: Remote user interfaces are device-oriented rather than task-oriented

This is a consequence of problem 1 (user interfaces tied to target devices). Since every manufacturer binds users to the user interface that they deliver for their targets, and since there are typically multiple targets from different manufacturers in the home, users end up "juggling" multiple user interfaces, one per target device. Each of these user interfaces reflects the functionality of a single target, and does not directly offer cross-target functionality that would be required for more intuitive and task-oriented user interfaces.

For example, the (simple) task of watching a DVD movie on a typical home theater system involves at least 3 target devices that the user needs to set up in the right way for interoperation: The TV screen must be switched on and accept input from the DVD player; the same holds for the receiver/equalizer; and the DVD player must be switched on and instructed to play the DVD.

This situation is also prohibitive for natural language user interfaces that would allow users to interact with targets in terms of what they want to do rather than how to do it. The fact that users have to operate multiple user interfaces (one per device involved) to achieve a goal is one of the biggest impediments for easy-to-use interfaces in the digital home. This problem becomes worse the more devices are required for a particular task.

An evolving standard on task model representation, CEA-2018, currently being developed by the CEA working group R7 WG12 [13], may help to solve this problem for future generations of devices in the digital home.

Problem 4: Some new devices are network-enabled, but there is no uniform networking platform for the digital home

Network-enabled devices are emerging on the market. These can be controlled remotely by means of standardized networking protocols. Such home networking protocols include, among others: Universal Plug and Play (UPnP) [14], CECED Home Appliances Interoperating Network (CHAIN) [15], HDMI Consumer Electronics Control channel (CEC) [16], HANA [17], and LonWorks [18].

The networked devices allow for one controller device being used to control all target devices of a particular network protocol. For example, the Siemens serve@Home product line [19], which is built upon the CHAIN standard, includes a touch-screen controller device (called "info display") which can be used to view status information and remotely control all serve@Home products in a home. Another example is the Rudeo Play & Control software [20] which promotes a Pocket PC based PDA to a remote control for any UPnP Media Player in the home network.

However, there is no networking platform today that would encompass the full range of devices in the digital home. Therefore users are still required to cope with multiple controller devices and user interfaces, one for each platform that is being used by a target device in their home. They cannot use the controller device and user interface of their choice, such as PDAs, cell phones or assistive technology devices.

Problem 5: No clean separation between user interface code and networking code

Since existing tools encourage user interface code that is interspersed with networking code, writing user interfaces requires knowledge of protocol stacks for device-to-device communication. Typically, connectivity and network-oriented programmers end up writing user interface code themselves.

The missing separation between user interface code and connectivity code is a major cause for poorly designed user interfaces because no user interface experts were involved in their design. Consequently, unsatisfied customers frequently complain about non-intuitive user interfaces - and manufacturers suffer from high return rates on the retail market.

3 Requirements

To overcome the above problems, a framework for remote user interfaces for the digital home is needed with the following requirements. Note that there are generally no one-to-one correspondences between requirements and problems, but the set of requirements together is designed to address all the problems described above.

Requirement 1: Users can choose the controller device and user interface that suits their needs and preferences

Users should be able to use the controller device and user interface technology they are most familiar with and that suits their needs. For example, one user would like to run a Flash application on a PDA, a second favors a Web 2.0 page in their PC's browser, a third would prefer a very simple remote control with visual feedback on the TV screen, and a fourth wants to use a Braille-based handheld computer.

Users should be able to use one controller and one user interface of their choice for all targets. Even if they transition to a new target device (e.g. because the old target broke) they should still be able to use the same, familiar controller and user interface.

Requirement 2: Third parties can provide user interfaces and user interface components for any target

Obviously, there is no "one-size-fits-all" approach for usable and accessible user interfaces. Open competition in user interfaces will result in more usable and more personally accessible user interfaces over time. Users will be able to pick the user interface that matches best their needs and preferences.

We need an open market for remote user interfaces in which any party, including manufacturers, 3rd party organizations, user groups, and users, can create a user interface for any target. These user interfaces and user interface components should be globally available for any consumer who wishes to use them.

Requirement 3: Facilitate cross-device user interfaces

Many tasks in the digital home require multiple target devices (cf. the DVD movie playing example above). In general, the targets will come from multiple manufacturers. Users want to control these from one user interface, with the devices being transparent to them, or at least allowing for seamless user operations.

Task-based user interfaces that communicate with the user in terms of "what to do" rather than "how to do it" are seen as potential remedy for the user interface complexity problem in consumer electronics. Cross-target user interfaces (rather than strictly target-specific ones) are a prerequisite for task-based user interfaces.

Requirement 4: Middle layer between user interface and target functionality

There needs to be a middle layer, a standardized "connection point", between the functionality of a target (as given by the target device manufacturer) and the user interface created by any party. This is called "user interface socket" in the Universal Remote Console framework [21][22].

Typically, the manufacturer would specify the user interface socket for their target. The user interface socket specifies a target's functionality in terms of abstract user interface elements not bound to any particular device. A user interface socket reflects a target's state at any time, and allows for triggering commands and state changes on the target. If a user interface socket is available for a target, a user interface designer can easily bind their user interface to it, without needing to write code for the specific networking and connectivity platform employed.

Regarding task-based user interfaces, the user interface socket can serve as an "atomic task layer". A task model may specify tasks and their sub-tasks as a multi-level hierarchy, with the sub-tasks of the lowest level being bound to elements of the user interface socket. This makes the task model independent of the underlying networking platform.

4 The Universal Control Hub

The Universal Control Hub (UCH) is a gateway based architecture for implementing the Universal Remote Console (URC) framework [21][22] in the digital home. It reaps the benefits of the URC framework without requiring URC-compliant targets and controllers. The UCH architecture was originally proposed for UPnP based home networks [23], but is also applicable to any other networking platform and any combination of them.

In the UCH architecture, the control hub is the gateway between controllers and targets that would otherwise not be able to talk to each other. It is easy on requirements on targets and controllers in the system, since it handles most of the URC-related technologies. In fact, all of the URC framework's components are running inside the control hub. In short, the Universal Control Hub exposes pluggable user interfaces to controllers on behalf of targets, based on the URC framework.

Figure 1: Universal Control Hub architecture

Figure 1: Universal Control Hub architecture (from [23] with minor modifications). The control hub (in the middle) connects to different controllers (at the left hand) through a variety of user interface protocols (sample protocols shown as gray boxes on the left side of the control hub). The control hub connects to targets (at the right hand) through network-specific code (here as an example the UPnP 1.0 architecture & standardized DCPs). The control hub downloads pluggable user interfaces from resource servers on the Internet (at the top).

Controllers need to discover the control hub and thus get access to the connected targets in the home. CEA-2014 [12] may be used for that purpose. However, this requires a UPnP control point be available on the controllers. Alternatively, users can establish the connection to the control hub manually once and store it for subsequent uses (e.g. as a bookmark in a Web browser).

The remainder of this section describes how various features of the Universal Control Hub architecture meet the requirements described above.

Feature 1: Control hub between targets and controllers

The hub approach solves the interoperability problem between targets and controllers: It bridges between targets with their control language and controllers with their user interface languages and protocols.

The hub spans multiple targets. This facilitates user interfaces that seamlessly integrate multiple targets and that are structured by tasks rather than by individual targets.

Feature 2: Standard-based user interface socket

The UCH is based on the Universal Remote Console framework defined in ISO/IEC 24752 [22]. This framework allows third parties to provide pluggable user interfaces for targets of other manufacturers, based on a user interface socket description for a target device. This facilitates an open market for remote user interfaces in the digital home.

The user interface socket (short "socket") is the connection point between a target and remote user interface. It reduces the interconnection problem from an n*m problem to an n+m problem (with n=number of controllers and m=number of targets). Note: Actually, this calculation is overly simplistic since it assumes that there is always exactly one user interface per target.

The user interface socket description abstracts platform specific details of networking and connectivity protocols, so that a pluggable user interface does not need to care about that. The user interface socket description can also contain pre- and postconditions as typically used in task models. These dependencies can be used by sophisticated pluggable user interfaces such as intelligent agents to generate more usable user interfaces.

Feature 3: Variety of user interface protocols

The control hub provides a variety of user interface protocols (such as DHTML over HTTP, Flash, VoiceXML, etc.) for controllers. Controllers can pick the most suitable user interface protocol that it supports.

Users can choose their controller and their favorite user interface as long as there is a matching pluggable user interface that runs inside the control hub. The pluggable user interface may be created by the manufacturer of a target device, the manufacturer of a controller, or by a third party.

This provides for a "user interface forward-compatibility" of targets, i.e. one can control the same targets with conventional controllers today, and tomorrow with intelligent agents and natural-language based controllers, once they are mature enough and available for everybody.

If there is no appropriate pluggable user interface available for a specific combination of controller and targets, a controller can fall back to the URC-HTTP protocol which is mandatory for the hub. The URC-HTTP protocol is the basis for a "functional user interface". It provides direct access to the user interface socket and thus allows for generating a user interface on the fly for any modality.

Feature 4: Globally available resource servers

The control hub connects to global resource servers on the Internet to retrieve various user interfaces for the target devices that it discovers. Third parties can create their own user interfaces for any controllers and any targets and register them with any resource server.

Thus a user interface can be updated on the fly (i.e. by having it downloaded to a control hub) without requiring access to or replacing target devices or controller devices. Note that controller and target devices do not need to have Internet access. It is sufficient that the control hub has an Internet connection (which may be temporary) to access resource servers and download the appropriate pluggable user interfaces for the controller and target devices present in the home.

5 Application of the Universal Control Hub Architecture

An implementation of the Universal Control Hub architecture has been developed at the Trace R&D Center, University of Wisconsin-Madison, USA. A recent prototype demonstrates how a variety of mobile devices can be used to remotely control a UPnP-based entertainment system.

The Universal Control Hub is also being applied in the "i2home" project [24], a project funded by the European Research programme IST. The i2home consortium consists of organizations and companies from Germany, Sweden, the Czech Republic, Spain and Portugal.

Organizations interested in the promotion and implementation of the URC standards have formed the URC Consortium (URCC) that provides more information on current projects and tools [25].

Acknowledgment

This work was partially funded by the US Dept of Education, NIDRR, under Grant H133E030012 (RERC on IT Access); and by the EU 6th Framework Program under grant FP6-033502 (i2home). The opinions herein are those of the author and not necessarily those of the funding agencies.

References

[1] Consumer Electronics Association (2006): Home Theater Opportunities - CEA Market Research Report, Sep. 2006. http://www.ebrain.org/crs/crs_arch.asp?crscode=CRS290.

[2] Zeven, P. (2006). Do people need the gizmos we're selling? CNET News, Dec. 18, 2006, http://news.com.com/Do+people+need+the+gizmos+were+selling/2010-1041_3-6144335.html.

[3] Ogg, E. (2006). Can Grandma figure out how to use it? CNET News, Oct 13, 2006. http://news.com.com/The+key+to+gadget+buyers+hearts+Simplicity/2100-1041_3-6125477.html

[4] Den Ouden, E. (2006): Developments of a Design Analysis Model for Consumer Complaints: revealing a new class of quality failures. Ph.D. thesis, Technische Universiteit Eindhoven.

[5] Bias, R.G., Mayhew, D.J. (2005): Cost-Justifying Usability: An Update for the Internet Age. Morgan Kaufmann; 2nd edition (April 4, 2005).

[6] UPnP Remote UI Client and Server V 1.0. UPnP Forum. http://www.upnp.org/standardizeddcps/remoteui.asp.

[7] CEA-2027-B. A User Interface Specification for Home Networks Using Web-based Protocols. Consumer Electronics Association, 2006. http://ce.org/Standards/2502.asp.

[8] OASIS User Interface Markup Language (UIML) Technical Committee. http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=uiml

[9] XForms 1.0 (Second Edition). W3C Recommendation 14 March 2006. http://www.w3.org/TR/2006/REC-xforms-20060314/.

[10] The Pittsburgh Pebbles PDA Project. Human-Computer Interaction Institute, Carnegie Mellon University. http://www.pebbles.hcii.cmu.edu/.

[11] SUPPLE: Automatic Generation of Personalizable User Interfaces. Department of Computer Science and Engineering, University of Washington. http://www.cs.washington.edu/ai/supple/.

[12] CEA-2014. Web-based Protocol and Framework for Remote User Interface on UPnP? Networks and the Internet (Web4CE). Consumer Electronics Association, 2006. http://www.ce.org/Standards/browseByCommittee_2757.asp.

[13] CEA R7 WG12, Task-Based User Interface. http://www.ce.org/Standards/CommitteeDetails.aspx?Id=000011032608.

[14] Universal Plug and Play (UPnP) Forum. http://www.upnp.org/.

[15] CECED Home Appliances Interoperating Network (CHAIN). European Committee of Domestic Equipment Manufacturers (CECED). http://www.ceced.org/IFEDE/easnet.dll/ExecReq/WPItem?eas:dat_im=010113&eas:display=CHAIN.

[16] High-Definition Multimedia Interface (HDMI). http://www.hdmi.org.

[17] High-Definition Audio-Video Network Alliance. http://www.hanaalliance.org/.

[18] The LonWorks Platform: Technology Overview. Echelon. http://www.echelon.com/developers/lonworks/.

[19] Siemens serve@Home. http://www.servehome.com/.

[20] Rudeo Play & Control for UPnP Devices and Windows Media Player. http://www.rudeo.com/playctrl.htm.

[21] ANSI/INCITS 289-2005, 290-2005, 291-2005, 292-2005 and 293-2005. Information Technology - Protocol to Facilitate Operation of Information and Electronic Products through Remote and Alternative Interfaces and Intelligent Agents. ANSI, 2005. http://myurc.org/obtain-copies.php.

[22] ISO/IEC FCD 24752. Information technology - User interfaces - Universal remote console - 5 parts. International Organization for Standardization (ISO), 2006. http://www.iso.org/iso/en/CombinedQueryResult.CombinedQueryResult?queryString=24752.

[23] Zimmermann, G.; Vanderheiden, G.; Rich, C. (2006). Universal Control Hub & Task-Based User Interfaces. URC Consortium, 2006. http://myurc.org/publications/2006-Univ-Ctrl-Hub.php.

[24] Intuitive Interaction for Everyone with Home Appliances based on Industry Standards (i2home). http://www.i2home.org/.

[25] URC Consortium. http://myurc.org/.