Interactive Computing Everywhere

 goals The ICE project plans to push human computer interaction into all of the situations where people work, live and play.
lab director Dan R. Olsen Jr.
lab guide Research Guidance


This is an internal (not visible outside of BYU) project. It is an architecture that implements many of the ideas described below.



This is an internal project. Its goal is to develop new models for interactive television.
View a video demo of the Time Warp Sports Project.



This is an internal project. In involves the development of computing for education

Nomadic Multiscale Interaction

Computing is steadily decreasing in size and cost. Most cellphones have more computing and communication power than some mainframes of 30 years ago. The limiting factor on such small computing devices is their ability to interact. Fingers will not get smaller and eyes will not focus more sharply. We envision an interaction model where the user carries minimal interactive capability and exploits displays and input devices as they are encountered in the environment. This requires applications that can function effectivily in the presence of a variety of display and input scales.


See and Touch

Much of future interaction will involve communication and cooperation with other people and other devices. This requires a secure mechanism for establishing such connections that ordinary people understand. Many security breaches occur because users do not understand the implications of the connections they make. This project exploits the simple gesture of touching as a means for establishing secure communications.


Cooperative Interaction

Most interactive applications support a single user with limited cooperation among applications. We want interactive architectures where multiple people and multiple applications can dynamically share information. We want to blur the boundaries between applications and exploit information and people in the "interactive neighborhood".


Interactive Machine Learning

With massive amounts of computation and massive amounts of information users will need "leverage" in to take advantage of it all. Direct manipulation simply will not scale to "internet sized" data stores. We believe that machine learning can provide the necessary leverage. However, new interactive techniques are required to train algorithms and provide feedback in ways that everyday users can understand and exploit.

papers Olsen, D. R., Taufer, T., Fails, J. A.: “ScreenCrayons: Annotating Anything”, UIST '04, ACM, (2004). .PDF
  Fails, J. A., Olsen, D. R.: “A Design Tool for Camera-based Interaction”, Human Factors in Computing Systems, CHI '03, ACM, (2003)
  Fails, J. A., Olsen, D. R.: “Interactive Machine Learning”, Intelligent User Interfaces, IUI '03, ACM (2003) .PDF
  Olsen, D. R., Peachey, J. R.: “Query by Critique: Spoken Language Access to Large Lists” , User Interface Software and Technology, UIST '02, ACM(2002). .PDF
  Fails, J. A., Olsen, D. R.: “Light Widgets: Interacting in Every-day Spaces”, Intelligent User Interfaces, IUI '02, ACM(2002). .PDF
  Rosenfeld, R., Olsen, D. R., Rudnicky, A." “Universal Speech Interfaces” interactions, ACM (2001) .PDF
  Olsen, D. R., Nielsen, S. T., Parslow, D.: “Join and Capture: A Model for Nomadic Interaction,” UIST '01, ACM, (2001) .PDF
  Olsen, D. R., Nielsen, T,: “Laser pointer Interaction”, Human Factors in Computing Systems, CHI '01, ACM, (April 2001) .PDF
  Olsen, D. R., Nielsen, T., Jefferies, S., Moyes, W., Fredrickson, P.: "Cross-modal Interaction in XWeb", UIST '00, ACM (2000) .PDF
  Olsen, D. R., "Interacting in Chaos", Interactions, ACM (1999)