Steven K. Feiner


450 CS Building
Mail Code 0401

Tel(212) 939-7083

Steven K. Feiner and his lab explore how computers can assist people in performing skilled tasks at work and in play, individually and collaboratively. Feiner’s lab has been doing research on Augmented Reality (AR), Virtual Reality (VR), and wearable computing for over 25 years, designing and evaluating novel 3D interaction and visualization techniques, creating the first outdoor mobile AR system using a see-through head-worn display, and pioneering experimental applications of AR to fields such as tourism, journalism, maintenance, construction, and healthcare. Their work on 2D desktop and handheld user interfaces has developed novel approaches to effectively communicate information in a broad range of domains, from helping hospital patients understand their health records to assisting mobile users in map-based navigation.

Research Interests

Human–computer interaction, augmented reality, virtual reality, 3D user interfaces, mobile and wearable computing, automated design of graphics and multimedia, computer games, visualization, health.

Research Areas

Feiner and his team examine the ways in which people accomplish tasks using current technology to determine how to create systems and user interfaces that can improve their performance. Whether working with 2D phones and tablets or 3D see-through head-worn displays, they design and implement interaction techniques that allow users to manipulate information more effectively and visualization techniques to present that information in ways that are more understandable. They perform user studies to evaluate their work by comparing how participants use their techniques and existing ones, quantifying improvements in metrics such as time, error rate, and subjective response. Feiner’s lab has developed view management systems that automatically lay out virtual objects, such as labels, in VR and AR user interfaces. These systems rely on efficient algorithms that can dynamically reposition and resize virtual objects to avoid unwanted visual relationships, such as occlusions, as the user’s viewpoint moves and the world changes. Their research on collaborative task assistance has created and tested new ways in which a remote expert can demonstrate to a distant technician through AR how to perform 3D manual tasks. Feiner’s lab has shown how to combat VR sickness, which some users experience when wearing VR head-worn displays, by subtly modifying the user’s field of view in ways that are effective yet imperceptible for many people. In addition to working with individual displays by themselves, Feiner has investigated what he calls hybrid user interfaces, which combine different kinds of displays and interaction devices to benefit from their complementary strengths. For example, one hybrid user interface created by his lab supports urban visualization. It uses a large, publicly visible, horizontal multi-touch display to present and interact with a map showing building footprints; personal head-tracked head-worn AR displays that overlay 3D building models that extend from their footprints; and personal tracked handheld phones that present detailed data about selected buildings. All these displays are aligned in a common 3D coordinate system and share maps and building models, together with georeferenced data obtained from online sources.


Feiner received an AB in music in 1973 and a PhD in computer science in 1987, both from Brown University. He was elected to the ACM CHI Academy in 2011, received the ACM UIST Lasting Impact Award in 2010 and the IEEE VGTC Virtual Reality Career Award in 2014.


  • Professor, Department of Computer Science, Columbia University, New York, NY 
  • Associate Professor, Department of Computer Science, Columbia University, New York, NY
  • Assistant Professor, Department of Computer Science, Columbia University, New York, NY


  • Association for Computing Machinery (ACM)
  • Institute of Electrical and Electronics Engineers (IEEE)
  • IEEE Computer Society


  • IEEE VGTC Virtual Reality Career Award, 2014
  • ACM CHI Academy, 2011
  • ACM UIST Lasting Impact Award, 2010


  • Mengu Sukan, Carmine Elvezio, Steven Feiner, S., and Barbara Tversky. “Providing assistance for orienting 3D objects using monocular eyewear,” Proc. SUI 2016 (ACM Symp. on Spatial User Interaction), Tokyo, Japan, October 15−16, 2016, 89−98.
  • Daniel Miau and Steven Feiner, “Personalized compass: A compact visualization for direction and location,” Proc. CHI 2016, San Jose, CA, May 7−12, 2016, 5114−5125.
  • Ajoy Fernandes and Steven Feiner, “Combating VR sickness through subtle dynamic field-of-view modification,” Proc. IEEE 3DUI 2016 (IEEE Symp. on 3D User Interfaces), Greenville, SC, March 19−20, 2016, 201−210.
  • Janet Woollen, Jennifer Prey, Lauren Wilcox, Alexander Sackeim, Susan Restaino, Syed Raza, Suzanne Bakken, Steven Feiner, S., George Hripcsak, and David Vawdrey, “Patient experiences using an inpatient personal health record,” Applied Clinical Informatics, 7(2), 2016, 446−460.
  • Lauren Wilcox, Janet Woollen, Jennifer Prey, Susan Restaino, Suzanne Bakken, Steven Feiner, Alexander Sackeim, and David Vawdrey, “Interactive tools for inpatient medication tracking: A multi-phase study with cardiothoracic surgery patients.” J. Am. Med. Inform. Assoc., 23(1), 2016, 144−158. doi:10.1093/jamia/ocv160.
  • Shashi Shekhar, Steven Feiner, and Walid Aref, “Spatial computing,” Communic. ACM, 59(1), January 2016, 72−81.
  • Ohan Oda, Carmine Elvezio, Mengu Sukan, Steven Feiner, and Barbara Tversky, “Virtual replicas for remote assistance in virtual and augmented reality,” Proc. UIST 2015 (ACM Symp. on User Interface Software and Technology), Charlotte, NC, November 8−11, 2015, 405−415.
  • Mengu Sukan, Carmine Elvezio, Ohan Oda, Steven Feiner, and Barbara Tversky, “ParaFrustum: Visualization techniques for guiding a user to a constrained set of viewing positions and orientations,” Proc. UIST 2014 (ACM Symp. on User Interface Software and Technology), Honolulu, HI, October 5−8, 2014, 331−340.
  • Taejin Ha, Steven Feiner, and Woontack Woo, “WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception,” Proc ISMAR 2014 (IEEE Int. Symp. on Mixed and Augmented Reality), Munich, Germany, September 10−12, 2014, 219−228.
  • John Hughes, Andries van Dam, Morgan McGuire, David Sklar, James Foley, Steven Feiner, and Kurt Akeley, Computer Graphics: Principles and Practice, Third Edition, Addison-Wesley, Upper Saddle River, NJ, 2014.