Mar 21 2014
The Society for Information Display (SID), in collaboration with IHS Technology, today unveiled the preliminary program for its Touch Gesture Motion (TGM) event, being held at Display Week 2014 (San Diego Convention Center, June 1-6). The TGM conference on Wednesday June 4 will continue in SID's tradition of providing the most up-to-date information and expert insight in the essential area of display input technology.
Direct touch technologies have become commonplace, but new breakthroughs in sensory technologies that involve recognition of gesture motion, faces, and emotion enable product designers to bring exciting new capabilities to human-machine interaction. A little more than 10 years ago, the "Minority Report" display interaction seemed futuristic, but the capability of machines to use multiple methods to anticipate human expectations has arrived. The TGM event will provide attendees with an opportunity to consider some of the latest developments in the field, and how these will impact the display marketplace. The event will educate, inform, and entertain, with a goal to spur the novel ideas that are likely to change the way we interact with display devices in the future.
The conference's mid-morning keynote will be from Marnie Bartlett, founder and lead scientist for Emotient. Her keynote will be complemented by presentations from other companies and organizations at the forefront of this sector, including: Ayotle, CEA, EyeTech Digital, Handscape, Intel, nLight, Ocular, Quantum Interface, Qualcomm, Sensor Platforms, Shindler Perspective, Tobii, The Eye Tribe and Xensr. This year's session topics include:
- Session 1, Sensor Technologies: Situational Intelligence
Devices are becoming increasingly intelligent, capable of measuring where we are, what we are doing, how we are feeling, what we are watching, and what information we are seeking. Increasingly, sensor technologies anticipate our needs and identify solutions to problems before we've even enunciated them. This session features speakers who will identify sensors that are enabling contextual awareness and situational intelligence.
- Session 2, Eyetracking and Facial Recognition: Enabling Direct Interactivity
One of the primary ways in which we as humans communicate is through non-verbal interactive cues, such as using our eyes and facial expressions to react to events around us. There is an increasing focus on eyetracking and facial recognition (and even emotion recognition based on eye/facial reactions) that serve to enable enhanced interactivity with display devices. This session features some of the industry leaders in this enabling communications technology discipline.
- Session 3, Gesture and Motion-based Interactivity: Touchless Interaction
Gesture and motion-based interactivity continues to expand to new areas and is no longer limited to the sort of sensing that serves to displace a TV remote control. This session seeks to expand upon previous years' TGM events by covering some of the latest developments in the field of gesture and motion-based interactivity.
- Session 4, Touch Industry Analysis: Beyond Projected Capacitive Touch
Developments in the area of direct touch interactivity are still expanding rapidly. The dominant touch technology, projected capacitive, is still seeing considerable innovation and challenges from other technologies. This session looks to identify some of the improvements and potential challengers to projected capacitive touch.