Multi-touch
In computing, multi-touch refers to the ability of a surface (a trackpad or touchscreen) to recognize the presence of more than one[1][2] or more than two[3] points of contact with the surface. This plural-point awareness is often used to implement advanced functionality such as pinch to zoom or to activatie certain subroutines attached to predefined gestures.
The two different uses of the term are the result of the quick developments in this field, which resulted in many companies using the term "multi-touch" for marketing purposes for older technology that is called gesture-enhanced single-touch or several other terms by other companies and researchers.[4] [5] There are several other similar or related terms that attempt to differentiate between whether a device can exactly determine or only approximate the location of different points of contact and that attempt to further differentiate between the various technological capabilities,[5] but they are often used as synonyms in marketing.
History of multi-touch
The use of touchscreen technology to control electronic devices pre-dates multi-touch technology and the personal computer. Early synthesizer and electronic instrument builders like Hugh Le Caine and Bob Moog experimented with using touch-sensitive capacitance sensors to control the sounds made by their instruments.[6] IBM began building the first touch screens in the late 1960s, and, in 1972, Control Data released the PLATO IV computer, a terminal used for educational purposes that employed single-touch points in a 16x16 array as its user interface.
One of the early implementations of mutual capacitance touchscreen technology was developed at CERN in 1977[7][8] based on their capacitance touch screens developed in 1972 by Danish electronics engineer Bent Stumpe. This technology was used to develop a new type of human machine interface (HMI) for the control room of the Super Proton Synchrotron particle accelerator.
In a handwritten note dated 11 March 1972, Stumpe presented his proposed solution – a capacitive touch screen with a fixed number of programmable buttons presented on a display. The screen was to consist of a set of capacitors etched into a film of copper on a sheet of glass, each capacitor being constructed so that a nearby flat conductor, such as the surface of a finger, would increase the capacitance by a significant amount. The capacitors were to consist of fine lines etched in copper on a sheet of glass – fine enough (80 μm) and sufficiently far apart (80 μm) to be invisible (CERN Courier April 1974 p117). In the final device, a simple lacquer coating prevented the fingers from actually touching the capacitors.
Multi-touch technology began in 1982, when the University of Toronto's Input Research Group developed the first human-input multi-touch system.[9] The system used a frosted-glass panel with a camera placed behind the glass. When a finger or several fingers pressed on the glass, the camera would detect the action as one or more black spots on an otherwise white background, allowing it to be registered as an input. Since the size of a dot was dependent on pressure (how hard the person was pressing on the glass), the system was somewhat pressure-sensitive as well.[6]
In 1983, Bell Labs at Murray Hill published a comprehensive discussion of touch-screen based interfaces.[10]
In 1983 the video-based Video Place/Video Desk system of Myron Krueger was influential in development of multi-touch gestures such as pinch-to-zoom.[11][12]
In 1984, Bell Labs engineered a touch screen that could change images with more than one hand. In 1985, the University of Toronto group including Bill Buxton developed a multi-touch tablet that used capacitance rather than bulky camera-based optical sensing systems.[6]
Sears et al. (1990)[13] gave a review of academic research on single and multi-touch touchscreen human–computer interaction of the time, describing single touch gestures such as rotating knobs, swiping the screen to activate a switch (or a U-shaped gesture for a toggle switch), and touchscreen keyboards (including a study that showed that users could type at 25 words per minute for a touchscreen keyboard compared with 58 words per minute for a standard keyboard, with multi-touch hypothesized to improve data entry rate); multi-touch gestures such as selecting a range of a line, connecting objects, and a "tap-click" gesture to select while maintaining location with another finger are also described.
An advance occurred in 1991, when Pierre Wellner published a paper on his multi-touch "Digital Desk", which supported multi-finger and pinching motions.[14][15]
Various companies expanded upon these inventions in the beginning of the twenty-first century. The company Fingerworks developed various multi-touch technologies between 1999 and 2005, including Touchstream keyboards and the iGesture Pad. Several studies of this technology were published in the early 2000s by Alan Hedge, professor of human factors and ergonomics at Cornell University.[16][17][18] Apple acquired Fingerworks and its multi-touch technology in 2005. Mainstream exposure to multi-touch technology occurred in 2007 when the iPhone gained popularity, with Apple stating they "invented multi touch" as part of the iPhone announcement,[19] however both the function and the term predate the announcement or patent requests, except for such area of application as capacitive mobile screens, which did not exist before Fingerworks/Apple's technology (Fingerworks filed patents in 2001-2005,[20] subsequent multi-touch refinements were patented by Apple[21]).
Microsoft's table-top touch platform, Microsoft PixelSense, which started development in 2001, interacts with both the users touch and their electronic devices. Similarly, in 2001, Mitsubishi Electric Research Laboratories (MERL) began development of a multi-touch, multi-user system called DiamondTouch, also based on capacitance, but able to differentiate between multiple simultaneous users (or rather, the chairs in which each user is seated or the floorpad on which the user is standing); the Diamondtouch became a commercial product in 2008.
Small-scale touch devices are rapidly becoming commonplace, with the number of touch screen telephones expected to increase from 200,000 shipped in 2006 to 21 million in 2012.[22]
Some of the first devices to support multi-touch were:
- Mitsubishi DiamondTouch (2001)
- Apple iPhone (announced January 9 2007)
- Microsoft PixelSense (formerly Surface) (May 29, 2007)
- NORTD labs Open Source system CUBIT (multi-touch) (2007)
- ELAN eFinger
Brands and manufacturers
Apple has retailed and distributed numerous products using multi-touch technology; most prominently including its iPhone smartphone and iPad tablet. Additionally, Apple also holds several patents related to the implementation of multi-touch in user interfaces,[23] however the legitimacy of some patents has been disputed.[24] Apple additionally attempted to register "Multi-touch" as a trademark in the United States—however its request was denied by the United States Patent and Trademark Office because it considered the term generic.[25]
Multi-touch sensing and processing occurs via an ASIC sensor that is attached to the touch surface. Usually, separate companies make the ASIC and screen that combine into a touch screen; conversely, a trackpad's surface and ASIC are usually manufactured by the same company. There have been large companies in recent years that have expanded into the growing multi-touch industry, with systems designed for everything from the casual user to multinational organizations.
It is now common for laptop manufacturers to include multi-touch trackpads on their laptops, and tablet computers respond to touch input rather than traditional stylus input and it is supported by many recent operating systems.
A few companies are focusing on large-scale surface computing rather than personal electronics, either large multi-touch tables or wall surfaces. These systems are generally used by government organizations, museums, and companies as a means of information or exhibit display.
Implementations
Multi-touch has been implemented in several different ways, depending on the size and type of interface. The most popular form are mobile devices, tablets, touchtables and walls. Both touchtables and touch walls project an image through acrylic or glass, and then back-light the image with LEDs.
Types[26]
- Capacitive Technologies
- Surface Capacitive Technology or Near Field Imaging (NFI)
- Projected Capacitive Touch (PST)
- Mutual capacitance
- Self-capacitance
- In-cell: Capacitive
- Resistive Technologies
- Analog Resistive
- Digital Resistive or In-Cell: Resistive
- Optical Technologies
- Optical Imaging or Infrared technology
- Rear Diffused Illumination (DI)
- Infrared Grid Technology (opto-matrix) or Digital Waveguide Touch (DWT)™ or Infrared Optical Waveguide
- Frustrated Total Internal Reflection (FTIR)
- Diffused Surface Illumination (DSI)
- Laser Light Plane (LLP)
- In-Cell: Optical
- Wave Technologies
- Surface Acoustic Wave (SAW)
- Bending Wave Touch (BWT)
- Dispersive Signal Touch (DST)
- Acoustic Pulse Recognition (APR)
- Force-Sensing Touch Technology
The optical touch technology functions when a finger or an object touches the surface, causing the light to scatter, the reflection is caught with sensors or cameras that send the data to software which dictates response to the touch, depending on the type of reflection measured. Touch surfaces can also be made pressure-sensitive by the addition of a pressure-sensitive coating that flexes differently depending on how firmly it is pressed, altering the reflection.[27] Handheld technologies use a panel that carries an electrical charge. When a finger touches the screen, the touch disrupts the panel's electrical field. The disruption is registered and sent to the software, which then initiates a response to the gesture.[28]
In the past few years, several companies have released products that use multi-touch. In an attempt to make the expensive technology more accessible, hobbyists have also published methods of constructing DIY touchscreens.[29]
Multi-touch gestures
Multi-touch gestures are predefined motions used to interact with multi-touch devices. An increasing number of products like smartphones, tablets, laptops or desktop computers features functions that are triggered by multi-touch gestures. Some typical gesture-function-pairs are listed below.
| ||
| ||
| ||
| ||
| ||
| ||
| ||
| ||
| ||
| ||
|
Popular culture
Popular culture has also portrayed potential uses of multi-touch technology in the future, including several installments of the Star Trek franchise.
The television series CSI: Miami introduced both surface and wall multi-touch displays in its sixth season. Another television series, NCIS: Los Angeles, make use of multi-touch surfaces and wall panels as an initiative to go digital. Another form of a multi-touch computer was seen in the film The Island, where the professor, played by Sean Bean, has a multi-touch desktop to organize files, based on an early version of Microsoft surface. Multi-touch technology can also be seen in the James Bond film Quantum of Solace, where MI6 uses a touch interface to browse information about the criminal Dominic Greene.[30] In an episode of the television series The Simpsons, Lisa Simpson travels to the underwater headquarters of Apple to visit Steve Jobs, who is shown to be performing multiple multi-touch hand gestures on a large touch wall.
A device similar to the Microsoft surface was seen in the 1982 Disney sci-fi film Tron. It took up an executive's entire desk and was used to communicate with the Master Control computer.
The interface used to control the alien ship in the 2009 film District 9 features such similar technology.[31]
Microsoft's surface was also used in the 2008 film The Day the Earth Stood Still.[32]
In the 2002 film Minority Report, Tom Cruise uses a set of gloves that resemble a multi-touch interface to browse through information.[33]
See also
- Gesture-enhanced single-touch
- List of multi-touch computers and monitors
- Gesture recognition
- Human-Computer Interaction
- Jeff Han
- Natural User Interface
- Pen computing
- Sketch recognition
- Surface Computing
- Tenori-on
- Touchpad
- Touch user interface
- Sensomusic Usine
References
- ↑ http://encyclopedia2.thefreedictionary.com/multi-touch
- ↑ http://www.x2computing.com/support/glossary.aspx
- ↑ http://books.google.fi/books?id=78-PIQQwv8kC&printsec=frontcover#v=onepage&q&f=false page 33
- ↑ http://books.google.fi/books?id=78-PIQQwv8kC&printsec=frontcover#v=onepage&q&f=false
- ↑ 5.0 5.1 "What is Multitouch". Retrieved 30 May 2010.
- ↑ 6.0 6.1 6.2 Buxton, Bill. "Multitouch Overview"
- ↑ Stumpe, Bent (16 March 1977), A new principle for x-y touch system, CERN, retrieved 2010-05-25
- ↑ Stumpe, Bent (6 February 1978), Experiments to find a manufacturing process for an x-y touch screen, CERN, retrieved 2010-05-25
- ↑ Mehta, Nimish (1982), A Flexible Machine Interface, M.A.Sc. Thesis, Department of Electrical Engineering, University of Toronto supervised by Professor K.C. Smith.
- ↑ Nakatani, L. H., John A Rohrlich; Rohrlich, John A. (1983). "Soft Machines: A Philosophy of User-Computer Interface Design". Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI’83): 12–15. doi:10.1145/800045.801573. Retrieved 2009-01-28.
- ↑ Krueger, Myron. "Videoplace '88".
- ↑ Krueger, Myron, W., Gionfriddo, Thomas., &Hinrichsen, Katrin (1985). VIDEOPLACE - An Artificial Reality, Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI’85), 35 - 40.
- ↑ Sears, A., Plaisant, C., Shneiderman, B. (June 1990) A new era for high-precision touchscreens. Advances in Human-Computer Interaction, vol. 3, Hartson, R. & Hix, D. Eds., Ablex (1992) 1-33 HCIL-90-01, CS-TR-2487, CAR-TR-506.
- ↑ Wellner, Pierre. 1991. The Digital Desk. YouTube video
- ↑ Pierre Wellner's papers via DBLP
- ↑ Westerman, W., Elias J.G. and A.Hedge (2001) Multi-touch: a new tactile 2-d gesture interface for human-computer interaction Proceedings of the Human Factors and Ergonomics Society 45th Annual Meeting, Vol. 1, 632-636.
- ↑ Shanis, J. and Hedge, A. (2003) Comparison of mouse, touchpad and multitouch input technologies. Proceedings of the Human Factors and Ergonomics Society 47th Annual Meeting, Oct. 13-17, Denver, CO, 746-750.
- ↑ Thom-Santelli, J. and Hedge, A. (2005) Effects of a multitouch keyboard on wrist posture, typing performance and comfort. Proceedings of the Human Factors and Ergonomics Society 49th Annual Meeting, Orlando, Sept. 26-30, HFES, Santa Monica, 646-650.
- ↑ Steve Jobs (2006). "And Boy Have We Patented It". Retrieved 2010-05-14. "And we have invented a new technology called Multi-touch"
- ↑ "US patent 7,046,230 "Touch pad handheld device"".
- ↑ Jobs et al. "Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics".
- ↑ Wong, May. 2008. Touch-screen phones poised for growth http://www.usatoday.com/tech/products/2007-06-21-1895245927_x.htm . Retrieved April 2008.
- ↑ Heater, Brian (27 January 2009). "Key Apple Multi-Touch Patent Tech Approved". PCmag.com. Retrieved 27 September 2011.
- ↑ "Apple's Pinch to Zoom Patent Has Been Tentatively Invalidated". Gizmodo. Retrieved 12 June 2013.
- ↑ Golson, Jordan. "Apple Denied Trademark for Multi-Touch". MacRumors. Retrieved 27 September 2011.
- ↑ Knowledge base:Multitouch technologies. Digest author: Gennadi Blindmann
- ↑ Scientific American. 2008. "How It Works: Multitouch Surfaces Explained" . Retrieved January 9, 2010.
- ↑ Brandon, John. 2009. "How the iPhone Works
- ↑ "http://www.humanworkshop.com/index.php?modus=e_zine&sub=articles&item=99 "DIY Multi-touch screen"
- ↑ 2009. " Quantum of Solace Multitouch UI"
- ↑ "District 9 - Ship UI "
- ↑ Garofalo, Frank Joseph. "User Interfaces For Simultaneous Group Collaboration Through Multi-Touch Devices". Purdue University. p. 17. Retrieved 3 June 2012.
- ↑ Minority Report Touch Interface for Real. Gizmodo.com. Retrieved on 2013-12-09.
External links
Wikimedia Commons has media related to Multi-touch. |
Look up multi-touch in Wiktionary, the free dictionary. |
- Multi-Touch Systems that I Have Known and Loved – An overview by researcher Bill Buxton of Microsoft Research, formerly at University of Toronto and Xerox PARC.
- The Unknown History of Pen Computing contains a history of pen computing, including touch and gesture technology, from approximately 1917 to 1992.
- Annotated bibliography of references to pen computing
- Annotated bibliography of references to tablet and touch computers
- Notes on the History of Pen-based Computing (YouTube)
- Multi-Touch Interaction Research @ NYU
- Camera-based multi-touch for wall-sized displays
- David Wessel Multitouch
- Jeff Han's Multi Touch Screen's chronology archive De
- Force-Sensing, Multi-Touch, User Interaction Technology
- LCD In-Cell Touch by Geoff Walker and Mark Fihn
- Touch technologies for large-format applications by Geoff Walker
- Video: Surface Acoustic Wave Touch Screens
- Video: How 3M™ Dispersive Signal Technology Works
- Video: Introduction to mTouch Capacitive Touch Sensing