Facial Action Coding System
From Wikipedia, the free encyclopedia
This article needs additional citations for verification. Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (July 2007) |
The references in this article would be clearer with a different or consistent style of citation, footnoting, or external linking. |
Facial Action Coding System (FACS) is a system originally developed by Paul Ekman and Wallace Friesen in 1976, to taxonomize every conceivable human facial expression.[1] It is the most popular standard currently used to systematically categorize the physical expression of emotions, and it has proven useful both to psychologists and to animators.
Contents |
[edit] History
FACS and its action units (AUs) are based on the book of Carl-Herman Hjortsjö "Man's Face and Mimic [i.e. Facial] Language".[2] Hjortsjö was professor of Anatomy at Lund University in Sweden.
The original FACS was published in 1976 by Paul Ekman and Wallace V. Friesen. While using the system for several years in their lab and training new FACS coders, they updated the rules and definitions of the system. At first the changes were handed out to the new FACS coders in form of an addendum. However, as changes became more structural, a new version of FACS was needed.
In 2002, a new version of FACS was finally published, with large contributions by Joseph Hager.[3] Most co-occurrence rules were removed, a number of AUs were removed and some added, minimum requirements were eliminated and a novel intensity scoring definition was introduced. Unfortunately, the authors decided not to rename the system. It is still simply known as FACS, not as FACS2, FACS 2002 revision or FACS version 2. The website of Paul Ekman' lab refers to it as the "new" FACS.
[edit] Uses
Using FACS, human coders can manually code nearly any anatomically possible facial expression, decomposing it into the specific AUs and their temporal segments that produced the expression. As AUs are independent of any interpretation, they can be used for any higher order decision making process including recognition of basic emotions, or pre-programmed commands for an ambient intelligent environment.
FACS defines 32 AUs, which are a contraction or relaxation of one or more muscles. It also defines a number of Action Descriptors, which differ from AUs in that the authors of FACS have not specified the muscular basis for the action and have not distinguished specific behaviors as precisely as they have for the AUs.
For example, FACS can be used to distinguish two types of smiles as follows: [4]
- insincere and voluntary Pan American smile: contraction of zygomatic major alone
- sincere and involuntary Duchenne smile: contraction of zygomatic major and inferior part of orbicularis oculi.
Although the labeling of expressions currently requires trained experts, researchers have had some success in using computers to automatically identify FACS codes, and thus quickly identify emotions. [5]
Computer graphical face models, such as CANDIDE or Artnatomy, allow expressions to be artificially posed by setting the desired action units.
The use of FACS has been proposed for use in the analysis of depression,[6] and the measurement of pain in patients unable to express themselves verbally.[7]
A variant of FACS has been developed to analyze facial expressions in chimpanzees.[8]
[edit] Codes for action units
(Also see the list of facial muscles.)
[edit] Action units involving facial muscles
- 1 Inner Brow Raiser -- Frontalis (pars medialis)
- 2 Outer Brow Raiser -- Frontalis (pars lateralis)
- 4 Brow Lowerer -- Corrugator supercilii, Depressor supercilii
- 5 Upper Lid Raiser -- Levator palpebrae superioris
- 6 Cheek Raiser -- Orbicularis oculi (pars orbitalis)
- 7 Lid Tightener -- Orbicularis oculi (pars palpebralis)
- 9 Nose Wrinkler -- Levator labii superioris alaquae nasi
- 10 Upper Lip Raiser -- Levator labii superioris
- 11 Nasolabial Deepener -- Zygomaticus minor
- 12 Lip Corner Puller -- Zygomaticus major
- 13 Cheek Puffer -- Levator anguli oris (also known as Caninus)
- 14 Dimpler -- Buccinator
- 15 Lip Corner Depressor -- Depressor anguli oris (also known as Triangularis)
- 16 Lower Lip Depressor -- Depressor labii inferioris
- 17 Chin Raiser -- Mentalis
- 18 Lip Puckerer -- Incisivii labii superioris and Incisivii labii inferioris
- 20 Lip stretcher -- Risorius w/ platysma
- 21 Neck Tightener
- 22 Lip Funneler -- Orbicularis oris
- 23 Lip Tightener -- Orbicularis oris
- 24 Lip Pressor -- Orbicularis oris
- 25 Lips part -- Depressor labii inferioris or relaxation of Mentalis, or Orbicularis oris
- 26 Jaw Drop -- Masseter, relaxed Temporalis and internal pterygoid
- 27 Mouth Stretch -- Pterygoids, Digastric
- 28 Lip Suck -- Orbicularis oris
- 31 Jaw Clencher
- 38 Nostril Dilator
- 39 Nostril Compressor
- 43 Eyes Closed -- Relaxation of Levator palpebrae superioris; Orbicularis oculi (pars palpebralis)
- 45 Blink -- Relaxation of Levator palpebrae superioris; Orbicularis oculi (pars palpebralis)
- 46 Wink -- Relaxation of Levator palpebrae superioris; Orbicularis oculi (pars palpebralis)
[edit] Action Descriptors
- 19 Tongue Out
- 29 Jaw Thrust
- 30 Jaw Sideways
- 32 Lip Bite
- 33 Cheek Blow
- 34 Cheek Puff
- 35 Cheek Suck
- 36 Tongue Bulge
- 37 Lip Wipe
- 51 Head turn left
- 52 Head turn right
- 53 Head up
- 54 Head down
- 55 Head tilt left
- 56 Head tilt right
- 57 Head forward
- 58 Head back
- 61 Eyes turn left
- 62 Eyes turn right
- 63 Eyes up
- 64 Eyes down
- 65 Walleye
- 66 Cross-eye
[edit] See also
- Microexpression
- Facial feedback hypothesis
- Blink (book). Current bestseller has a section on FACS.
[edit] References
- ^ P. Ekman and W. Friesen. Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto, 1978.
- ^ Hjortsjö, C. H. (1970). Man's face and mimic language. Malmö: Nordens Boktryckeri. Swedish version: “Människans ansikte och mimiska språket”, 1969: Malmö, Studentlitteratur
- ^ Hager, Joseph C.; Ekman, Paul; Friesen, Wallace V. (2002). Facial action coding system. Salt Lake City, UT: A Human Face. ISBN 0-931835-01-1.
- ^ Del Giudice M, Colle L (2007). "Differences between children and adults in the recognition of enjoyment smiles". Developmental psychology 43 (3): 796–803. doi: . PMID 17484588.
- ^ Facial Action Coding System. Retrieved July 21, 2007.
- ^ Reed LI, Sayette MA, Cohn JF (2007). "Impact of depression on response to comedy: A dynamic facial coding analysis". Journal of abnormal psychology 116 (4): 804–9. doi: . PMID 18020726.
- ^ Lints-Martindale AC, Hadjistavropoulos T, Barber B, Gibson SJ (2007). "A Psychophysical Investigation of the Facial Action Coding System as an Index of Pain Variability among Older Adults with and without Alzheimer's Disease". Pain medicine (Malden, Mass.) 8 (8): 678–89. doi: . PMID 18028046.
- ^ Parr LA, Waller BM, Vick SJ, Bard KA (2007). "Classifying chimpanzee facial expressions using muscle action". Emotion (Washington, D.C.) 7 (1): 172–81. doi: . PMID 17352572.