A database of facial expressions in younger, middle-aged, and older women and men


What is FACES?

FACES is a set of images of naturalistic faces of 171 young (n = 58), middle-aged (n = 56), and older (n = 57) women and men displaying each of six facial expressions: neutrality, sadness, disgust, fear, anger, and happiness. The FACES database was developed between 2005 and 2007 by Natalie C. Ebner, Michaela Riediger, and Ulman Lindenberger at the Center for Lifespan Psychology, Max Planck Institute for Human Development, Berlin, Germany.

The database comprises two sets of pictures per person and per facial expression (a vs. b set), resulting in a total of 2,052 images. A subset of 72 pictures is available without having to register in order to apply for a personal account. Research-related publication and display of the publicly available FACES objects are permitted for the purpose of illustrating research methodology. However, if you plan to use the corresponding objects for such purposes, please register for FACES and send the FACES Platform Release Agreement (with a short description of how you plan to use the publicly available objects) to the FACES Technical Agent.

For detailed information about the development and validation of the database see Ebner, N., Riediger, M., & Lindenberger, U. (2010). FACES—A database of facial expressions in young, middle-aged, and older women and men: Development and validation. Behavior research Methods, 42(1), 351-362. https://doi.org/10.3758/BRM.42.1.351. Please always refer to this publication whenever you use objects from the FACES platform.

After development of the face stimuli, in a subsequent validation study, a total of 154 young (n = 52), middle-aged (n = 51), and older (n = 51) women and men rated the faces in terms of facial expression, perceived age, attractiveness, and distinctiveness. These picture-specific normative ratings can be downloaded here:

The first two dimensions (facial expression and perceived age) are described in Ebner, N., Riediger, M., & Lindenberger, U. (2010). FACES—A database of facial expressions in young, middle-aged, and older women and men: Development and validation. Behavior research Methods, 42(1), 351-362. https://doi.org/10.3758/BRM.42.1.351. More detailed descriptions for the latter two dimensions are available in Ebner, N. C., Luedicke, J., Voelkle, M. C., Riediger, M., Lin, T., & Lindenberger, U. (2018). An adult developmental approach to perceived facial attractiveness and distinctiveness. Frontiers in Psychology, 9, Article 561. https://doi.org/10.3389/fpsyg.2018.00561.

Moreover, to provide picture-specific normative data on perceived trustworthiness, 199 adults (aged 22-78 years) rated the trustworthiness of each face in the FACES database. For details see Pehlivanoglu, D., Lin, T., Lighthall, N. R., Heemskerk, A., Harber, A., Wilson, R., Turner, G. R., Spreng, R. N., & Ebner, N. C. (2023). Facial Trustworthiness Perception Across the Adult Lifespan. The Journals of Gerontology, Series B: Psychological Sciences and Social Sciences, 78(3), 434-444. https://doi.org/10.1093/geronb/gbac166. These picture-specific normative trustworthiness ratings can be downloaded here:

Dynamic FACES

Dynamic FACES is an extension of the original FACES database. It is a database of morphed videos (n = 1,026) of young, middle-aged, and older adults displaying six naturalistic emotional facial expressions including neutrality, sadness, disgust, fear, anger, and happiness. Static images used for morphing came from the original FACES database. Videos were created by transitioning from a static neutral image to a target emotion. Videos are available in 384 x 480 pixels as .mp4 files or in original size of 1280 x1600 as .mov files. All videos are accessible for registered FACES users.

For further information about Dynamic FACES see:

Holland, C. A. C., Ebner, N. C., Lin, T., & Samanez-Larkin, G. R. (2019). Emotion identification across adulthood using the Dynamic FACES database of emotional expressions in younger, middle aged, and older adults. Cognition and Emotion, 33(2), 245-257. https://doi.org/10.1080/02699931.2018.1445981.

Scrambled FACES

All 2,052 images from the original FACES database were scrambled using MATLAB. With the randblock function, original FACES files were treated as 800x1000x3 matrices – the third dimension denoting specific RGB values – and partitioned into non-overlapping 2x2x3 blocks. The matrices were then randomly shuffled by these smaller blocks, providing final images that matched the dimensions of the original image and were composed of the same individual pixels, although arranged differently. All scrambled images are 800x1000 jpeg files (96 dpi). The scrambled images are accessible for registered FACES users.


Reference publications about FACES

Pehlivanoglu, D., Lin, T., Lighthall, N. R., Heemskerk, A., Harber, A., Wilson, R., Turner, G. R., Spreng, R. N., & Ebner, N. C. (2022). Facial Trustworthiness Perception Across the Adult Lifespan. The Journals of Gerontology, Series B: Psychological Sciences and Social Sciences, 78(3), 434-444. https://doi.org/10.1093/geronb/gbac166

Holland, C. A. C., Ebner, N. C., Lin, T., & Samanez-Larkin, G. R. (2019). Emotion identification across adulthood using the Dynamic FACES database of emotional expressions in younger, middle aged, and older adults. Cognition and Emotion, 33(2), 245-257. https://doi.org/10.1080/02699931.2018.1445981

Ebner, N. C., Luedicke, J., Voelkle, M. C., Riediger, M., Lin, T., & Lindenberger, U. (2018). An adult developmental approach to perceived facial attractiveness and distinctiveness. Frontiers in Psychology, 9, Article 561. https://doi.org/10.3389/fpsyg.2018.00561

Voelkle, M. C., Ebner, N. C., Lindenberger, U., & Riediger, M. (2013). Here we go again: Anticipatory and reactive mood responses to recurring unpleasant situations throughout adulthood. Emotion, 13(3), 424-433. https://doi.org/10.1037/a0031351

Voelkle, M. C., Ebner, N. C., Lindenberger, U., & Riediger, M. (2012). Let me guess how old you are: Effects of age, gender, and facial expression on perceptions of age. Psychology and Aging, 27(2), 265-277. https://doi.org/10.1037/a0025065

Riediger, M., Voelkle, M. C., Ebner, N. C., & Lindenberger, U. (2011). Beyond "happy, angry, or sad?": Age-of-poser and age-of-rater effects on multi-dimensional emotion perception. Cognition and Emotion, 25(6), 968-982. https://doi.org/10.1080/02699931.2010.540812

Ebner, N. C., Riediger, M., & Lindenberger, U., (2010). FACES—A database of facial expressions in young, middle-aged, and older men and women: Development and validation. Behavior Research Methods, 42(1), 351-362. https://doi.org/10.3758/BRM.42.1.351


How to use FACES

FACES is freely available for usage in scientific research.

Without a user account, only the pictures of six exemplary persons (72 pictures) can be viewed. Full access to this online service and all its objects is possible after registration and log in.

Researchers can apply for an account on a case-by-case (i.e., person-by-person and study-by-study) basis.


Partners, contact & copyright

The FACES database was conceived between 2005 and 2007 at the Center for Lifespan Psychology, Max Planck Institute for Human Development, Berlin, Germany. Meanwhile, new collections of stimulus objects based on the original FACES pictures were created and are available via the FACES platform. As for detailed information about authorship, please refer to the collection specific information.

The FACES software is based on imeji and was developed by the Max Planck Digital Library. For questions regarding FACES, please contact the FACES technical agent.

Copyright: Max Planck Institute for Human Development, Center for Lifespan Psychology, Berlin, Germany.