Estée Lauder have launched a new App in the UK that helps visually impaired users apply makeup. Applying makeup is challenging for people with vision loss, this app has the potential to provide many with more confidence.

Powered by the company’s augmented reality and artificial intelligence capabilities and developed using machine learning, the first-of-its-kind app, called the Voice-enabled Makeup Assistant (VMA), analyses the makeup on a user’s face to assess uniformity, boundaries of application and coverage. Audio feedback will identify if a user’s bronzer is foundation or if their lipstick is uneven, for example, and offer descriptions of the specific areas that could be touched up, waiting for the user to make adjustments before scanning again. The user can customise the speed of the voiceover and change the voice using the accessibility setting on their device. The VMA app is free to download and use, and will detect any makeup — not just ELC brands.

It is great to see that ELC have not restricted the use to only their products and have worked with the visiually impaired community and adapted to the feedback provided.

ELC’s IT team researched a range of variables when developing the app, from its name to the speed of the voice command and tone of voice. Prior to the release of the app, it was tested by users of a variety of ethnicities and with different types of visual impairment. “When we thought about voice selection, we thought the best would be a humanistic voice — something that sounds really realistic. When we actually did the research, what was most important was familiarity with a voice they were already using for a screen reader or Siri or an accessibility setting they had,” Rastogi says. That’s why speaking to members within the community was essential when developing the app, she says: to validate assumptions being made and correct them according to people’s experiences and preferences.