<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=703753234704129&amp;ev=PageView&amp;noscript=1">
Oct 28-30, 2024 | Arlington VA
Oct 28-30, 2024 | Arlington VA
Exploring Innovative Use Cases and Future Possibilities with ReadSpeaker

Exploring Innovative Use Cases and Future Possibilities with ReadSpeaker

As ReadSpeaker continues to push the boundaries... Read More

Empowering Developers with GitHub Copilot

Empowering Developers with GitHub Copilot

Software development is pivotal in driving... Read More

Decoding AI Tools in Software Development

Decoding AI Tools in Software Development

Integrating AI into development processes isn't... Read More

Save the Date

VOICE & AI | Oct 28-30, 2024 | Arlington, VA

Add to Calendar

Bringing Beauty to All: The Estée Lauder Companies' Voice-Enabled Makeup Assistant

Posted by Modev Staff Writers on Apr 5, 2024 12:17:55 PM
Robin Shyam

In the world of beauty, accessibility is often an afterthought. However, The Estée Lauder Companies is changing the narrative by introducing an innovative tool that promises to make beauty routines more inclusive. Robin Shyam, a product manager in Inclusive Tech at The Estée Lauder Companies, spearheads the development of a voice-enabled makeup assistant (VMA) designed to empower the blind and low vision community with independence in their beauty routines.

She joined us at VOICE & AI 2023 to present a tool that marries inclusive design with the world of beauty.

Inclusive is Beautiful

The concept of inclusivity in design is not new. It's exemplified by the ubiquitous curb cut, a simple dip in the sidewalk that, while initially intended for wheelchair users, has proven universally beneficial. This principle of universal design is at the heart of The Estée Lauder Companies' approach as they seek to serve a diverse consumer base across 150 territories and countries.

The VMA is a mobile application that uses voice input and output to guide users through their makeup application. It's a tool that initially caters to the blind and visually impaired but has the potential to assist a much wider audience. The app leverages artificial intelligence (AI) and immersive reality to identify makeup on the user's face and provide feedback on symmetry and coverage. It currently supports foundation, eyeshadow, and lipstick, with plans to expand its capabilities.


The journey to creating the VMA began with extensive research. The Estée Lauder team discovered that many blind and low-vision individuals apply makeup, take a selfie, and then send it to a trusted friend or family member for feedback. This process, while effective, fostered a sense of reliance and insecurity. The VMA was designed to replace this reliance with a tool that promotes independence and confidence.

When Seeing Isn't Required to Believe

The voice-enabled makeup assistant (VMA) developed by The Estée Lauder Companies is a groundbreaking application that leverages the power of artificial intelligence to make beauty routines more accessible to the blind and low-vision community. The app's functionality is ingeniously demonstrated through the experiences of two blind individuals, Hazal Bay Basin and Clara Sisk.


Hazal Bay Basin, featured as the face of the application, uses the VMA to check her lipstick application. The app provides her with auditory feedback, informing her that she has applied lipstick beyond the border of her lipline on the upper lip. It then suggests that she consider wiping off the excess lipstick around the outline of her lips. This type of feedback is crucial for users like Hazal, as it allows them to make corrections independently without needing a sighted person's assistance.

Similarly, Clara Sisk, a beauty influencer and content creator, showcases the app's capabilities by using it to check her entire makeup routine, including foundation, eyeshadow, and lipstick. The VMA guides Clara through the process, providing feedback on her foundation application by detecting uneven coverage on her face's left and right sides. It suggests she blends the foundation in these areas for a more even application. 


When Clara applies eyeshadow, the app instructs her to close her eyes and tip her head back slightly to avoid shadows, ensuring an accurate scan of her eyelids. For her lipstick application, the VMA detects five areas that need review, including extra coverage beyond the lip line and corners. It then waits for Clara to touch up her makeup before scanning again to confirm that the application looks fabulous.


The VMA's feedback loop is a critical feature that continues until the makeup application is perfect. It provides real-time, step-by-step guidance, allowing users to correct their makeup application as they go. That fosters a sense of independence and confidence and ensures users can achieve their desired look without relying on others.


Beauty is in the Voice of the Beholder


The Estée Lauder Companies' VMA is a testament to AI's potential in creating inclusive beauty experiences. By providing auditory feedback and practical suggestions for correction, the app empowers individuals who are blind or have low vision to engage with their beauty routines confidently and independently. The VMA's intuitive design and user-centric approach make it a pioneering tool in the beauty industry, setting a new standard for accessibility and inclusivity.

User feedback has been integral to the VMA's development. Contrary to the team's initial belief that a humanistic voice would be preferred, users favored familiarity with their existing voiceover settings, which tend to be more robotic. As a result, the VMA's default voiceover setting matches the user's existing screen reader or Siri preferences.


The Estée Lauder Companies' commitment to inclusivity is also reflected in the app's pricing—it's free, making it accessible in every sense of the word. The app launched in the UK and Ireland and is now available in the USA, with plans to expand to Android devices.

The VMA is a testament to the power of inclusive design. It's not just for the blind and low-vision community; it has the potential to assist anyone who wants to ensure their makeup is applied evenly. Makeup artists could even use it to create templated looks for clients. And as the app continues to evolve, The Estée Lauder Companies is committed to adding new features based on ongoing user interviews. The app is available for download on iOS and Android, inviting users to experience the future of accessible beauty.

Wrapping Up


The Estée Lauder Companies' initiative is a call to action for all digital product creators to consider accessibility in their designs. By co-creating with the communities they serve, they can ensure that their products are useful and empowering. When you design for one, you design for all.

The VMA is more than just an app; it's a movement towards a more inclusive beauty industry. It acknowledges that beauty is not one-size-fits-all and that technology can be a powerful ally in making beauty accessible to everyone. The Estée Lauder Companies' voice-enabled makeup assistant is a pioneering step in the right direction, and it's exciting to think about the possibilities it opens up for individuals worldwide.

With continued development and user feedback, the VMA is poised to revolutionize how we think about beauty and accessibility. It's not just about looking good; it's about feeling confident and self-reliant, regardless of one's abilities. The future of beauty is inclusive, and The Estée Lauder Companies is leading the charge.

 

 

Save the Date

VOICE & AI | Oct 28-30, 2024 | Arlington, VA

Add to Calendar