This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
viewpoints
Welcome to Reed Smith's viewpoints — timely commentary from our lawyers on topics relevant to your business and wider industry. Browse to see the latest news and subscribe to receive updates on topics that matter to you, directly to your mailbox.
| 3 minute read

Always in Season: Luxury, Fashion, and the Law - Beauty Tech & Privacy

The beauty industry is always pushing boundaries for new cutting-edge innovations, and consumers today are faced with novel AI, such as try-on features offered via apps or virtual reality, smart mirrors, wearables, chatbots and other solutions while they are trying on lipstick, mascara, moisturizers and other products. But technology isn’t the only aspect of the beauty business that is on the rise; so is the regulation of privacy. In this article, we look at some of the key privacy considerations in the UK, EU and US that come into play when technology is integrated into cosmetics and beauty products.

The legal landscape

In the UK and EU, the General Data Protection Regulation (GDPR) is now mature regulation that most people are already familiar with, setting various rules in relation to the processing of personal data. Designed as technology-neutral legislation, however, updated guidance and applications are appearing all the time as new innovations and novel uses of data emerge. Where electronic marketing or cookies or similar technologies are deployed, the Electronic Privacy Directive (as implemented under local laws) must also be considered.

The United States has seen an influx of comprehensive state privacy laws in the last five years, which can be a complicated labyrinth to navigate, especially for those in the beauty space that collect all forms of personal information – from the mundane, such as email addresses, to the more sensitive, such as gender identification, medical conditions or prescriptions. As in the EU/UK, the definition of personal information is broad, and there are increasing standards for data minimization in states like Maryland and California.

Key considerations

Some key considerations in the context of beauty tech are set out below, but bear in mind that this is just the tip of the beauty iceberg, as the privacy considerations in this space continue to flow, depending on the jurisdictions involved, the nature of the data and the type of technology and consumer experience.

 

  • Is it really personal data anyway? Privacy laws mostly will only kick in when data is personal data – meaning that it identifies, directly or indirectly, an individual. Beauty tech can collect personal data in obvious ways, such as when an individual completes an online form or provides a photo for video for skin analysis. However, care must be taken with personal data collection that may be less obvious. For example, an AI-based solution is likely to have collected personal data from third parties for training purposes, or a device may not directly ask for, say, a name but may still collect device information and location information that could still be considered personal data.
  • When does data become “sensitive data”? Beauty tech solutions may easily trip into processing sensitive data such as ethnicity, biometric or health information, which then is subject to more stringent regulatory requirements. For example, a virtual reality makeup service should be capable of being designed in a way that does not involve any of these categories, but careful privacy by design would be needed to ensure that this really is the case. In the United States, beauty tech providers should have a list of what is considered “sensitive” information across all the states with comprehensive privacy laws, including the requirements for biometric data. If a provider is collecting any form of sensitive information, it must know the consent requirements for the collection, the limitations on use and requirements for retention and whether the company will be required to do a data protection impact assessment.
  • Will teenagers and children be using the technology? If so, in the United States, you should consider US state laws that require “opt-in” consent for targeted advertising to teens. This can be more burdensome then the “opt-out” model that is required for advertising that targets adults. Also, consider whether you will need an age gate to prevent children under the age of 13 from accessing the platform. In Europe, the processing of children’s data (U18) is a hot topic, and specific care with such data is necessary – some activities will likely not be permitted (for example, use of their sensitive data for targeted advertising). A data protection impact assessment should be undertaken.
  • Will you have a secondary purpose for the personal information? Consumers will understand why they are providing some information – for example, an eyebrow liner virtual “try-on” application may require a facial scan. But consumers may not appreciate any further use of that facial scan for AI training – for example, skin color analysis or the gathering of information on the age of consumers who use the product for marketing. Any secondary use of personal information must be considered, and in some cases, explicit consent will be required from the consumer prior to use.
  • What are the key compliance steps? Once any personal data has been identified, a data protection impact assessment is likely to be required or at least is highly recommended. A lawful basis will be required, and data subjects will need to be informed about how their personal data is used and about their rights through a privacy notice or a similar form or message – not just in a written linked document, but at the time that they are using and interacting with the tech solution. Working with security and tech specialists to protect the security of the data will be essential.

Tags

always in season, luxury, data protection, beauty, ai, privacy, ccpa, gdpr, emerging technologies, entertainment & media