In late March, the French Data Protection Authority, Commission Nationale de l’Informatique et des Libertés (“CNIL”) released a model regulation (the “Model Regulation”) governing the use of biometric access controls in the workplace. Unlike many items of personal information, biometric data (such as a person’s face or fingerprints) is unique and, if stolen or otherwise compromised, cannot be changed to avoid misuse. Under Article 9 of the GDPR, biometric data collected “for the purpose of uniquely identifying a natural person” is considered “sensitive” and warrants additional protections. The GDPR authorizes Member States to implement such additional protections. As such, the French Data Protection Act 78-17 of 6 January 1978, as amended, now provides that employers – whether public or private – wishing to use biometric access controls must comply with binding model regulations adopted by the CNIL, the first of which is the Model Regulation.
Laura Goldsmith is a partner in the Technology, Media and Telecommunications Group and member of the Privacy & Cybersecurity Group and Life Sciences Group. Her practice focuses on matters in technology, intellectual property, privacy and data protection across a range of industries including life sciences, media, entertainment, sports, sports betting, software, professional and financial services, healthcare, retail, fashion and communications.
Laura structures and negotiates complex technology transactions, such as license agreements, joint development agreements, supply, manufacturing or other services agreements, and software-as-a-service agreements. In particular, she regularly represents life science companies in licensing deals, co-commercialization arrangements, research collaborations, strategic acquisitions, and other transactions.
Laura also counsels clients in navigating compliance with international, federal and state laws related to privacy and data protection in the context of transactions, vendor relationships, internal compliance and external-facing policies. She is an editor of and contributor to Proskauer’s Privacy Law Blog and contributor to the State Privacy Laws and Financial Privacy chapters of the Proskauer on Privacy treatise published by PLI.
Laura is a member of the Proskauer Women’s Alliance Steering Committee and previously served as its co-chair.
Prior to her legal career, Laura worked as a consultant to global pharmaceutical companies formulating drug development strategy and clinical trial design. She also conducted scientific research in pharmacology and biology at Duke University Medical Center and her research has been published in peer-reviewed journals.
While at Boston University School of Law, Laura served as the Editor-in-Chief for the Review of Banking & Financial Law and interned for Judge Kiyo A. Matsumoto of the U.S. District Court for the Eastern District of New York.
Unwanted robocalls reportedly totaled 26.3 billion calls in 2018, sparking more and more consumer complaints to the FCC and FTC and increased legislative and regulatory activity to combat the practice. Some automated calls are beneficial, such as school closing announcements, bank fraud warnings, and medical notifications, and some caller ID spoofing is justified, such as certain law enforcement or investigatory purposes and domestic violence shelter use. However, consumers have been inundated with spam calls – often with spoofed local area codes – that display fictitious caller ID information or circumvent caller ID technology in an effort to increase the likelihood consumers will answer or otherwise defraud consumers. To combat the rash of unwanted calls, Congress and federal regulators advanced several measures in 2019 and states have tightened their own telecommunications privacy laws in the past year. For example, within the last week, the Arkansas governor signed into law S.B. 514, which boosts criminal penalties for illegal call spoofing and creates an oversight process for telecommunications providers.
On February 21, 2018, the Securities and Exchange Commission (SEC) issued an interpretive Commission Statement and Guidance on Public Company Cybersecurity Disclosures (the “Guidance”) to assist public companies in meeting their cybersecurity disclosure requirements under the federal securities laws. The Guidance notes that, as reliance on networked systems and the…
This month, the Federal Trade Commission (FTC) issued guidance on privacy and security best practices for health-related mobile apps, such as fitness apps connected with wearables, diet and weight loss apps, and health insurance portals. At the same time, the FTC unveiled an interactive tool designed to direct health app developers to federal laws and regulations that may apply to their apps. The Mobile Health Apps Interactive Tool, which is the product of collaboration among the FTC, Department of Health and Human Services’ Office of National Coordinator for Health Information Technology (ONC), Office for Civil Rights (OCR), and the Food and Drug Administration (FDA), seeks to unify guidance in a space governed by a complicated web of legal requirements. It also signals the continued focus of regulators on the protection of consumer health information in this rapidly evolving space.
Oregon became the first state to adopt the Revised Uniform Fiduciary Access to Digital Assets Act (“Revised UFADAA”) when Governor Kate Brown signed Oregon Senate Bill 1554 into law on March 3, 2016. The law will become effective on January 1, 2017.
Today, one month after the European Court of Justice decision that invalidated the Safe Harbor framework, the European Commission (the “Commission”) issued a Communication setting forth its position on alternative tools for the lawful transfer of personal data from the EU to the United States. The Commission also stated its objective to conclude negotiations with the U.S. government regarding the so-called Safe Harbor 2.0 within three months. This timeline dovetails with the Article 29 Working Party’s grace period, which continues until the end of January 2016.
In a non-binding opinion issued on September 23, 2015, an Advocate General for the European Court of Justice (“ECJ”) recommended that the ECJ suspend the U.S.-EU Safe Harbor program (“Safe Harbor”) and reexamine whether the Safe Harbor provides adequate protection for personal data of EU citizens. In light of its non-binding nature, the opinion did not effect any legal change and the ECJ is free to reject or adopt its recommendations. Nevertheless, the opinion has triggered widespread concerns about the future of the Safe Harbor, due in part to the frequency with which the ECJ follows the recommendations of its advisors.
Connecticut has joined a list of twenty-one states with a statute designed to preserve the privacy of personal online accounts of employees and limit the use of information related to such accounts in employment decision-making. Legislation directed to online privacy of employees has also passed this year in Montana, Virginia, and Oregon, and such legislation is pending in a number of other states.