A previous blog post discussed FTC Chairwoman Slaughter’s first priority as the newly designated chairwoman – the COVID-19 pandemic. The FTC’s second priority, racial equity, can be broken down into two sub issues. First, the FTC plans to investigate biased and discriminatory algorithms that target vulnerable communities. As the FTC acknowledges, the analysis of data can help companies and consumers, “as it can guide the development of new products and services, predict the preferences of individuals, help tailor services and opportunities, and guide individualized marketing.” Nonetheless, the FTC cautions companies to consider the below before making decisions based on the results of big data analysis.
On January 21, 2021, President Biden designated Federal Trade Commission (the “FTC”) Commissioner Rebecca Kelly Slaughter as acting chair of the FTC. Soon thereafter in one of her first speeches in her new role, Chairwoman Slaughter announced two substantive areas of priority for the FTC – the COVID-19 pandemic and racial equity.
Earlier this month, the FTC sent a letter to Wildec, LLC, the Ukraine-based maker of several mobile dating apps, alleging that the apps were collecting the personal information and location data of users under the age of 13 without first obtaining verifiable parental consent or otherwise complying with the Children’s Online Privacy Protection Act (COPPA). The letter pressed the operator to delete personal information on children (and thereafter comply with COPPA and obtain parental consent before allowing minors to use the apps) and disable any search functions that allow users to locate minors. The letter also advised that the practice of allowing children to create public dating profiles could be deemed an unfair practice under the FTC Act. Subsequently, the three dating apps in question were removed from Apple’s App Store and Google’s Google Play Store following the FTC allegations, showing the real world effects of mere FTC allegations, a response that might ultimately compel Wildec, LLC to comply with the statute (and cause other mobile apps to reexamine their own data collection practices). Wildec has responded to the FTC’s letter by “removing all data from under age accounts” and now prevents minors under the age of 18 from registering on the dating apps.
The past few years have seen exponential growth in the use of technology in the classroom, with applications ranging from the increased availability and use of e-books to the displacement of physical classrooms through Massive Open Online Courses (also known as MOOCs). One of the fastest growing segments of the education technology market relates to online educational services and applications, which are designed to track individual student progress and use the data gathered to deliver an individualized learning experience to each user. However, while online educational services and applications hold significant potential, the gathering of massive amounts of data has also sparked fears about what data will be collected, from whom, how it will be used, and whether, if at all, it will be deleted. This fear is especially prevalent when it comes to online educational services and applications targeted at children.
On January 23, 2015, Senior Attorney Lesley Fair at the Federal Trade Commission (“FTC”) posted on the Agency’s business blog clarifying how the Children’s Online Privacy Protection Act (“COPPA”) applies to schools. COPPA seeks to protect the privacy of children by allowing parents to control what personal information about their children under the age of thirteen may be collected by “operators” of websites or online services, including apps, that are either directed to children or that knowingly collect personally identifiable information from children. Subject to certain regulatory exceptions, the entities covered by COPPA must notify parents and obtain consent before collecting, using, or disclosing any personal information from children under thirteen.
A substantial rise in schools’ use of online educational technology products has caused educators to become increasingly reliant on these products to develop their curricula, deliver materials to students in real time, and monitor students’ progress and learning habits through the collection of data by third-party cloud computing service providers. Unfortunately, with these advances come the data security concerns that go hand-in-hand with cloud computing—such as data breaches, hacking, spyware, and the potential misappropriation or misuse of sensitive personal information. With the Family Educational Rights and Privacy Act (FERPA)—federal legislation enacted to safeguard the privacy of student data—in place for four decades, the education sector is ripe for new standards and guidance on how to protect students’ personal information in the era of cloud computing. California has tackled this issue head on, with the passage of two education data privacy bills by its legislature on August 30, 2014. Senate Bill 1177 and Assembly Bill 1442 (together, the Student Online Personal Information Protection Act (SOPIPA)) create privacy standards for K-12 school districts that rely on third-parties to collect and analyze students’ data, and require that student data managed by outside companies remain the property of those school districts and remain within school district control.
Two and a half years after initiating a review of the Children’s Online Privacy Protection Rule (the “Rule”), the Federal Trade Commission (FTC) announced on December 19, 2012 that the Rule will be amended to clarify perceived ambiguities and to strengthen the Rule’s protections for children who engage in online…