Explore

Polonetsky: Are the Online Programs Your Child’s School Uses Protecting Student Privacy? Some Things to Look For

As CEO of a global data protection nonprofit, I spend my workdays focused on helping policymakers and companies navigate new technologies and digital security concerns that have emerged in the wake of the COVID-19 pandemic.

Meanwhile, my children have adopted many of these technologies and are participating in online learning via Zoom and dozens of other platforms and apps — some of which have sparked serious concerns about student privacy and data security in the classroom.

These things are not contradictory. Here’s why.

Specific laws have been put in place to protect especially sensitive types of data. Your doctor uses services that safeguard your health information, and your bank relies on technology vendors that agree to comply with financial privacy laws.

Similarly, as the use of technology in the classroom skyrocketed in the past decade, federal and state laws were established that require stringent privacy protections for students.

To comply, many general consumer companies like Google, Apple and Microsoft developed education-specific versions of their platforms that include privacy protections that limit how they will use student information. School districts set up programs to screen ed tech software, even though few of the new laws came with funding.

But many of these federal and state protections apply only to companies whose products are designed for schools, or if schools have a privacy-protective contract with vendors. As schools rushed to provide distance learning during their coronavirus shutdowns, some of the tools adopted were not developed for educational environments, leaving children’s data at risk for sale or marketing uses.

If your child’s school has rolled out new technology platforms for online learning, there are important steps you can take to determine whether the tool includes adequate safeguards to protect student privacy. First, ask whether your school has vetted the company or has a contract in place that includes specific limitations on how student information can be used. Don’t hesitate to ask your child’s teacher to explain what data may be collected about your child and how it will be used — you have a right to this information.

Second, check to see if the company has signed the Student Privacy Pledge, which asks companies that provide technology services to schools to commit to a set of 10 legally binding obligations. These include not selling students’ personal information and not collecting or using students’ personal information beyond what is needed for the given educational purposes. More than 400 education technology companies have signed the pledge in recent years, so this can be a quick resource for identifying businesses that have demonstrated a commitment to ensuring that student data are kept private and secure.

Most importantly, take time to review each program’s privacy settings with your child and have an honest discussion about behavior online. Even the strictest privacy controls can’t always prevent a student from disrupting class by making racist remarks in the chat or sharing the link or log-in credentials. I hate to load another burden on parents who are trying to work from home, but making sure your kid isn’t an online troll is partly on you.

Now more than ever, we are relying on technology to keep in touch with work, school, and friends and family. It hasn’t been — and will never be — perfect. Policymakers can help schools ensure that the technologies they use meet privacy and security standards by providing the resources for schools to employ experts in those fields.

As we all try to adjust to this new normal, we should embrace technologies that can add value to students’ educational experience, enhance our ability to work remotely and help us stay connected. But we must first make sure the appropriate safeguards are in place so privacy and security don’t fall by the wayside.

Jules Polonetsky is the CEO of the Future of Privacy Forum, a Washington, D.C.-based nonprofit that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Previously, he served as chief privacy officer at AOL and DoubleClick, as New York City consumer affairs commissioner, as a New York state legislator, as a congressional staffer and as an attorney.

Help fund stories like this. Donate now!

Republish This Article

We want our stories to be shared as widely as possible — for free.

Please view The 74's republishing terms.





On The 74 Today