Schools and districts across the country are currently grappling with how to vet new education technology tools that incorporate generative AI. Various organizations have created frameworks designed to help schools consider these tools holistically, but they often say little about how to scrutinize privacy compliance. To address this gap, the Future of Privacy Forum’s Review of Generative AI Tools for Use in Schools describes steps schools should consider incorporating into their more general edtech reviews and AI policies. I am.
download Scrutinize the generative AI tools your school uses
download Incorporate generative AI into your school’s app review checklist
audience
These resources were created for school and district personnel reviewing education technology tools for privacy compliance. This resource is also valuable for companies that manufacture ed-tech tools, as these are questions schools should ask ed-tech vendors.
How to use resources
This resource assumes that your school or district already has a process in place for reviewing and approving new education technology tools for privacy compliance (referred to as app review) and that you have: (1) legal and other laws; legal considerations, which need to be taken into account as legal considerations. (2) The following important considerations should be included in the review process regarding how tools that use generative AI are unique:
- It depends on your use case. Generative AI edtech tools require more user input to perform any number of tasks as output compared to traditional edtech tools, so schools can develop specific use cases for the tools. Must be considered.
- Data collection. Student privacy laws typically cover use cases where a tool requires a student’s personally identifiable information (PII) as input, or where the output from the tool becomes part of the student’s record. There are many use cases that do not require student PII, and schools can use it without impacting most student privacy laws. Still, there are many use cases in which schools do not have control over all the information that tools collect, so schools should consider whether they can reduce the risk of data collection or avoid it altogether.
- Transparency and explainability. For tools that use student PII, schools should consider how they meet transparency and explainability requirements for teachers, parents, and students. State privacy laws frequently require schools to disclose information about the student data they share and who receives it. Many edtech companies have created AI transparency pages to better explain what data their tools use and how they make decisions.
- Product improvements. Many generative AI tools rely on large amounts of data to continuously train the underlying models that generate responses. Other tools initially train the model but do not use student data to further train the tool. Important questions schools need to ask are: Will the vendor use student PII to train the model, and if so, will any additional products the vendor create using that model be educational or commercial? The question is whether that additional use is permitted by state law.
- Unauthorized disclosure of student PII. If you trained your model using student PII, snippets of the PII may appear in future output from the tool. Schools need to understand the steps companies are taking to prevent this type of unauthorized disclosure.
- High-risk decision making. Some proposed use cases that involve substantive decision-making may be regulated by long-standing rules or new AI laws. Other uses carry a much higher risk of harm to students and schools should be cautious in pursuing them. Options schools may consider are allowing these cases only with parental consent, requiring a human to be in the loop, or banning the use case.