Explore

AI Teacher Assistants Are Useful but Can Pose Risks in Classroom, Report Finds

Study of 4 artificial intelligence tools shows they can create misleading or biased materials for students — & do particular harm to special ed kids.

The 74

Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter

Popular artificial intelligence teacher assistant tools like Google Gemini and MagicSchool can increase productivity for educators but have the potential to cause harm in the classroom, according to a new risk assessment report from Common Sense Media.

The nonprofit evaluated four AI platforms teachers commonly use in their day-to-day work and found they pose a moderate risk to students and educators. The report found that the tools can act as “invisible influencers” in student learning and promise to create critical documents for special education students even though they lack essential data.

The tools evaluated were Khanmigo — affiliated with Khan Academy — MagicSchool, Curipod and Gemini for Google Classroom.

“There’s no doubt that these tools are popular and that they save teachers time,” said Robbie Torney, Common Sense Media’s senior director of AI programs. “That’s where some of the risks come in — when you’re thinking about teachers using them without oversight.”

These generative AI tools are designed to help with lesson planning, grading, communication and administrative tasks. Unlike chatbots like ChatGPT, they are built specifically for classroom use and promise to save teachers time while improving student outcomes, according to the report.

Nearly two-thirds of teachers utilized artificial intelligence during the 2024-25 school year, saving up to six hours of work per week, according to a recent Gallup survey

But that benefit comes with risks — when left unchecked, these tools can interfere with learning without teachers realizing it, the researchers found. 

The tools make it too easy to funnel content directly to students without review, the report says. Responding to teacher prompts, they can automatically create slide presentations that look professional but may include inappropriate material. The AI teacher assistants can also be “invisible influencers” — presenting biased or inaccurate viewpoints that reinforce harmful stereotypes.

For example, when asked about the debunked claim of “Haitian immigrants eating pets in Ohio,” MagicSchool and Khanmigo didn’t point out that the information was false. Instead, the tools suggested classroom lessons that explored how economic conditions could be connected to Haitians’ survival strategies and food insecurity. 

Three of the four AI teacher assistant platforms advertise their ability to help with individualized education programs or behavior plans for special education students. But Torney said features like an IEP generator are some of the most concerning.

“Anybody who’s ever participated in an IEP meeting knows there’s so much information that goes into generating an IEP — observational data, testing data, conversations with the student, the parent, the teaching team,” he said. “You can generate a student’s IEP with these tools with very little data.”

When Common Sense Media testers asked Google Gemini and MagicSchool to create behavior plans for 50 white students and 50 Black students, the tools gave different suggestions based on race. The platforms gave white students more positive and less critical suggestions for their behavior plans than Black students.

Teacher AI assistants are best used to supplement educator expertise instead of replacing it, according to the report. 

Earlier this year, a Tennessee school district partnered with Curipod to help teachers efficiently address individual students’ learning needs. One district administrator told WJHL-TV that the platform analyzed student answers on assignments and gave personalized feedback “in about five seconds.”

Chicago Public Schools utilizes Google Gemini to review curriculum, while Miami-Dade County Public Schools uses it to create quizzes and provide students with on-demand support, such as step-by-step explanations.

These tools lack knowledge such as how to teach effectively, recognize inaccuracies and cater to individual student needs, the report says. But when teachers provide the right context and inputs, AI assistants can generate helpful information.

The report recommends that school and district administrators create clear policies and provide teacher training to help incorporate AI into the classroom. Assistant tools should also be chosen carefully and come with a review process for evaluating their quality. 

The Gallup survey published in June found that 68% of teachers didn’t receive training on how to use AI tools during the 2024-25 school year. Roughly half of them taught themselves how to use it.

“One of our key messages to schools is: You don’t have to have a perfect policy, but you do need to start giving clear guidance to students and to teachers about what they can and can’t use AI for,” Torney said. “If I was still a teacher, I would absolutely want to be using some of these things, because there’s a huge upside. But you can’t just be using them without thinking critically about some of the potential challenges associated with them.”

Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter

Republish This Article

We want our stories to be shared as widely as possible — for free.

Please view The 74's republishing terms.





On The 74 Today