Image Source: Wikimedia Commons

Can AI Be a Fair Teacher? New Study Raises Big Questions

By Norah Rami for Chalkbeat

  • About 33% of teachers use AI weekly to develop goals for individualized programs
  • Black students are already subject to higher rates of suspension than their White counterparts in schools

Asked to generate intervention plans for struggling students, AI teacher assistants recommended more punitive measures for hypothetical students with Black-coded names and more supportive approaches for students the platforms perceived as white, a new study shows. These findings come from a report on the risks of bias in artificial intelligence tools published Wednesday by the non-profit Common Sense Media. Researchers specifically sought to evaluate the quality of AI teacher assistants such as MagicSchool, Khanmingo, Curipod, and Google Gemini for Education that are designed to support classroom planning, lesson differentiation, and administrative tasks, Chalkbeat reports.

Advertisement

Why This Matters: Common Sense Media found that while these tools could help teachers save time and streamline routine paperwork, AI-generated content could also promote bias in lesson planning and classroom management recommendations. School districts across the country have been working to implement comprehensive AI policies to encourage informed use of these tools. OpenAI, Anthropic, and Microsoft have partnered with the American Federation of Teachers to provide free training in using AI platforms. The Trump Administration has also encouraged greater AI integration in the classroom. However, recent AI guidelines released by the U.S. Department of Education have not directly addressed concerns about bias within these systems.

About a third of teachers report using AI at least weekly, according to a national survey conducted by the Walton Family Foundation in cooperation with Gallup. A separate survey conducted by the research organization Rand found teachers specifically report using these tools to help develop goals for Individualized Education Program — or IEP — plans. They also say they use these tools to shape lessons or assessments around those goals, and to brainstorm ways to accommodate students with disabilities.

Advertisement

Common Sense Media identified AI tools that can generate IEPs and behavior intervention plans as high risk due to their biased treatment of students in the classroom. Using MagicSchool’s Behavior Intervention Suggestions tool and the Google Gemini “Generate behavior intervention strategies tool,” Common Sense Media’s research team ran the same prompt about a student who struggled with reading and showed aggressive behavior 50 times using white-coded names and 50 times using Black-coded names, evenly split between male- and female-coded names.

The AI-generated plans for the students with Black-coded names didn’t all appear negative in isolation. But clear differences emerged when those plans from MagicSchool and Gemini were compared with plans for students with white-coded names. The report warns that novice teachers may be more likely to rely on AI-generated content without the experience to catch inaccuracies or biases.

Advertisement

Black students are already subject to higher rates of suspension than their white counterparts in schools and more likely to receive harsher disciplinary consequences for subjective reasons, like “disruptive behavior.” Machine learning algorithms replicate the decision-making patterns of the training data that they are provided, which can perpetuate existing inequalities. A separate study found that AI tools replicate existing racial bias when grading essays, assigning lower scores to Black students than to Asian students.

CBX Vibe:Everybody’s Gotta Learn Sometime‘ Beck

Advertisement

Welcome to CultureBanx, where we bring you fresh business news curated for hip hop culture!