Max Bohun is Founder & CEO of GradeWiz, a Y Combinator-backed edtech featured in TechCrunch’s “10 startups to watch from YC’s W25 Demo Day.”

Education is currently facing a significant challenge: It isn’t personalized enough to meet individual student needs. Overburdened teachers face large class enrollments, standardized testing requirements and administrative tasks.

A 2022 survey by EdWeek Research Center and the Winston School of Education at Merrimack College found only 46% of a teacher’s time in school buildings is spent teaching. The rest is devoted to things like planning and administrative work, with grading taking an average of five hours weekly, more than 3 years over a 30-year career!

As a former teaching assistant at Cornell and now as CEO of an edtech company that helps with grading, I’ve experienced firsthand the challenge of providing personalized feedback to students at scale.

The Power Of Personalized Education Models

Consider Oxford’s tutorial system, where students meet with tutors once or twice a week in groups of 2-3 to discuss course materials and resolve doubts. This human connection can make learning feel personal, engaging and help students stay caught up.

Why should only the most prestigious institutions enjoy this personalized mentorship? With AI automating lower value-add tasks, each student can now have access to personalized attention.

AI’s Potential In Education

I think the true potential of AI in education isn’t simply efficiency—it’s transformation. By automating administrative tasks, AI can free educators to do what only humans can do effectively: inspire, mentor and connect. AI can make education more equitable in several key ways:

1. Immediate, Comprehensive Feedback

AI systems can provide feedback on work for every student, regardless of class size, with rapid turnaround. (Disclosure: My company specializes in this, as do others.) This quick feedback loop accelerates learning and provides personalized guidance at scale.

2. Access To Tutoring

AI tutors with voice capabilities could provide interactive learning support, allowing all students—not just those who can pay $50 an hour—to get tutoring support.

3. Customized Learning Pathways

By analyzing performance data, AI can help identify each student’s unique strengths and weaknesses, enabling truly differentiated instruction tailored to individual learning styles and needs. Stanford University’s Accelerator for Learning is actively researching how AI can improve education.

4. Teacher Empowerment

Most critically, AI tools can empower teachers to focus their expertise where it matters most: building relationships, developing critical thinking skills and nurturing student passion for learning. As Victor Lee, faculty lead for Stanford’s AI + Education initiative notes, “I’m heartened to see some movement toward creating AI tools that make teachers’ lives better – not to replace them, but to give them the time to do the work that only teachers are able to do.”

Key Challenges Of AI In Education

1. Hallucinations And Accuracy

AI systems, particularly large language models, can produce “hallucinations”—plausible-sounding but factually incorrect information. These hallucinations occur because AI models function like advanced autocomplete tools designed to predict the next word based on patterns, not to verify truth.

A 2023 study examining ChatGPT-generated research proposals found a concerning frequency of inaccuracies: Of 178 references cited by ChatGPT, 28 did not exist. In a similar study of medical articles generated by ChatGPT, 46% of references were fabricated, and only 7% were authentic and accurate.

For educational technology leaders addressing this challenge, I’ve found two approaches particularly effective: First, implement human-in-the-loop oversight for critical educational content. Finally, educate students about AI limitations, teaching them to critically evaluate AI-generated information rather than accept it as authoritative. I’ve found the most successful implementations position AI as a starting point for investigation, not the final word.

2. Academic Integrity

The emergence of generative AI has fundamentally disrupted traditional assessment models. According to a recent survey of students, instructors and administrators, 51% of students would continue to use generative AI tools even if prohibited by their instructors or institutions.

Research by Copyleaks, analyzing data from January 2023 to January 2024, revealed a 76% surge in AI-generated material in student work over the year. Interestingly, plagiarism rates experienced a notable decrease of 51% during the same period, suggesting significant challenges in identifying AI-assisted academic dishonesty.

For assignments vulnerable to AI generation, educators are pivoting to alternative assessment methods like verbal presentations or audio reflections. Many institutions are returning to pen-and-paper exams.

In my experience, the most promising approaches involve authentic project work that demonstrates application of knowledge, supplemented by AI tools that facilitate rather than replace student thinking.

3. Regulatory Frameworks

Educational regulatory bodies are increasingly focused on several critical concerns around AI implementation. Student data privacy ranks foremost—with AI systems potentially collecting and analyzing unprecedented amounts of information on student behavior and performance. Additionally, there are growing concerns about technological dependency and the potential erosion of core cognitive skills when AI tools are implemented without careful guardrails.

Despite the rapid advancement of AI technologies, school policies are struggling to keep pace. According to a 2023 Education Week survey, 79% of educators reported that their districts do not have clear AI policies, creating an uncertain environment for both teachers and students.

The California State University system recently announced an initiative providing its 460,000-plus students with equitable access to AI tools, including a partnership with OpenAI to implement ChatGPT Edu. Other states will likely follow, but implementation must be guided by thoughtful regulatory frameworks that protect student interests.

When addressing these regulatory challenges, I advise education technology leaders to proactively engage with policymakers rather than waiting for regulations to emerge. Develop transparent data governance frameworks that exceed current requirements, and design implementations that center human development rather than technological convenience.

The Human-Centered Future Of Education

AI in education isn’t about replacing teachers—it’s about amplifying their impact. With AI, teachers can finally refocus their energy on helping students.

Students should look forward to school. Teachers should reclaim their role as mentors rather than administrators. If implemented thoughtfully, AI won’t make education less human—it could make it more human than ever before. But this future requires us to approach these technologies with both optimism and clear-eyed recognition of their limitations.

Forbes Business Council is the foremost growth and networking organization for business owners and leaders. Do I qualify?

Read the full article here

Share.