As revealed by the Daily Tar Heel, University of North Carolina at Chapel Hill’s student publication, UNC has employed Slate AI, a data analytics software, in admissions since at least 2020. This software grades each applicant with an auto-generated essay score from one to four based on writing quality. It also generates scores across four categories: performance, program, extracurricular involvement, and personal qualities.
UNC is hardly alone — according to Inside Higher Ed, 17 percent of American universities now use AI in admissions, with this figure steadily rising as application numbers surge.
These AI platforms were implemented to help fix the biggest problem schools face in the admissions process: getting backed up by thousands of applications. There were 73,192 total applicants at the University of North Carolina at Chapel Hill for the 2025 year, a nearly 10 percent increase from last year.
But using AI to score applications is inconsistent and cutting corners. The problem is clear: these AI systems evaluate writing primarily on technical metrics like grammar and structure, while failing to meaningfully engage with content. This approach neglects the purpose of college essays, which serve as windows into applicants’ character, experiences, and potential contributions to campus communities.
It’s not known how exactly the auto-generated essay score influences a student’s final admission decision, yet any sort of grouping applicants based on their writing score feels unjust.
With AI as the gatekeeper, something as powerful as a personal experience about overcoming adversity could receive a low score simply due to unconventional structure or minor grammatical issues. An algorithm can’t evaluate the emotions of a life-changing experience.
Additionally, it might penalize students from diverse linguistic backgrounds or those with unique writing styles.
The irony becomes particularly stark when we consider the potential for students to respond by using AI themselves. If universities employ algorithms to judge writing, nothing prevents applicants from using AI writing tools to craft essays perfectly calibrated to these scoring systems. This technological arms race benefits no one and further distances the admissions process from its intended purpose.
More troubling still is the lack of transparency. Most students remain completely unaware that their heartfelt personal statements are being evaluated by algorithms. This only came to light because a UNC alumnus reviewed his admissions file (students at any public university are able to review their admissions file due to the Family Educational Rights and Privacy Act) and discovered the AI scoring system in the process.
Moreover, these systems potentially create privacy concerns, as deeply personal information shared in essays may be processed by third-party AI platforms.
While UNC’s guidelines state that reviewers should read essays in their entirety for “a more accurate sense of a student’s writing abilities,” there exists no legal obligation for institutions to thoroughly evaluate every submission. This creates a scenario where an algorithm’s initial assessment could prematurely eliminate promising candidates before human eyes ever see their applications, simply because software deemed their essay not a high enough score.
Not all universities have embraced this trend. Despite receiving over 90,000 applications last fall — a 25 percent increase from the previous year — the University of Texas at Austin has maintained its traditional review process. Similarly, the University of Michigan continues to prioritize holistic evaluation that considers each student’s unique context, including school resources and opportunities.
The solution isn’t mysterious. Instead of relying on AI, universities must invest in hiring more human readers. If financial constraints make this challenging, institutions should at minimum obtain explicit consent from applicants regarding AI evaluation and ensure each essay receives full human oversight.
As applications continue to flood admissions offices nationwide, the pressure to implement technological shortcuts will only intensify. But we must ask ourselves: what values do we prioritize in higher education? If we truly believe in equitable access and holistic review, then we cannot allow algorithms to become the arbiters of academic opportunity.
For prospective students, awareness is the first step toward change. By advocating for transparency, demanding human review of applications, and supporting legislation that regulates AI use in educational contexts, we can work toward an admissions system that honors the genuineness of humanity rather than reducing it to data points in an algorithm.
AI in college admissions threatens fairness
March 7, 2025
Categories:
As the college admissions season winds down, it is important to look at the great effects of AI.
More to Discover
About the Contributor
Oliver Peck, Editorial Director