College admissions essays are often the only part of an application where students can explain who they are in their own words. A new Cornell University study suggests artificial intelligence is not ready to take on that role.
Researchers reviewed 30,000 essays submitted to a selective university before the release of ChatGPT. They compared them with essays generated by eight large language models from OpenAI, Meta, Anthropic, and Mistral. The findings showed that AI text looked markedly different from human writing, even when the models were prompted with details about an applicant’s race, gender, or family background.
Patterns that reveal the difference
The team used statistical methods to measure how closely AI essays resembled human ones. Human-written essays clustered together but showed variety across individuals. By contrast, the AI essays formed a separate cluster that looked highly uniform.
Models also repeated words from the essay question rather than drawing on lived experience. They leaned on abstract terms such as “growth,” “journey,” or “community,” while human essays often included time markers, references to friends, or specific events.
One example generated by ChatGPT began, “Growing up in Lexington, South Carolina, with my Asian heritage, I often felt like a bridge between two cultures.” The line met the prompt but carried a formulaic tone, according to the researchers.
Why identity cues do not work
The researchers tested whether giving the models demographic information would improve the output. In practice, essays written with identity prompts still looked more like other AI essays than like real student writing.
In some cases, the identity prompts made the output less natural. Instead of integrating background into a personal story, the models inserted demographic terms directly, such as “Asian,” “immigrant,” or “college-educated.” This produced rigid phrasing that stood out from how students described themselves.
Statistical results
To confirm these observations, the team trained classifiers to distinguish between AI and human essays. The classifiers reached nearly perfect accuracy, with F1 scores close to 0.999. Even when only prompted essays were tested, the models could still be separated from human text.
Pairwise similarity scores told a similar story. Human-to-human essays showed more variation than human-to-AI comparisons, even when identity cues were added. The researchers concluded that AI outputs are too uniform to pass as authentic student writing.
Demographic observations
The study also examined differences across gender, race, and first-generation status. Human essays in these groups displayed unique linguistic patterns. AI essays did not.
When identity prompts specified first-generation students, the essays aligned slightly better with that group, but not enough to match human variation. For race, prompts improved similarity in some cases, but gaps remained, especially for Black applicants.
Implications for students and admissions
The study suggests AI may still have a role in essay writing, but only as a support tool. It can provide feedback on clarity or grammar, but full drafts risk sounding generic and easily detectable.
Lead author Jinsook Lee recalled her own essay writing as a useful exercise in reflection. She noted that students who skip this step and hand it to AI may lose the chance to develop their own voice.
For universities, the findings carry a different lesson. Because AI writing remains distinct, admissions officers could identify it if they are looking for the signs.
Broader lessons about AI writing
The Cornell team concluded that large language models struggle to produce text that mirrors the diversity of human expression. They generate outputs that are consistent and safe, but also monotonous.
In high-stakes settings such as college admissions, that lack of individuality can be a liability. For now, students who rely on AI to write their essays are likely to stand out, and not in the way they hope.
Notes: This post was edited/created using GenAI tools.
Read next: Americans Turn to Social Media for News, With TikTok Rising Fast
