Can AdCom Detect AI-Written MBA Essays?

  1. Home
  2. MBA Admissions
  3. Can AdCom Detect AI-Written MBA Essays?

Here’s a question more MBA applicants should be asking before they hit submit: Does this essay actually sound like me?

Not “is it good.” Not “will it pass a plagiarism checker.” Does it sound like a real person, with a real story, who genuinely wants to be at this school?

The reason this matters is simple. AdCom readers are not algorithms. They are experienced professionals who have spent years learning to read between the lines of an application. Spotting writing that is polished but empty, confident but hollow, is something they do almost by instinct now. And with AI-generated essays flooding inboxes at every top program, that instinct is getting sharper by the season.

Can MBA Adcoms detect AI written essays

In this piece, we skip the speculation and go straight to the evidence. What have admissions directors actually said about AI on record? What do the numbers on detection tools really show? And what are the real consequences when an essay crosses the line?

Executive Summary: Can AdCom Detect AI?

What top admissions officers are really saying — and what the data reveals.

The Current Landscape

  • Rising Usage: Approximately 56% of MBA candidates want to use AI for their essays, though most seek clear guidelines first.
  • The Detection Gap: AI detection tools have significant weaknesses, including a 1% false positive rate that often unfairly flags non-native English speakers.
  • Human Intuition: Experienced readers catch machine-authored patterns like “polished but hollow” voices and suspiciously perfect paragraph structures.

Top School Positions

  • Columbia Business School: Using AI to write complete essays is an Honor Code violation that can lead to rescinded offers.
  • Duke Fuqua: Views “AI” as “Authentic Individuality” and scans all essays for plagiarism.
  • Wharton & Berkeley Haas: Emphasize that applications must be a true reflection of lived experience, relying on multiple layers of review rather than software alone.

How to Use AI Responsibly

Click below to see the boundaries of ethical usage.

Brainstorming themes, checking grammar/spelling, getting feedback on structural flow, and researching specific school programs.

The “Only-You” Test

“Could someone else with a similar background, industry, and career goal have written this exact essay?” If yes, it’s not specific enough.

Bottom Line: AdComs are looking for reasons to admit you, not just reasons to catch you. An AI-written essay provides no voice or person to connect with.

The AI Essay Epidemic: How Big Is the Problem?

The speed at which students have adopted AI writing tools is staggering. According to the Pew Research Center, About 26% of U.S. teens used ChatGPT for schoolwork in 2024, double the share from 2023. (Pew Research Center, January 2025)

Among MBA applicants specifically, the numbers tell a similar story. A Manhattan Prep/Kaplan Survey found that 56% of MBA candidates want to use AI to help craft their essays, though most said they’d want clear guidelines before doing so.

And it’s not just light editing. Research by Turnitin, which scans hundreds of millions of academic submissions, found that 11% of papers showed evidence of AI use, with 3% classified as predominantly AI-generated. Given that AI-assisted writing is still rising year-over-year, those figures in the MBA applicant pool are likely even higher today.

Turnitin analyzed over 200 million assignments and found 11% showed some AI usage, with 3% being mostly AI-generated. (Turnitin, 2024 — via Anara.com)

This is the backdrop against which Admissions Committees are reading your essays. They are not naive about what is happening in the applicant pool, and several schools have now responded with explicit policies and systematic review processes.

What MBA Admissions Officers Are Actually Saying

Rather than speculating about what AdCom thinks, let’s go straight to the source. Over the past two years, several admissions directors at leading MBA programs have spoken candidly about the use of AI in applications. Their message is consistent and worth reading carefully.

Columbia Business School

Clare Norton, Senior Associate Dean for Enrollment Management at Columbia Business School, is direct about the limits of AI when it comes to what the school is really looking for:

“Generative AI is not capable of [answering our essay questions] in a way that is authentic. It’s never going to give an answer that can tie together across the application. The best applications are reflective, truly, of the individual.”

— Clare Norton, Senior Associate Dean for Enrollment Management, Columbia Business School  |  Source: Admissions Straight Talk Podcast, Ep. 556

Columbia has also formalized its position at the policy level: using generative AI to write complete essay responses is listed as an Honor Code violation, and offers of admission can be rescinded if misrepresentation is discovered.

Duke Fuqua School of Business

Shari Hubert, Associate Dean of Admissions at Duke Fuqua, offered one of the most memorable framings of the issue in the admissions world:

“AI at Fuqua stands for authentic individuality. We’re going to assume positive intent and that applicants are ethical but your application must be a true and accurate reflection of your lived experience and exclusively your own.”

— Shari Hubert, Associate Dean of Admissions, Duke Fuqua School of Business  |  Source: Admissions Straight Talk Podcast, Ep. 556

Hubert also confirmed that Fuqua scans all essays using plagiarism detection software and that submitting verbiage that is improperly sourced is grounds for denying an application.

Wharton School, University of Pennsylvania

Blair Mannix, Executive Director of Graduate Admissions at Wharton, reinforced the authenticity standard that sits at the core of elite MBA evaluation:

“We do require that your application be a true and accurate reflection and representation of your lived experience and exclusively your own.”

— Blair Mannix, Executive Director of Graduate Admissions, Wharton School  |  Source: Admissions Straight Talk Podcast, Ep. 556

UC Berkeley Haas School of Business

Eric Askins, Executive Director of Full-Time MBA Admissions at Berkeley Haas, gave a notably candid response that reveals how AdComs have been thinking about this problem in real time:

“The first piece of this journey was, ‘I hope our fraud software can catch it.’ This is a tool like the calculator is a tool. The tool exists.”

— Eric Askins, Executive Director of Full-Time MBA Admissions, UC Berkeley Haas  |  Source: Admissions Straight Talk Podcast, Ep. 556

This is a revealing admission: AdComs know the detection technology is imperfect, and they’re relying on multiple layers of review, not just software, to evaluate authenticity.

Can AI Detection Tools Actually Catch It?

The short answer: sometimes. But far less reliably than applicants fear or schools might hope.

Tools like Turnitin, GPTZero, and Originality.ai have been widely adopted by universities for academic work. Some business schools do use them as a first-pass screening layer. But the technology has significant known weaknesses.

The False Positive Problem

One of the most serious flaws in AI detection is that it disproportionately flags non-native English speakers. Research from Packback highlights this sharply:

AI detection tools have an estimated 1% false positive rate — which sounds small, but could mean approximately 223,500 legitimate essays incorrectly flagged as AI-written each year. (Packback, 2024 — via Anara.com)

For Indian, Chinese, or other non-native English-speaking applicants who write in careful, formal prose, this is a genuine and unfair risk. Several schools have publicly acknowledged they will not reject an applicant solely on the basis of a detection tool’s output.

The False Negative Problem

On the flip side, a well-prompted AI essay can often slip through detection tools entirely. A Poets&Quants analysis tested detection tools on ChatGPT-crafted essays written for Harvard Business School and Stanford GSB prompts, and the tools failed to flag them. This is why schools cannot and do not rely on software alone.

There is also the famous Wharton benchmark: Professor Christian Terwiesch published research showing that ChatGPT scored a B to B- on his MBA final exam, concluding that AI could already match a solid business school student’s performance on structured tasks.

The conclusion: detection software is a signal, not a verdict. AdComs know this. Which brings us to the most reliable detection mechanism of all.

The Dead Giveaways That Human Readers Catch

After reading thousands of essays across multiple application cycles, AdCom readers develop something that no software can replicate: pattern recognition for authentic human storytelling. Here are the AI tells that stand out to experienced readers:

1. The ‘Polished But Hollow’ Voice

AI writing is technically proficient. It is grammatically correct, well-structured, and reads fluently. But it tends to be emotionally flat. There is no friction, no vulnerability, no moment of genuine uncertainty. Real people hedge, reflect, contradict themselves slightly. AI papers over all of that with confident symmetry.

2. The Opening Cliché

If your essay begins with any variation of ‘In today’s rapidly evolving business landscape…’ or ‘Throughout my professional journey…’, it is a near-certain AI flag. These openings are statistically among the most common outputs of large language models asked to write professional essays.

3. Suspiciously Perfect Structure

AI instinctively organizes essays into three neatly balanced points. Real human narratives are messier; they linger on what matters most, and they leave some things unresolved. When every paragraph is exactly the same length, and every argument has an equal counter-argument, it reads as machine-authored.

4. Generic ‘Why School’ Answers

AI-generated ‘Why Wharton’ or ‘Why Booth’ answers tend to name-check professors, clubs, and programs in a way that is clearly scraped from public information rather than felt. AdComs can tell instantly when a candidate has never actually visited, attended an information session, or spoken to a current student.

5. No Idiosyncratic Detail

The most powerful MBA essays contain specific, almost oddly personal details, the exact words a mentor said in a difficult conversation, the particular number of employees in a factory, the smell of a city during a formative trip. AI cannot invent these because they do not exist in any training data. When an essay is all generality and no granularity, readers notice.

The Interview: The Ultimate Lie Detector

Even if an AI-written essay somehow passes automated screening and slips past a first-read reviewer, there remains one final, unavoidable test: the admissions interview.

At schools like Wharton, MIT Sloan, and London Business School, behavioral interviews are specifically designed to probe and expand on written application materials. London Business School’s Director of MBA Recruitment and Admissions, David Simpson, has pointed to the school’s holistic approach as a safeguard:

“Our selection process combines the written application with a one-to-one interview. Our holistic approach means I’m satisfied our admissions process is rigorous enough that we will continue to select the very best talent.”

— David Simpson, Director of MBA Recruitment & Admissions, London Business School  |  Source: Poets&Quants, April 2023

The implication is clear: if your written narrative was constructed by AI rather than lived by you, the interview will expose it. AdComs are trained to ask probing follow-up questions: ‘Tell me more about that moment,’ ‘What specifically made you choose that path over the alternative?’ that only someone who actually lived the story can answer fluently and naturally.

KEY RISK  If your essay describes an experience AI invented or embellished, your interview answers will contradict it. Inconsistency between written and verbal accounts is one of the fastest paths to rejection and in some cases, a rescinded offer.

The Real Consequences of Getting Caught

Schools’ responses to confirmed AI misuse range from application denial to revocation of an accepted offer, and in the case of enrolled students, academic discipline.

Columbia Business School is explicit: using generative AI to write complete responses violates the Honor Code, and offers of admission will be rescinded if misrepresentation is found. Stanford’s application materials similarly prohibit having ‘another person or tool’ write essays, with the same potential consequence.

But even beyond the formal consequences, there is a subtler risk that often goes undiscussed: fit. The entire admissions process exists to determine whether you belong in this particular cohort, in this particular school’s culture. An AI-written essay does not help AdCom understand that. It gives them nothing real to connect with — and in a cycle where every other strong candidate is submitting authentic, vulnerable, specific writing, a generic AI essay will simply not compete.

INSIGHT  Schools aren't just looking for smart candidates — they're looking for interesting people who will make the class better. AI can write a competent essay. It cannot make you interesting.

How to Use AI Responsibly in Your MBA Application

The goal is not to avoid AI entirely; it is to use it as a tool that serves your voice, not one that replaces it. Here is the distinction that matters:

✅ Acceptable uses of AI: Brainstorming themes and potential essay angles, checking grammar and spelling, getting feedback on structural flow, generating alternative phrasings for a sentence you’ve already written, researching school-specific programs and clubs to include in ‘Why School’ answers.

❌ Unacceptable uses of AI: Asking AI to write the essay from a bullet-point brief, using AI output as a first draft that you lightly edit, letting AI generate your ‘Why School’ section without adding genuine personal experience, using AI to fabricate or dramatize experiences you didn’t have.

The golden rule: if an admissions reader sat across from you in an interview and asked you to expand on every single sentence in your essay, you should be able to do so effortlessly and naturally. If you can’t, because AI wrote it, that is the problem.

What the Best Essays Have That AI Never Will

The essays that get applicants into HBS, Stanford GSB, and Wharton share a quality that is deceptively simple but almost impossible to manufacture: they make the reader feel like they know the person.

This comes from specificity. Not ‘I led a team through a difficult time,’ but the exact conversation, the specific decision, the precise moment when something became clear. It comes from honest self-assessment, the acknowledgment of a failure or a limitation alongside the growth that followed. It comes from a point of view: a perspective on business, leadership, or the world that is distinctly yours and that emerges naturally from your particular experiences.

Ask yourself the ‘only you’ test: could this essay have been submitted by a thousand other candidates with broadly similar backgrounds? If the answer is yes or if you’re not sure, the essay needs more of you in it, and less of the algorithm.

THE ONLY-YOU TEST  Read your essay and ask: 'Could someone else with a similar background, industry, and career goal have written this exact essay?' If yes, it's not specific enough. The best essays are irreplaceable — they could only have come from one person.

Final Thoughts

Can AdCom always detect an AI-written essay? Probably not, at least not with certainty in every case. Detection technology is imperfect, human readers are busy, and a well-crafted AI essay can be hard to distinguish from a real one at first glance.

But here is what the evidence actually shows: AdComs are not looking for a reason to catch you. They are looking for a reason to admit you. An AI-written essay gives them almost nothing to hold onto, no voice, no story, no person they can imagine sitting in a classroom and contributing to the community.

The risk is never just getting caught. The risk is submitting an essay that doesn’t work, that doesn’t move anyone, doesn’t reveal anything, and doesn’t make a case for why you belong. That is the real cost of outsourcing your story.

Your essay is your first conversation with an admissions committee. Make it sound like you.

Author

  • Nupur Gupta

    Nupur Gupta is the Founder of Crack The MBA, a premier MBA admissions consulting firm. A Wharton MBA, former AIGAC President, and storytelling enthusiast, she’s passionate about helping applicants uncover their unique stories and get into top B-schools worldwide.

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Menu