2026 ERAS Cycle · Editorial Policy

Our no-AI editing policy

Every edit is performed by a human physician reviewer. Grammarly is used for mechanical proofreading only. No GPT, no Claude, no Gemini, no Perplexity — no generative AI is used at any point in our editing workflow. This page explains exactly what that means and why, in the context of the AAMC's 2026 ERAS certification that personal statements must not be the product of artificial intelligence.

Effective 12 May 2026 · For the 2026 and 2027 ERAS cycles

TL;DR — The 2026 MyERAS application requires you to certify that your personal statement is not the product of artificial intelligence. That certification is unambiguous. We do not use AI generation at any step of our editing process. The only software that touches your draft is Grammarly, used for mechanical proofreading (spelling, comma placement, simple grammar), with generative-rewrite suggestions ignored. Every editorial decision is made by a physician reviewer who has matched into an ACGME residency.


What the AAMC actually requires for 2026

The 2026 MyERAS application introduces an explicit certification: applicants must affirm that their personal statement is not the product of artificial intelligence. The certification is signed when the application is submitted; a violation is treated under the same plagiarism investigation rules that have governed ERAS for years — a substantiated finding is reported to every programme the applicant applies to in the current cycle and may be reported in subsequent cycles.

This is a meaningful shift from the previous AAMC position. AMCAS — the medical school application — has been explicit in permitting AI for brainstorming, proofreading and editing, with the requirement that the final submission represent the applicant's own work. ERAS, in 2026, does not contain that carve-out. The plain reading of the certification language is that the personal statement should not be drafted by AI in the first place, not merely revised back to look human after the fact.

Several large residency programmes have been clear about how they treat suspected AI personal statements. The diagnostic markers are well-known: uniform structure, predictable opening hooks, an absence of the specific clinical detail that only an applicant who has actually been there can produce, and stylistic patterns that match GPT's output across thousands of essays the same reviewers have already read. Some programmes run drafts through third-party detection tools; others rely on the editorial intuition of selection-committee members who have read hundreds of AI essays already and recognise the patterns by eye.


What our reviewers actually do

Each draft is assigned to a physician reviewer matched to your specialty. The reviewer reads the draft in full, then performs four passes:

  1. Structural pass. Does the statement open well? Is there a coherent narrative arc? Are the experiences in the right order? Is the closing earned? This is where the most expensive editorial decisions get made, and it is also where AI tooling is least useful — structural judgement requires reading thousands of competing personal statements, which our reviewers have done and an AI has not.
  2. Specialty-fit pass. Does the statement land for the specialty? Internal medicine values different things from dermatology, which values different things from emergency medicine. Reviewers are matched to specialty so the editorial standard reflects what programme directors in that field actually look for.
  3. Truth pass. Inconsistencies between the personal statement and the rest of the application are an automatic interview-cut at competitive programmes. Reviewers cross-check the experiences, dates, and themes against your CV and noted application elements. This is reviewer judgement work, not pattern matching.
  4. Mechanical proofreading. Grammarly is used here — and only here — for spelling, comma placement, and obvious grammatical issues. The generative-rewrite features in Grammarly Premium (paraphrase, tone shift, full-sentence regeneration) are not used. Where Grammarly proposes a regeneration, the reviewer ignores it or replaces with their own edit.

What you receive back is a tracked-changes document where every change is attributable to the named reviewer, plus a feedback memo written by the same reviewer explaining the structural decisions. You can read the entire edit trail. If anything in the edit reads as AI-generated, the work is redone at no cost.


What we do not do


What about drafts you have already AI-touched?

Bring it anyway. Many applicants experiment with ChatGPT during the brainstorming phase. The 2026 ERAS certification is about what you submit, not what you produced in private. Your reviewer treats AI-touched drafts the same as any other raw material: as input that needs honest editorial judgement to reach a defensible final submission. We will tell you if a passage reads as AI-generated and rewrite it with you to something authentic. The final version you certify must be your own work — that is what we deliver.


Frequently asked questions

Does the AAMC allow AI for the ERAS personal statement?

The 2026 MyERAS application requires every applicant to certify that their personal statement is not the product of artificial intelligence. That language is explicit and stronger than AMCAS, which permits AI for brainstorming, proofreading and editing. ERAS does not have the same carve-out. Using a generative AI to draft a personal statement is a violation of the certification you sign when you submit, and the AAMC investigates substantiated plagiarism findings.

What is the difference between AI generation, AI editing, and AI proofreading?

AI generation means an AI produced the text — you prompt ChatGPT or Claude to write a draft from your CV or talking points. AI editing means an AI substantively rewrites sentences or paragraphs to improve flow, replace clichés, or shift tone. AI proofreading is mechanical: spelling, grammar, capitalisation, comma placement, simple subject-verb agreement. The grey zone is between editing and proofreading; the safe rule for the 2026 cycle is that anything generative — anything that changes meaning, structure, or voice — is off the table.

Why does MyERAS Editing not use AI for editing?

Two reasons. First, the AAMC certification on the 2026 MyERAS application is unambiguous: a personal statement that is the product of AI is non-compliant. We will not put applicants in a position where their work could be flagged. Second, the editorial decisions that move a personal statement from neutral to lift — which experiences to lead with, which clichés to cut, how to structure a specialty-specific argument — require the judgement of a physician who has read thousands of competing applications and matched into an ACGME programme. AI tooling does not replicate that judgement.

Then what exactly do MyERAS reviewers do?

Every edit is performed by a physician reviewer. The reviewer reads your draft, identifies structural issues, marks specialty-specific risks (clichés the specialty over-uses, opening hooks that work versus ones that read as generic), suggests cuts, flags inconsistencies with the rest of your application, and writes a feedback memo. Tracked changes are made by the reviewer, not by an AI. The only software in our workflow that touches your draft is Grammarly, used for mechanical proofreading — spelling, comma placement, and obvious grammatical issues. No GPT, Claude, Gemini, Perplexity, or other generative model is used at any step.

Will programs detect AI-generated personal statements?

The AAMC announced that Thalamus Cortex, the platform many residency programs use to review applications, uses AI to assess applicant characteristics starting with Academic Career Interest for the 2026 ERAS season. Thalamus has stated that it does not use AI to score applications, automatically filter applicants, or determine whether essays were written with AI tools — but multiple residency programmes independently use third-party AI-detection software on personal statements. Stanford and several large IM programmes have publicly discussed using detection during the 2024-2025 cycle. Detection accuracy is imperfect, but the regulatory risk is what matters: a flagged statement is investigated, and a sustained finding is reported back to programmes.

Is Grammarly considered AI?

Grammarly uses AI for its core grammar and spelling checks, and its newer suggestion features include AI-generated alternative phrasings. We use Grammarly only for the mechanical proofreading suggestions (spelling, punctuation, simple grammar), with all generative-rewrite features disabled. Where Grammarly suggests a paraphrase or a tone change, our reviewers either ignore it or apply their own judgement instead. The line is: Grammarly catches a missed comma; the physician reviewer decides whether to rewrite a sentence.

What happens if I have already used ChatGPT or Claude on my draft?

Bring the draft anyway. The 2026 ERAS certification is about what you submit, not what you produced privately during brainstorming. Your reviewer will treat AI-touched drafts the same way they treat any draft — as raw material that needs honest editorial judgement to reach a defensible final submission. The final version you certify must be your own work; that is what we deliver.

Why do other editing services use AI?

Some do. Many residency editing services use GPT or Claude internally to generate suggestions faster, which lets them advertise higher volumes and lower prices. The trade-off is the same trade-off the AAMC certification is designed to address: AI-drafted content is uniform, structurally predictable, and detectable by both human and automated review. The price compression is real; the regulatory risk is also real.

How can I verify your no-AI policy?

Three ways. First, every reviewer signs an internal no-AI editing agreement which we make available on request. Second, our edits are returned as tracked changes in Microsoft Word or Google Docs — you can see every change in context, who made it, and why. Third, our feedback memos are written by named physician reviewers; you receive their professional biography with every edit. If anything in your edit looks AI-generated, contact us; we will redo the work.

Does this policy apply to your other services?

Yes. The same no-AI editing policy applies to ERAS experience descriptions, letters of recommendation review, interview preparation materials, and SOAP rewrites. Every product on MyERAS Editing is human edited. The only software in the workflow is Grammarly for mechanical proofreading and standard document software.


Work with a physician reviewer

Every edit human. Every reviewer named. No AI generation, in line with the AAMC's 2026 ERAS certification.