A Reviewer Told Me My Reference Didn't Exist. Here Are 5 Citation Mistakes That Get Papers Rejected.
Real examples of citation errors caught during peer review — fabricated papers, wrong author names, retracted citations, and more. All preventable before submission.
The most embarrassing reviewer comment you can get
I'll be honest. I once received a review that said: "Reference [14] does not appear to exist. Please verify." The cause? A ChatGPT-generated citation I'd included without checking. At the time, I didn't fully understand AI hallucination in academic contexts.
That experience made me obsessive about checking references before submission. Here are five citation mistakes that actually get flagged in peer review — and how to prevent each one.
1. Citing a paper that doesn't exist
This is the worst one. When you use AI tools to generate references, fabricated papers can slip in. The formatting looks perfect. The author names are real researchers. But the paper itself? Never written, never published.
In 2024, Retraction Watch reported cases where publishers retracted multiple papers specifically for "containing AI-generated fabricated citations." Some of these papers had passed peer review before the problem was discovered. The reputational damage to those authors was severe.
How to prevent it: Run all citations through Crossref (search.crossref.org) or use Cite Checker for batch verification. Click every DOI link to confirm it resolves. Be especially thorough with any reference that came from an AI tool.
2. Wrong author names or publication years
This is the most common citation error, and the easiest to dismiss as trivial. It's not.
Real examples I've encountered:
- "Vaswani et al., 2017" written as "Vaswani et al., 2018" (the publication year of Attention Is All You Need)
- "LeCun" split into "Le Cun"
- A four-author paper where the second author's initial was wrong
A reader searching for a paper with the wrong year might not find it. Google Scholar filters by year, so a one-year error can make a paper invisible in search results.
How to prevent it: Don't type bibliographic data manually. Import via DOI into Zotero or Mendeley. If you've already typed things by hand, Cite Checker's verification results will flag title and author mismatches.
3. Citing a retracted paper
This one is sneaky because the paper was legitimate when you first read it. But between starting your manuscript and submitting it, papers get retracted. Data fabrication, plagiarism, methodology problems — the reasons vary.
The classic case: Wakefield's 1998 Lancet paper claiming a link between vaccines and autism has been cited thousands of times *after* retraction. Grieneisen and Zhang (2012) found that 31.8% of post-retraction citations didn't even mention the retraction.
How to prevent it: For critical citations, visit the journal page and look for retraction notices. In biomedical fields, search the Retraction Watch Database (retractiondatabase.org). Make it a habit to re-check your reference list right before submission.
4. Inconsistent citation style
I once read a review report that said: "The reference list mixes APA and IEEE formats." This alone rarely causes rejection, but it tells reviewers you're careless about details. And if they think you're careless about formatting, they'll start wondering what else you were careless about.
Common patterns:
- Some citations have DOIs, others don't (if available, include them for all or none)
- Journal names alternate between full and abbreviated forms
- Author names switch between "Smith, J." and "John Smith"
How to prevent it: Read the target journal's Author Guidelines before you start writing. Use a reference manager like Zotero to enforce a single style. Do a final visual pass through the entire reference list.
5. Incomplete bibliographic information
No volume number. No page range. No DOI. Missing metadata means readers can't find the original source.
The most common offender: papers that have DOIs but where the author didn't include them. Most journal guidelines now explicitly state "include DOIs when available."
Another frequent issue: preprint citations without arXiv IDs. If you cite an arXiv paper, include the arXiv identifier. Without it, the citation is nearly impossible to track down.
How to prevent it: Verify metadata on Crossref and fill in any gaps. Cite Checker's verification results show the database's version of each citation's metadata — compare it against yours.
A 10-minute pre-submission checklist
This takes 10-15 minutes total. Not much, given what's at stake.
1. Upload your PDF to Cite Checker → note any "Not Found" citations
- For "Not Found" journal articles, manually check Crossref
- Click through all DOI links (open them in browser tabs in batch)
- For high-stakes citations, check Retraction Watch
- Visually scan the reference list for style consistency
You don't need perfection. Catching obvious fabrications and major errors before submission dramatically reduces the chance of an embarrassing reviewer comment.
Tools help, but the final call is yours
Cite Checker, Google Scholar, Crossref — these are all assistants, not authorities. "Not Found" doesn't always mean fake, and "Found" doesn't always mean perfect.
The final check is always manual. Especially for any reference that involved AI in its creation, verify every single one. That extra 30 minutes might save you from a rejection letter.