
- Field notes
- Research integrity and ethics
- AI in peer review: what early-career researchers should know (and how to use it wisely)
AI in peer review: what early-career researchers should know (and how to use it wisely)

For early-career researchers, AI has become part of everyday work, whether in drafting manuscripts, organizing ideas, or exploring literature. A survey of 1,645 active researchers, conducted by Frontiers, reveals that 87% of early-career researchers already use AI tools, far more than any other career stage.
When it comes to peer review, though, a place where many ECRs feel the most pressure, the rules often feel unclear. What’s allowed? What’s ethical? And how can AI actually improve your reviewing without compromising your judgment or credibility?
This Field notes article offers a practical, grounded look at how early-career researchers can navigate AI use in peer review responsibly and confidently. You can explore the complete findings of the Frontiers study in our whitepaper: Unlocking AI’s untapped potential: responsible innovation in research and publishing.
1. Yes, AI is already part of peer review but mostly in small ways
More than half of reviewers (53%) have tried AI during peer review, but most use it only for surface-level tasks:
drafting or polishing review reports
summarizing sections of a manuscript
checking for unclear language
organizing feedback
Only a small group, around 19%, use AI to examine methodology, statistics, or the logic of arguments.
This is important because it means early-career researchers aren’t alone in experimenting, but they also aren’t alone in feeling unsure about how far to go.
2. Why early-career researchers use AI more and why that matters
The generational divide is one of the clearest findings in the study:
61% of reviewers with ≤5 years’ experience use AI regularly
Only 45% of reviewers with 15+ years do
For many ECRs, AI feels like part of the normal toolkit. It helps overcome common early-career challenges:
building confidence in academic writing
managing time across teaching, research, and service
checking clarity when English isn’t a first language
understanding complex manuscripts more quickly
Used well, AI can become a support system - one that makes reviewing less intimidating.
The key is understanding what counts as responsible use.
3. What AI can (and can’t) do for you during peer review
Here’s a breakdown to help you stay on safe, ethical ground.
AI can help you:
Understand the manuscript more quickly Ask AI to summarize long sections, clarify jargon, or outline the structure. Always verify summaries against the original.
Clarify your own understanding Use AI to explain statistical approaches, methodological terms, or unfamiliar concepts. Think of it like a study buddy, not an authority.
Draft clearer, more constructive feedback If you write bullet points, AI can help turn them into a cohesive, polite, and structured review. The scientific evaluation still comes from you.
Check for consistency in your comments Some reviewers use AI to ensure their tone is constructive or to verify that their feedback is internally coherent.
AI should not do the following:
decide acceptance, rejection, or revision outcomes
generate scientific assessments that you copy uncritically
fabricate reasoning, references, or claims
read confidential manuscripts into public/open AI tools if prohibited
replace your judgment on methodology or data integrity
Think of AI as a supporting tool, not a decision-maker.
4. How to use AI responsibly as an early-career reviewer
Keep human judgment in control
Use AI to surface ideas but then apply your field expertise, experience, and critical reasoning to evaluate those ideas.
Verify everything
Researchers frequently report AI hallucinations (inaccurate claims, invented citations). Always check original sources before relying on AI-generated statements.
Avoid putting confidential manuscripts into public tools
Unless a journal explicitly allows it, assume you cannot share manuscript text with open AI systems.
If you’re unsure, ask the editorial office - they expect questions.
Disclose your use if journal policy requires it
Transparency builds trust, especially for early-career reviewers who genuinely want to embrace best practice.
Use AI for exploration, not shortcuts
AI can help you:
test whether your interpretation holds up
identify unclear logic
compare methods in similar articles
surface overlooked variables
These uses develop your skills rather than weakening them.
5. What early-career researchers say about AI
ECRs describe AI as:
“A way to get more confident with complex papers.”
“A time-saver that lets me focus on the real science.”
“Helpful for structuring feedback but not for making judgments.”
“Useful for translations—reading and writing.”
But many also share worries:
“I’m afraid of using it incorrectly.”
“I don’t know what journals allow.”
“I’m not fully confident in AI’s accuracy.”
These concerns are valid. They reflect a wider truth in the whitepaper: the rules are still catching up to the reality of researcher behavior.
6. If you want to grow as a reviewer, AI can actually help
Peer review skills develop over years. For early-career researchers, AI can speed up that learning curve without undermining the process.
Here are ways to use AI that build long-term capability:
Use it to improve your analytical thinking
Ask AI to propose alternative interpretations of results, then evaluate whether they hold up.
Use it to expand your vocabulary for constructive feedback
Many ECRs struggle with tone. AI can help you craft feedback that is firm but respectful, a core peer-review skill.
Use it to explore common standards in your field
Prompt AI to list typical methodological checks in your discipline, then consider which apply to the manuscript you're reviewing.
Use it to practice before your first real review
Upload your own old manuscript draft (or a synthetic one) and practice reviewing it with AI as a companion tool.
7. If you feel unsure about AI use, know you’re not alone
The whitepaper shows that even though ECRs use AI the most, they often receive the least formal training. Many learned through trial and error, personal networks, or online experimentation.
35% of all researchers are self-taught
18% do nothing to ensure best practice
As expectations evolve across journals and institutions, early-career researchers are in a unique position to shape emerging norms by adopting responsible practices early.
8. A practical checklist for ECRs reviewing with AI
Before you start:
Does the journal allow AI use in review?
Will your use preserve confidentiality?
During the review:
Did you use AI only for support, not judgment?
Did you verify AI-generated ideas or summaries?
Did you assess methodology and analysis yourself?
After the review:
Do you need to disclose your use?
Is your feedback your own, even if AI helped with wording?
If you can tick these boxes, you’re on solid ground.
AI use will only continue to grow
AI is becoming a natural part of how early-career researchers work, learn, and participate in scientific communication. The goal isn’t to avoid AI out of fear or to use it uncritically but to treat it as a tool that supports deeper understanding, clearer thinking, and higher-quality contributions to your field.
Peer review is a skill that builds with practice. AI can help accelerate that growth so long as your judgment stays in the lead.
Discover all the survey findings and read and download the whitepaper for an in-depth exploration of the use of AI in publishing.





