Does anyone else feel like it becomes harder and harder to trust the information we consume every day? From algorithm-driven news cycles to the blazing speed at which information (regardless of its accuracy) travels, the drive toward research integrity to restore trust in science can seem steeper and steeper.  

As the public conversation around research becomes more incendiary, exaggerated, and skeptical, the integrity of the scholarly record has never been more critical. Scholarly publishing today has to not only ensure accurate, valid, and impactful science is shared, but also uphold public trust in how that record is created.  

This dual mission becomes challenging when the systems upholding scientific credibility remain invisible or misunderstood. The solution lies in the evolution of one of our most foundational systems: peer review.  

Peer review today is not what it was even five years ago. Today, I argue that we need to recognize that peer review is not just a publishing workflow, it’s a critical piece of public infrastructure in the defense of research.  

From process to infrastructure 

While peer review has always been a vital part of publishing, its function in the broader knowledge ecosystem is changing. It is a signal of rigor, responsibility, and transparency. Beyond determining what gets published, peer review reflects how scientific communities govern themselves, how credibility is earned, and how evidence is evaluated.  

The weight of this process means that peer review is more than a workflow, it is the infrastructure that underwrites trust both within the academic community and beyond. That is, if it is well-designed, properly supported, and effectively executed.  

Given this shift, we can’t treat peer review as static or operational. Peer review needs to adapt in structure and technology, as research integrity threats gain momentum and as the public knowledge ecosystem becomes more fraught.  

The rising complexity of editorial responsibility 

To be an editor in today’s scholarly publishing world, you have to be a juggler, easily able to navigate: 

  • The ethical use of generative AI in submissions 
  • Increasingly sophisticated fraud and peer review manipulation 
  • A growing demand for transparency from funders, institutions, and readers 
  • The tension between rapid dissemination and rigorous verification 
  • New technologies, which sometimes bring new complexities 
These pressures are not temporary. They reflect a structural shift in how science is communicated—and who’s paying attention. 

The editorial role is, in effect, an act of stewardship. And that means thinking beyond individual articles to ask bigger questions: What does trust in our journal look like? What does it rest on? What tools, policies, and partnerships do we need to sustain it? How big of a threat is research integrity, and how do I manage that burden on my peer reviewers? How do we approach contextualizing retractions to preserve the scholarly record? How do we communicate that retractions are in fact a trust signal? 

AI-assisted peer review: scale, efficiency, but human 

We can’t talk about the changes to peer review without talking about emerging technologies and the AI explosion. AI can be a critical tool to support editorial workflows, and it can be especially critical in the context of research integrity. But adoption must be thoughtful, strategic, and even cautious.  

  • Triaging submissions based on scope and technical quality 
  • Flagging statistical or methodological inconsistencies 
  • Detecting duplication, image manipulation, or undisclosed AI use 
  • Identifying links to retracted literature or prior misconduct 
These tools are not a replacement for human judgment. But they can enhance our ability to scale. Editors armed with AI tools can be efficient, and potentially catch issues earlier, before publication, before citation, before sharing.  

In this context, early intervention is about more than operational quality, and more than saving the time and resources of your peer reviewers. AI-assisted peer review can be about preventing the erosion of trust downstream, where flawed papers can be amplified, misunderstood, and can even go viral in today’s public discourse. Many organizations, including Silverchair, are exploring solutions of this nature (see recent partnership announcement on Hum’s Alchemist Review).  

Embedding trust architecture throughout the peer review process 

We can counteract many of the research integrity challenges publishers by combining the efficiencies afforded by technology, with the rigor and experience of reviewers and editors.  Research integrity checks, embedded directly into peer review software to avoid adding the complexity of new applications, are getting better every day to screen for plagiarism, image manipulation, ethical compliance, data availability, and authorship conflicts.   

These tools can do more than catch problems early, they also streamline editorial decision-making and reduce the burden on human reviewers. By surfacing key issues before peer review even begins, these checks allow reviewers to focus on scientific merit rather than policing standards. The result is a faster process that produces higher-quality, more reliable publications. 

I think it is unlikely that these types of checks will ever replace peer review. There is too much room for nuance, and ceding control over peer review to these technologies risks undermining trust even further if they are found to be ineffective. Think of them as tools to help you meet the profound responsibility of being a steward of the scholarly record.  

Research integrity today might be just as much about the quality of research as its perceived legitimacy. By collaborating to co-create ecosystems that are efficient, ethical, and transparent, we can create the environment needed to reinforce the foundation of trust in science.  

Science doesn’t ask for blind faith — it earns belief through process 

We may have moved beyond the information or knowledge economy to the attention economy. The best response to a system that thrives on noise is to show the systems that keep science ethical.  

Peer review is one of those systems. When it works well, with transparency, flexibility, and bolstered by automated integrity checks and AI assistants, it is more than a publishing workflow. It is a promise. A promise that scholarship, even when imperfect, is accountable. A promise that complex truths are worth defending and that trust can be rebuilt one review at a time.  

1993 1999 2000s 2010 2017 calendar facebook instagram landscape linkedin news pen stats trophy twitter zapnito