top of page

Should AI code be available for reviewers?

Issue #76

Data, Numbers

by Michael Seadle


According to a Retraction Watch post by Ivan Oransky on 14 May 2024, “[a] group of researchers is taking Nature to task for publishing a paper earlier this month about Google DeepMind’s protein folding prediction program without requiring the authors publish the code behind the work.”¹ The issue is whether the new Artificial Intelligence tool “AlphaFold3” can reliably predict the behaviour of complex molecules such as DNA and RNA. The authors of the complaint write: ““the model’s limited availability on a hosted web server, capped at ten predictions per day, restricts the scientific community’s capacity to verify the broad claims of the findings or apply the predictions on a large scale. Specifically, the inability to make predictions on novel organic molecules akin to chemical probes and drugs, one of the central claims of the paper, makes it impossible to test or use this method.””¹ 


Nature editor in chief Magdalena Skipper told Retraction Watch: ““While seeking to enhance transparency at every opportunity, Nature accepts that there may be circumstances under which research data or code are not openly available. When making a decision on data and code availability, we reflect on many different factors, including the potential implications for biosecurity and the ethical challenges this presents.””¹ What remains unclear is what exactly is at stake other than Google’s proprietary interest in restricting access to its code. 


Nature’s policy raises a larger question for the research community: under what circumstances is it legitimate to keep the basis for claims in a scholarly article from the people reviewing the article for its validity. It is easy to imagine less open companies than Google hampering reviewers access to the necessary information to judge reliability by refusing essential information. It also raises questions about whether artificial intelligence tools in general should be allowed to be exempt from the normal rules of peer review, since a peer reviewer cannot reasonably make a judgement about a black box system.

 

1 Oransky, Ivan, Nature earns ire over lack of code availability for Google DeepMind protein folding paper. Retraction Watch, 14 May 2024 https://retractionwatch.com/2024/05/14/nature-earns-ire-over-lack-of-code-availability-for-google-deepmind-protein-folding-paper/

16 views

Recent Posts

See All

Commenti


bottom of page