Retraction watch - Misspelled Cell lines and how to prevent them

Thanks to Jennifer Byrne and Retraction Watch for highlighting a very important topic of reproducibility, misspelling of reagent and resource names. As Dr. Byrnes points out, these ghost cell lines take a life of their own, most likely, as they are shared between labs. This makes it impossible to check for authenticity of the cell line and is associated with significant mistakes in the literature.

To combat this, Byrnes suggests that all papers using cell lines, check the cell line names with the authority, Cellosaurus, and add the RRID to each cell line used, that way the information about the cell line is accurate.

We could not agree more, thank you for highlighting how RRIDs can help clean up the scientific literature.

https://retractionwatch.com/2024/03/11/misspelled-cell-lines-take-on-new-lives-and-why-thats-bad-for-the-scientific-literature/

SciScore and the Karger Vesalius innovation award

Vesalius Innovation Award

by Karger Publishers

"From the third time on, it's tradition!"- True to this principle, Karger Publishers is pleased to invite innovative companies with a focus on Open Science and Health Sciences to participate in the 3rd Vesalius Innovation Award in 2022 once again.

Eponym Surgeon Andreas Vesalius not only revolutionized anatomy when he published De Humanis Corporis Fabrica in 1543. His work also took typography and illustration to a new level, laying the foundation for an entirely new view of the human body for many generations to come.

Fast forward to 2022, Health Sciences publishing is ready for a new revolution. The movement towards Open Research and the increased use of digital technologies in healthcare fundamentally change the way researchers, doctors, and patients create and consume knowledge.

For the further development of the award, this year the focus for participating startups will be expanded to include "Open Science", which further reflects the innovative spirit of Andreas Vesalius.

Our Finalists

The last weeks were very exciting for the Vesalius Innovation Award team as well as for the members of the Jury, since the five finalists who will now be benefitting from the mentoring program had to be chosen.

All participants were amazed by the high quality of the applications, which made the selection process pleasantly difficult, but led to very fruitful discussions within the decision-making committee.

We are delighted that we can now present the five finalists.

alviss.ai develops AI software to assist scientists and publishers in the scientific article reviewing process. Our software provides a toolkit for users to optimize any article and streamline the publication process.

ImageTwin is the solution to detect manipulations and duplications in figures of scientific articles. By comparing the figures with a database of existing literature, problematic images will be identified within seconds for all relevant image types, including blots, microscopy images, and light photography.

Prophy believes that fair, transparent and efficient peer-review lies at the foundation of all good scientific research. As an organisation founded by scientists for science, they use Artificial Intelligence to power automated expert finder, delivering independent reviewers who can review any manuscript from any discipline, ensuring you can trust in the science you read.

SciScore is a scientific content checker / validation tool that verifies common rigor criteria (NIH, MDAR, ARRIVE) and research resources (antibodies, cell lines, organisms). SciScore uses text mining techniques to perform this critical validation in minutes, providing a report to the editors, reviewers, or authors about criteria that have and have not been addressed.

scientifyRESEARCH is an open access, curated and structured research funding database to connect researchers with research funding information. Our database covers global funding across all disciplines and all career stages.

https://www.karger.com/company/innovation/vesalius-innovation-award

FASEB Journals Now Provide Researchers with SciScore Tool to Improve Rigor and Reproducibility

FASEB has a new tool available for researchers to improve the rigor and reproducibility of science submitted to FASEB journals. An automated tool, SciScore, is now integrated into the journals’ submission system, and provides key recommendations and practical steps researchers can take to improve the rigor and reproducibility of their reported science.

“As part of our mission to advance health and well-being by promoting research and education in biological and biomedical sciences, FASEB has long demonstrated a commitment to scientific integrity,” says Darla Henderson, PhD, Director of FASEB Open Science and Research Integrity. “Rigor and reproducibility are a core component of scientific integrity and integrating the SciScore tool into our submission and peer review system provides instant feedback directly to the researcher outlining practical steps they can take to improve reporting.”

SciScore is an automated text-mining tool that evaluates the methods section of scientific articles and checks compliance against recommendations, requirements, and best practices for rigor and reproducibility. The tool generates for researchers both an overall score and a detailed report that provides guidance on potential improvements. FASEB has included the tool in their submission and peer review process as a support tool for authors, available for use at submission and again prior to publication.

“We are very excited to be able to collaborate with our FASEB colleagues and really hope that we can be of service to the authors.” says Anita Bandrowski, CEO of SciCrunch, the company that built SciScore. “At the end of the day, checking and verifying little things like whether an antibody includes enough information to find the reagent is fairly tedious and authors can miss something. However, that missing bit of information can really scuttle another researcher's project, so we hope that these little reminders to make the manuscript better at publication will improve the overall quality of the FASEB journals.” 

Henderson adds “Research integrity and quality are at the center of everything we do. By taking a slightly different approach, giving researchers direct access to the tool, the report, and the score at both initial submission and again pre-publication, we are empowering the community with resources to better understand rigor and reproducibility issues and to enact their own change, much in the same way we give researchers resources to improve data sharing and reuse through our recently launched DataWorks! initiative. FASEB will also be able to review aggregated scores over time and assess how our community is improving in this key metric for research integrity, helping us identify and solve additional researcher needs as we move together towards an open science world.”

https://stagingfaseb.citrodigital.biz/journals-and-news/latest-news/faseb-journals-provide-researchers-with-sciscore-tool-to-improve-rigor-and-reproducibility

OpenBehavior adopts RRIDs

The RRID Initiative by OpenBehavior and SciCrunch

The OpenBehavior project received support from the National Science Foundation in January 2021. There are three main goals for the initial funding period: (1) create a database of open-source tools used in behavioral neuroscience research and issue Research Resource IDentification (RRID) for all projects featured on the OpenBehavior website, (2) initiate a repository of video recordings for common neuroscience tasks and community conversations on video analysis, and (3) host training sessions on the use of open-source hardware and software at conferences such as the annual Society for Neuroscience meeting.

The first goal of the project has been achieved this summer due to hard work of Marty Isaacson, a senior majoring in Neuroscience at American University and working in the Laubach Lab, and Anita Bandrowski and Edyta Vieth from SciCrunch.

Full article: https://edspace.american.edu/openbehavior/2021/08/05/rrid-initiative/

Research Square Launches Beta Testing for SciScore Automated Assessment Tool

The SciScore tool is now being offered to Research Square preprint authors. The tool detects RRIDs, it verifies them, it also detects sentences where RRIDs should be and attempts to suggest RRIDs if it can find a catalog number. Please see the press release.

https://www.sspnet.org/community/news/research-square-launches-beta-testing-for-sciscore-automated-assessment-tool/

RRIDs are now part of the MDAR Checklist

The MDAR checklist has been announced:

https://www.cos.io/blog/minimal-reporting-standards

The checklist includes RRIDs for antibodies, cell lines, and organisms. https://osf.io/bj3mu/

This checklist is well accepted by authors:

http://crosstalk.cell.com/blog/testing-the-materials-design-analysis-reporting-mdar-checklist

The checklist is now required by Science publications:

https://blogs.sciencemag.org/sciencehound/2018/11/09/towards-minimal-reporting-standards-for-life-sciences/

https://science.sciencemag.org/content/367/6473/5.full

MedTech News: SciScore’s Innovative Solution Supports Pre-clinical Research Reproducibility

Boston’s MedTech news finds the story about the RRID compliance tool, SciScore intriguing:

https://massachusettsnewswire.com/medtech-news-sciscores-innovative-solution-supports-pre-clinical-research-reproducibility-44125/

SAN DIEGO, Calif., Nov 22, 2019 (SEND2PRESS NEWSWIRE) — SciScore announces the release of its innovative solution, the first and only working application of its kind, in support of the pre-clinical scientific research community’s pursuit of reproducibility and transparency.

“Finding the cure for any medical ailment facing our society, costs money. And, rightly so, the public has great expectation that the money spent on research will advance healthcare,” says Anita Bandrowski, a neuroscience researcher at the University of California, San Diego and CEO of SciScore. “This tool makes it easier for researchers to focus on the work-at-hand by indicating when, or if, something was overlooked or omitted in the process of reporting the research in a manuscript.”

In January 2016, the National Institute of Health (NIH) introduced new grant review guidelines that focused on four key areas of reproducibility and transparency. This move changed the way in which grants are awarded today. “It remains to be seen in time but it’s possible that NIH changed the business of pre-clinical medical research for the better, and for good,” Bandrowski said.

In conjunction with NIH, many journals have revised author guidelines to direct researchers to include and emphasize elements required for reproducibility and transparency: PLoS, JBC, eLife, AACR, MBoC, and GSA. SciScore is being piloted by the following publishers: Wiley & Sons, NatureResearch, and eLife.

SciScore provides a score and supporting report that is used by the agency, publisher, or individual author to identify if key areas of reproducibility and transparency are addressed in the manuscript. It uses AI and deep learning technology to calculate a score by looking for evidence of randomization, blinded conduct of experiment, sample size estimation, whether sex is included as a biological characteristic, and cell line authentication or contamination. It also detects any resource ambiguity, like a mislabeled or unidentified cell line.

An author may improve a score by adding information that may be missing or correcting information that is obscure. The manuscript submitted for analysis is removed from the cloud server almost immediately after scoring, keeping information secure and private.

For more information, visit http://sciscore.com/.

About SciScore:

SciScore (SciScore.com) is an application developed by SciCrunch Inc. (scicrunch.com) supported by the Small Business Innovation Research (SBIR) program grants R43OD024432 and R44MH119094.