SciScore and the Karger Vesalius innovation award

Vesalius Innovation Award

by Karger Publishers

"From the third time on, it's tradition!"- True to this principle, Karger Publishers is pleased to invite innovative companies with a focus on Open Science and Health Sciences to participate in the 3rd Vesalius Innovation Award in 2022 once again.

Eponym Surgeon Andreas Vesalius not only revolutionized anatomy when he published De Humanis Corporis Fabrica in 1543. His work also took typography and illustration to a new level, laying the foundation for an entirely new view of the human body for many generations to come.

Fast forward to 2022, Health Sciences publishing is ready for a new revolution. The movement towards Open Research and the increased use of digital technologies in healthcare fundamentally change the way researchers, doctors, and patients create and consume knowledge.

For the further development of the award, this year the focus for participating startups will be expanded to include "Open Science", which further reflects the innovative spirit of Andreas Vesalius.

Our Finalists

The last weeks were very exciting for the Vesalius Innovation Award team as well as for the members of the Jury, since the five finalists who will now be benefitting from the mentoring program had to be chosen.

All participants were amazed by the high quality of the applications, which made the selection process pleasantly difficult, but led to very fruitful discussions within the decision-making committee.

We are delighted that we can now present the five finalists.

alviss.ai develops AI software to assist scientists and publishers in the scientific article reviewing process. Our software provides a toolkit for users to optimize any article and streamline the publication process.

ImageTwin is the solution to detect manipulations and duplications in figures of scientific articles. By comparing the figures with a database of existing literature, problematic images will be identified within seconds for all relevant image types, including blots, microscopy images, and light photography.

Prophy believes that fair, transparent and efficient peer-review lies at the foundation of all good scientific research. As an organisation founded by scientists for science, they use Artificial Intelligence to power automated expert finder, delivering independent reviewers who can review any manuscript from any discipline, ensuring you can trust in the science you read.

SciScore is a scientific content checker / validation tool that verifies common rigor criteria (NIH, MDAR, ARRIVE) and research resources (antibodies, cell lines, organisms). SciScore uses text mining techniques to perform this critical validation in minutes, providing a report to the editors, reviewers, or authors about criteria that have and have not been addressed.

scientifyRESEARCH is an open access, curated and structured research funding database to connect researchers with research funding information. Our database covers global funding across all disciplines and all career stages.

https://www.karger.com/company/innovation/vesalius-innovation-award

Rigor and Transparency Index: Large Scale Analysis of Scientific Reporting Quality - published in Journal of Medical Internet Research

JMIR Publications recently published “Establishing Institutional Scores With the Rigor and Transparency Index: Large-scale Analysis of Scientific Reporting Quality” in the Journal of Medical Internet Research (JMIR), which reported that improving rigor and transparency measures should lead to improvements in reproducibility across the scientific literature, but assessing measures of transparency tends to be very difficult if performed manually by reviewers.

Video interview with the authors of this article: https://youtu.be/iWcNuCOKp7U

The overall aim of this study is to establish a scientific reporting quality metric that can be used across institutions and countries, as well as to highlight the need for high-quality reporting to ensure replicability within biomedicine, making use of manuscripts from the Reproducibility Project: Cancer Biology.

The authors address an enhancement of the previously introduced Rigor and Transparency Index (RTI), which attempts to automatically assess the rigor and transparency of journals, institutions, and countries using manuscripts scored on criteria found in reproducibility guidelines (eg, NIH, MDAR, ARRIVE).

Using work by the Reproducibility Project: Cancer Biology, the authors could determine that replication studies scored significantly higher than the original papers which, according to the project, all required additional information from authors to begin replication efforts.

Unfortunately, RTI measures for journals, institutions, and countries all currently score lower than the replication study average. If they take the RTI of these replication studies as a target for future manuscripts, more work will be needed to ensure the average manuscript contains sufficient information for replication attempts.

Dr. Anita Bandrowski from the University of California San Diego said:

Research reproducibility is necessary for scientific progress. However, over the last decade, numerous reports on research irreproducibility have shed light on a lingering problem, one that is proving to be both troublesome and costly."

In an effort to encourage reproducibility, numerous scientific organizations and journals have adopted the Transparency and Openness Promotion guidelines, which focus on establishing best practices at the level of individual journals. 

Along a similar vein, the publisher-driven Materials Design, Analysis, and Reporting framework is a multidisciplinary research framework designed to improve reporting transparency across life science research at the level of individual manuscripts.

This framework provides a consistent, minimum reporting checklist whose criteria were used, in part, to create the first RTI, a journal quality metric focusing on research methodologies and reporting transparency.

Specifically, the authors here introduce the latest version of the RTI, which represents the mean SciScore over a subset of papers, and demonstrate how it can be used to assess reporting transparency within research institutions.

While we cannot simply describe all papers scoring a “2” as not replicable and all papers scoring an “8” as replicable, as numerous fields and their subsequent best practices exist, we can state that higher scores are associated with more methodological detail and as such are likely easier to use to attempt a replication. 


###

DOI - https://doi.org/10.2196/37324

Full-text - https://www.jmir.org/2022/6/e37324

Free Altmetric Report - https://jmir.altmetric.com/details/130343509

JMIR Publications is a leading, born-digital, open access publisher of 30+ academic journals and other innovative scientific communication products that focus on the intersection of health and technology. Its flagship journal, the Journal of Medical Internet Research, is the leading digital health journal globally in content breadth and visibility, and it is the largest journal in the medical informatics field.

To learn more about JMIR Publications, please visit https://www.JMIRPublications.com or connect with us via:

Head Office - 130 Queens Quay East, Unit 1100 Toronto, ON, M5A 0P6 Canada

If you are interested in learning more about promotional opportunities please contact us at Communications@JMIR.org

The content of this communication is licensed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, published by JMIR Publications, is properly cited.

JMIR Publications is a registered trademark of JMIR Publications.

About SciScore

SciScore is a scientific content checker / validation tool that verifies common rigor criteria (NIH, MDAR, ARRIVE) and research resources (antibodies, cell lines, organisms). These guidelines can be checked by editorial, but the process is tedious and takes a lot of effort from a skilled professional, so checklists are enforced only in the best-resourced journals. SciScore uses text mining techniques to do the job in minutes, providing a report to the editors, reviewers or authors about criteria that have and have not been addressed. Furthermore, it provides a numerical score, which allows editors to assess the percentage of criteria met or not met at a glance.

Contact Researchers: Anita Bandrowski | anita@scicrunch.com

Contact Media/Publishers: Martijn Roelandse | martijn@martijnroelandse.dev

LCSB partners with SciCrunch to promote reproducibility of research

Luxembourg, 3 December 2020 – As part of a broad initiative aiming to guarantee research quality and reproducibility of scientific results, the Luxembourg Centre for Systems Biomedicine (LCSB) at the University of Luxembourg is partnering with SciCrunch Inc. The LCSB will be one of the first academic institutions to use SciScore – an automated validation tool for scientific articles – as part of its internal quality control process. It will contribute to further enhance the rigour and reproducibility of the publications written by LCSB’s researchers.

the_luxembourg_centre_for_systems_biomedicine_recruits_a_new_director_medium.jpg

Scientific research always faces new challenges and, with the increasing volume of data, the complexity of new tools and the fast pace of modern science, ensuring that experiments can be repeated and results validated is as crucial as ever. Over the past years, the scientific community has widely acknowledged that the reproducibility crisis needs to be addressed in order to guarantee trust in the published literature and best use of valuable resources.

Early on, the LCSB recognised reproducibility as a very important topic and decided to tackle the issue by implementing measures to promote research quality. Grouped under the umbrella of the Responsible and Reproducible Research (R3) initiative, they include state-of-the-art IT infrastructures, GDPR compliant data processes and tools for high-quality scientific computing code. “A particular emphasis has been placed on a standardised publication workflow which will now be complemented through the development of a pre-publication check,” details Dr Christophe Trefois, leader of the R3 team.

This internal verification will monitor the compliance with the latest standards and high-quality of all manuscripts written at the LCSB, through a series of checks addressing issues such as plagiarism, data protection and source code quality. SciScore, through its rigor check, will be one of the main components in this pre-publication pipeline.

“Part of the recent research is not reproducible due to flaws in reference material, unreliable source identification, and similar issues,” explains Anita Bandrowski, Founder and CEO of SciCrunch, the company behind SciScore. “Our solution helps flag these issues before scientific articles become part of the permanent record.”

This automated validation tool verifies common rigor criteria and research resources in manuscripts. Scanning through the methods section of an article thanks to text mining techniques, SciScore detects whether the authors address issues such as bias, sample size, blinding, randomisation and more. It also analyses sentences that mention research resources – such as antibodies, cell lines, organisms and software tools – and determines how uniquely identifiable each resource is. SciScore performs this critical validation in roughly a minute, generates a score to assess the percentage of criteria met and provide a report that will help improve the rigor and reproducibility of the manuscript.

“As an integral part of the pre-publication check, SciScore will help LCSB authors ensure their manuscripts are better prepared for peer review,” says Dr Trefois. The tool will be first used to go through a database of all the articles published at the LCSB, giving a good insight into some reproducibility and transparency aspects of previous research. “Then, as the pre-publication check is gradually implemented, our scientists will be able to assess their new manuscripts quickly and accurately.” It will make it easier for authors to focus on the work-at-hand by indicating when, or if, something was overlooked or omitted. This immediate feedback will allow them to both learn best practices for writing the methods section and to directly implement the relevant changes.

“Most of the stakeholders in the scientific community have acknowledged the reproducibility issue and many academic institutions are currently developing strategies to address it but practical solutions are not yet widely implemented,” details Prof. Rudi Balling, director of the LCSB. “We made it a priority at the LCSB and we are very happy to be the first research centre to add SciScore to its toolbox to promote quality, transparency and reproducibility of science.”

 

---

 

About the LCSB

The LCSB is an interdisciplinary research centre at the University of Luxembourg. It is accelerating biomedical research by closing the link between systems biology
and medical research. Collaboration between biologists, medical and computer scientists, physicists, engineers as well as mathematicians is offering new insights in complex systems like cells, organs and organisms. These findings are essential for understanding principal mechanisms of disease pathogenesis and for developing new tools in diagnostics and therapy.

Neurodegenerative diseases like Parkinson’s Disease and description of diseases as networks are in the focus of LCSB’s research. The Centre has established strategic partnerships with leading biomedical laboratories worldwide and with all major biological and medical research units in Luxembourg. The LCSB fosters collaboration with industrial partners and has founded several spin-off companies, thereby accelerating the translation of fundamental research results into clinical applications.

https://wwwen.uni.lu/lcsb

 

About SciCrunch Inc.

Based in San Diego, California, SciCrunch Inc. was founded on a desire to reduce scientific irreproducibility. Our mission is to improve scientific literature through the development of tools and services around the provisioning of research resource identifiers (RRIDs). Initially, SciCrunch was established to specifically address the long-term sustainability of technologies and data assets developed through the neuroscience information framework (NIF) and the NIDDK information network (dkNET) projects. As a result, we are constantly advocating to ensure that data remain open and free to use in order to build for a more sustainable tomorrow.

SciCrunch Inc. was founded in December of 2015 by Drs. Bandrowski and Martone. After over 50 years combined in the industry, they saw a need for a company that not only knew the ins and outs of the research life cycle but also looked at the big picture with an emphasis on scientific reproducibility. Between them, they have developed and lead dozens of teams across a multitude of projects including Force11 and the Resource Identification Initiative, working to restore the public's confidence back into scientific research.

https://www.scicrunch.com

 

Karger Starts Trial with Methods Review Tool SciScore

Basel, Switzerland, 5 November 2020

Karger Publishers supports scientists in evaluating their manuscripts for reproducibility.

Karger has started a trial with the artificial intelligence (AI) based methods review tool SciScore, which assesses reproducibility of research. Initially, SciScore analyzes four of Karger’s journals, including 28,690 articles. This provides insight into the reproducibility and transparency of the research, both on a general level and in-depth on various rigor criteria and research resources. Compliance with rigor criteria is evaluated. The research resources used in the experiment are reviewed to see if they can be uniquely identified using a persistent identifier (Research Resource Identifiers = RRID). The analysis results in a score for each scientific paper (0-10). And these scores are averaged for each journal every year.

The Karger pilot journals are:

Psychotherapy and Psychosomatics

Liver Cancer

Dermatology

Cytogenetics and Genome Research

In the future, Karger plans to use these key performance indicators in the peer review process to improve the reproducibility of the research published by Karger.

“By integrating SciScore, Karger is supporting scientists in evaluating their manuscripts for reproducibility, a key criterion of scientific quality. We continually strive to improve our products and services for our customers, as the Health Sciences community is in the center of everything we do. Partnerships with startups are crucial to this objective,” says Daniel Ebneter, CEO at Karger Publishers.

“We are pleased to partner with Karger to enhance reproducibility of published papers. We are also very excited to interact with authors ahead of publication to improve reporting standards at such a trusted publisher in Health Sciences,” said Anita Bandrowski, Founder and CEO of SciCrunch, the company that offers SciScore.

About Karger Publishers

Karger Publishers is a worldwide publisher of scientific and medical content based in Basel, Switzerland. It is independent and family-led in the fourth generation by Chairwoman and Publisher Gabriella Karger. Connecting and advancing health sciences since 1890, Karger has been continuously evolving, keeping pace with the current developments and shifts in research and publishing. The publishing house is dedicated to serving the information needs of the scientific community, clinicians, and patients with publications of high-quality content and services in health sciences. Karger Publishers has 240 employees and is present in 15 countries around the globe.

For more Information about Karger Publishers please visit karger.com

About SciScore

SciScore is a scientific content checker / validation tool that verifies common rigor criteria (NIH, MDAR, ARRIVE) and research resources (antibodies, cell lines, organisms). SciScore uses text mining techniques to perform this critical validation in minutes, providing a report to the editors, reviewers, or authors about criteria that have and have not been addressed. Furthermore, it provides a numerical score, which allows editors to assess the percentage of criteria met or not met at a glance.

For more information, visit scicrunch.com/sciscore.

Download media release here

Sneak peek at an automated MDAR report

Previously, we have described the efforts of the MDAR group to create a rigor and transparency checklist that will phase in as publisher or journal specific checklists phase out. 

Journals, including the British Journal of Pharmacology, Nature, Cell, and many others are doing exceptional work improving rigor reporting using journal or publisher specific checklists. MDAR should be better if publishers adopt the checklist, as planned because everyone will theoretically have the same checklist. 

The checklist is currently required by Science (since January of 2020 each publication is accompanied by an MDAR_reproducibility_checklist document, check it out)! Kudos to Science, this is an important step in the right direction. 

Why is MDAR frightening to publishers?
Besides hearing the shrieks of authors filling out a 30+ point checklist, in order to be effective in improving manuscripts, someone has to verify the checklist, as journals with checklists in place already know! That is horrifying indeed, as there are no acceptance criteria for what constitutes a well filled out or a badly filled out checklist from the MDAR group. 

Due to the high amount of work on the part of authors, Science only requires the MDAR checklist at the final "your manuscript has been accepted" stage. There, it is likely that the checklist is going to be completed without too many audible shrieks from authors. However, this misses an important opportunity to influence the manuscript while authors and reviewers can be alerted to potential problems and the checklist can be part of the conversation about the quality of the science.

...what to do...

On the one hand the checklist is labor intensive and painful to authors, on the other hand the rigor items are important to address during review. 

Automation to the Rescue?
The SciScore team thought about this a lot, then stopped thinking and started doing. 

237c60a9-3ab6-46b3-a98e-610a78104317.png

SciScore, an automated tool for checking rigor criteria in manuscripts, will get a major upgrade in the coming month. The figure above is a sneak peek at the new automated MDAR checklist (using data from Hansen et al, 2020). The author created MDAR checklist is on the left and the SciScore generated checklist is on the right. These look very similar, the difference is largely in the mechanics. SciScore takes the methods section, and ~1 minute later an MDAR report is created and a completeness score is made available.

The score is a quick way for editors to assess roughly what percentage of criteria are addressed by authors in the methods section making the verification step, a little more tenable for journals. 

New Rigor Criteria, now being tested:

  • Ethics statement: IRB / Consent / IACUC / Field sample permit

  • Investigator blinding

  • Randomization of subjects into groups

  • Power calculation for group size

  • Statistical tests used

  • Sex as a biological variable

  • Subject demographics (age / weight)

  • Protocols / Clinical Trials IDs verified (including both EU and US clinical trials, and protocol journals / repositories like protocols.io)

  • Software repository URLs verified (github, bitbucket, google code)

  • Data repository IDs verified (GEO, ArrayExpress, Uniprot etc)

  • Inclusion & Exclusion criteria / Attrition

  • Replication (type and number)

  • Cell line contamination / Authentication

  • Presence of or the need for all RRID types (antibodies, cell lines, plasmids, organisms etc)

We still do not know if having a software tool that validates a checklist is going to help authors and reviewers improve manuscripts, but the promise of a quick automated check may entice more journals to look at MDAR in their workflow.  

Curious to see a MDAR Sample Report? Click HERE

Research Square Launches Beta Testing for SciScore Automated Assessment Tool

Authors can use assessments to improve manuscripts for free during beta trials 

DURHAM, N.C., USA (September 29, 2020)--Research Square is beta testing the new artificial intelligence (AI) based SciScore assessment tool for its preprint platform.

ScisScore will help scientists assess their manuscripts for reproducibility and adherence to rigor standards after uploading their preprints onto Research Square -- at no cost during the beta trial.

“As an integrated part of the preprinting process on our platform, SciScore will help authors ensure the science behind their manuscripts is better prepared for peer review,” said Rachel Burley, President of Research Square Company. “We are excited to expand the range of tools and services available to researchers through the Research Square  platform and to work with such an innovative partner.”

SciScore scans preprint methods against various research guidelines and rigor criteria known to support the reproducibility of scientific research, including evidence of reagent identifiability, randomization, sample size estimation, and more. SciScore also analyzes sentences for uniquely identifiable research resources, then generates a Methods Completeness Score and report that will help authors improve the rigor and reproducibility of their preprints. 

“It’s estimated that 50 percent of the United States’ preclinical research spend in recent years is not reproducible, mainly due to flaws in reference material, unreliable source identification, and similar issues,” said Anita Bandrowski, Founder and CEO of SciCrunch, which produces SciScore. “Our solution helps flag cell line contamination and other issues for authors at the preprint phase, before they’re submitted to journals.”

Authors uploading their manuscripts to Research Square can opt to receive the SciScore-based Methods Completeness Score and report at no cost through November 1, 2020.

About Research Square

Research Square, a division of Research Square Company, exists to make research communication faster, fairer, and more useful. Our industry-leading preprint platform, launched in 2018, is a  large, author-centric preprint server that brings transparency to the peer review process. Through our journal-integrated In Review service, innovative author dashboard, manuscript assessments, and research promotion services, we enable researchers to establish the primacy of their work, share it with the broader community, and receive useful feedback much earlier in the publication process. By improving the way science is shared, we accelerate the pace of global discovery and advancement. 

For more information on our platform and research promotion services, visit researchsquare.com.

About SciScore

SciScore is a scientific content checker / validation tool that verifies common rigor criteria (NIH, MDAR, ARRIVE) and research resources (antibodies, cell lines, organisms). SciScore uses text mining techniques to perform this critical validation in minutes, providing a report to the editors, reviewers, or authors about criteria that have and have not been addressed. Furthermore, it provides a numerical score, which allows editors to assess the percentage of criteria met or not met at a glance.

For more information, visit scicrunch.com/sciscore.

###

Contacts: 

Phillip Bogdan Communications Manager, Research Square Company

phillip.bogdan@researchsquare.com

Martijn RoelandseLead Business Development, SciScore

martijnroelandse@me.com

Aries Systems and SciScore Partner to Enable Enhanced Rigor and Reproducibility within Editorial Manager

North Andover, Massachusetts, 7 July 2020

Aries Systems Corporation, a leading technology workflow solutions supplier to the scholarly publishing community, is pleased to announce its partnership with SciScoreTM. Aries and SciScore have partnered to integrate Editorial Manager® (EM), a cloud-based manuscript submission and peer review system for scholarly journals, reference works, books and other publications, with SciScore, the leading methods review tool for scientific articles.

A poorly controlled study impacts the quality of scientific research. It is critical for those in the peer review process to adhere to rigor and transparency criteria to ensure their contributions support the reproducibility of scientific research. Focusing on these methods, SciScore detects whether authors address bias, sample size, sex, blinding, the randomization of subjects, and properly identified key biological resources (i.e., research reagents) within an Author’s manuscript, producing a score that roughly corresponds to the number of criteria filled compared to the number that was expected.

Prior to the integration with Editorial Manager, Authors would need to send key manuscript metadata into SciScore to have their content evaluated. Now, Editorial Manager can send this content to SciScore automatically on the Author’s or journal’s behalf. SciScore then scans the manuscript’s methods section(s) and sentences that contain research resources to generate a reproducibility score, reports and resource tables. SciScore’s evaluation will be made available within Editorial Manager, providing Authors, Editors and Reviewers with easy access to the data. The integration enables a more efficient and seamless manuscript analysis workflow as Editorial Manager users can complete the entire process without ever needing to leave the system. Publications utilizing Editorial Manager now have another tool at their disposal designed to improve the quality of their scientific papers and research.

Aries Director of Product Management Tony Alves stated, “Transparency and rigor adherence are critical to the reproducibility of scientific research. SciScore is a fantastic methods review tool that helps researchers quickly, accurately, and securely score their research for rigor and transparency adherence. Authors, Editors and Reviewers have expressed a desire to have their SciScore reports and tables centralized within Editorial Manager for some time. We are always looking for ways to innovate and enhance our EM users’ experience and feel that this integration will bring a lot of value. Publications who are members of Editorial Manager will individually subscribe to SciScore services and I am really pleased to be able to offer this integration!”

SciScore CEO Anita Bandrowski said, “SciScore is a first of its kind editorial tool addressing a part of the manuscript that reviewers do not want to read, the methods section. The tool does not get bored reading a long list of reagents and verifying whether the information about each one is accurate. It flags problem reagents, a much smaller set that reviewers can ask about. The tool produces a simple report flagging multiple issues that typically take editors a longer time to verify”.

The SciScore and Editorial Manager integration will become available with the release of Editorial Manager/ProduXion Manager version 17.0.

About SciScore | www.sciscore.com
SciScore scans your submitted methods sections for a variety of rigor criteria that have been shown to contribute to the reproducibility of scientific research. It also analyzes sentences that contain research resources (antibodies, cell lines, plasmids and software tools) and determines how uniquely identifiable that resource is based off of the provided metadata. Using this, SciScore is able to generate a reproducibility criteria-compliance score and a report containing a rigor adherence table and a key resources table based off of the "​STAR​" guidelines.

About Aries Systems​ | www.ariessys.com
Aries Systems transforms the way scholarly publishers bring high-value content to the world. Aries’ innovative workflow solutions manage the complexities of modern print and electronic publishing–from submission, to editorial management and peer review, to production tracking and publishing channel distribution. As the publishing environment evolves, Aries Systems is committed to delivering solutions that help publishers and scholars enhance the discovery and dissemination of human knowledge on a global scale. Aries Systems was acquired by Elsevier in September 2018. Publish faster, publish smarter, with Aries Systems.

SciScore to launch a pilot with the American Association for Cancer Research to help authors improve rigor and reproducibility in their published work

San Diego | Philadelphia, 2 June 2020

SciScore, an advanced, text-mining-based tool, is pleased to announce that the American Association for Cancer Research (AACR), the largest professional organization dedicated to advancing cancer research, will integrate SciScore into the AACR journals’ submission platform, eJournalPress, as part of a pilot program.

SciScore evaluates scientific manuscripts for compliance with recommendations and requirements designed to address different aspects of rigor and reproducibility in the published literature, e.g., MDAR, ARRIVE, CONSORT, and RRID standards. This tool provides a score and a supporting report to identify whether key areas of reproducibility and transparency are addressed in the manuscript. Amongst others, it will look for evidence of randomization, blinded conduct of experiment, sample size estimation, whether sex is included as a biological characteristic, animal/cell line authentication or contamination, and will verify the identity of the antibodies used.

“Finding the cure for any medical ailment facing our society involves the expenditure of both time and money. Research funders and the public more generally have the just expectation that the money spent on research will advance healthcare,” says Anita Bandrowski, a neuroscience researcher at the University of California, San Diego and CEO of SciScore. “SciScore will make it easier for AACR’s authors to focus on the work-at-hand by indicating when, or if, something was overlooked or omitted in the process of reporting the research in a manuscript.”

“This value-add tool for our authors, reviewers, and editors will support our initiatives to disseminate critical and reproducible research that will lead to the conquest of cancer, ” says Christine Battle, Publisher and Vice President of Scientific Publications at AACR. “We are delighted to be the first to integrate SciScore in the journal workflow across all nine of the AACR journals. It is an effective tool to measure — and ultimately improve — the quality of the science being published.”

About the American Association for Cancer Research (AACR)
The AACR is the largest professional organization dedicated to advancing cancer research and its mission to prevent and cure all cancers. The organization was founded in 1907, and its membership includes more than 47,000 laboratory, translational, and clinical researchers; population scientists; other healthcare professionals; and patient advocates residing in over 127 countries.
https://www.aacr.org/about-the-aacr/

About SciScore
SciScore is a scientific content checker / validation tool that verifies common rigor criteria (NIH, MDAR, ARRIVE) and research resources (antibodies, cell lines, organisms). SciScore uses text mining techniques to perform this critical validation in minutes, providing a report to the editors, reviewers, or authors about criteria that have and have not been addressed. Furthermore, it provides a numerical score, which allows editors to assess the percentage of criteria met or not met at a glance.

Contact Researchers: Anita Bandrowski | anita@scicrunch.com
Contact Media/Publishers: Martijn Roelandse | martijnroelandse@me.com