The quality of scientific research, in any fields, is measured by the socio-economic benefits and its contributions to scientific advancement.
For that purpose, the research protocol must be planned on a clear and accurate methodology, recent and relevant bibliography, rigorous data analysis, reproducible results, and of course conform to the highest ethical standards during the whole research process.
Academic research in the 21st century is witnessing an irreversible transition to open science, which consists of open access to publications and research data, thus promoting a faster dissemination of the research findings, and therefore will have a major contribution on the improvement of research quality.
To support this transition, many innovative initiatives in scientific publications are aiming at enhancing the quality and transparency of research:
- Open peer review: to increase transparency and accountability of peer reviewers in order to prevent any risk of abuse or fraudulent peer review.
- Registered Reports: to increase the reproducibility of research in natural and social sciences. By reviewing study protocols before undertaking experiments, it allows a quality evaluation of the research project.
- Preprints: to accelerate the dissemination of research results.
- A Platform for post-publication peer review (PubPeer: https://pubpeer.com/): if fraud has occurred in published papers, it can lead to retraction (Retraction Watch, http://retractionwatch.com/).
The gold open-access model (author-pay) has given rise to a “parasite”: the so-called predatory journals. Articles published in these journals have a very superficial or non-existent peer review process, and therefore have no credibility or values.
These pseudo-journals are a fertile ground for scientific fraud (plagiarism, fabrication/falsification of data, etc.), which generate pollution of scientific literature and constitute a real danger for science.
In South Africa, predatory journals cost, in 2017, an estimated loss of US$25 million.
These journals are a waste of resources already limited in countries of the global south.
To counter this phenomenon, the famous campaign “Think. Check. Submit” (http://thinkchecksubmit.org/) allows researchers to evaluate and identify trusted journals to submit their manuscripts; there are also predatory journal blacklists that serve as a complementary tool to avoid these journals.
The evaluation of scientific outputs by bibliometric indices, such as impact factor of Clarivate Analytics, has demonstrated their ineffectiveness (DORA and Leiden Manifesto), . Both of these latter are against using these indicators to evaluate researchers, and advice for responsible use of metrics.
If we refer to Nature index, countries in the global south have a low, or even negligible, number of publications. Similarly, the performance in these countries is low according to Web of Science, as shown by a recent bibliometric study in Africa on scientific production in the last 15 years (2000–2015). This study found that South Africa and Egypt contributed to almost half of all African publications3.
addition, the quality of research is also measured by its positive impact on
the development of nations and the well-being of humans. Applied research and R
& D play a fundamental role in achieving this ambition. By involving the
socio-economic world in research projects and strengthening the R & D
component within companies, scientific research is now at the heart of any
development model. Today’s innovation, which is the keystone of all progress,
can not bear fruit and have lasting effects without being supported by
 European Commission (29 May 2018) Integrated advice of the Open Science Policy Platform Recommendations.
 Hollydawn Murray (31 May, 2018) Registered Reports to support reproducibility – an author and reviewer in conversation. F1000 Research blog.
 Munyaradzi Makoni (23 February 2018) Network seeks to lift African research integrity. Nature Index.
 Sooryamoorthy R. (2018) The production of science in Africa: an analysis of publications in the science disciplines, 2000–2015. Scientometrics, 115(1) : 317-349. doi : 10.1007/s11192-018-2675-0.