Indian Journal of Psychological Medicine
  Home | About Us | Editorial Board | Search | Ahead of print | Current Issue | Archives | Instructions | Contact | Advertise | Submission | Login 
Users Online: 463 
Wide layoutNarrow layoutFull screen layoutHome Print this page Email this page Small font sizeDefault font sizeIncrease font size


 
 Table of Contents    
EDITORIAL
Year : 2019  |  Volume : 41  |  Issue : 4  |  Page : 303-305  

Beyond research reporting guidelines: How can the quality of published research be enhanced?


Department of Psychiatry, Jawaharlal Institute of Post Graduate Medical Education and Research, Dhanvantri Nagar, Puducherry, India

Date of Web Publication15-Jul-2019

Correspondence Address:
Dr. Vikas Menon
Department of Psychiatry, Jawaharlal Institute of Post Graduate Medical Education and Research, Dhanvantri Nagar, Puducherry - 605 006
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/IJPSYM.IJPSYM_513_18

Rights and Permissions

How to cite this article:
Menon V. Beyond research reporting guidelines: How can the quality of published research be enhanced?. Indian J Psychol Med 2019;41:303-5

How to cite this URL:
Menon V. Beyond research reporting guidelines: How can the quality of published research be enhanced?. Indian J Psychol Med [serial online] 2019 [cited 2019 Aug 20];41:303-5. Available from: http://www.ijpm.info/text.asp?2019/41/4/303/252709



Published original research papers, regardless of the impact factor and standing of the journal, are often found to suffer from several shortcomings. At best, these papers junk the scientific literature, and at worst, misinform clinicians and researchers. The fact remains that they see the light of the day after escaping the layers of quality control mechanisms available in publishing today such as the double-blind peer review and multiple editorial checks. Partly, in an effort to address these issues, and to bring about some uniformity in reporting research designs, several evidence-based checklists such as the Preferred Reporting Items for a Systematic Review and Meta-analysis (PRISMA), Strengthening the Reporting of Observational Studies (STROBE), Meta-analysis of Observational Studies in Epidemiology (MOOSE), and Consolidated Standards of Reporting Trials (CONSORT) statements were developed for various research designs in the last two decades and are available on the Enhancing the QUAlity and Transparency Of health Research (EQUATOR) network.[1]

Notwithstanding these positive developments, it is widely accepted that the dissemination of these guidelines has only modestly impacted the quality of medical research reporting. Recently, Song et al., in a study of nearly 500 randomized controlled trials (RCTs),[2] divided into pre-CONSORT (n = 285) and post-CONSORT (n = 214) era, found that the reporting of several essential steps in an RCT, including methods of randomization, allocation concealment, and blinding continued to be inadequate 20 years following the first publication of the CONSORT statement.[3] Specialty specific studies from obstetrics and gynecology and surgery have concluded similarly.[4],[5] Even papers published in the most influential high-impact biomedical journals are not immune to deficits in reporting.[6] As for observational studies, experts have opined that it is unnecessary and may even be detrimental to insist on registered research protocols before publication and that simpler alternatives may need to be found.[7]

Though one may argue that there is inadequate dissemination of the reporting guidelines, the problems leading to poor quality of published research are far more complex and unlikely to be resolved solely with strategies aimed at raising awareness of the many reporting guidelines. Rather, what is required is a multipronged strategy that includes a nuanced understanding of the limitations of these guidelines and incorporates educational and statutory elements. A few strategies to this effect are outlined below:

  1. Promoting awareness, adoption, and a balanced understanding of reporting guidelines: For starters, these guidelines are only intended to be a checklist for reporting and do not pre-empt issues in design or methodology. Therefore, they cannot be expected to be a solution for the garbage-in, garbage-out phenomenon in research. Nevertheless, a good understanding of what is eventually expected should, ideally, prompt researchers to address some of those issues at the initial stage itself. Focus group discussions with early career researchers may elicit barriers to guideline adherence and provide inputs to design appropriate remedial measures. Raising awareness of reporting criteria and their utility, through periodic thematic workshops, for all stakeholders may enhance guideline uptake. Research methodology workshops should devote an entire session for introducing research guidelines. Institutions should incentivize and popularize research papers that demonstrate good adherence to reporting guidelines. Nodal funding agencies such as Indian Council of Medical Research (ICMR) and Department of Science and Technology (DST) should design research process algorithms, along the lines of the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) initiative [8] aimed at guiding young researchers to develop guidelines-compliant research protocols
  2. Registered research reports and publishing of research protocols: Registering of research reports has taken a big leap in the last decade or so. In India, this trend is most obvious for clinical trials, probably owing to the insistence of many journals on prior registration of trials with the Clinical Trials Registry of India (CTRI) before publication.[9] Now, studies with any research design (observational/interventional) can be registered with the CTRI, though few authors bother with registering observational study protocols. Nearly a decade ago, an editorial in the British Medical Journal (BMJ) had drawn attention to the need for registering research protocols for even observational studies, to enhance research transparency.[10] It is argued that this would reduce the problems of data dredging and selective reporting by researchers who may indulge in these undesirable practices for career advancement or financial gains. Sister journals of the BMJ, such as BMJ Open, as well as the Lancet have been publishing protocols of observational studies for quite a while now. However, it remains to be seen if the editors of smaller, upcoming journals would be willing to sacrifice potential submissions at the altar of research transparency by insisting on prior registration of protocols. Editors may also consider publishing research protocols in their journals to promote transparency and enhance quality. Protocol publishing can be one way to ensure that the study results eventually get published regardless of their direction. Editors too are often guilty of disfavoring papers with negative results, and protocol publishing may help in mitigating the publication bias that plagues biomedical research. If these measures are perceived to be resource intensive, a low burden alternative for editors would be to insist on a simple declaration of transparency by the corresponding author that the study hypothesis arose before the inspection of the data. Nevertheless, in the long run, a central repository of observational study protocols, akin to the clinical trial registry, would be highly beneficial for researchers in two ways: by enabling easier access to previous similar research and thus reducing duplication, and by providing an opportunity to build and improve upon previous work, with greater scientific value and impact
  3. Establishment of research oversight committees at both institutional and journal level: It has been proposed that journals should consider appointing a qualified person to the post of a research ombudsman.[11] While the initial goal of such an appointment was mainly to put a lid on abuse of trust and power by journal editors,[12] in the current scenario, there is a felt need to confer a wider educational (about research integrity, protocols and guidelines) and consultancy (for research related queries) role for this watchdog. However, recognizing the limitations and potential workload issues, perhaps, journals and institutions should consider establishing an independent “research oversight committee” comprising chosen experts. This committee should have a composition that reflects the aims and scope of the journal. For instance, a journal with a focus on consultation-liaison psychiatry can have experts from medical, surgical, dental and nursing fields, apart from psychiatrists, to fully evaluate the issues of subject safety and quality. The remit of this committee could include education and consultancy as described earlier. However, it should also oversee whether manuscripts conform to standard reporting protocols and offer suggestions to this effect for erring authors
  4. Augmentation of editorial practices and guidelines: This could include a requirement of publishing full table of findings as supplementary material online. This may address the problem of publication bias and cherry picking of data by the researchers. In addition, journals may mandate that the authors include responses to two questions at the end of an article; “What is known” and “What this study adds?” – a practice followed by few journals presently. Authors must specify their primary objective, and all other analyses done apart from this must be treated as exploratory. While exploratory analysis is exciting and has its own importance in contributing to hypothesis generation, the fact remains that they have to be treated as preliminary findings requiring further testing. To this effect, the practice of specifying certain analysis as “exploratory” in a paper will prevent wrong conclusions. Editors also need training in spotting data mining practices, and while rejecting such manuscripts, the editors must clearly specify the reasons so that the authors are better informed about the pitfalls of such practices
  5. Formation of a National Editors' Consortium: A less commonly encountered but, nevertheless, significant issue is that papers rejected for lack of methodological rigor by one journal find their way into another journal of similar standing. If this has occurred due to a genuine reason, such as the author making a significant improvement in the paper based on the comments of the initial reviewer(s), then it can only be beneficial to science. Worryingly, more often, this phenomenon owes its occurrence to wide disparity in reviewing standards (which allows such articles to fall through the sieve). The obvious solution here seems to be reviewer training. However, a central repository or consortium where the review comments can be made accessible to all those who wish to see them can have dual benefits: first, it allows editors of various journals to access these comments for papers in the rejection-resubmission cycle and second, it will expose the predatory journals (where such articles may commonly end up) for what they are and what they stand for. Peer reviewers' time is a precious commodity and any step that augments the sanctity of this time-trusted quality assurance mechanism in scientific publishing should be given a serious thought by the powers that be. It is time the editors worked together and not in isolation for the advancement of science.


In summary, only a combination of strategies that target awareness, training, and sensitive enforcement can eventually enhance the quality of published research. While none of these methods may be sufficient on its own, the bottom line is that we have to constantly endeavor to find ways to improve the quality of published research and mere awareness raising may not suffice. Editors need to be vigilant, yet sensitive, in framing journal policies and dealing with transgressions. Scientific publishing is serious business that demands time, training and trust. Training is clearly required for all stakeholders including editors and peer reviewers, apart from researchers and junior investigators.



 
   References Top

1.
Reporting guidelines | The EQUATOR Network [Internet]. Available from: https://www.equator-network.org/reporting-guidelines. [Last cited on 2018 Dec 10].  Back to cited text no. 1
    
2.
Song SY, Kim B, Kim I, Kim S, Kwon M, Han C, et al. Assessing reporting quality of randomized controlled trial abstracts in psychiatry: Adherence to CONSORT for abstracts: A systematic review. PLOS One 2017;12:e0187807.  Back to cited text no. 2
    
3.
Moher D. CONSORT: An evolving tool to help improve the quality of reports of randomized controlled trials. Consolidated Standards of Reporting Trials. JAMA 1998;279:1489-91.  Back to cited text no. 3
    
4.
Halpern SH, Darani R, Douglas MJ, Wight W, Yee J. Compliance with the CONSORT checklist in obstetric anaesthesia randomised controlled trials. Int J Obstet Anesth 2004;13:207-14.  Back to cited text no. 4
    
5.
Nagendran M, Harding D, Teo W, Camm C, Maruthappu M, McCulloch P, et al. Poor adherence of randomised trials in surgery to CONSORT guidelines for non-pharmacological treatments (NPT): A cross-sectional study. BMJ Open 2013;3:e003898.  Back to cited text no. 5
    
6.
Hays M, Andrews M, Wilson R, Callender D, O'Malley PG, Douglas K. Reporting quality of randomised controlled trial abstracts among high-impact general medical journals: A review and analysis. BMJ Open 2016;6:e011082.  Back to cited text no. 6
    
7.
Pearce N. Registration of protocols for observational research is unnecessary and would do more harm than good. Occup Environ Med 2011;68:86-8.  Back to cited text no. 7
    
8.
Chan A-W, Tetzlaff JM, Gøtzsche PC, Altman DG, Mann H, Berlin JA, et al. SPIRIT 2013 explanation and elaboration: Guidance for protocols of clinical trials. BMJ 2013;346:e7586.  Back to cited text no. 8
    
9.
Satyanarayana K, Sharma A, Parikh P, Vijayan V, Sahu D, Nayak BK, et al. Statement on publishing clinical trials in Indian biomedical journals. Indian J Ophthalmol 2008;56:177-8.  Back to cited text no. 9
[PUBMED]  [Full text]  
10.
Loder E, Groves T, MacAuley D. Registration of observational studies. BMJ 2010;340:c950.  Back to cited text no. 10
    
11.
Satyanarayana K. The role of the ombudsman in biomedical journals. J Postgrad Med 2002;48:292.  Back to cited text no. 11
[PUBMED]  [Full text]  
12.
Smith R. The trouble with medical journals. J R Soc Med 2006;99:115-9.  Back to cited text no. 12
    




 

Top
 
 
  Search
 
  
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

 
  In this article
    References

 Article Access Statistics
    Viewed551    
    Printed34    
    Emailed0    
    PDF Downloaded97    
    Comments [Add]    

Recommend this journal