Creating and using real-world evidence to answer questions about clinical effectiveness

Simon de Lusignan

University of Surrey, Guildford, UK

Laura Crawford

Eli Lilly and Company, Indianapolis, Indiana, USA

Neil Munro

University of Surrey, Guildford, UK

Cite this article: de Lusignan S, Crawford L, Munro N. Creating and using real-world evidence to answer questions about clinical effectiveness. J Innov Health Inform. 2015;22(3):368-373.

Copyright © 2015 The Author(s). Published by BCS, The Chartered Institute for IT under Creative Commons license http://creativecommons.org/licenses/by/4.0/

Author address for correspondence:

Simon de Lusignan

Professor of Primary Care and Clinical Informatics,

University of Surrey, Guildford, GU2 7XH, UK

E-mail: s.lusignan@surrey.ac.uk


ABSTRACT

New forms of evidence are needed to complement evidence generated from randomised controlled trials (RCTs). Real-World Evidence (RWE) is a potential new form of evidence, but remains undefined.

This paper sets to fill that gap by defining RWE as the output from a rigorous research process which: (1) includes a clear a priori statement of a hypothesis to be tested or research question to be answered; (2) defines the data sources that will be used and critically appraises their strengths and weaknesses; and (3) applies appropriate methods, including advanced analytics. These elements should be set out in advance of the study commencing, ideally in a published protocol.

The strengths of RWE studies are that they are more inclusive than RCTs and can enable an evidence base to be developed around real-world effectiveness and to start to address the complications of managing other real-world problems such as multimorbidity. Computerised medical record systems and big data provide a rich source of data for RWE studies.

However, guidance is needed to help assess the rigour of RWE studies so that the strength of recommendations based on their output can be determined. Additionally, RWE advanced analytics methods need better categorisation and validation.

We predict that the core role of RCTs will shift towards assessing safety and achieving regulatory compliance. RWE studies, notwithstanding their limitations, may become established as the best vehicle to assess efficacy.

Keywords: computerized, data collection, epidemiological methods, evidencebased practice, health services research, medical record systems, real-world analytics,research design, statistics as topic


INTRODUCTION

New ways are needed to generate evidence in addition to conducting randomised controlled trials (RCTs).1, 2 This is particularly so as medicine struggles to understand how to apply an evidence base developed for single conditions to an ageing population with multimorbidity.3 Studies using Real-World Evidence (RWE) are a potential next step (Box 1).

Box 1 Definition of RWE

RWE:

• Is a form of evidence (along with RCTs, health economics studies, etc.) generated to answer a question or test a hypothesis

• Is derived from primary or secondary real-world data (RWD) sources, often data from computerised medical record (CMR) systems

• Includes appropriate and rigorous design and analyses, generally set out in a protocol in advance of conducting the study

• Provides evidence about patient populations diseases, medicines, and health care that will inform clinical practice

• Generates further research questions

Patients, providers and payers all need evidence beyond that generated in clinical trials:

• Patients want to know to what extent health interventions work in the real world for patients like them;

• Providers of health care want to make more informed recommendations for patients;

• Commissioners and payers need to know how interventions work in their populations in the real world;

• Society needs to be informed of any long-term unexpected complications of therapies.

RWE is enabled by the ubiquity of health care information technology systems and the resultant increase in health care data that is born digital, as well as by advances in analytics. The move towards greater use of RWE is also driven by the general shift in health care from a transactional, fee-forservice model to a more value-based system with payments based on outcomes.4

In summary, RWE is a form of evidence generated by applying a research question or testing a hypothesis using real-world data (RWD), and applying a method that generally includes advanced analytics to generate evidence (Figure 1).


REAL-WORLD DATA (RWD)

RWD is either primary – collected specifically for the RWE study – or secondary use of data collected primarily for another purpose (Box 2). All these data have strengths and limitations5 that need to be set out in full.

Box 2 Primary and secondary real-world data (RWD)

Primary sources of RWD

• Prospective patient registries

• Prospective observational or longitudinal cohort studies

• Pragmatic clinical trials

• Patient and caregiver surveys

• Hybrid studies (e.g. retrospective database + survey)

Secondary sources of RWD

• Retrospective databases/patient registries

• Electronic medical records

• Administrative claim records

• Personal health records

• Genetic and biomarker databanks

• Patient-derived data (via web-based or smart technologies

• Social media data

Figure 1. RWE is generated by applying a question or testing a hypothesis using RWD and applying analytical techniques set out in the study design

Computerised medical record (CMR) systems have been a key enabler of RWE investigations, particularly in health systems where individuals have a unique ID and can be followed over time. 5,6 However, access to RWD is not always straightforward – and laws and policies to protect privacy can be a greater limitation on the ability to carry out RWE research than conducting analyses.

Design of RWE studies

The design used in an RWE study needs to have rigour appropriate to the disciplines it is drawn from. Table 1 sets out the possible elements of an RWE observational study. They should have written, published protocols, and peerreviewed outputs.


REAL-WORLD ANALYTICS

Advanced analytic methodologies are used to mine RWD and provide evidence for populations and subgroups. Whereas trials and conventional evidence inform how populations might respond to treatment, RWE looks to provide evidence for subgroups and eventually for individuals.

Strengths of RWE

RWE studies are likely to cost much less than RCTs, and retrospective RWE investigations can be completed much faster than RCTs – and the same may be true of some prospective investigations. RWE studies include people with multimorbidity, on usual prescribed doses and with standard patterns of adherence. Also, adverse effects of medication may be detected in RWE studies.

RWE studies use a larger sample size: whilst important RCTs might have thousands of subjects, RWE studies might use a million. Exclusion criteria for RCTs are often extensive and limit generalisablity; RWE studies can use data that are statistically generalisable. RWE studies may particularly lend themselves to areas where case definition and outcomes might be readily measured from routine data. RWE is able to gain insights from complex patients with multimorbidities. The way an intervention is implemented in the ‘real world’ is not necessarily the same as the way that intervention was planned to be.

Data from CMR systems and administrative claims records are a mainstay of such studies. Importantly, records can be used to rapidly identify any unexpected harms and ensure patient safety. And, because of their observational nature, there is no need for subjects of such studies to specially attend clinics or clinical research centres beyond that is required for their routine care.


LIMITATIONS OF RWE

The primary limitation of RWE is that as it is a new and emerging approach, there is no agreed upon guidance for RWE methodologies and evaluation. There is uncertainty about sample size, what represents a valid and reliable method, how to compare care pathways, especially in people with complex health needs, the best method for economic evaluation, and how to reduce bias. A recent debate around possible harms of screening for lung cancer reflects this uncertainty.7

Table 1. The design of RWE studies

Guidance from the Enhancing the Quality and Transparency of health Research network (http://www.equator-network.org/), notably Consolidated Standards of Reporting Trials, informs whether an RCT is of quality. However, despite all attempts to reduce the risk of bias, there are concerns that funding source may bias trial results,8,9 though there is not a consensus about this.10,11 The International Society for Pharmacoeconomics and Outcomes Research has to some extent sought to address this gap,12 though this does not address the study of RWE per se.

The RWD used in RWE studies are messy and often incomplete; sophisticated statistical methods must be used to rigorously gain accurate RWE insights. Further, apparently easy to define conditions can actually be more complex – diabetes provides a good example of this.13 Pitfalls in case selection are all too easy and can only be detected if coding lists used in analysis are published.14 Case finding if it is to be reliable between different brands of computer system – let alone international – needs to be much more ontological15,16 (i.e. transparent models that link clinical concept to coding lists used to identify cases, outcomes, confounders, etc.).17


RWE IS COMPLEMENTARY TO RCT EVIDENCE

Whilst the RCT remains the gold standard for testing hypotheses about treatment, other forms of evidence are needed, as RCTs fail to answer questions of benefit for those excluded from trials. RCTs will always be needed and form an important part of the licencing of medications; it is important that they are demonstrated to be effective in the core target group who will receive them. However, a number of factors suggest that whilst RCTs will always be required, a first response of ‘Let’s do a trial’ to test a hypothesis, particularly about the effectiveness of treatments in clinical practice, should be accompanied by a careful evaluation as to whether an RCT is the best option. Downsides of RCTs (Figure 2) include high costs, the time taken from hypothesis to creating evidence and uncertainty about the relevance of trial results to an ageing population with multimorbidity. RCTs in the therapeutic world are short – mainly because that is all that regulators require. As a result, rare complications of therapies may not be picked up in a trial that lasts 6–12 months. This is a very important function of RWE or post-marketing studies, i.e. the detection of unusual but significant drug-related issues –either on target or off target. It is it not just the size of the numerator that is important but the possibility of tracking large numbers of patients over long periods that provides such useful information. Whilst RWE will most often not prove causality, it does enable a hypothesis to be raised and explored further. The potential advantages of using RWE are that costs are potentially much lower and retrospective RWE investigations can take place much faster, and the same may be true of some prospective investigations. Whilst RWD are inherently more messy, their advantage is that RWE studies will include people with multimorbidity, on usual prescribed doses, and standard patterns of adherence. Also, adverse effects of medication may not always be detected in the initial trials.18,19


CALL FOR THE ESTABLISHMENT OF RWE CENTRES

Notwithstanding these limitations, studies using RWE are probably here to stay; the urgent need is for methods to assess the quality of these studies. Unless we develop robust ways of working, RWE might be dismissed as just another attempt to create new ways to conduct comparative effectiveness research.20 Research funders should be sponsoring the development of RWE centres to develop methodologies and conduct RWE studies. Prospective RWE studies could focus on stratified medicine.21 These studies should look to understand which groups of patients therapeutic interventions fit with trial evidence and for which groups they are more or less effective in real-world use. Retrospective studies can rapidly answer questions about adherence, thresholds at which treatments are implemented, unanticipated safety issues, and other important questions. They should inform health services about the cost-effectiveness of different care pathway options; currently, they are often dealt with piecemeal.

Figure 2. The case for increasing the use of RWE


SUMMARY

RWE studies are needed to properly assess benefit/risk of treatments when used in standard clinical practice. CMR can facilitate concurrent safety monitoring and data collection without direct patient contact and enable large randomised study populations to be identified for pragmatic RCTs. Informaticians or investigators with a detailed understanding of the strengths and weaknesses of routine data – how to infer meaning and to avoid pitfalls – should lead such study teams. Ensuring privacy and ethical use of patient data is also essential for the success of these projects. Analysis of health data needs to take place within a secure environment. There needs to be a high level of trust alongside professionalism in protecting privacy.22,23

New et al. sum up the current situation and the need for RWE:

RCTs are the gold standard but additional information is needed about the risk/benefit profile of new treatments in real-world practice, and especially evidence providing relative ‘values’ on novel medications, in cash-limited health services.24

There are enormous opportunities for the informatics community if these predictions about the future research direction are correct. The future research landscape will be one in which RCTs are required for regulation and for approval of therapies, devices, and other interventions. RCTs are needed primarily to demonstrate comparative efficacy and safety in a controlled environment. RWE studies will become the norm for demonstrating effectiveness and cost-effectiveness, particularly in the context of an ageing population with multimorbidity.


CONFLICTS OF INTEREST

Simon de Lusignan is the lead academic and Neil Munro is a professor supporting the RWE centre based at University of Surrey funded by Eli Lilly and Company (http://www.surrey.ac.uk/fhms/research/researchgroups/real_world_evidence_centre_fhms.htm).


REFERENCES

1. Sanson-Fisher RW, Bonevski B, Green LW and D’Este C. Limitations of the randomized controlled trial in evaluating population-based health interventions. American Journal of Preventive Medicine 2007;33(2):155–61. http://dx.doi.org/10.1016/j.amepre.2007.04.007. PMid:17673104.

2. Williams BA. Perils of evidence-based medicine. Perspectives in Biology and Medicine 2010;53(1):106–20. http://dx.doi.org/10.1353/pbm.0.0132.

3. Schattner P, Barker F and de Lusignan S. Minimally disruptive medicine is needed for patients with multimorbidity: time to develop computerised medical recordsystems to meet this requirement. Journal of Innovation in Health Informatics 2015;22(1):250–4. http://dx.doi.org/10.14236/jhi.v22i1.136.

4. Porter M and Lee T. The strategy that will fix health care. Harvard Business Review 2013;91(10):50–70.

5. de Lusignan S and van Weel C. The use of routinely collected computer data for research in primary care: opportunities and challenges. Family Practice 2006;23(2):253–63. http://dx.doi.org/10.1093/fampra/cmi106. PMid:16368704.

6. de Lusignan S, Metsemakers JF, Houwink P, Gunnarsdottir V and van der Lei J. Routinely collected general practice data: goldmines for research? A report of the European Federation for Medical Informatics Primary Care Informatics Working Group from MIE2006, Maastricht, The Netherlands. Informatics in Primary Care 2006;14(3):203–9. http://dx.doi.org/10.14236/jhi.v14i3.632.

7. Harris RP, Sheridan SL and Lewis CL. Real-world evidence about potential psychosocial harms of lung cancer screening – Reply. JAMA Internal Medicine 2014;174(8):1416–7. http://dx.doi.org/10.1001/jamainternmed.2014.1636.

8. Lexchin J, Bero LA, Djulbegovic B and Clark O. Pharmaceutical industry sponsorship and research outcome and quality: systematic review. British Medical Journal 2003;326(7400):1167. http://dx.doi.org/10.1136/bmj.326.7400.1167. PMid:12775614; PMCid:PMC156458.

9. Lundh A, Sismondo S, Lexchin J, Busuioc OA and Bero L. Industry sponsorship and research outcome. The Cochrane Database of Systematic Reviews. 2012;12:MR000033. http://dx.doi.org/10.1002/14651858.MR000033.pub2.

10. Naci H, Dias S and Ades AE. Industry sponsorship bias in research findings: a network meta-analysis of LDL cholesterol reduction in randomised trials of statins. British Medical Journal 2014;349:g5741. http://dx.doi.org/10.1136/bmj.g5741.

11. van Lent M, Overbeke J and Out HJ. Role of editorial and peer review processes in publication bias: analysis of drug trials submitted to eight medical journals. Public Library of Science 2014;9(8):e104846. http://dx.doi.org/10.1371/journal.pone.0104846.

12. McGhan WF, Al M, Doshi JA, Kamae I, Marx SE and Rindress D. The ISPOR Good Practices for Quality Improvement of Cost-Effectiveness Research Task Force Report. Value Health 2009;12(8):1086–99. http://dx.doi.org/10.1111/j.1524-4733.2009.00605.x.

13. de Lusignan S, Sadek N, Mulnier H, Tahir A, Russell-Jones D and Khunti K. Miscoding, misclassification and misdiagnosis of diabetes in primary care. Diabetic Medicine 2012;29(2):181–9. http://dx.doi.org/10.1111/j.1464-5491.2011.03419.x.

14. de Lusignan S, Sun B, Pearce C, Farmer C, Steven P and Jones S. Coding errors in an analysis of the impact of pay-for-performance on the care for long-term cardiovascular disease: a case study. Informatics in Primary Care 2014;21(2):92–101. http://dx.doi.org/10.14236/jhi.v21i2.62.

15. Liaw ST, Taggart J, Yu H and de Lusignan S. Data extraction from electronic health records – existing tools may be unreliable and potentially unsafe. Australian Family Physician 2013;42(11):820–3. PMid:24217107.

16. Liaw ST, Taggart J, Yu H, de Lusignan S, Kuziemsky C and Hayen A. Integrating electronic health record information to support integrated care: practical application of ontologies to improve the accuracy of diabetes disease registers. Journal of Biomedical Informatics 2014;52:364–72. http://dx.doi.org/10.1016/j.jbi.2014.07.016.

17. de Lusignan S, Liaw ST, Michalakidis G and Jones S. Defining datasets and creating data dictionaries for quality improvement and research in chronic disease using routinely collected data: an ontology-driven approach. Informatics in Primary Care 2011;19(3):127–34. http://dx.doi.org/10.14236/jhi.v19i3.805.

18. Schroll JB, Maund E and Gøtzsche PC. Challenges in coding adverse events in clinical trials: a systematic review. Choonara I (Ed). Public Library of Science 2012;7(7):e41174. http://dx.doi.org/10.1371/journal.pone.0041174.

19. Zorzela L, Golder S, Liu Y, Pilkington K, Hartling L and Joffe A et al. Quality of reporting in systematic reviews of adverse events: systematic review. British Medical Journal 2014;348:f7668. http://dx.doi.org/10.1136/bmj.f7668.

20. Berger ML, Dreyer N, Anderson F, Towse A, Sedrakyan A and Normand SL. Prospective observational studies to assess comparative effectiveness: the ISPOR good research practices task force report. Value Health. 2012;15(2):217–30. http://dx.doi.org/10.1016/j.jval.2011.12.010.

21. Jones AG, Shields BM, Hyde CJ, Henley WE, Hattersley AT. Identifying good responders to glucose lowering therapy in type 2 diabetes: implications for stratified medicine. PLoS One. 2014;9(10):e111235. http://dx.doi.org/10.1371/journal.pone.0111235.

22. de Lusignan S, Chan T, Theadom A and Dhoul N. The roles of policy and professionalism in the protection of processed clinical data: a literature review. International Journal of Medical Informatics 2007;76(4):261–8. http://dx.doi.org/10.1016/j.ijmedinf.2005.11.003. PMid:16406791.

23. de Lusignan S. Effective pseudonymisation and explicit statements of public interest to ensure the benefits of sharing health data for research, quality improvement and health service management outweigh the risks. Informatics in Primary Care 2014;21(2):61–3. http://dx.doi.org/10.14236/jhi.v21i2.68.

24. New JP, Bakerly ND, Leather D, Woodcock A. Obtaining real-world evidence: the Salford Lung Study. Thorax 2014;69(12):1152–4. http://dx.doi.org/10.1136/thoraxjnl-2014-205259.

Refbacks

  • There are currently no refbacks.


This is an open access journal, which means that all content is freely available without charge to the user or their institution. Users are allowed to read, download, copy, distribute, print, search, or link to the full texts of the articles in this journal starting from Volume 21 without asking prior permission from the publisher or the author. This is in accordance with the BOAI definition of open accessFor permission regarding papers published in previous volumes, please contact us.

Privacy statement: The names and email addresses entered in this journal site will be used exclusively for the stated purposes of this journal and will not be made available for any other purpose or to any other party.

Online ISSN 2058-4563 - Print ISSN 2058-4555. Published by BCS, The Chartered Institute for IT