Our title reflects a maxim penned by Goethe. It continues in translation to read ‘With knowledge doubt increases.’1 The papers that appear in this issue stemming from a special symposium held in Banff, Alberta in February 2009, describe our progress toward agreement on the concept of access with evidence development (AED).2-6 Goethe’s reflection summarizes the dilemma. Much of decision making in health care is based on the premise that certainty can be achieved through randomized controlled trials and translated into policy decisions on issues such as insurance coverage. Unfortunately we have become increasingly aware of the constraints imposed by reliance on evidence based solely on randomized controlled trials and the health care system is now shifting rapidly toward the acceptance of other kinds of evidence to support a more nuanced form of decision making that gives appropriate weighting to the perspectives of patients, their families, and their health care providers.7-10
Executive Director, Child & Family Research Institute; Vice President, Provincial Health Services Authority; Associate Dean (Research), University of British Columbia
Senior Scientist, Centre for Clinical Epidemiology and Evaluation;
Associate Professor, School of Population and Public Health, University of British Columbia
The symposium presented in this issue underscores the pressing need to consider anew the dynamics of technology diffusion at all levels. Such consideration is particularly appropriate at a time when economic pressure on health care systems is extreme. Those responsible for the publicly funded aspect of health care can certainly be forgiven for succumbing to the allure of evidence based rationing,11 particularly when beset by the need to cope with the impact of what have been called ‘disruptive technologies’. Unfortunately, the view of disruptive technologies often appears to encompass all innovations without due regard for the potential benefit attached to each new intervention. There is a pressing need for clarity on what constitutes value added and for the adoption of new methodologies to allow assessment of value. On the flip side, there is also a clear need to have evidence informed decisions around disinvestment of technologies and other services.
One laudable step in the UK is the National Institute for Clinical Excellence (NICE) created in 1999. In his OHE annual lecture in 2004 the late Alan Williams posed the question ‘What could be nicer than NICE?’12 In the words of Professor Williams, ‘NICE is the closest anyone has yet come to fulfilling the economist’s dream of how priority setting in health care should be conducted. It is transparent, evidence based, seeks to balance efficacy and equity and uses a cost per QALY benchmark as the focus for its decision making’. In spite of this ringing endorsement, Williams continued in his lecture to concede that ‘experience has taught me that it is not uncommon for an economist’s “dream-come-true” to be seen as a nightmare by everyone else’.
The challenges of evaluating evidence on the efficiency, effectiveness, and safety of new technologies are undeniable. The idea of permitting access to new technologies and approving reimbursement on a temporary basis is a positive and appealing step but one that is clearly taken in a spirit of compromise that may not prove sustainable. The limitations of the approach are well defined in the accompanying articles.
In a different context it has been said that ‘There’s nothing in the middle of the road but yellow stripes and dead armadillos’13 and this may yet turn out to apply to the concept of AED. There is particular danger in the idea of providing access and reimbursement approval on a temporary basis in the full knowledge that the final decision on benefit of the new technology will be elusive. In the absence of randomized controlled trials, the potential for the misuse of other types of evidence is extreme. We also know that once initial uptake of the technology has occurred, later discontinuance may prove to be very difficult. Yet maximizing benefit gain overall for a given pot of resources can only be achieved when opportunity cost of each decision is assessed against other competing spends. Informing the debate with AED has potential, but the challenges must be understood and acknowledged.
Clearly we must consider new ways of valuing health outcomes. The ideal from an academic perspective is movement toward a deliberative process of decision making that takes context into consideration. Recent experience suggests that a growing emphasis on comparative effectiveness studies will be required.14-15 It is likely that we are moving towards a system in which policy decisions will be flexible and evidence-informed rather than categorically based on conclusive evidence. Inevitably this will bring about a greater use of pragmatic and observational studies conducted in real world situations that will go beyond the evidence available from RCTs. Decision makers will, of necessity, be increasingly required to consider composite outcomes and to make judicious use of colloquial evidence.16 The framework for consideration of other kinds of evidence was described by Daniels and Sabin in the 1990s17-19 and recently updated in 2008.20 Their principles of accountability for reasonableness that may apply to the AED framework include four elements of legitimacy and fairness in decision making:
- stakeholder involvement
- transparency and openness to public dissemination of the basis of decision making
- clear description of the basis for revision or appeal
- assertive leadership, including acceptance of accountability for reasonableness
All of these elements are brought into sharp focus in the debate about AED. The consideration of new approaches to reimbursement decision-making also brings into question the availability of the human resource pool needed for a continuing evidence development paradigm. We have recently conducted studies in Canada to assess our national preparedness for adoption of a life cycle approach to drug regulatory approval using epidemiologic and adaptive trial methodologies. The survey of human resources in Canada proved somewhat discouraging. We were able to identify only 300 individuals who were independent of government, contract research organizations, and manufacturers who were prepared to take on research studies of the kind that would be required within an AED environment.21
The principles that underpin AED are well described in the lead article from the Banff symposium and further details are provided in the accompanying papers. The principle likely to present the greatest challenges is number 5, Governance should ensure the independence of the scheme from any parties with a vested interest in its outcomes. It is difficult to adequately reflect the interests of all stakeholders, clinicians, manufacturers, patients, and payers while retaining complete independence on the part of the researchers involved. As noted by the symposium authors, a spirit of at least ‘passive cooperation’ is required amongst these stakeholders. In most literature related to AED the emphasis seems to be placed on independence of investigators and decision-makers from influence by clinicians, patients, and manufacturers. Less attention has been paid to the need for independence from influence by payers. It is evident that payers in many cases see the requirements for evidence of efficiency, effectiveness, and safety as a means of building barriers to approval for reimbursement. This type of evidence-based-rationing is often seen as an affront by the other stakeholders and certainly in many cases represents a barrier to innovation, depriving patients and their families of the benefit of medical progress.
- von Goethe JW. Proverbs in Prose. 1819
- Access with evidence development: A first international consensus statement.
- Stafinski T, McCabe C, Menon D. Funding the unfundable: Mechanisms for managing uncertainty in decisions on the introduction of new and innovative technologies into health care systems.
- McCabe C, Stafinski T, Edlin R, Menon D. Access with evidence development schemes: A framework for description and evaluation.
- Briggs A, Ritchie K, Fenwick E, Chalkidou K, Littlejohns P. Access with evidence development in the UK: Past experience, current initiatives and future potential.
- Mohr PE, Tunis SR. Access with evidence development: The US experience.
- Tunis SR, Stryer DB, ClancyCM. Practical clinical trials: Increasing the value of clinical research for decision making in clinical and health policy. JAMA 2003;290(12):1624-32.
- Garrison Jr LP, Neumann PJ, Erickson P, Marshall D, Mullins CD. Using real-world data for coverage and payment decisions: The ISPOR real-world data task force report. Value in Health 2007;10:326-35.
- Drummond M, Evans W, LeLorier J, Karakiewicz P, Martin D, Tugwell P, MacLeod S. Evidence and values: Requirements for public reimbursement of drugs for rare diseases – a case study in oncology. Can J Clin Pharmacol 2009;16(2):e273-e281.
- Gooch KL, Smith D, Wasylak T, Faris PD, Marshall DA, Khong H, Hibbert JE, Parker RD, Zernicke RF, Beaupre L, Pearce T, Johnston DWC, Frank CB. The Alberta hip and knee replacement project: A model for health technology assessment based on comparative effectiveness of clinical pathways. Int J Tech Assess in Health Care 2009;25:113-23.
- MacLeod SM, Bienenstock J. Evidence-based rationing: Dutch pragmatism or government insensitivity? Can Med Ass J 1998;158:213-4.
- Williams A. What could be nicer than NICE? London: Office for Health Economics, 2004.
- Hightower J. There’s nothing in the middle of the road but yellow stripes and dead armadillos: A work of political subversion. HarperCollins 1997 (ISBN 0-06-092949-9)
- Chalkidou K, Tunis S, Lopert R, Rochaix L, Sawicki PT, Nasser M, Xerri B. Comparative effectiveness research and evidence-based health policy: Experience from four countries. Milbank Quarterly 2009;87:339-67.
- Knowing what works in health care: A roadmap for the nation. Eden J, Wheatley B, McNeil B, Sox H, Editors, Committee on reviewing evidence to identify highly effective clinical services. 2009. ISBN-10: 0-309-11356-3.
- Lomas J. Commentary: Whose views count in evidence synthesis? And when do they count? Healthc Policy 2006; 1(2):55–7.
- Daniels N. Rationing fairly: Programmatic considerations. Bioethics 1993;7:223–33.
- Daniels N, Sabin J. The ethics of accountability in managed care reform. Health Affairs 1998;17:50–64.
- Daniels N, Sabin JE. Setting limits fairly: Can we learn to share medical resources? New York: Oxford University Press, 2002.
- Daniels N, Sabin JE. Accountability for reasonableness: an update.Br Med J 2008;337:a1850.
- Soon JA, MacLeod S, Sharma S, Wiens M. Human resource and educational inventories to support the life cycle approach to the regulation of therapeutic products. submitted to Health Canada August 2009. publication pending