A colleague pointed me to this news article regarding evaluation of new health plans:
The Affordable Care Act would fund a new research outfit evocatively named the Innovation Center to discover how to most effectively deliver health care, with $10 billion to spend over a decade.
But now that the center has gotten started, many researchers and economists are disturbed that it is not using randomized clinical trials, the rigorous method that is widely considered the gold standard in medical and social science research. Such trials have long been required to prove the efficacy of medicines, and similarly designed studies have guided efforts to reform welfare-to-work, education and criminal justice programs.
But they have rarely been used to guide health care policy — and experts say the center is now squandering a crucial opportunity to develop the evidence needed to retool the nation’s troubled health care system in a period of rapid and fundamental change. . . .
But not all economists think that randomization is the gold standard. Here’s James Heckman, for example, criticizing “the myth that causality can only be determined by randomization, and that glorifies randomization as the ‘gold standard’ of causal inference.” I try to put some of this in perspective here; see p. 956 of that article.
Meanwhile Prabhjot Singh offers some thoughts on the health policy innovation center:
The Innovations center that is in the cross hairs of this article is the one new federal level healthcare initiative that I truly think is transformative. Over just the past 3 years, they leveraged 1 billion dollars to change the organizational behavior of a 2.7 trillion dollar industry in the specific area of service delivery and payment systems. I have sat with the executive / strategy groups of 3 of this city’s largest hospital systems (covering 80% of the city’s patients) and many across the country as they scramble to figure out how their organization should shift strategy to capture some of this money. It’s actually sorta fascinating, because 1 billion divided by all the possible hospital systems in the US, who each submit multiple grants, is a pretty small amount of money. But the innovation center process is smart – hospital systems have to demonstrate that they would do what they suggest anyway. Everybody is angling to catch a piece of this large sounding amount of cash and figure out what the innovation center really wants to see – and they game their applications accordingly and whisper intel across their institutions and the required set of regional partners they have to assemble. By the time initiatives are submitted, there is a remarkable amount of synchronization and consensus across internally fragmented organizations about the sort of payment and delivery innovations they each think has the best shot. I was surprised by how many leaders of interdependent healthcare sub-systems were meeting for the first time through this process. In my view, the sheer internal prep and intra-institutional cooperation required to submit a proposal for an innovation center challenge—repeated across the country—is worth $1b. The shared awareness of common challenges alone will promote the diffusion of successful demonstrations across institutions with synchronized priorities. This is notoriously difficult even with an iron-clad RCT in hand.
When the goal is changing system behavior, the design of the process should be a central concern. It creates a clear navigation path so the entire ecosystem can strive forward, not simply a few lucky demonstration projects that post improvements compared to controls. The demonstrations are like fruit on a tree. In a well designed process, the tree remains after the fruit is long gone. It’s easy to ignore the tree, much its own interdependencies, when you’re entirely focused upon comparing control and treatment apples. They matter too, but an apple needs to satisfice a set of contextual requirements. Moreover, we already have a dedicated RCT/CEA funding stream for well designed healthcare delivery experiments called PCORI. The problem with forcing all innovation center demonstration project to contain an RCT is that it massively constrains the space of participants & potential solutions, and would fundamentally comprise the systems change process I described above. But I’d hardly expect people who seem to believe that informal domain knowledge is a fundamental source of bias to be eliminated to appreciate that.
Here [Singh writes] is a much more thoughtful exploration of these issues:
Over the last twenty or so years, it has become standard to require policy makers to base their recommendations on evidence. That is now uncontroversial to the point of triviality—of course, policy should be based on the facts. But are the methods that policy makers rely on to gather and analyze evidence the right ones? In Evidence-Based Policy, Nancy Cartwright, an eminent scholar, and Jeremy Hardie, who has had a long and successful career in both business and the economy, explain that the dominant methods which are in use now—broadly speaking, methods that imitate standard practices in medicine like randomized control trials—do not work. They fail, Cartwright and Hardie contend, because they do not enhance our ability to predict if policies will be effective.
The prevailing methods fall short not just because social science, which operates within the domain of real-world politics and deals with people, differs so much from the natural science milieu of the lab. Rather, there are principled reasons why the advice for crafting and implementing policy now on offer will lead to bad results. Current guides in use tend to rank scientific methods according to the degree of trustworthiness of the evidence they produce. That is valuable in certain respects, but such approaches offer little advice about how to think about putting such evidence to use. Evidence-Based Policy focuses on showing policymakers how to effectively use evidence, explaining what types of information are most necessary for making reliable policy, and offers lessons on how to organize that information.