Innovations Without Evidence: Breakthroughs in Healthcare Need Proof

Lidiya Tarasenka

Lidiya Tarasenka

Clinical Editor at Andersen

Healthcare
Jun 12, 2025
Reading time: 9 mins
views
  1. Introduction
  2. The innovation-evidence gap
  3. The limitations of traditional trials
  4. Faster paths to evidence: beyond the RCT
  5. Frameworks for safe adoption
  6. Conclusion

Introduction

The past few years have felt like a rolling earthquake, triggered by an explosion of new technologies across every field, and healthcare is no exception. Digital tools, AI, “game-changing” apps, and biotechnology breakthroughs promise to revolutionize patient care.

Yet amid this feverish innovation, a stubborn contradiction remains: medical progress moves at high speed, but medical evidence often crawls. Many technologies have raced ahead of the rigorous trials and real-world studies needed to prove they actually improve health outcomes. This disconnect between hype and proof is one of the defining challenges for health systems in the 21st century. Bridging this gap will require policy makers, regulators, and clinicians to find ways to separate genuine breakthroughs from digital snake oil – and to do so faster than in the past.

The innovation-evidence gap

As of early 2025, the global digital health landscape comprises approximately 337,000 health-related mobile applications.

Over 35% of adults in Europe reported active use of at least one health-related application or wearable device in 2024.

AI algorithms can now read medical images, wearable sensors record continuous health data, and telemedicine connects patients and doctors across continents. The COVID-19 pandemic turbocharged some of these trends, accelerating digital tool adoption. But the cadence of classical clinical studies is “misaligned with the ‘fail fast, fail often’ mantra” of technology start-ups; tech entrepreneurs iterate in months or weeks, while gold-standard trials can take years.

As a result, many tools become widely used before robust evidence catches up. In fields like digital health, entire categories of intervention have few or no randomized controlled trials (RCTs) to validate them, even as they reach patients’ hands. Consider clinical decision support software (CDSS) and AI diagnostic tools. These promise to assist clinicians by analyzing data or images faster and perhaps more accurately than humans. Dozens of AI systems have been approved for use in Europe. In the US, the Food and Drug Administration (FDA) had approved 1,016 ML/AI-enabled medical devices as of September 2024, but only a few had been evaluated in prospective RCTs, raising concerns about real-world performance. Early deployments have revealed pitfalls: for example, a sepsis-prediction AI implemented in hospitals produced so many false alarms that clinicians began to suffer “alert fatigue” from faulty AI predictions. Such cases highlight that without evidence, even well-intentioned innovations can backfire or simply fail to deliver results.

Telemedicine experienced a meteoric rise during the pandemic; video consultations and remote monitoring became commonplace. Encouragingly, telehealth does rest on a growing base of research. By the mid-2010s, strong evidence of its effectiveness across multiple specialties – from cardiology to mental health – was already documented. Further studies in Europe and globally confirmed that virtual care can maintain quality and improve access. Still, telemedicine’s expansion raced ahead of policy, reimbursement models, and long-term comparative studies. Many health systems are now examining outcomes to determine when remote care truly benefits patients versus when an in-person visit is irreplaceable.

Wearable gadgets, meanwhile, have flooded the market – smartwatches that flag irregular heart rhythms, fitness bands that log vital signs, even “smart” clothing. Consumers have embraced these devices, and some clinicians use the data to guide care. But evidence for clinical effectiveness of wearables remains scant. As of August 5, 2024 there were only 6 RCTs on use of wearable devices outside of a hospital setting. In other words, wearables generate a deluge of data, but whether they measurably improve health outcomes (such as preventing heart attacks or improving chronic disease management) is still unproven in the majority of cases.

This hasn’t stopped eager early adopters, but it underscores the need for more validation. Even FDA-approved “digital therapeutics” – software prescribed as treatment – illustrate the gap between innovation and proof. In one notable case, an app for substance use disorder called reSET was cleared by regulators after initial studies. The company heralded its trial results as a breakthrough in boosting sobriety. However, an independent review by the Institute for Clinical and Economic Review found that the app’s pivotal trial had design flaws and that its primary endpoints showed no significant benefit.Despite these, that app’s clearance opened the door for two more apps (for opioid addiction and insomnia) to be approved without new clinical trials, as regulators deemed them substantially equivalent to the first product. The EU has been somewhat more cautious, requiring digital therapeutics to obtain a CE mark as medical devices and, in countries like Germany, to demonstrate some evidence of positive outcomes.

The limitations of traditional trials

RCTs – medicine’s gold standard for proof – excel at isolating the effect of a drug or intervention in a controlled setting, but they are slow (often taking years), expensive, and rigid. By the time an RCT is designed, approved, enrolled, and completed, a digital product may have undergone multiple software updates or become obsolete. Moreover, some digital interventions are difficult to test with traditional blind trials. For example, when testing a mobile app or wearable, one cannot hide if participants are using the gadget – they know if they have the flashy new device or not. This introduces a "digital placebo effect": studies without a tech placebo often report exaggerated benefits simply from the novelty or engagement of using a gadget. Incorporating appropriate control groups is tricky. In one case, after early uncontrolled studies suggested efficacy for a schizophrenia digital therapy, a larger trial added a “digital placebo” (a version of the app without the active component); the result was that the active app performed no better than the sham app.

Generalizability is another issue. A decision-support AI might be intended for deployment across millions of encounters in diverse hospitals, but RCTs typically involve a few hundred or thousand patients, under tightly managed conditions. It may subtly influence clinical decisions in ways that a narrow trial cannot capture. Аn algorithm might work well in a trial at academic hospitals, but falter in small community clinics or across different patient demographics.

Despite these challenges, it’s important to note that RCTs (and other robust studies) remain essential for many innovations. However, there is growing recognition that alternative methodologies can supplement or even replace the classic trial in generating credible evidence.

Faster paths to evidence: beyond the RCT

These approaches aim to maintain scientific validity while speeding up feedback for innovators and reassurance for adopters:

Real-World Data and Real-World Evidence (RWD/RWE): Rather than lengthy new trials, regulators are looking at data from actual use in clinical practice. This can include electronic health records, patient registries, insurance claims, and digital device logs. The idea is to continuously harvest insights on safety and effectiveness from large populations in uncontrolled settings. Real-world studies can be done faster and cheaper than RCTs, though they must be designed carefully to avoid biases. Both the FDA and European Medicines Agency (EMA) have initiatives to harness RWE for decision-making. The EU’s upcoming European Health Data Space (EHDS) is explicitly intended to facilitate such data-sharing for research while “safeguarding individual privacy rights”. A learning health system, where every patient encounter feeds into outcome data, could dramatically accelerate evidence – provided the data is reliable and analyzed rigorously.

Adaptive and Pragmatic Trials: Adaptive trial designs allow modifications to the trial protocol in response to interim results – stopping early if something clearly works (or doesn’t), or shifting patients between arms.

Pragmatic trials, meanwhile, test interventions in real-world clinical settings with broad patient criteria and minimal interference in routine care. Both approaches can yield results faster or in more generalizable contexts than traditional RCTs. For instance, at Imperial College London, a digital health clinical simulation test bed has been used to safely observe how new health tech performs in a mock clinical environment, generating evidence without risking patient harm.

Continuous Post-Marketing Surveillance: Borrowing from pharmacovigilance, regulators can approve technologies on the condition of rigorous post-market data collection. This is somewhat akin to granting “provisional” approval: let the innovation out into the real world, but require the manufacturer to gather ongoing outcomes data and report back. If the benefits aren’t confirmed, approval can be withdrawn or limited. The EU Medical Device Regulation (MDR) of 2017/2021 leans in this direction. It tightened requirements for medical devices (including software) to obtain CE marking, mandating a plan for clinical evidence and post-market follow-up proportional to risk. Аfter a digital device is on the market, manufacturers in Europe are obliged to conduct post-market performance evaluations and update regulators – creating a feedback loop of evidence.

Iterative Evidence Building: Traditional drug development has phased trials (Phase I to III). Digital health might benefit from a similarly phased approach, but compressed. Early-stage studies can focus on usability and intermediate outcomes, mid-stage on surrogate clinical outcomes, and late-stage on definitive outcomes – all while the product is being refined. If done in tight iteration, this can resemble software version updates with evidence “patches” applied. One example is Germany’s DiGA (Digitale Gesundheitsanwendungen) fast-track program for digital health apps. Under the 2019 Digital Healthcare Act, apps that show initial promise in improving patient outcomes can be temporarily listed for reimbursement by statutory insurance. During this provisional phase (up to one year), the app’s developers must collect real-world data and prove a “positive healthcare effect” within 12 months. If they succeed, the app gains permanent reimbursement status; if not, it can be delisted. This model encourages innovation but sets a clear timeframe for evidence – innovation with a safety net. Early results from Germany indicate that an extensive amount of effort is required to demonstrate clinical efficacy.

Frameworks and Guidelines for Evidence: To guide innovators on their expectations, health authorities have started publishing clear frameworks. The UK’s National Institute for Health and Care Excellence (NICE), whose guidance is influential in Europe, introduced an Evidence Standards Framework (ESF) for Digital Health Technologies. First published in 2019 and updated in 2021 – 2022, the ESF lays out tiers of evidence required based on the risk and function of the technology. For example, a low-risk wellness app might only need basic usability and engagement data, whereas an AI diagnostic tool influencing clinical decisions would need rigorous clinical outcome evidence. NICE’s framework is meant to ensure digital tools are “clinically effective and offer value” before the National Health Service adopts them.

Such frameworks, along with regulatory guidance from bodies like the World Health Organization (WHO) and European Commission, help set benchmarks and prevent the most egregious hype-driven deployments. The WHO, for instance, released its first Digital Health Intervention Guidelines in 2019, stressing that digital innovations should be adopted on the basis of evidence and aligned with health system needs – not just because they are novel. The WHO cautions against the “if you build it, they will use it” mindset, noting that this approach has “failed time and again”.

Frameworks for safe adoption

Even with faster evidence generation, healthcare systems require structured channels to vet and absorb new technologies. Europe is leading on this front. Health Technology Assessment (HTA) agencies in countries like France, Germany, and Sweden are extending their remit beyond drugs to include digital health. A forthcoming EU regulation, due in 2025, will standardize clinical assessments of high-impact technologies such as AI and telehealth, ensuring that robust evidence can be shared across borders. The EU’s Artificial Intelligence Act is also set to demand explainability and auditability in health algorithms.

Hospitals are establishing innovation committees comprising clinicians, ethicists, and patients, treating new tools as they would a novel procedure: evidence is reviewed, pilots are run, and deployment only proceeds if outcomes meet predefined thresholds. Tech champions may find this overly cautious, but patients should not be treated as beta testers.

Meanwhile, regulatory sandboxes offer supervised trial environments for emerging tools. In the UK, the MHRA and NHS AI Lab have tested AI applications under controlled conditions to support evidence-building. England’s Accelerated Access Collaborative further streamlines the uptake of validated technologies, fast-tracking those with proven benefit into the NHS.

Conclusion

In the long run, a culture of “innovations with evidence” will yield not just flashy press releases, but real gains in population health.

Share this post:

Book a free IT consultation

What happens next?

An expert contacts you after having analyzed your requirements;

If needed, we sign an NDA to ensure the highest privacy level;

We submit a comprehensive project proposal with estimates, timelines, CVs, etc.

Customers who trust us

SamsungVerivoxTUI

Book a free IT consultation