One question comes up in almost every conversation with private hospitals, health insurers, and clinical operators evaluating a digital health platform: "What does implementation actually look like?"
It is a fair question. And it is one the industry has not answered honestly enough.
Most technology vendors describe implementation as a setup phase. A few weeks, maybe a month, and then you are live. What that framing misses is everything that determines whether the programme actually works: how care pathways are redesigned, how clinical staff are trained, how patients are enrolled, and how engagement is sustained after the first few weeks. Technology going live is not the same as care being delivered.
Here is a realistic picture of what 90 days looks like, and why each phase matters.
The myth that needs to go first
The most common assumption buyers bring to digital health implementation is that it is a heavy, disruptive IT project. That assumption shapes how internal teams approach the first month, and it creates unnecessary friction.
A well-designed digital health platform does not require dismantling existing clinical systems. It integrates around them. What does require careful work is not the technical setup. It is the pathway mapping: understanding which patient populations the programme serves, what the referral flow looks like, where the digital touchpoints sit within the existing clinical journey, and who on the care team owns the relationship with the programme.
This distinction matters because it changes where organisations direct their attention. The first thirty days are about alignment, rather than the product.
Days 1 to 30: Building the foundation
In the first month, the real work is organisational,rather than technical. Which clinical staff will use the clinician portal? Who owns patient onboarding? How does a patient get enrolled, and at what stage in their care journey? These questions do not have universal answers. They are specific to each organisation's structure, patient population, and existing workflows.
The WHO's Global Strategy on Digital Health 2020-2025 identifies workflow integration as one of the foundational conditions for successful digital health deployment. The strategy is explicit that the primary barriers to adoption are not technical in nature. They are structural: insufficient pathway design, unclear ownership, and misalignment between clinical staff expectations and how the platform actually functions.
What good looks like at the end of day thirty: a defined patient cohort, clinical staff trained on the portal, and at least one clinical lead who owns the programme internally. Not a perfect launch. A solid foundation.
Days 31 to 60: First patients, first signals
By the second month, the first cohort of patients is typically enrolled and active. This is where data starts to tell a story. Engagement rates in the early weeks, check-in completion, and the quality of coach-patient interactions are all leading indicators of whether the programme will hold over time.
This is also where one of the most persistent misunderstandings in digital health becomes visible. Digital tools do not automatically drive engagement. What drives engagement is structure: a personalised programme, a certified health coach available through the platform, and regular touchpoints built into the patient's routine rather than added on top of it.
NICE, which evaluates digital health tools against clinical evidence standards across the UK, consistently identifies sustained patient engagement as the differentiating factor between programmes that produce measurable outcomes and those that plateau after initial enrolment. A passive app is a different proposition from a structured, coach-supported programme. The difference in retention is significant.
Organisations that treat the platform as a passive tool see the drop-off patterns common to standalone wellness apps. Organisations that pair the technology with active, human-led coaching see a fundamentally different engagement curve.
Days 61 to 90: Optimisation, not completion
The third month is where a framing shift is essential. Ninety days is not when implementation is finished. It is when there is enough real-world data to begin optimising.
Are there points in the programme where engagement consistently dips? Are certain patient segments responding differently? Are coaches identifying clinical patterns that warrant adjustments to the care plan structure? These are the questions that should be on a clinical lead's desk at day sixty
NHS evidence on digital chronic disease management, reflected in evaluations published through NHS England's digital transformation programme, consistently shows that long-term condition management requires longitudinal tracking. A 90-day window gives an organisation its first reliable baseline and the first genuine signal of what is working at the individual and cohort level. Chronic disease does not resolve in a quarter. The evidence base for what actually works in behaviour change, from dietary adjustment to physical activity adherence, points to programmes measured in months and years.
By the end of day ninety, the programme should have three things in place: a referral pathway that works with the existing clinical structure rather than alongside it; a coaching model that clinical staff understand and trust; and an analytics layer that gives leadership visibility without creating additional reporting overhead.
The human layer is not optional
One of the most consistent gaps in how digital health is procured is the assumption that software alone scales chronic care. It does not. Scaling personalised care for people living with Type 2 diabetes, obesity, or cardiovascular conditions requires redesigning how care is delivered, not just adding a new tool to the existing model.
The software is infrastructure. The programme is what drives outcomes. These are different things with different implications for implementation planning.
When a private hospital or insurer evaluates a digital health solution, the right question is "what does the care model look like, and how does the platform support it?" One is a feature evaluation. The other is an implementation strategy conversation. They lead to different decisions.
How Liva structures the first 90 days
Liva's platform supports three deployment models: digital only, virtual clinic, and in-person. That flexibility reflects the reality that no two organisations operate the same care delivery infrastructure. A private hospital running chronic disease programmes across multiple sites has different requirements from an insurer building a prevention programme for its membership base. What stays consistent across all deployments is the clinical governance structure: certified health coaches, a performance management framework, and clinical event management protocols embedded in the platform. The 90-day methodology is designed to establish that clinical foundation before scale is introduced, because scale without structure is where digital health programmes lose the clinical credibility needed to justify continued investment.
Ninety days is a starting point. The organisations that see lasting clinical and commercial outcomes are the ones that treat implementation as the beginning of a care programme, rather than end of a procurement process.




