The IFIC Podcast
The International Foundation for Integrated Care has defined nine pillars of integrated care based on the evidence accumulated over the last 2 decades. One of those pillars is Aligned Payments that Promote Integration. This is a difficult subject to understand particularly for policymakers, service managers and health and care professionals working in systems trying to implement integrated care who are not financing and payment experts. This short podcast series features our Chief Executive Dr Niamh Lennox-Chhugani in conversation with four leading practitioners who have been researching and designing new payment models around the world. They demystify the language of payment models and the different models we see emerging in different countries.
Episodes
Friday Mar 13, 2026
Friday Mar 13, 2026
In this episode, IFIC Chief Executive Dr Niamh Lennox-Chhugani is joined by Dr Robert Reid, Chief Scientist Emeritus at the Institute for Better Health in Toronto and Professor at the University of Toronto and McMaster University.
Robert is a global expert in population health, primary care, and learning health systems. In this conversation, he reflects on his journey from primary care physician into research and evaluation, and on what it takes to build learning health systems that genuinely improve health for patients and communities.
Drawing on his experience embedding research within health systems in Canada and beyond, Robert explores how evaluation can better support real-world decision-making. He discusses the importance of balancing the priorities of researchers, system leaders, and communities; how rapid mixed-method evaluations can generate useful evidence for policymakers; and why evaluation should be built into implementation from the beginning.
The conversation also looks at how learning health systems can expand beyond healthcare to address the wider determinants of health, working with partners across sectors such as education, urban design, and transportation. Throughout, Rob emphasises that evaluation is most powerful when it is used not just to judge success or failure, but to continuously improve care.
The discussion draws on Rob’s work with Sarah Greene in the article Gathering Speed and Countering Tensions in the Rapid Learning Health System, which explores why health systems still struggle to generate and use evidence quickly enough to improve care. The paper highlights the tensions that arise when researchers, health system leaders, and funders pursue different priorities, and argues that these tensions must be actively managed if learning health systems are to succeed.
Key insights from Robert Reid
On the purpose of evaluation
“It's not research and evaluation per se that's important — it's actually the practical applications of it to reach patients, all patients, and their communities.”
On whose priorities matter
“The whole purpose for the health system is to deliver health for patients and communities… those priorities should be our north star.”
On bringing stakeholders together
“When we bring people together, we can drive consensus in fairly efficient ways.”
On generating evidence quickly
“We have to generate evidence much quicker and be creative in the methods that we use.”
On mixed methods
“Mixing quantitative and qualitative evidence is absolutely essential.”
On evaluation that actually changes practice
“I view evaluation as essential for the improvement part of it… threading it through the implementation of any project.”
On learning health systems
“Learning health systems are really quality improvement on steroids.”
On looking beyond healthcare
“Health is a product of many things — the environment, work, transportation, family dynamics and social supports.”
Thursday Mar 05, 2026
Thursday Mar 05, 2026
In this episode, IFIC Chief Executive Niamh Lennox-Chhugani is joined by Professor Kathrin Cresswell, Professor of Digital Innovations in Health and Care at the Usher Institute, University of Edinburgh.
Kathrin is a social scientist with extensive experience evaluating large-scale digital transformation programmes, including the National Programme for IT, the Global Digital Exemplar Programme, and most recently the NHS AI Lab. Drawing on this work, she reflects on what formative evaluation can offer complex, digitally enabled change in health and care.
The conversation explores why impact evaluation alone is rarely enough in complex systems. Kathrin makes the case for formative and process evaluation that is embedded early, identifies emerging risks, and supports programmes to adapt in real time. Together, they discuss why some evaluations “sit on the shelf,” the tensions between independence and partnership, and the challenge of demonstrating impact when digital interventions can take years to stabilise.
Looking ahead, Kathrin argues for evaluation that is closer to practice — co-constructed with frontline teams, focused on learning, and continually asking whether an intervention is truly addressing the need it set out to solve.
Key insights from Kathrin Cresswell
On formative evaluation“You become part of the intervention… you make it work.”
On the limits of traditional impact studies“By then, you have no idea how it works… and you miss your chance to look at how you could make it work.”
On expectations of rapid impact“Asking after two years whether a programme was successful… is absolutely crazy.”
On evaluations that lack real learning“They’re the ones where the people who commission you want you to find something that they know in advance.”
On being involved early enough“We always come in too late.”
On staying focused on purpose“We need to keep coming back to what need this thing is meant to address.”
Thursday Feb 26, 2026
Thursday Feb 26, 2026
In this episode, IFIC Chief Executive Niamh Lennox-Chhugani is joined by Dr Ingo Meyer, Head of PMV Research at the University Hospital of Cologne.
With more than 15 years’ experience evaluating complex interventions across Europe, Ingo reflects on what it really means to evaluate integrated care in practice. From early European projects to his current work in Germany across oncology, palliative and primary care, he explores the persistent tensions between scientific rigour, practical relevance, and stakeholder expectations.
The conversation examines the blurred boundaries between research, evaluation and performance monitoring, and the challenge of delivering answers that are both methodologically sound and useful to decision-makers. Together, they discuss mixed methods, stakeholder communication, co-design approaches, and the growing — but still uncertain — role of artificial intelligence in evaluation.
Key insights from Ingo Meyer
On the complexity of evaluating integrated care“The complexity of the evaluation is just as high as the complexity of what I want to evaluate.”
On rigour versus usefulness“How can I look at things in an evaluation that is meaningful… and at the same time has enough scientific rigour so it’s done properly?”
On the difference between research and evaluation“In research… maybe I find an answer, maybe not. In evaluation… it will be less open in terms of ‘sorry, we didn’t find anything.’”
On shifting the core evaluation question“It is not often the right question, ‘Did it work, yes or no?’ — but rather, ‘How can I make it work?’”
On tailoring findings to different audiences“My results need to be short, two pages, executive summary… but I always try to deliver the other things with it.”
On the importance of context alongside numbers“I need to make sure that someone cannot just take the figure and run away with it… The context needs to be really glued to the numbers.”
On combining quantitative and qualitative insight“I can see a lot of things, but I will never fully understand the why.”
On co-design and citizen science approaches“At the beginning of my project, I’m pulling my stakeholders together… and we define at least a part of our research questions together.”
On artificial intelligence in evaluation“I’m still a bit on the fence… I think it’s more a question of time rather than whether AI will ever be useful.”
Thursday Feb 19, 2026
Thursday Feb 19, 2026
In this episode, IFIC Chief Executive Dr Niamh Lennox-Chhugani is joined by Anna Wilding, Research Fellow at the University of Manchester, speaking from Melbourne.
Anna is a co-author of the recent paper Impact of the rollout of the national social prescribing link worker programme on population outcomes: evidence from a repeated cross-sectional survey, published in the British Journal of General Practice (available here: https://bjgp.org/content/75/761/e880). Drawing on this work, she reflects on how social prescribing has been implemented through primary care networks in England and what evaluation can tell us about its impact on population outcomes and patient experience.
The conversation highlights the practical challenges of evaluating complex, system-wide interventions — including data access and governance barriers, working with imperfect real-world data, and balancing methodological rigour with pragmatic decision-making. Together, they explore what evaluation can (and can’t yet) tell us about social prescribing at scale, why early involvement of evaluators matters, and how multidisciplinary teams can produce more meaningful and useful insights for policymakers and practitioners.
Key insights from Anna Wilding
On making complex evaluation accessible“We knew we weren’t going for quite a general journal… so we wanted to make it as accessible as possible for people to understand.”
On linking data to study social prescribing“We applied for data from the GP patient survey… and then we linked it with data sets from NHS Digital… that’s where the social prescribing link workers are funded from.”
On why evaluation design matters from the start“The data wasn’t designed for research… so our ethics committee… wasn’t going to allow us to access that data for research.”
On the need to involve evaluators early“It would be good to have the people who would be evaluating it embedded in the process from the beginning.”
On pragmatism versus perfection“Sometimes done is better than perfect.”
On limits of causality in complex systems“We have associations… but we might not know that this definitely caused this.”
On managing expectations about impact“It’s not going to have these massive effects that we’re sort of expecting.”
On the value of multidisciplinary teams“Them together actually makes a more powerful message… mixing together is better.”
On big data and its limits“Sometimes the big data isn’t well collected either, even if you think it is.”
Thursday Feb 12, 2026
Thursday Feb 12, 2026
In this episode, IFIC Chief Executive Dr Niamh Lennox-Chhugani is joined by Deborah Cohen, Professor of Family Medicine at Oregon Health & Science University and a member of the US National Academy of Medicine.
Deb reflects on evaluating the scale and spread of an integrated, team-based care model in Oregon, originally piloted to support pregnant women living with substance use disorder. While early pilots showed promising outcomes, the expansion into rural settings revealed significant implementation challenges — offering a powerful real-world example of why evaluation needs to go beyond whether something “works” and focus on how and why interventions succeed or struggle in different contexts.
The conversation explores what evaluation can reveal about implementation, scale-up, and system readiness, and how evaluators can support learning in complex health and care systems — particularly when programmes move from successful pilots to wider adoption.
Key insights from Deborah Cohen
On what was known — and what wasn’t — about the pilot“We know that this program is effective. We know that it costs more to deliver this care. What we don’t really know is how to implement it.”
On why scale-up struggled in new contexts“None of those behavioral health organizations really have been able to navigate a relationship with the medical care part of the team such that they can truly integrate medical care and behavioral health care together.”
On what earlier evaluations missed“That was work that had not been acknowledged in the other evaluation.”
On the role of evaluation in learning“Evaluation, in my opinion, is meant to be designed to accelerate that learning process as rapidly as possible, because it’s all about making mistakes and learning from them in a transparent way.”
On how hard it is for implementers to hear difficult findings“It’s very hard as an implementer to take in lessons when things aren’t working.”
On investing properly in evaluation“If you shortchange your evaluation, you tend to sometimes shortchange what you can learn.”
On the value of mixed methods“Evaluation gets stronger when those elements are being done together… and done together iteratively.”
On psychological safety and commissioning“The tone and culture gets sort of set from the top and having a commissioner that’s open to really understanding the complexity of what’s going on on the ground can make a huge difference.”
On bringing evaluators in early“That latter case is way better because you really get to build relationships early on.”
Wednesday Feb 04, 2026
Wednesday Feb 04, 2026
Episode 1: Evaluation Explained — In Conversation with Tom Ling
In the first episode of this series, IFIC Chief Executive Dr Niamh Lennox-Chhugani is joined by Tom Ling, Senior Research Leader and Head of Evaluation at RAND Europe, and former President of the European Evaluation Society.
Drawing on decades of experience leading complex evaluations, Tom reflects on why evaluating integrated care remains so challenging — and why so much evaluation struggles to meaningfully inform practice, policy, and improvement. The conversation explores how evaluation can move beyond technical proficiency towards approaches that are more useful, more engaging, and more closely connected to the realities of people delivering and receiving care.
They also discuss Tom’s recent article, Reframing the Evaluation of Integrated Care: Examples from the NHS in England, which you can read here:https://www.rand.org/pubs/external_publications/EP71109.html
Key insights from Tom Ling
On how evaluations are communicated
“We produce almost astonishingly boring reports… some of them are unreadable. And even the executive summaries… are kind of 12 pages of really dense material.”
On what integrated care actually is
“It is treated in many evaluations as if it is a thing in the same sense that a new drug is a thing… That isn’t what integrated care is. It is an aspiration.”
On the danger of single measures
“Don’t get sucked into believing that one single measure is going to give you the perfect measure of integrated care. It will always be misleading.”
On the gap between what’s measured and what matters
“Very often what we’re asked to measure and what policymakers sometimes think they want to know isn’t the thing that matters most to patients, and indeed clinicians and carers.”
On involving people on the front line
“They’d given their energy, their time, their insights to the evaluators… and at the end of the process, the evaluators produced a study and a report which they couldn’t make sense of and didn’t describe their world. That has to be unacceptable.”
On the need for a better mix of evaluation approaches
“What we really need is a much longer scale study… instead of… the same evaluation questions… the same kind of time scale.”
On embedding evaluative thinking in decision-making
“That model is much more likely to create innovative insights and better thinking where you get evaluative thinking informing decisions.”
Monday Apr 07, 2025
Monday Apr 07, 2025
In this episode of the International Foundation for Integrated Care's podcast series, 'Measuring the Impact of Integrated Care,' Niamh Lennox-Chhugani, is joined by Dr Jason Cheah, Deputy Group CEO (Strategy, Planning and Resourcing) of the National Healthcare Group (NHG) and the CEO of Woodlands Health (WH). He is also Adjunct Professor at the LKC School of Medicine.
Dr Cheah oversees cluster-wide strategy, finance, communications, resourcing and planning functions in NHG. These include driving financial and systemic changes within the Group towards accountable care, implementing capitation design, cost management, and workforce transformation. He also oversees and drives the organisational transformation roadmap and implementation plans for NHG, while also working to strengthen the organization's culture and identity.
Tuesday Apr 01, 2025
Tuesday Apr 01, 2025
In this episode of the International Foundation for Integrated Care's podcast series, 'Measuring the Impact of Integrated Care,' Niamh Lennox-Chhugani, is joined by David Harrison, an experienced accountant in the UK's health and social care system and Treasurer of IFIC's board.
David has a public-sector management, economics and commercial background. He is a qualified chartered accountant and MBA who, since qualifying, has worked extensively in the public and private sectors.
David’s advisory work on integrated care dates back to 2016. He is designated as one of NHS England’s integrated care “subject matter experts”, having been deployed in several roles, from policy developer to implementor, by NHS England to support and accelerate the adoption of integrated care in England.
Most recently, David spent four years as chair of a substantial, employee-owned social enterprise, delivering a mix of community health services and primary care services.
Key Moments
In this podcast episode hosted by the International Foundation for Integrated Care, Neve Cahill converses with David Harrison, covering key discussions on measuring the impact of integrated care.
[02:35]They explore the transition to integrated care in England, the collaboration challenges between various health and social care organizations, and the importance of developing impact and outcome frameworks.
[08.33] Harrison highlights the necessity of senior leadership engagement, planning collaboratively to align institutional performance with system-level goals, and fostering trust among partners.
[14:10] How can integrated care systems isolate what improvements are necessary.
[21:40] The importance of collaborative planning so action can be taken early, instead of trying to understand problems when reviewing measurements after the fact.
Tuesday Mar 25, 2025
Tuesday Mar 25, 2025
In this episode of the International Foundation for Integrated Care's podcast series, 'Measuring the Impact of Integrated Care,' Niamh Lennox-Chhugani, is joined by Dr Ng Yeuk fan, Director of Corporate Development in Yishun Health and Khoo Teck Puat Hospital, Singapore.
Yeuk Fan is a consultant public health physician experienced in health systems and services planning and development. At Yishun Health and Khoo Teck Puat Hospital, he is responsible for Health Systems and Services Planning and Evaluation, Insights and Analytics, Corporate Planning, and Organization Development. He is also Associate Professor in Saw Swee Hock School of Public Health and Duke-NUS Medical School. His interests include integrated and value based care, systems thinking and health systems transformation.
Key Discussion Points
[05:12] – Discussion on real-world examples of integrated care implementation.[10:45] – Challenges faced by healthcare professionals in adopting integrated care models.[15:30] – Yeuk Fan shares insights on the importance of patient-centered approaches.[20:20] – Discussion on how to evaluate the impact of integrated care on patient outcomes.
Tuesday Mar 18, 2025
Tuesday Mar 18, 2025
In this episode of the International Foundation for Integrated Care's podcast series, 'Measuring the Impact of Integrated Care,' Niamh Lennox-Chhugani, is joined by Ellen Nolte Director of the NIHR Policy Research Unit in Policy Innovation and Evaluation, a collaboration between the LSHTM, the Care Policy Evaluation Centre at the London School of Economics and Political Science, and University of Glasgow
Ellen Nolte is an expert in public health and her research primarily focuses on health services, international healthcare comparisons, and performance assessment, particularly addressing the needs of people with complex and long-term health issues.
Her work also explores person-centred health systems and the role of health system factors in improving care coordination and integration across sectors.
Key Discussion Points
[5:30-9:00] Defining impact in the context of integrated care, including the OECD's broad definition of impact and the nuances of treating integrated care as an intervention.
[9:00-12:00] Discussion on the difference between evaluation impact and the beneficiaries' perception of impact, stressing the need to clarify these differences.
[12:00-15:00] Examination of the policy perspective on integrated care, focusing on cost reduction and the importance of considering the broader impacts beyond just cost savings.
[15:00-18:00] Ellen emphasises the need for context-specific evaluation and the challenges of measuring integrated care across different settings and baselines.
[18:00-21:30] The importance of understanding patient and staff experiences and the relational aspects of care in evaluating integrated care.
[21:30-24:00] Discussion on the significance of allowing space and time for learning and making mistakes in integrated care implementation and evaluation.
[24:00-27:00] Highlighting the shift towards longer-term contracts in integrated care to provide stability and enable local systems to evolve.

Who are we?
At the International Foundation for Integrated Care (IFIC), our mission, as the leading international voice in integrated care, is to inspire, influence and facilitate the adoption of integrated care in policy and practice around the world.
Operating as a centre for excellence and underpinned by a high quality and evidence informed approach, we seek to work in collaborative partnership with our beneficiaries to develop, test, apply and lead this movement.
We seek to do this through the development and exchange of ideas among academics, researchers, managers, health and care professionals, users and carers of services and, policy and decision makers throughout the World.


