HACA 24 Recordings

For HACA 2024, the conference organising group compiled an exciting programme with quality speakers from across the UK, including thought-provoking talks from leaders in the analytics world, themed learning sessions, poster presentations and interactive workshops.

The recordings for the live streamed sessions in the programme can be found below.

Day 1
Day 2
E-Labs
Auditorium
Ironbridge
Wenlock
Beckbury
Auditorium
Ironbridge
Wenlock
Beckbury
Online Pre-HACA E-Lab Events
Auditorium

Opening Day 1: Welcome to HACA 2024

Recording Available Here!

Keynote: Sir Andrew Dilnot. Why the Analysis Counts.

Recording Available Here!

Analysis is essential to policy.  Done well, analysis can make the work of policy making easy and clear.  Ignored, or done badly, and we end with muddled and failed policy.  Andrew will argue that analysts and analysis are a vital tool in thinking about the challenges and opportunities we face, and how to respond to them.

 

Keynote: Anita Charlesworth, Health in 2040.

Recording Available Here!

Anita will provide an overview of recent trends in major illness and projections for future trends up to 2040. She will explore how the trends in population health risk factors could shape the future demand for health care. She will explore the potential for a future expansion or compression of morbidity. Anita will outline how the trends in major illness might affect the type of need which the NHS will be seeking to address. She will extend the analysis to share modelling results which have attempted to estimate the implications of major illness trends for NHS funding and the model of care. She will look at the pattern of inequality associated with changing need and the implications for wider policy including economic inactivity and pensions. The presentation will use the results of the Health in 2040 analytical partnership between the Health Foundation’s REAL Centre and University of Liverpool. It is an example of the use of micro-simulation modelling for health policy. The study draws heavily on linked primary care and hospital data with evidence from survey data.

Participatory Research and Involving Experts by Experience: Talk and Panel Discussion.

Recording Available Here!

Main Presenter: The Health Foundation

Keynote: Andy Boyd. UK Longitudinal Linkage Collaboration: a transformative national resource linking UK longitudinal study data with diverse routine records

Recording Available Here!

The UK Longitudinal Linkage Collaboration (UK LLC) is a transformative national resource which enables analysts to pool rich and diverse data from over 25 UK longitudinal population studies with participants linked health (NHS), socio-economic and environmental records. It enables analysts to investigate multidisciplinary research questions, service interactions and outcomes, that could not previously be considered due to the scale and detail of the combined information.
Initial projects using the UK LLC have demonstrated its potential to support the investigation of policy relevant questions and that rich pooled data including detailed insights on behaviours and aspirations and can provide the data foundation needed to understand associations. The breadth of studies included provide a heterogenous sample which is powered for sub-group analysis.
The UK LLC is a remotely accessible Trusted Research Environment – data is accessed through one application process which provides efficient access to data. UK LLC is currently free to access.

Reflections on the Role of the Analyst: An Audience with Anita Charlesworth.

Recording Available Here!

Interviewer: Fraser Battye

Supporting the Measles Outbreaks using Geospatial Technologies.

Recording Available Here!

Main Presenter/s: Ian Maxfield
Additional Presenter/ Co-Author/s: Richard Sanders

Measles is highly infections- in fact the most infectious disease transmitted by the respiratory route. It can be severe, especially in young infants and the immunosuppressed and it can cause miscarriage and stillbirth in pregnancy. The most effective way to control it is high uptake of two doses of the Measles, Mumps and Rubella (MMR) vaccine. The World Health Organisation (WHO) set a target of 95% uptake, however in England, the rate for 2 doses at 5 years of age has been well below at 86%. The UK briefly achieved Measles Elimination in 2016 and 2017, but transmission of the disease returned again in 2018, when Europe was suffering multiple epidemics. Cases dropped during the Covid pandemic due to the societal and travel restrictions implemented, but Measles cases have returned, with 789 cases logged by UK Health Security Agency (UKHSA) between October 2023 and 21ST March 2024, the majority in London and the Midlands.

Background/objective
Measles is the most highly infections disease spread by the respiratory route. It can be controlled using the MMR vaccine- the WHO’s target is an uptake rate of 95% across the population, however in England, it is much lower at 86% and between October 2023 and March 2024, 789 cases of Measles were recorded, most in the Midlands and London.
Public Health teams are tasked with increasing uptake rates but have limited access to the information resources they need to effectively plan and monitor interventions.

Method
In England, MMR vaccine uptake records among the 0-19 population are managed by Child Health Information Services (CHIS). Following a programme of Cancer Screening work, SCW Geospatial were asked by NHS England South West to map a dataset of MMR vaccinations from CHIS providers. Using methods pioneered during Covid and Flu immunisation programmes, an interactive online Geospatial Solution was developed. It presents users with a heatmap of vaccination uptake at a granular neighbourhood level of detail. The map can be filtered to focus on populations of different age groups, ethnic groups, specific NHS areas and on areas of socioeconomic deprivation.

Results/Further development
Following the initial version for the South West, an advanced version of the solution has been commissioned for the North West. This builds on the prototype, adding live data that is refreshed weekly, visualising change over time and presenting uptake rates at School and GP Practice level. Next steps are to agree and deliver the live solution to other Regions and enhance the functionality still further, showing School and GP Catchments and to calculate and visualise a Measles Susceptibility Index.

Conclusion
Feedback has been extremely positive; customers have been clear that the tools developed will have long-term value and could be extended to other vaccination programmes and to the adult population.

Building a Better Geospatial Data and Analytic Function for Improved Public Health Outcomes.

Recording Available Here!

Main Presenter/s: James Lewis
Additional Presenter/Co-Author/s: Joe Easley, Bryony Cook, Donna Clarke

The Geospatial Team at the UK Health Security Agency is learning the lessons of COVID19 to enhance how geospatial data is managed, analysed and shared, to improve public health outcomes.
In 2022, the team partnered with Ordnance Survey to define a strategic approach to building improved geospatial capability, focusing on better geospatial data engineering, standardised geospatial products and developing a capability in Geospatial Analytics, Research and Insights (GARI). In addition the team is developing a framework for geospatial standards and is working to grow membership of the fledgling UKHSA Geography Profession in partnership with the Government Geography Profession.
The capability presented and being developed by the team will support improved analytics for climate change, health equity and across multiple hazards including infectious diseases, CBRN and environmental. The team is exploring how automated data pipelines will support standardised analytics and packages, supporting accurate and trustworthy geospatial data and outputs. The team is also developing the application of scalable address geography (UPRN) for enhanced common exposure and cluster detection analysis. The team is also leading on Geospatial Commission Pilots to explore use of Earth Observation and mobility data for public health insight.
In addition, the team is building partnerships across government and academia to consider research opportunities focused on relationships between the built and natural environment and the impact on population health.

Utilizing Location Analysis in NHS Blood and Transplant.

Recording Available Here!

Presenter: Gareth Humphreys

My presentation, titled "Utilizing Location Analysis in NHS Blood and Transplant " will showcase the pivotal role of location analysis in the strategic decision making of NHS Blood and Transplant (NHSBT). Specifically, it will demonstrate how this analytical approach aids selecting optimal locations for new donor centres, thereby enhancing blood collection capacity and catering to diverse organisational objectives.

NHSBT employs a combination of temporary donation sessions, in venues such as church halls and sports clubs, alongside permanent donor centres in city centres or on hospital sites. Additionally, certain blood products like platelets and plasma, can only be donated at donor centres. In response to the nation's evolving donation needs, NHSBT has been charged with establishing new centres.

During the presentation, I will delve into the methodologies, datasets, and tools utilized to pinpoint suitable locations, starting from an extensive list of towns and progressively refining it to specific streets, for three distinct projects. These include maximizing the collection of Ro blood—an in-demand blood type used in treating Sickle Cell patients—expanding existing donor centre capacity by establishing a new centre, without diminishing existing sites, and relocating an existing centre once a lease has ended.

Furthermore, I will underscore the significance of high-quality data analysis in the decision-making process for selecting new centres, as well as its efficacy in addressing stakeholder challenges.

Ironbridge

Removing Friction in a Data Flow by Automated Geocoding of Assault Locations Captured as Free Text in Emergency Departments.

Recording Available Here!

Main Presenter/s: Michael Cheetham
Additional Presenter/Co-Author/s: Adam Woodgate, Martin Patrick Griffiths, Ben Bloom

Assaults recorded in ED represent a substantially different set of events to those recorded by the police. Since 2014, data on assault location and mechanism has been collected in Emergency Departments in England under the Information Sharing to Tackle Violence standard (ISTV) and is used to identify geographical hotspots, temporal trends and changes in the nature of violence. This informs various violence reduction activity. Implementation of this nationally has been challenging because geocoding has been done manually at a local level. The objective of this study is to test the value of an automated geocoding algorithm.

Using data on all those presenting at a Barts Health UEC service in 2022 and 2023 recorded as having been assaulted (n= 6,998), this presentation describes a pilot natural language processing methodology for converting this free text to a point location. The methodology uses a combination of regular expressions (to identify postcode); phonetic matching using refined Soundex to a place name list (to identify streets and places) and supervised machine learning using subword vectors generated by fastText (to identify assaults happening at home).

Up to 93% of the test data (n= 1,730) have the location identified as well as the entry allows. Of the errors the majority are entries that should have been matched which are identified as not being matchable. This performance is good enough for most operational uses of ISTV data.
The intention is to refine this methodology to make this data available quickly across England and enable ISTV implemented.

Automated Identification of Suicide Attempts in Police Record Reports.

Recording Available Here!

Main Presenter/s: Ehsan Taati
Additional Presenter/Co-Author/s: Lee Robertson

Keywords: Knowledge extraction, Natural Language Processing (NLP), Suicide attempts, Mental Health Interventions, Large Language Model

Aim:
This research aims to develop an automated pipeline to identify and extract reports of suicide attempts from police records. Using advanced Natural Language Processing (NLP) techniques, we intend to streamline the text labelling process by implementing prompt engineering with a large language model.

Background:
Manually analysing and labelling large volumes of textual data in the UK Police's record systems, specifically flagging incidents as suicide attempts, is labour-intensive and time-consuming. This slows the broader analysis and interpretation of critical information.

Methodology:
In developing our automated pipeline, we first employ text normalisation to standardise language from the police reports. Utilising prompt engineering with Llama2, we improve the pipeline’s ability to identify information about suicide attempts. Human evaluators with police and mental health expertise subjectively assess the pipeline, guiding iterative refinements in text normalisation and prompt engineering. This qualitative evaluation ensures the system aligns with nuances important to police and mental health. This integrated approach aims to create a sophisticated system to effectively identify and extract reports of suicide attempts from police records.

Results
Subjective evaluation, including calculating the number of correct and incorrect predicted labels in the dataset, demonstrated the system's accuracy. Efficiency was underscored by comparing to manual labelling efforts, while expert assessments emphasized high accuracy (83%). Dataset analysis confirmed the system's robust performance in identifying diverse instances of suicide attempts.

Conclusion:
In summary, our automated system utilizes advanced NLP and prompt engineering to effectively identify suicide attempt incidents within police reports. Validation through comparisons to manual labelling and expert evaluation confirms the system's efficiency and accuracy. This research contributes a valuable tool integrating technology, law enforcement, and public health, with the potential to enable timely and precise knowledge extraction to inform mental health interventions.

Accelerating NHS Feedback moderation at NHS.UK with NLP.

Recording Available Here!

Main Presenter: Eladia Liliana Valles Carrera
Additional Presenter/Co-Author/s: Alice Tapper, Daniel Goldwater, Matthew Taylor

Background:
Health and care service feedback mechanisms are crucial for continuous improvement and patient satisfaction. Every year we receive thousands of reviews for NHS services which must adhere to strict guidelines on personal information, abuse and discrimination. Moderating these reviews manually is a resource-intensive, time-consuming, and expensive task that hinders user experience.

Aim:
This project aimed to automate the content moderation process for written reviews on NHS.UK by using natural language processing (NLP) techniques to improve efficiency, scalability, and user satisfaction.

Methods:
Each rule on the content moderation policy was tackled separately, leveraging regex for the simpler rules to more sophisticated NLP tools like Named Entity Recognition and Part of Speech tagging for identifying names and descriptions. We trained three models using logistic regression, BERT, and SVM classifiers. These models were trained to identify suicidal and self-harm related content, to distinguish between personal experiences and generic comments, and a complaints model to flag reviews requiring formal escalation. The text in the reviews was transformed into a numerical representation using various embedding methods to capture the textual patterns and the semantic meaning of the reviews.

Results:
The implementation of the automoderation tool has enhanced the moderation process. It has reduced the time from several days to near real-time, decreased operational costs, and it offers the potential to scale the moderation service. The models have performed at par with, or even surpassed manual moderation.

Conclusions:
The use of advanced analytics in the form of NLP has proven to be a powerful tool in addressing the challenges of moderating written reviews for health and care services. This project exemplifies how innovative solutions can lead to substantial improvements in service delivery, efficiency, and set a precedent for future applications of NLP to enhance public health services.

National Competency Framework: The Future of the Data and Analytics Workforce.

Recording Available Here!

Main Presenter: Sarah Louise Blundell
Additional Presenter/Co-Author/s: Andrew Lavelle

Background:
The national competency framework launched October 2023. The framework covers Data Analysts, Data Scientists and Data Engineers, looking not only at the competencies required within each discipline but also wider competencies in Leadership, Working in Projects and Behaviours. The Framework can be further nuanced by incorporating a Specialism dimension and an environment domain.

Aim:
To hear how its adoption has impacted on the development of data and analytics staff. How the framework will be expanded in 2024/25, and how the framework forms the foundation of the Data and Analytics Academy.

Methods:
The approach throughout the Framework development has been collaborative, engaging with colleagues across health and social care. Each development stage has been done in collaboration with experts with the data and analytics community to ensure a broad range of views could be collated.

Results:
By the HACA2024 Conference we will be able to showcase the impact on adoption on staff skill assessment and development planning in adopter organisations. Currently there are 36 organisations adopting the NCF. (March 24)

Conclusion:
The reality is there will be no conclusion for the Framework. It will continue to be dynamic and evolve and adapt as the requirements and demands on the data professional workforce changes. By raising awareness, increasing interest and maximising engagement with the Framework, we hope to create a truly integrated, informative and constructive tool that will support analytical development and professionalisation for individuals, teams, organisations and collaborative system level intelligence functions.

Nursing local, thinking global: UK Registered Nurses and their Intentions to Leave.

Recording Available Here!

Presenter: Nuha Bazeer

Background: In recent years, record numbers of nurses trained outside of the UK and EU (overseas-trained nurses) have moved to the UK and are now working in the NHS, filling vacancies Whilst UK nurse “inflow” is well evidence, outflow to other countries is more difficult to track, with limited evidence on the indicators prior to nurses leaving the UK, or the workforce.

Aim: To identify trends in the potential outflow of UK registered nurses between 2018/19 and 2022/23 , and explore their connection to changes in the global labour market.

Methods: Nursing and Midwifery Council (NMC) registered nurses seeking to work abroad must apply for a Certificate of Current Practicing Status to prove their practicing status to foreign registration authorities.

The number of nurses applying for a certificate is therefore a reasonable proxy for potential outflow and was used to calculate the likelihood of UK registered nurses seeking employment abroad. Trends in application numbers were assessed against the length of time on the register before application, their training country, and the country they applied to, relative to the total number of nurses on the register.

Results: In 2022/23, overseas-trained nurses were 7 times as likely to apply than UK-trained nurses and accounted for 70% of the almost 12,000 overall applicants. Almost 25% of overseas-trained nurse had 3 or fewer years on the UK register before applying, compared to around 3% in 2018/19. Over 80% of applications were for potential destination countries with attractive labour market conditions, policies, and remuneration: Australia, New Zealand, and the USA.

Describing Trends in General Practice Staffing Mix using Latent Profile Analysis.

Recording Available Here!

Presenter: Alexander Lawless

Background:
General practices in England are currently facing major workforce pressures with falling numbers of full time General Practitioners (GPs) and rising demand for services. There is interest in understanding whether skill mix changes have improved access to general practice, increased the range of general practice services available to patients, or reduced use of other parts of the health care system.
This study aims to identify clusters in practice staffing levels using routine general practice consultations and to assess distribution of clusters within practice deprivation quintiles.

Methods:
This cross-sectional study utilises Clinical Practice Research Datalink (CPRD) data to derive practice-level daily full time equivalent (FTE) rate per 1000 registered patients by staff group. Consultation data grouped by staff type were aggregated to medical, nursing, registered healthcare professionals (HCP’s), healthcare assistants (HCA’s) or administrative roles. Practice FTE totals by staff group were analysed using Latent Profile Analysis (LPA) methods to identify clustering. Practice Index of Multiple Deprivation (IMD) quintile was linked to examine the relationship between deprivation and practice staffing configurations.

Results:
Analysis is currently in progress. All results will be completed and available by June 2024 in line with NHS England funding. Distribution of practices across latent profiles will be explored. Deprivation profiles of our staffing clusters will be generated and assessed, similarly, the geographical distribution within our profiles will be assessed to correlate geography and primary care staffing configurations.

Implications:
This analysis operationalises data on various staff groups and creates a composite measure that incorporates comparative levels of staffing within the practice. The utility of a composition measure that describes staffing can be compared against crude measures such as the proportion of staff at a practice that are medics. The next logical application is to examine the relationship between staffing mix and patient outcomes, for example secondary care admissions for avoidable conditions.

Understanding Drivers of Productivity in the General Practice Sector.

Recording Available Here!

Presenter: Anastasiia Zharinova
Additional Presenter/Co-Author/s: Sarah Blincko, Rob Scott

With funding challenges and growing demand on health services, it is more important than ever to ensure that the NHS is run efficiently. While NHS acute productivity is well researched, there is limited knowledge about how productive the General Practice sector is and what factors contribute to its productivity the most.

This work investigates different approaches to estimating productivity of General Practices. We explored two metrics: total factor productivity and labour productivity. We linked multiple datasets, such as GP Workforce Data, ONS Demographics, NHS Payments to GP, GP Appointments Dataset and others. We applied regression modelling to analyse how General Practice characteristics, patient characteristics and other factors affect productivity in general practices. To ensure quality of the model and mitigate data limitations, we built both panel models and cross-sectional models and conducted additional robustness checks. In addition, we investigated quality metrics to test feasibility of quality adjustments and evaluate if there is a correlation between quality and productivity of General Practices. Over the course of the project, we worked with Primary Care policy teams and data teams to ensure both a correct interpretation of data insights and clear practical outcomes from analytical insights.

This analysis can inform policy decisions around additional support for General Practices. It can also support practices when making operational decisions about optimal skill mix of staff or addressing population needs. In addition, this work contributed to Primary Care Data Strategy and can inform future analysis in this area, for example, productivity of pharmacies.

Revolutionising the COVID-19 Epidemiological Bulletin.

Recording Available Here!

Main Presenter: Colleen Dempster
Additional Presenter/Co-Author/s: Emilia Dulaj Kmiecik

The COVID-19 Health Protection Surveillance team at Public Health Agency (PHA), Northern Ireland, produce a weekly, public facing, COVID-19 Epidemiological Bulletin. Initially the production of the bulletin used various systems such as Access, Excel and Word, which where embedded together. This manual process was time consuming and error-prone due to the limitations of Microsoft. There was a risk that accurate data, critical to the surveillance of the ongoing pandemic, might not be available. This would impact decision making on testing policies, case management and vaccination campaigns.

It was crucial that we identified and implemented a more resilient and robust way of producing the COVID-19 Epidemiological Bulletin. We started by bridging the gap between Access and Microsoft by writing R scripts. This protected us from the immediate loss of data. We streamlined the process to R Markdown, moving away from Excel and Word. We worked with other teams moving to a cloud-based data analytics platform. We created various R-Studio-Connect apps to produce automated outputs that have made reporting processes more efficient and reduced the likelihood of error. This was a long process as we had to map out all data pipelines coupled with extensive data validation steps.

Revolutionising the production of the COVID-19 Epidemiological Bulletin was the result of a successful cross-departmental collaboration project lasting around 18 months. This project required staff upskilling and developmental work to take place in parallel, whilst ensuring information governance was in place and continuing to respond to the acute demands of the pandemic. Investing this time, allowed automation of reports which has reduced person hours, increased accuracy by removing any chance of human error and has enhanced the skill set of our staff. This work has been used as a template for other workstreams, thereby further improving efficiencies, allowing staff to work in a more agile and collaborative way.

Wenlock

Alcohol Dependence Dashboard and Return on Investment Tool: Heath Inequalities and Sustainability.

Recording Available Here!

Main Presenter: Bethany Thompson
Additional Presenter/Co-Author/s: Tom Frost

Keywords: dashboard, visualisation, metrics, ROI, impact, health inequalities

Introduction:
In response to Long Term Plan commitments, Alcohol Care Teams are being implemented in secondary care in England. Analysis of patient level data was required to monitor activity and outcomes, to identify health inequalities and to support the financial sustainability case for the programme.

Methodology:
Analysis of the Patient level data collection was used to create an Alcohol Dependence Dashboard which allows Trusts, ICBs and Regions to see their data through a variety of metrics. There has also been a particular focus on health inequalities within the dashboard with breakdowns of all metrics by different demographics (age, gender, ethnicity, deprivation). The Return on Investment Tool was developed to support the case for continued investment in alcohol services at sub-national level. It is based on hospital data and insight from a case study in the NICE Quality and Productivity collection.

Conclusions and recommendations for further research:
Both the dashboard and ROI tool were well received by Trusts, ICBs and Regions and allowed them to show and quantify the benefits of ACTs. The dashboard has been updated multiple times with improved functionality, since first publication (based on user feedback) and now the collection is in its 3rd year, the volume of data is providing important insights into inequalities at local and national level. Webinars were held to showcase the dashboard and the ROI tool and gave an overview of how the products worked.

Scaling Healthcare Equity Analysis of Access and Experience.

Recording Available Here!

Presenter: Matthew Eves

Health Inequalities in Rates of Activity in Emergency Departments for Children.

Recording Available Here!

Main Presenter: Robert Watkins
Additional Presenter/Co-Author/s: Adrian Harvey

Activity at University Hospitals Leicester has increased by 13% since 2018/19, for children. To understand and address this increase in demand, a small team was commissioned to undertake analysis of rates of activity and possible links to health inequalities. The team included data analysts, Paediatric ED consultants, Public Health professionals and commissioners. The findings are to be presented to the ICB board.

Re-designing the Mental Health Learning Disability and Autism (MHLDA) Integrated Performance Reporting for the NHS South East Region.

Recording Available Here!

Main Presenter: Jamie Gale
Additional Presenter/Co-Author/s: Elisa Santoro, Neil Jackson

Purpose:
• The Integrated Performance Report (IPR) is used by the South East regional MHLDA Clinical Delivery Team for internal meetings and briefings, requests for information and discussions with ICBs.
• A survey was recently circulated to those who receive the IPR. Feedback was it too long, too many metrics, difficult to interpret the charts.
• Overall, stakeholders found it challenging to get the intelligence they required, so it was agreed to re-design the report.

Methods:
• Meetings took place between programme leads and analysts to work collaboratively on the new report.
• The structure of the report and metrics were defined by the group based upon a patient pathway approach, considering business needs and regional priorities (i.e. aligned to the Regional and the NHS Long Term Plan).
• Any duplicate metrics were identified and consolidated.
• With the restructure of NHS England, the MH and LDA programmes were merged into one; additional meetings took place to integrate the LDA metrics within the report.
• Manual data collection processes were reviewed via a task and finish group; new metrics were added and those no longer required were stood down.
• As the report became established, it was automated using R Studio and the ‘plot the dots’ Statistical Process Control chart package.

Results:
• Clear exception-based narrative enabled evidence-based and higher quality decision making.
• One version of the truth for the metrics provided, using existing data sources when possible.
• Simplified report with fewer more concise slides and no duplicate metrics
• Reduced analyst time to produce the report and make changes.
• Appendices with additional detail was built into the report, with no additional resource required.
• Manual data collections streamlined reducing the data burden placed upon ICBs and providers.

Co-producing and Piloting a Quality Early Warning Signs Framework.

Recording Available Here!

Main Presenter: Kathryn Marie Lupton
Additional Presenter/Co-Author/s: Michael Richardson

How do we get better at integrating analytics and research into national policy development? How do we co-produce analytical and policy tools in a meaningful and iterative way with end users?

This presentation will draw on emergent learning from the piloting of the National Quality Board’s Quality Early Warning Signs Framework, being led by NHS England. The QEWS framework has been designed to support providers, systems, regional and national teams to identify, mitigate and manage early signals about the quality of care, relating to leadership, culture, governance and other considerations.

The approach is underpinned by a dashboard hosted on Model Health System, and a framework signposting to qualitative data and wider sources of information. It is currently being piloted in four Integrated Care Boards and being overseen by a Working Group with representatives across system, regional and national teams.

By drawing on the learning from these pilots, this presentation will explore:
- The case for building analytics and social research into policy and strategy development BAU
- How we can do this by the conversations we have and how we work together across system, regional and national geographies, with people using services, clinicians policy makers and analysts
- Key ingredients, enablers and challenges to achieving this, including culture, leadership and behaviours
- What difference this can make to how policies are adopted and the difference they make, with insights into how we spread the learning.

In-Depth Analysis and Insight: A Cancer Pathway Review within an Integrated Care Board.

Recording Available Here!

Main Presenter: Ruth Green
Additional Presenter/Co-Author/s: Kelly Bishop, Joanne Bayliss

This presentation will focus on the work NHS ML BI Consultancy Unit are leading, in collaboration with NHS ML Nursing and Urgent Care Team, to support an Integrated Care Board (ICB) to understand the potential opportunities for improvements in cancer pathways.

Background:
As referenced in the national NHS priorities and operational planning guidance for 2023/24, the significance of early cancer diagnosis is an NHS priority. One of the biggest actions the NHS can take to improve cancer survival is to diagnose cancer earlier.

Funded by the regional Cancer Alliance, the ICB identified the need for a deep dive cancer pathway analysis. The areas of focus of this in-depth review were late-stage cancer diagnosis and a review of primary care cancer referrals.

Aim:
The aim was to identify and investigate variations in cancer pathways. The insight gained informed recommendations to improve the patient experience by optimising referral to diagnosis timescales, and improving early diagnosis, where possible and appropriate.

Approach / Key message:
Our joint approach of clinical and analytical teams working closely together was key to the delivery of the project, which encompassed clinical audit as part of the analysis. Working in partnership with the ICB, primary care and acute hospital provider cancer services teams within the ICB geography, meant we could incorporate a wide source of data and information, and obtain the required input from clinical specialists as part of a case study approach.

Adopting a mixed method analysis enabled a rich study to develop, providing the evidence needed to inform recommendations for improvement and transformation of cancer pathways across the integrated care system, with the potential to be adopted across other ICBs / regions.

Using Machine Learning and Secondary Care Activity Data to Identify Risk of Cancer Earlier.

Recording Available Here!

Main Presenter: Hadi Modarres
Additional Presenter/Co-Author/s: Thomas Henstock, Divya Balasubramanian, Dimitris Pipinis, Gursimran Thandi, Achut Manandhar, Rupert Chaplin

Introduction:
Timely cancer diagnosis is critical for improving patient outcomes, yet late diagnosis remains a challenge associated with limited treatment options and increased mortality. The NHS aims to enhance early cancer detection by 2028, a goal further complicated by COVID-19 related disruptions. Identifying high-risk sub-populations for certain cancers, characterised by vague symptoms and low incidence rates, is particularly difficult. Leveraging advanced machine learning techniques offers promise in addressing this challenge within the NHS. This study utilises NHS England's patient-level data and data science capabilities to predict cohorts at elevated risk of future cancer diagnoses with the aim of targeting and tailoring interventions to sub-populations that are at higher risk.

Method:
A dataset comprising 29 million individuals over 40 in England was compiled, integrating various datasets including the master patient index, population segmentation data, 111 calls, hospital attendances, and mortality records. Features capturing healthcare interactions (e.g. number of 111 calls, number of hospital attendances), demographic, socioeconomic, and clinical diagnosis variables were developed. Machine learning algorithms were trained to predict future cancer risk for all cancers and specific priority sites identified by the cancer programme.

Results:
The trained models achieved a performance of 0.79 AUC, comparable to existing literature. Notably, interactions with the healthcare system, such as A&E attendances, emerged as significant predictors of future cancer risk. Identified high-risk cohorts exhibited up to a tenfold increase in cancer incidence compared to baseline population rates.

Discussion:
This study highlights the potential of secondary care data coupled with machine learning in identifying high-risk cohorts for targeted interventions. The incorporation of primary care and cancer registry data could further enhance predictive accuracy, representing a key area for future research within this project.

The Lincolnshire Living with Cancer Programme Dashboard - Transforming Words and Feelings to Numbers and Statistics.

Recording Available Here!

Main Presenter: Kathie McPeake
Additional Presenter/Co-Author/s: Kelly Dixon

35,000 people are living with cancer in Lincolnshire. This is predicted to rise to over 45,000 by 2030. A cancer diagnosis can have a profound effect on a person, with not just physical health impacted. People’s mental health, finances, relationships and work can also be negatively affected. Evidence shows that people who have access to good, personalised support before and after a cancer diagnosis often have better outcomes than those who don’t.
The Lincolnshire Living with Cancer Programme is an innovative, collaborative, co-produced programme which is transforming the way in which people affected by cancer access personalised support. We are creating a better and sustainable future for supporting people living with cancer, involving and integrating all relevant parts of the health and social care system, using the assets we already have, supporting people in the place they would like and in the way they would like, and placing people at the centre of everything we do. Data drives our programme, and we measure impact by synthesising quantitative and qualitative data.
At the start of the programme, qualitative data in the form of patient stories told us what needed changing to improve patient experience. It informed our approach to transformation and what we needed to do. In our delivery phase, we need to understand the impact we’re having. We use quantitative and qualitative data to do this. Our data analysts have developed a dashboard bringing together all quantitative data relevant to the programme from different organisations in the Lincolnshire system, which shows the difference we’re making. They have also developed a methodology of measuring the impact of qualitative data (case studies) which quantifies patient experience and enables us to express patient experience as a return on investment for our ICB Investment Panel. We also use data to inform our service delivery.

Beckbury

Getting Our Tools Used: Learning from the First Year of an NHS Data Science Team.

Recording Available Here!

Main Presenter: Brandon Chapman
Additional Presenter/Co-Author/s: Joe Turner, Jo Davis

With a small data science team now in place at RCHT, we have both developed tools and worked on getting them used in operational practice. We have done this because of the ability of data science to open up new ways of understanding the range of information available to the hospital, especially on the predictive side, enabling the operational and clinical teams to take more informed actions.
To do this, we have developed a range of new products working with the clinical and operational teams. Successes have included a machine learning based DNA predictor which can predict with good accuracy which patients are most likely not to attend. Working with our outpatient transformation team, we have been contacting these patients to bring about an actual reduction in non-attendances.
We have also developed a suite of predictive tools in urgent care, for instance:
- An estimate of individual risk of emergency department admissions
- Urgent care alert system which generates an overall ED pressure score based on the indicators most related in practice to ED crowding
- Assessing the impact of seasonal tourism on emergency and trauma theatres
- Assessing demand for blood in emergency operating.
The results to date have included getting some tools in decision-making, for instance delivering over 200 outpatient slots’ worth of benefit, but also an improvement in our understanding of how we improve take-up of what we develop.
Learning points have included the importance of project definition, addressing questions that operational teams want answered, as well as identifying operational questions which are boundaried enough to lend themselves to actions dependent on what the tools show. Also, the importance of designing tools with an eye of how they will ultimately be used (eg high specificity for the DNA tool to focus limited resource for calling patients).

Why Modelling of Inpatient Discharge Predictions is Key to Reducing ED Admitted Patient Delays and Elective Cancellation: An Example from UCLH.

Recording Available Here!

Main Presenter: Dr Zella King
Additional Presenter/Co-Author/s: Alison Clements

In a health system under pressure, with hospitals operating at capacity, difficult decisions are made daily. Operational managers must decide when to initiate escalation measures like discharging patients earlier, cancelling elective admissions or diverting ambulances because patient care is no longer safe.

These decisions depend on having a solid picture of current capacity and imminent pressures. Gaining that picture often involves manual data collection, validation of data in the Electronic Health Record (EHR), frequent meetings with ward managers, and the use of simple heuristics to predict end-of-day bed state. With such activities, the promise of EHRs to unleash greater productivity and better insight remains unrealised. This is unfortunate both for hospitals themselves and for the national, regional and sector bodies who are seeking real-time oversight of urgent and emergency care through System Control Centres and the Operational Pressures Escalation Levels (OPEL) framework

At HACA 2023 we presented a predictive model of short-term demand for emergency beds, which is in daily use at UCLH. That model provides one tool to inform decision-making about escalation. However, for the full picture, the ‘supply side’ is also important. In this joint talk from an operations director and a data scientist, we introduce models that predict short-term discharge activity at different levels of aggregation and certainty. When combined with predictions of demand, and based on real-time data, such models give operational managers better insight about where capacity needs to be freed up, and when to initiate escalation measures.

We discuss the data challenges and some of the realities of hospital life that make this important work difficult. We lay out a vision of how predictive modelling of short-term capacity can supplement the view provided by an EHR, and how this can lead to better informed decision-making about the provision of urgent and emergency care.

A&E Admissions Forecasting: How to Develop, Maintain and Improve a Data Science Model in Production. Lessons Learnt.

Recording Available Here!

Presenter: Jane Kirkpatrick

Background:
Our team develops and maintains an A&E Forecasting tool which supports operational decision making.
We make daily 21-day ahead forecasts of non-elective admissions via A&E for 121 trusts in England. These are presented on a user-facing dashboard, with forecast explanations and historic accuracy metrics.

Rationale of our work:
Since this tool was stood up during the pandemic, we were working to tight timelines to deliver a product that met users’ top priorities. We now have an opportunity to improve the tool and expand its use. We have heard from users the improvements they would find useful. We want to share our code, and have refactored to improve code usability.

Actions:
This presentation will focus on the continuous development of the Bayesian hierarchical model we use to forecast – new features tried and lessons learnt. It will be an honest presentation of the reality of data science tool development – sometimes you try things, and they don’t work as planned!
Recently we have worked on:
- Providing forecasts with breakdown by site. To enable this we explored normalisation approaches for input data. We found this did not lead to expected performance improvements.
- Forecasting for attendances as well as admissions. This involved adapting the model, tuning parameters of prior distributions and modularisation of code.
- Exploring if data on respiratory infections could improve predictions.
- Refactoring and maintaining codebases for modelling and deployment.

Outcome:
“I think you’ve got something which is ground-breaking and incredibly useful.”
Associate Director for Performance Information, Luton & Dunstable Hospital
Maintaining and improving an analytical tool is an important part of its lifecycle. This work has allowed us to better meets users’ needs, improved user buy-in and supported working towards publishing our code.

Uncertainty, Politics, and Analytics: Building Projections of the Elective Waiting List.

Recording Available Here!

Main Presenter: Melissa Co
Additional Presenter/Co-Author/s: Freya Tracey, Kathryn Marszalek

In January 2023, the Prime Minister pledged that ‘NHS waiting lists will fall and people will get care more quickly’. In October 2023, we analysed what it would take to achieve this pledge before the general election. The NHS elective care waiting list has been growing since 2013 and increased sharply during the pandemic because elective care was suspended. Even after lockdowns were lifted, the waiting list continued to grow because the number of treatments completed had not caught up with the steady stream of new referrals.
We created four projections for how the waiting list would change up to January 2025 with and without the effect of industrial action (strikes by consultants and junior doctors) and varying productivity (increases/decreases in completed treatments). To help illustrate the uncertainty in the system, we also created an interactive calculator using R Shiny to allow readers to explore their own alternative scenarios.
The politicised backdrop – industrial action, the upcoming general election, the critically stretched NHS – added challenges beyond the analytical modelling itself. We had to balance our commitment to objectivity and accuracy while accepting we could not model the full, continually changing system.
We will discuss our analytical approach, including our choices when modelling the effect of industrial action and where and why we decided to allow users to make their own decisions. With an additional 9 months of data available by the time of the presentation, we will also compare our projections to reality and reflect on why they may differ.
National and local healthcare analytics will continue to face challenges working on uncertain and politically-sensitive topics – and this is increasingly likely as pressures on the NHS mount. We aim to share the lessons we have learned and our thoughts on how to manage these challenges.

Predicting Deaths on the Waiting List.

Recording Available Here!

Main Presenter: Kamil Barczak

The growing NHS waiting list has been in the spotlight for years. Cuts to the NHS budget, the growing elderly population and the effects of 2020 COVID-19 pandemic have all contributed to record breaking growth of those waiting to be seen and the duration of the referral to treatment times (RTT). A report by the Health Foundation UK suggests that the NHS waiting list is expected to peak at more than 8 million by summer 2024 . Coinciding with long RTT times are deaths of patients on waiting lists. There’s a real concern that longer RTT mean more deaths and increased disease impact. Apart from reducing the RTT, methods to prioritise those who are most vulnerable to the effects of increased RTT and can be effective and minimise the impact of long waiting times.

Our aim was to develop a risk stratification tool using machine learning to identify patients most likely to die within 12 months of their pathway journey.

Retrospective open pathways and death data was collected, cleaned and temporalized. A recent 12-month period was chosen to track deaths within 12 months of starting a pathway. A previous 12-month period was used for training. Any information prior to the beginning of the training data was used to calculate metrics of interest such as death rate by pathway. We used a Synthetic Minority Over-sampling Technique (SMOTE) and random under sampling to reduce the impact of data imbalance . We also used a logistic regression-based feature selection method to reduce the dimensionality of the training data and modelled risk of death per pathway using a logistic regression model, treating each record as a unique sample.

Model outputs have been integrated into a Power BI dashboard and deployed onto Cornwall’s platform. We plan to collect feedback from the client in the upcoming weeks.

The Use Of Real-World Data In Decision Making At The National Institute For Health And Care Excellence (NICE).

Recording Available Here!

Presenter: Shaun Rowark

For the past 25 year the National Institute for Health and Care Excellence’s (NICE) core purpose has been to help practitioners and commissioners get the best care to people fast, while ensuring value for the taxpayer. While this remains constant for NICE how we make decisions has changed to ensure we focus on what matters most, provide useful and useable advice, and constantly learn from data and implementation.
In this lightning talk I will describe how NICE has adapted to the proliferation of real-world data and the development of real-world evidence methods and how we use this data and methodology to make decisions for the development of our products. These products focus on national guidance as well as whether medicines are approved for patient access in England. I will also cover future considerations for NICE, such as patient reported data from wearables, synthetic data and data generated by artificial intelligence.

Right Data, Right Method, Right Insight - The Development of the Manchester Measuring Inequalities Toolkit.

Recording Available Here!

Main Presenter: Neil Bendel
Additional Presenter/Co-Author/s: Eliza Varga

Measures of socio-economic inequalities are important markers for health and social policy and for society in general. It is important that indicators and tools which purport to measure socio-economic inequalities are both accurate and presented in a way that minimises the chances of misinterpretation and supports better quality decision making.

Drawing on the earlier work of ScotPho and PHE, the Manchester Measuring Inequalities Toolkit is designed the fill the need for an easy to use, interactive training package which illustrates and explains some of the most commonly used methods for measuring health and other inequalities, including simple measures of inequality gaps, regression-based inequality measures (e.g. Slope Index of Inequality), and Lorenz-curve-based measures, and how these should be calculated, visualised and interpreted. Through this, the Toolkit will help to improve the development and monitoring of interventions to address socio-economic inequalities by helping information analysts and policy makers to produce and utilise more statistically rigorous and accurate outputs describing the scale and nature of the inequalities in a local area.

This presentation will describe in more detail the purpose of the Measuring Inequalities Toolkit, how it was commissioned and the initial content. It will go on to explain how selected modules were piloted by a group of analysts from a range of different organisations and how the content was then revised following the feedback provided. Finally, it will describe the next steps in the development of the toolkit and how this fits in with the wider programme of work to monitor the impact of Making Manchester Fairer – the City’s action plan for tackling health inequalities.

Supporting the Development of a 'Healthier and Fairer' Programme using System Wide Data and Insight.

Recording Available Here!

Presenter: Anna Pickford

The Healthier and Fairer Programme within Northeast and North Cumbria is a whole health and care system transformational programme of prevention, healthcare inequalities, and social and economic inequality projects to improve population health.

The development of the programme has included the use of analytics from inception, adopting modelling and population health management approaches, a strong culture of evaluation and joint working across system partners.

The Business Intelligence function has supported with 5 core components;

Intelligence and Insight – Providing evidence-based advice on emerging themes for the population, producing Healthcare Needs Assessments, delivering training on identifying inequalities within wider Programmes of work, advising how to measure and demonstrate impact.

Analytics and dashboard development – Providing cohort analysis for specific projects, creating a model of estimate need for increased weight prevalence, producing Power BI dashboards for workstreams to aid ongoing monitoring of key population issues specific to their agenda.

Performance – Demonstrating progress against national and local priorities such as CORE20plus5 agenda, ICB priority strategies and national policy.

Research and Evidence – Working in partnership with universities to evaluate specific projects. Writing quantitative protocols, undertaking analysis and contributing to the writing of academic and board papers.

Data quality and data flows – identifying appropriate data flows and where not available, working with providers to create new flows.

We currently have 5 self-service workstream dashboards live which are available to all system partners including Local Authorities, Acute and Mental Health providers and OHID.

The dashboards include; Alcohol Prevention, Healthy Weight and Treating Obesity, CVD prevention, Tobacco Control, Public Health and Prevention in Maternity and an Overarching Healthier and Fairer dashboard.

We have embedded routine performance reports to ICB executive, 4 evaluations are in progress and have completed three Healthcare Needs Assessments relating to the prevention agenda for NENC ICB.

Auditorium

Opening Day 2: Welcome to HACA 2024

Recording Available Here!

Keynote: Emma Gordon. Linking health and administrative data: mitigating the missed use of data

Recording Available Here!

Imagine a world where it didn’t matter what your agency was , your priorities were rooted to one over-arching strategic plan for our nation. An ambition spanning multiple organisations with every single one of us a pivotal part of the journey. A way of working powered by the rise of whole systems insights through multi agency collaborations, towards a single mission. To ensure every single citizen in our nation was THRIVING !

If we make this HACA the moment that changed our nations by putting away our logos and ego and combining our knowledge, technical expertise, wisdom and humility to transform how we do business together !

Keynote: Richard Humphries. Adult Social Care and Health: Policy, Purpose and the Power of Data.

Recording Available Here!

Current and Future Patterns of Inequalities in Diagnosed Illness by Deprivation.

Recording Available Here!

Main Presenter: Ann Raymond
Additional Presenter/Co-Author/s: Toby Watt, Hannah-Rose Douglas, Laurie Rachet-Jacquet, Anna Head, Chris Kypridemos

Introduction:
The existence of wide inequalities in health across England is well-documented. Our research adds to this evidence on inequalities in self-reported health by describing current patterns and projecting future patterns of inequality in diagnosed illness across multiple conditions by deprivation.

Methods:
We used the IMPACTNCD microsimulation model that simulates a close-to-reality synthetic population of adults in England from 2019 to 2040. This model combines individual-level data on demographics, health and mortality from linked administrative data for primary and secondary care with survey responses on individual-level risk factors and epidemiological evidence on the associations between risk factors and chronic illness.

We used the Cambridge Multimorbidity Score (CMS) as our multimorbidity measure. This assigns a weight to 20 common long-term conditions based on individuals’ healthcare use and their likelihood of death. We further focus on “major illness” which corresponds to a CMS greater than 1.5.

Results:
In preliminary results, we project that health inequalities are not projected to improve between 2019 and 2040. In 2019, the difference in the average time spent without major illness between the most and least deprived 10% of areas in England was 10.4 years. This is projected to remain largely unchanged at 10.7 years (8.8, 11.7).

We also find that in 2019, the share of working age people living with major illness in the most deprived 10% of areas in England (14.6%) was more than double the rate seen in the least deprived 10% of areas (6.3%). In 2040, we project these rates to remain largely unchanged at 15.2% (13.0%, 17.6%) and 6.8% (5.4%, 9.1%) respectively.

Discussion:
On current trends, health inequalities are projected to persist into the future. This has significant implications not just for population health but for labour supply and wider economic growth.

Data, Analytics, and Decision Making in Adult Social Care

Recording Available Here!

Access All Areas- Creating Space for Generic and Open Source Simulation Modelling.

Recording Available Here!

Presenter: Sally Thompson

Simulation models provide longitudinal insight into a system’s behaviour, whilst creating a space for risk-free exploration of diverse scenarios. Traditionally, these models are often commissioned for a singular customer group, against one agenda, yet the questions a model seeks to address are not likely to be unique, particularly within health and care. The bespoke nature of simulation modelling can be a barrier to its wider use, as can the need for specific, often expensive, software to be able to interact with the model.
The uptake and usage of simulation models can be increased by making models:
• more generic - with users able to apply their own data within a pre-defined structure;
• and more easily available – by creating apps with open-source software, or through publicly available repositories.
A system dynamics model that projects future demand for care home places is used as an example of this more accessible approach to sharing simulation models.

Keynote: Marc Farr. It’s the Data [and Analytics] Stupid

Recording Available Here!

Marc will discuss the importance of the data layer and the role of the analyst in how we tackle planning, population health and research. With a data landscape made even more chaotic (and exciting) than ever before by the FDP, the SDE, the emergence of AI, RAP and RPA Marc will attempt to set out a framework for how we develop data strategies for the next five years.

Harnessing the Power of NHS Jobs Vacancy Data to Support the National Competency Framework's Vision for a Standardised Data Professional Workforce.

Recording Available Here!

Presenter: Thomas Owen

The NHSBSA has access to the NHS Jobs dataset, which contains job description text for hundreds of thousands of vacancies advertised through NHS Jobs each year. Analysis of the NHS Jobs dataset could complement initiatives, such as the National Competency Framework (NCF), and help improve the recruitment and retention of NHS staff. The NCF was introduced by NHS England to enhance the professionalisation and standardisation of Data Professional roles. An analysis using the NHS Jobs data could supplement the NCF by assessing the current variability within the Data Professional vacancies.

We analysed 1.7K Data Professional vacancies advertised through NHS Jobs during the 2021-22 and 2022-23 financial years. Technical skills were extracted from job descriptions using a dictionary-based approach and vacancy similarities estimated using the Jaccard Index. Dimensionality reduction, via multidimensional scaling, were applied to visualise the vacancy similarities. This approach quantifies the degree of variability across vacancies before the introduction of the NCF and helps us to assess the current Data Professional landscape.
The NCF focuses on enhancing Data professional roles across the NHS. Our quantitative analysis of Data Professional job descriptions could help by providing a baseline understanding of the current Data Professional workforce. Notably, variability in the requested technical skills during recruitment is evident across the NHS for all three Data Professional roles (Data Scientists, Data Engineers and Data Analysts).

Our work aligns with the objectives of the NCF to help standardise and professionalise the NHS data workforce. A recent report by NHS England forecasts that the NHS Digital, Data, and Technology workforce will increase by 69% between 2020 and 2030 to meet demand. Given the rapidly evolving workforce, the success of initiatives such as the NCF is paramount. Our analysis could assist the NCF, providing a quantitative framework to measure its impact and efficacy on the workforce.

Combining Analytics and a Complexity-Informed Relational Approach to Facilitate System Change and High-Quality Collaborative Decision Making in Midlands In-hospital Paediatrics.

Recording Available Here!

Presenter: Jennifer Wood

When clinicians presented NHS senior leaders in the Midlands with complex problems that would require multi-organisation solutions, their ability to respond to the challenge was highly constrained. One the one hand, how could they generate the kind of “hard” evidence typically used to manage trade-offs in moving resource around? And on the other hand, how could they generate solutions that respond to the compositional, dynamic, experiential and governance complexity of a system of 23 Provider Trusts?
In this project we took a complexity informed mixed method approach to solving issues identified in Midland In-hospital Paediatrics. The aim was;
• to generate collective understanding and consensus on the issues and their potential solutions, supported by evidence and
• to facilitate mechanisms to enable effective collective decision making.
We started with a problem formulation workshop with clinicians. From this we generated a system map of the “as is” situation, including causal factors and how they interact.
We then initiated the analytical work, while concurrently carrying out qualitative work – interviewing decision makers, clinicians and subject matter experts, to broaden our understanding, and build commitment to action.
The qualitative analysis included exploratory descriptive analysis including trends and flows, cohort specific complex medical needs analysis and future demand modelling.
Regular touchpoints between the qualitative and quantitative teams allowed us to learn from one another, generating hypotheses and allowing for analytical “deep dives”.
The product was a Case for Change, which presented a range of new and compelling evidence.
This was presented back to the system via two workshops in which 139 clinicians and NHS leaders were brought together in person. They were asked to evaluate and respond to the evidence, connect and network with one another, take individual action and generate collaborative solutions, which we harvested for the next stage of work.

Neurodiversity at Work.

Recording Available Here!

Main Presenter: Dani Collier
Additional Presenter/Co-Author/s: Kierstan Lowe

Background /Objective:
“What”, “How” and “Why” are the themes of this year’s event. I would like to add “Who”.
Recently, I was diagnosed autistic and ADHD and I struggled to communicate my needs.
I use the “Working with Me” document to help with this. I share openly, as I recognise many of the traits that led to my diagnosis also appear as key skills within the analytical field. Skills such as logic, pattern recognition, questioning, creativity, and problem-solving, for example.

Methods:
Awareness is improving, but there is still a lot of ambiguity and nervousness in relation to talking or asking about disability and neurodiversity.
Using the free “My User Manual” team building resource from Atlassian, the examples given helped me look at how I work best, whilst factoring in my needs as a neurodivergent individual.
With the support of my manager, we completed the document. We looked at how the information might land with colleagues and customers. The document is still very much my voice. As someone with communication differences, it is important to me to get my message across and help others understand how this is done.

Results:
I have shared the document with many colleagues, networks and organisations and had great feedback. Some people have created their own documents. My confidence has improved, whilst allowing me to be my authentic self. I feel able to have conversations and speak up where I may not have been able to previously.

Conclusion:
The “Working with me” document supports how people can work at their best, whatever their needs. It can be useful for all individuals and teams, not just neurodivergent and/or disabled people. It’s a document that helps build teams quickly, with understanding and compassion, and helps break down barriers, allowing people to get on with the job.

Closing of HACA 2024

Recording Available Here!

Ironbridge

Improving Acute Hospital Flow Through Real-Time Stochastic Modelling of Daily Discharge Requirements for Onward Care.

Recording Available Here!

Presenter: Nick Howlett

Failure to discharge acute hospital patients in a timely manner can lead to elevated acute bed occupancy which can compromise patient safety and have knock-on effects for upstream services such as Accident and Emergency. A barrier to timely acute discharge is often the availability of intermediate care services for patients that require continuing rehabilitative care past the point of being medically fit. In the NHS, these time-limited services are known as ‘Discharge to Assess’ (or D2A) and there are three pathways along which such patients can be routed following acute discharge readiness – Pathway 1 involves daily home visits and Pathways 2 and 3 involve bedded care (with the latter reserved for those with particularly complex needs). If there is insufficient capacity along the D2A pathways, then the patients wait (i.e., queue) within the acute hospitals. We develop a real-time computer simulation model to stochastically estimate, for each of the next ten days, (1) the number of acute patients that will become ready for discharge along each of the D2A pathways, and (2) the total number of acute patients that will be awaiting discharge (i.e., the queue size). These are based on personalised predictions of discharge readiness date and D2A pathway requirement for all currently admitted patients not yet in a D2A queue. These outputs are combined with the corresponding (non-personalised) predictions for new acute admissions, which are forecasted through a time-series method. The models, updated each day with the latest data, have been implemented in a large healthcare system in and around Bristol, with outputs used to support efforts to improve hospital flow through enhanced discharge planning.

Development of a Python Package for Public Health Statistical Methods.

Recording Available Here!

Main Presenter: Jack Burden
Additional Presenter/Co-Author/s: Cameron Stewart, Thilaksan Vikneswaran

PHEindicatormethods is an R package which allows users to calculate key public health statistics quickly, based on the most up-to-date academic guidance. It provides functions for the generation of proportions, rates, Directly Standardised rates (DSRs), Indirectly Standardised Rates (ISRs), means, life expectancy and slope index of inequality (SII). Confidence intervals can also be generated, and data allocated to quantiles. The package is not only used in the Department of Health and Social Care (DHSC) but by statisticians across the world as it is publicly available through CRAN.

Developing Health Trends in England in Quarto.

Recording Available Here!

Main Presenter: David Jephson
Additional Presenter/Co-Author/s: Clare Griffiths, Alyson O'Neill, Sam Dunn, Annabel Westermann, Marika Kulesza

Since 2020 there has been increased interest in online dashboard-style products as a mechanism for displaying data in an interactive and accessible format for the public. The value of such an approach was clearly demonstrated during the pandemic by the UK coronavirus dashboard.

Health Trends in England (HTiE) is a new official statistics dashboard-style report which provides an easy to navigate summary of selected data from the Fingertips platform. Aimed at a non-technical audience, it presents trends for a selected key indicator for each of 11 public health focused topics via a summary page. A details page for each topic provides more information on the key indicator along with additional indicators on the topic. It brings together high-level trends data in one place for a coherent and accessible view of the nation’s health across topic areas. The dashboard uses existing data in a curated dashboard view with simpler visualisations.

The report has been developed using reproducible code via Quarto. Data is fed directly from an API and styled to meet gov.uk requirements, whilst providing charts that could be presented in an interactive way. Modular code was developed (following Government Analysis Function guidance), creating the building blocks from which to structure our project. Various techniques were used to produce the pages for the dashboard, such as conditional rendering and use of YAML parameters.

Formal launch of public Beta is expected in Spring 2024, with updates published on a monthly basis. Our presentation at the HACA conference will describe the approach we took, and key lessons learned.

RAP - Faster, More Robust, and more Transparent Analytical Processes.

Recording Available Here!

Presenter: Warren Davies

The “Data saves lives” DHSC report states that “We cannot deliver the change that we need to see – and our 10-year plans for cancer, dementia and mental health – unless we embrace the digital revolution and the opportunities that data-driven technologies provide.”

A big part of achieving this is the RAP (Reproducible Analytical Pipelines) way of working, which emphasises automation, testing, sharing code, and the use of open-source tools. As noted in the Goldacre Review: “The NHS can and should rapidly adopt RAP working practices, both for service analysis and for research.” The RAP Squad at NHS England has spent over three years working to implement RAP across the organisation, engaging with multiple teams to help them improve their analytical processes.

The aim of this poster is to provide an overview of the RAP process based on the lessons learned while working with these teams, and provide actionable insights that attendees can take away to use in their own work. These include key principles such as:

• Automation
• Modular, re-usable code
• Transparency
• Open-source tools
• Version control
• Good coding practices
• Testing
• Peer review

The poster will provide an overview of the RAP approach, and signposts to where readers can learn more about how to apply the RAP principles in their own work. As more analysts adopt RAP, the impacts include:

• More robust processes which improve the accuracy of publications
• Increased automation which frees up analyst time so that they can work on other projects
• Cost savings – more efficient processes are cheaper to run, especially in pay-as-you-go cloud services
• Increased transparency to improve public trust and increase the chance that errors can be spotted
• Improved collaboration and reduced duplication of effort through code-sharing

Towards Effective Data Linkage: Introducing a Quality Assurance Framework for Enhanced Health and Care Analytics.

Recording Available Here!

Main Presenter: Giulia Mantovani
Additional Presenter/Co-Author/s: Amelia Noonan, Liliana Valles Carrera

Data Linkage is a business-critical process within many government organisations, including NHS England. Research publications, official statistics, but also many direct care applications depend on data linkage. Its importance is further amplified when considering privacy preserving principles that require to minimise the use of patients' personal identifiable information. Consequently, data linkage is initiated early in the data lifecycle, establishing a substantial reliance of downstream applications on the quality of the linkage process.
However, too often data linkage is seen as an exclusive software development and data engineering exercise instead of a modelling challenge, and there is not an appropriate level of quality assurance applied at the different stages of the process. This is why we are proposing the Quality Assurance Framework for Data Linkage, which is a practical tool for practitioners to determine the needed quality assurance levels at every stage of the data linkage process:
1. Preparation: data profiling, data assessment, data enrichment, computational resources needs
2. Implementation: techniques and tools, configuration of linkage parameters, version control
3. Evaluation: verification and validity, quality of data linkage, speed
4. Overall considerations: uncertainty management, communication of changes, safety, ethics and fairness, information governance, community engagement, knowledge management, continuous improvement and maintenance.
The required level of quality assurance varies by project and is determined by the data linker and data users. The triage questions in the framework provide a structured approach to deciding the minimum expected levels by type of project.
The Quality Assurance Framework guides stakeholders to make well-informed choices based on a clear understanding of potential risks and benefits. Additionally, it can be used as a detailed record-keeping tool that helps evaluate and manage data linkage project aspects.

Wenlock

Enhancing Data Linkage for the NHS- How Generalisable are Probabilistic Models?

Recording Available Here!

Main Presenter: Jonathan Laidler
Additional Presenter/Co-Author/s: Amelia Noonan, Amaia Imaz Blanco, Giulia Mantovani

Background:
Having a generalisable model is desirable, however, there is a trade-off between the generalisability of the model and the accuracy of the results. In NHS England we are exploring the use of probabilistic data linkage models to link patients records to PDS (Person Demographics Service). The parameters of probabilistic models can be customised to the data sets being linked which results in better performance (Sayers et al. 2016). However, if such models need to be deployed for linking a variety of data sets, having many models with different sets of parameters can be more difficult to maintain, due to computation costs, time constraints, and difficulty of quality assurance for a constantly changing model.

Aims:
To investigate whether a probabilistic linkage model that is trained on one pair of datasets can be reused on a different pair of datasets without compromising the accuracy of the results, in particular, when one of them is a consistent master dataset (such as PDS).

Methods:
We created a basic Fellegi-Sunter linkage model using Splink. The same model is then trained on different data sets, where one side of the linkage is always PDS, generating a set of model parameters (specifically, m-values). We ran a sensitivity analysis to establish the robustness of the data linkage results to a change in the m-values using real-world data.

Results and Conclusions:
Whilst the parameters of the model have a range of values, often the best link found is the same. This seems to indicate that the model’s outcomes have low sensitivity to the parameters generated in this context (when the datasets have a similar level of data quality, and encompass a similar population), and therefore these models can in fact be reused.

Diagnostics in the Midlands: From Data to Action.

Recording Available Here!

Main Presenter: Lee Cadwallader-Allan

The Diagnostics transformation programme at NHS England was a new initiative that occurred during the COVID19 pandemic. Each region had a multi-million pound programme to implement the findings of the Mike Richards review "Diagnostics: Recovery and Renewal". As identified in the review data for Diagnostics services was lacking, but the programme could not hope to achieve anything without data. I was appointed as the Data Analyst for Diagnostics in the Midlands region with a blank page; the aim to provide data, analysis and insights for the programme team to progress its objectives.

What?
The problems in Diagnostics had been exacerbated by the pandemic with unprecedented waiting list volumes, uncertainty about additional patients waiting for services where data is not reported and an overriding aim to ensure effective use of resources including funding new Community Diagnostic Centres. Without data these problems could not be addressed.

How?
The approach taken was a full assessment of what data is currently available and turning it into meaningful information for colleagues. I also set about capturing additional data involving a full data analytics life cycle from objectives to visualisation. Stakeholder engagement was an important feature of my work. The ultimate aim for all existing and new data collections was to provide tools to internal and external stakeholders allowing them to draw insights. This resulted in an all in one Power BI app bringing together various dashboards.

Why?
The purpose was to allow funding decisions to be made but also highlighting key focus areas that required additional support to enable recovery and renewal. The overarching aim was about recognising the 'unknown' whether this be diagnostic performance in services not reported or hidden patients.

The journey I went on as a newly appointment data analyst in a new area was challenging yet exciting one - that is a story worth telling.

An Early Warning System to Flag Uncontrolled Hypertension.

Recording Available Here!

Main Presenter: An Te
Additional Presenter/Co-Author/s: Farah Adam

Hypertension is considered uncontrolled, if untreated or given medications are ineffective. Nearly 14% of the population in Bedfordshire, Luton and Milton Keynes (BLMK) have been diagnosed with hypertension, which is increasing annually. Presently, there are no tools available in BLMK for patient-level monitoring of hypertension. Our aim was to create a tool by leveraging population-level linked datasets for the early identification of hypertension patients with unmet needs. The tool will enable clinicians to case-find patients using clinical measures pertaining to hypertension and other cardio-vascular risk factors.

Activity and Evaluation Analysis for Neighbourhood Mental Health Teams in Birmingham and Solihull.

Recording Available Here!

Presenter: John O'Neill

Birmingham and Solihull ICB have embarked upon a Mental Health (MH) Community Transformation programme. A component of this has been the deployment of ‘Neighbourhood Mental Health Teams’ that are co-located in primary care across Birmingham and Solihull.
A business intelligence product has been developed to enable monitoring and evaluation of the service in line with its core aims:
- Swift access to services
- Removing barriers to care
- Removing health inequalities
- A multi-agency approach.
The principal data sources for the analytical product are the MHSDS (Mental Health Services Dataset) and the NHS Spine, both patient-level resources. The product was built using SQL and visualised in Tableau. It is accessed via Aristotle, MLCSU’s data visualisation platform, this allows for a degree of ‘self service’.
The report provides an overview of activity, analysis of patient characteristics, prevalence of MH conditions, access to services and deployment and impact of assessment/outcome tools (inc DIALOG),
- The Activity Overview provides some operational management information in the areas of referral volumes, attendance rates, waiting times, referrals reasons, data quality and basic demographics.
- The Prevalence Accessing sections aim to identify variations in access to services by residence by segmenting access rates into electoral ward and constituency and visualising with heat maps. Drill downs include age bands and Indices of Multiple Deprivation (IMD).
- The Patient Characteristics sections identify under or over representation of specific patient groups receiving support, compared to the local population. Data is segmented into the patient groups of Age, Gender, IMD and ethnicity. Users can drill down to constituency and ward level visualisations to identify potential inequity.
- Finally, the volumes of Assessment (or Outcome) tools used with patients and the status of their use is analysed. This section goes on to examine the results of DIALOG assessment to evidence how MH care has improved the lives of patients and the life areas affected.

Using Causal Impact Analysis to Measure the Impacts of Changing from Multi-Bed to Single-Bed Hospital Rooms.

Recording Available Here!

Main Presenter: Sarah Lucas
Additional Presenter/Co-Author/s: Andrew Hood, Anya Ferguson

Traditionally, hospital beds have been in shared multi-bed bays, however, there has been a move both in the UK and globally towards single-bed inpatient rooms. Indeed, current NHS England guidelines state that at least 50% of new hospital inpatient beds must be single-bed rooms (Health Building Notice, 2009). Recent reviews of published evidence have suggested that single-bed rooms improve bed management and patient flow with reduced bed closures due to improved infection control, while also promoting greater privacy and dignity for patients and their families. There is a lack of strong quantitative evidence in the literature to support the benefits of single-bed rooms and concerns have been raised about potential negative impacts. Hence it is important to better understand the potential impacts of a switch to single-bed accommodation when building future hospitals.

A number of hospital sites within England have already wholly or significantly switched to single-bed inpatient accommodation allowing them to be studied as natural experiments. Causal Impact Analysis (Broderson et al., 2015) is a technique that uses a control group (similar hospital providers, with similar catchment populations, patient case-mix and prior trends) to estimate a counterfactual against which we can quantify the change in the study hospitals following their switch to single-bed rooms. This method is implemented using the CausalImpact package in R.

Our analysis investigates the impact on measures from three domains 1) Health and Safety, 2) Patient and Staff Experience and 3) Productivity and Efficiency to determine the extent of any benefits, or negative impacts that need to be mitigated with single-bed room sites. Determining any reductions in length of stay is of particular interest as this has the potential to increase capacity and reduce costs, offsetting additional costs typically associated with building and maintaining hospitals with single-bed rooms.

Traditionally, hospital beds have been in shared multi-bed bays, however, there has been a move both in the UK and globally towards single-bed inpatient rooms. Indeed, current NHS England guidelines state that at least 50% of new hospital inpatient beds must be single-bed rooms (Health Building Notice, 2009). Recent reviews of published evidence have suggested that single-bed rooms improve bed management and patient flow with reduced bed closures due to improved infection control, while also promoting greater privacy and dignity for patients and their families. There is a lack of strong quantitative evidence in the literature to support the benefits of single-bed rooms and concerns have been raised about potential negative impacts. Hence it is important to better understand the potential impacts of a switch to single-bed accommodation when building future hospitals.

A number of hospital sites within England have already wholly or significantly switched to single-bed inpatient accommodation allowing them to be studied as natural experiments. Causal Impact Analysis (Broderson et al., 2015) is a technique that uses a control group (similar hospital providers, with similar catchment populations, patient case-mix and prior trends) to estimate a counterfactual against which we can quantify the change in the study hospitals following their switch to single-bed rooms. This method is implemented using the CausalImpact package in R.

Our analysis investigates the impact on measures from three domains 1) Health and Safety, 2) Patient and Staff Experience and 3) Productivity and Efficiency to determine the extent of any benefits, or negative impacts that need to be mitigated with single-bed room sites. Determining any reductions in length of stay is of particular interest as this has the potential to increase capacity and reduce costs, offsetting additional costs typically associated with building and maintaining hospitals with single-bed rooms.

Identifying Gaps in Service by Assessing Patients Against NHS England’s Discharge Ready Date Criteria: An Information Analyst and Flow Matron Discuss Implementing a Data Science Approach to Avoid Delayed Discharge.

Recording Available Here!

Main Presenter: Claire Tucker
Additional Presenter/Co-Author/s: Sally Beyzade, Craig Wood, Zella King

Hospitals are at capacity with spell length increasing. This is partly due to hospital inpatients staying beyond their Discharge Ready Date (DRD). Discharge-ready patients on Discharge To Assess (DTA) pathways 2 and 3 may be waiting for rehab beds or care homes. But for less complex patients (those on DTA pathways 0 and 1), additional services could be provided in the community or at home, reducing pressure on acute beds.

Data analysis can reveal the reasons for discharge delays and indicate whether any consistent reasons pointing to a service gap can be identified. However, in practice, this is tricky for two reasons (1) wards may vary in how they record DRD status, and the reasons for DRD reached, so data may be inconsistent (2) analysis may require use of multiple data sources, including unstructured notes, so can't be done in conventional (Power BI) reporting. However, tools like R make it possible to do higher-quality analysis.

We profiled 34,365 inpatients at UCLH in 2023 and found that patients on pathways 0 and 1 used 3,991 and 3,775 acute bed days post DRD - the equivalent of 11 and 10 beds respectively.

Using multiple data sources (structured text, notes review, discussions with clinical staff) we delved deeper into the 0 and 1 pathways to identify reasons for more straightforward discharges being delayed. We also examined ward consistency regarding DRD reviews and provided feedback to wards for education.

In this talk we will discuss our findings about (1) data quality, where data cleansing is needed on DRD data and ways to do it (2) a possible gap in services and whether we could identify anything actionable. We will also share our experience of using R for this purpose and report on what the hospital will do with the findings.

Estimating the Macro Level Impact of Efforts to Mitigate Hospital Activity in English Hospitals from 2013 to 2019: A Retrospective Database Study.

Recording Available Here!

Main Presenter: Gabriel Hobro
Additional Presenter/Co-Author/s: Steve Wyatt

Introduction:
Health systems endeavour to mitigate potentially unnecessary hospital activity via polices to prevent, redirect/substitute or de-adopt hospital activity but little is known about the impact of such policies.

Aim:
We examine the macro level impact of mitigation in the English National Health Service (NHS) by comparing growth rates for hospital activity that is in the purview of mitigation versus activity which is not.

Methods:
Regression models using hospital episode statistics from 2013 to 2019 to estimate percentage differences in growth rates (95% confidence intervals) of mitigatable versus usual (non-mitigatable) activity across four points of delivery (emergency department (ED) attendance, elective admissions, non-elective admissions, outpatient attendance) via prevention, redirection/substitution or de-adoption. We assume that having controlled for differences in the age and sex structure of population, activity mitigation is the primary driver of differences in growth rates.

Results:
Activity in the purview of de-adoption grew at a slower rate than usual activity by -2.62% (-3.84% to -1.39%) for elective admissions and -2.74% (-3.11% to -2.38%) for outpatient attendances per annum. ED attendances in the purview of redirection/substitution grew at a slower rate, by -0.39% (-0.74% to -0.04%) per annum than usual activity. Elective and non-elective admissions in the purview of prevention grew at a faster rate than usual activity, by 1.39% (0.11% to 2.68%) and 1.46% (0.46% to 2.46%) per annum respectively. Non-elective admissions in the purview of redirection/substitution grew at a faster rate, by 4.52% (3.50% to 5.55%) per annum than usual activity.

Conclusion:
Hospital activity in the purview of de-adoption showed lower growth rates than usual hospital activity. Activity in the purview of mitigation via prevention showed faster growth rates. Activity in the purview of redirection/substitution showed lower and higher growth rates. These macro level findings prompt a more critical examination of different mitigation policies.

Beckbury

"So You Built a Dashboard…Now What?"

Recording Available Here!

Main Presenter: Natalie Cantillon
Additional Presenter/Co-Author/s: James Harrison

We are working in a time where health, wellbeing and social care data is available in greater amounts than ever before. As analysts we are challenging ourselves to produce innovative interesting and timely analysis that reaches a wider audience with a varied skill base, and with differing roles in which they use the data but are we doing enough to understand the additional challenges and opportunities that this presents?

At OHID we produce the Fingertips platform for data visualisation in public health, healthcare and social care data analysis, it reaches 15,000 users per week across 30+ profiles with 1.8 million annual interactions and yet despite this reach we are still not aligning our support offer to users with the data …until now.

Find out about how Public Health Intelligence within the DHSC are creating a linked community to Fingertips, find out about how we are challenging ourselves to meet the varied needs of the Fingertips audience, eliminating pain points in our analysis, and how we are creating a central Public Health Intelligence community that creates connections between people and resources to support a system that is greater than the sum of its parts.

See how we are aligning workforce training and development in statistics, data and intelligence for users, with a wider community. Understand the process through which we have understood our users needs, how we are designing outputs that are produced to address the issues raised by stakeholders, how we evaluate our products to make sure they continue to meet the stakeholders needs, above all find out why analysis isn’t enough, find out why you must add value to your analysis by supporting people to understand and use your outputs better to ultimately facilitate the change which is needed.

Modelling Long-Term Changes in Population Health State and Associated Healthcare Resource Requirements: Application in BNSSG ICS.

Recording Available Here!

Presenter: Luke Shaw

Healthcare policy makers face regular challenges on how to allocate healthcare resources with limited budgets, both in the short and longer term. Mathematical and computer modelling tools can capture, subject to assumptions and simplifications, these interacting factors in estimating the long-term trajectory as well as the implications of different mitigatory measures. We have developed a mathematical model to support decisions around long-term commissioning needs within the Bristol, North Somerset, South Gloucestershire (BNSSG) ICS.

The model is a finite horizon discrete-time Markov chain where the state space representing the health state of individuals within the population is based on segmentation using Cambridge Multimorbidity Score. Essentially, the model accounts for the life-course of individuals as they age and (typically) advance through the states with declining health. Such movements are extrapolated from the observed transition rates within the system, anchored on demographic projections (births, deaths, and migration) from the Office for National Statistics (ONS).

Our results indicate notably different growth rates for different healthcare settings, such as maternity and A&E attendances, which helps to demonstrate the added value of data science techniques – especially over common approaches based simply on ONS percentage uplifts. Ultimately, while the population is expected to increase by 14% over the 20-year horizon, the total cost is expected to increase by 41%, indicating the scale of the challenge ahead.

The model has already influenced financial decisions of resource allocation within the system, and we will present examples of the model being used in real life situations. Finally, we will review outputs against two other externally developed models.

This work follows naturally on from the Population Segmentation talk at HACA2023 by Nick Hassey, introducing the segmentation approach being used in BNSSG, and now as a foundational building block in our dynamic model.

Developing Analytics for Better Pharmacy Stock Control.

Recording Available Here!

Main Presenter: Hazel Kirkland
Additional Presenter/Co-Author/s: Martin Utley, Chris Beeley, Devon Barrow, Nikolaos Kourentzes, Sara Simmons

Background:
Inventory management within hospital pharmacies is costly, labour intensive and safety critical. NHS hospital pharmacies have not adopted sophisticated forecasting techniques used in other industries such as retail and manufacturing.

Aim:
We aimed to explore the scope for using advanced forecasting techniques with pharmacy dispensing data and to combine stochastic forecasts of dispensing within an inventory management model.

Methods:
We built an inventory management model to suggest order quantities based on stochastic forecasts of demand, accounting for uncertain delivery times and drug-dependent tolerances to stockout and to breaching storage constraints. We mapped the processes and staff-time involved in ordering, receipt and invoicing and built a simulation model to explore the performance of different forecasting approaches in combination with the inventory model.
After an initial exploration of forecasting approaches with simulations using historical dispensing data across a sample set of medicines, an end-to-end analytical pipeline was developed that was used to generate over-night forecasts of forthcoming demand and suggested order quantities for all 243 drugs from one wholesaler. This analytical pipeline solution was then run in shadow-mode for a period of 9 weeks.
Recent work has focussed on more detailed assessment of a wider range of forecasting methods in a selection of 34 drugs chosen to reflect a range of demand profiles and vital / essential / desirable categorisation.

Results:
We developed and successfully tested a combined forecasting and inventory management model. Dispensing data is challenging to forecast given the high levels of intermittency for many drugs. Temporal Hierarchy Forecasting (THIEF) has emerged as a robust method to use across our varied demand-profiles.

Conclusion:
Hospital pharmacy inventory management faces challenges not seen in other industries but is ripe for better analytics to reduce stockouts and/or costs associated with excess stockholding and to release staff time for higher value work.

Using a Markov Chain Monte Carlo Method to Optimise the Blood Donation Appointment Booking System.

Recording Available Here!

Presenter: Janarth Duraisingham

Problem statement
To reliably deliver on supply commitments for different types of blood, and to ensure collection sessions run smoothly, NHS Blood and Transplant needs control over the final distribution of donors booked in to donate before a session.

Background
Donors can book into certain slots in the appointment grid based on their eligibility, defined by their blood group and donation frequency. The grid is comprised of slots with a slot-type, amounting to that slot’s respective eligibility criteria.

Methodology
We use a Markov Chain Monte Carlo simulation to explore optimal grid setups for different operational objectives. Grid setup is the slot-type eligibility criteria and the slot-types’ temporal distribution. Sampling from the final grid distribution, we observe the success of different grid setups as a function of session objectives e.g., maximising new donors booked or blood group proportions.

Impact
Using these models, we will be able to test the impact of different booking paradigms on key performance criteria including bookings, collection volume, blood mix, new donor recruitment and session overruns. This will enable NHSBT to better tailor its operations to meet its organisational targets.

The Marathon Training Guide: Automation in Public Health Surveillance.

Recording Available Here!

Main Presenter: Katie Binley
Additional Presenter/Co-Author/s: Adrianna Farmer

The Health Protection Surveillance team at the Public Health Agency (PHA), Northern Ireland, carry out routine monitoring of communicable disease testing and vaccination, and support with outbreak management. In recent years, the Surveillance team have been transforming their once manual and inefficient data pipelines and reporting outputs into highly efficient, automated workflows. Migration of data into a cloud-based system has created more secure, integrated datasets that can be used to enhance public health decision making. Adoption of modern tools, such as R, R Studio Connect and Microsoft Power BI, has improved efficiencies within the team and ensured key data are at stakeholders’ fingertips. Using git and DevOps repositories, the team have been able to work more collaboratively, and manage cross-team projects efficiently using Jira. Use of reproducible analytical pipelines has provided opportunities for service improvement and facilitated skill development within the team.
Considerations for this kind of transformation include the need for: technical training; dedicated data science support; acquisition of software licences; and ensuring information governance is in place. Finally, time and patience are big factors not to be overlooked, given the scale of change.
The journey to automation for the PHA Surveillance team has been long and is far from complete. We would like to share the progress of our journey so far, along with some reflection, which we hope will be valuable for other teams wishing to follow a similar path to automation.

Understanding Uptake of Physical Health Checks Amongst Patients with Serious Mental Illness in London.

Recording Available Here!

Main Presenter: Polly Sinclair
Additional Presenter/Co-Author/s: Joseph Bavington-Allen

The Health Innovation Network South London has been funded by the Cavendish Square Group (all London NHS Mental Health Trusts) to understand which patients with serious mental illness (SMI) have not received their annual physical health checks. People with SMI die 15-20 years earlier than the general population, largely due to preventable or treatable physical illness.

The focus of the analysis is on patients with SMI that are under the care of a mental health trust. Data was initially obtained from South London and Maudsley NHS Foundation Trust to test the data extraction. The data extraction focused on the 6 physical health checks (specified in NHSE guidance for this group) conducted on each patient with SMI over the past year and included demographic details of each patient: gender, age, ethnicity, deprivation decile and SMI diagnosis. Subgroup analysis was conducted for each demographic subgroup. Those with a minimum of 50 patients and a minimum of 5 percentage points more patients in that subgroup that were missing checks compared to the average across all patients were flagged as needing a particular focus to increase uptake of physical health checks.

Just under a fifth (17%) of patients had received none of the 6 physical health checks within the past year. The proportion of patients missing all checks was higher for:
• patients diagnosed with bipolar disorder (30%), particularly those aged 18-64 (31%).
• patients in IMD deciles 7-10 - least deprived - (22%), particularly those aged 18-65 (23%).
• white female patients (22%).
• female patients in deciles 5-6 (22%).

Work is now underway to obtain data in a standardised way from the other 8 mental health trusts across London to understand the pan-London picture on the completion of physical health checks.

Online Pre-HACA E-Lab Events

Voices from the Framework Frontline

Recording Available Here!

Presenters:

Sarah Blundell, Lead for Analytical Development from NHS England
Andrew Lavelle, Senior Manager Analytical Professionalisation NHS England

Welcome to this HACA 2024 E-Lab! Hear from people who have adopted the National Competency Framework for Data Professionals and the impacts that it is having on how they develop their skills and careers.

This is not the shiny car sales brochure, but the voices from the frontline who are using the Framework to drive change.

Who is it for? This e-lab session is for anyone at any level, who works in data and analytics in a health or care setting.

What will you learn? After this session you will get an insight into the variety of ways people are using the Framework, the successes and challenges they are having, to inspire you to make a pledge and adopt the National Competency Framework for Data Professionals.

Professional Registration for Data & Analytics Professionals

Recording Available Here!

Presenters:
Emma Wright, Director of Professional Development, Association of Professional Healthcare Analysts (AphA)
Jane Johnston, Director of Member Services, Association of Professional Healthcare Analysts (AphA)

Welcome to this HACA 2024 E-Lab event! With the growing focus on data, AI and analytics across health and care, the importance of professional registration for those working in these areas is ever present. Join this session to gain an understanding of the process for professional registration

Guiding leaders to better quality decisions : the pivotal role of analysts

Recording Available Here!

Presenters:
Samantha Riley, Director of Making Data Count, NHS England
Andrew Browne, Business Intelligence and Data Science Manager, University Hospitals of Morecambe Bay NHS Foundation Trust

Welcome to this HACA 2024 E-Lab! Guiding leaders to better quality decisions: the pivotal role of analysts! In this online event, we explore the crucial role analysts play in helping leaders make informed decisions.

Anatomy of a waiting list – how well do we understand our waiting lists?

Recording Available Here!

Presenters:
Neil Walton, Professor in Operations Management, Durham University Business School
Tom Smith, Insight Manager, Nottingham University Hospitals NHS Trust

Welcome to this HACA 2024 E-Lab! This online event is intended for anyone who works with waiting lists but will be of interest to many more.

Neil Walton (Professor in Operations Research) will take us through the anatomy of a waiting list, how it can be measured, how it can be managed, and some method for managing pressures across multiple waiting lists. Tom Smith will give an overview of development work on the NHS-R Community’s {NHSRwaitinglist} R package, which is being built to help analysts apply Neil’s methods at scale on real hospital data. Both the theory, and the practical package use will be helpful for anyone who finds themselves working with, and prioritising waiting lists.

Using Systems Dynamics in Local Authority Public Health Practice.

Recording Available Here!

Presenters:
Abrahm P George, Consultant in Public Health
Peter Lacey, funding partner at the Whole Systems Partnerships

Welcome to the final HACA 2024 E-Lab! Kent County Council Public Health have been actively using systems dynamic modelling (SDM) over a number of years for demonstrating the impact of prevention interventions in JSNA related reports.

This year it is being actively used for scenario test the impact of commissioned health improvement services, part of an ongoing service review to improve effectiveness and reduce health inequalities. We present some ongoing SDM examples, particularly going into the model development design and testing, how stakeholders were engaged, how they influenced senior leadership decision making, team reflections around their experiences in SDM training and emphasising the importance of key enablers such as data linkage and other analytical expertise such as evaluation methods for robust assumptions generation for SDM.

HACA 24 Keynote Speakers

Andrew Dilnot

Warden of Nuffield College Oxford & Non-executive Chair of the Oxford University Press Finance Committee

Sir Andrew Dilnot is Warden of Nuffield College Oxford. He is the non-executive chair of the Oxford University Press Finance ...

Anita Charlesworth

Director of Research & The REAL Centre at the Health Foundation and Honorary Professor in the College of Social Sciences at the Health Services Management Centre (HSMC)

Anita Charlesworth is the Director of Research and the REAL Centre (Research and Economic Analysis for the Long term) at the ...

Andy Boyd

Director, UK Longitudinal Linkage Collaboration, University Bristol and 'Trust & Transparency' Programme Co-lead, Health Data Research UK

Andy Boyd specialises in designing the data and governance infrastructure required for linking participants in longitudinal s...