6. The single assessment framework

6.1 Rationale for the single assessment framework

The rationale for the new framework is set out in ‘Our new single assessment framework’ published by CQC in July 2022. [7] This states that:

“There are three main reasons why we need to change:

  • We need to make things simpler so that we can focus on what really matters to people.
  • We need to better reflect how care is actually delivered by different types of service as well as across a local area.
  • We need one framework that connects our registration activity to our assessments of quality.”

While the 5 key questions and 4 quality ratings would remain central to CQC’s approach, the existing key lines of enquiry (KLOEs) and underlying prompts would be replaced with new ‘quality statements’. The aim of these changes was to reduce the duplication in the 4 previous separate assessment frameworks, which would allow a focus on specific topic areas under each key question, and would link to the relevant regulations and associated external guidance to make it easier for providers.

From discussions I have had with members of CQC staff, their understanding was that the aim of the single assessment framework was to provide:

  • consistency of approach across sectors
  • consistency of judgements
  • applicability to local health and care systems as well as to providers
  • simplicity
  • emphasis on people’s experiences.

6.2. What is the single assessment framework and how does it differ from the previous approach?

The single assessment framework is intended to be one single framework that covers all the services (across health and care) that CQC regulates. It retains the 5 key questions and replaces the key lines of enquiry (KLOEs) that were previously used with 34 quality statements (Appendix 1). These quality statements have been mapped onto the 5 key questions as follows:

Safe

  • Safety learning culture
  • Safe systems, pathways and transitions
  • Safeguarding
  • Involving people to manage risks
  • Safe environments
  • Safe and effective staffing
  • Infection prevention and control
  • Medicines optimisation

Effective

  • Assessing needs
  • Delivering evidence-based care and treatment
  • How staff, teams and services work together
  • Supporting people to live healthy lives
  • Monitoring and improving outcomes
  • Consent to care and treatment

Caring

  • Kindness, compassion and dignity
  • Treating people as individuals
  • Independence, choice and control
  • Responding to people’s immediate needs
  • Workforce wellbeing and enablement

Responsive

  • Person-centred care
  • Care provision, integrity and continuity
  • Providing information
  • Listening to and involving people
  • Equity in access
  • Equity in experiences and outcomes
  • Planning for the future

Well-led

  • Shared direction and culture - Shared vision, strategy and culture
  • Capable, compassionate and inclusive leaders
  • Freedom to speak up
  • Workforce equality, diversity and inclusion
  • Governance, management and sustainability
  • Partnerships and communities
  • Learning, improvement and innovation
  • Environmental sustainability – sustainable improvement

These 34 quality statements are broadly similar to the topics that were assessed in the previous hospital inspections and ratings. However, some are likely to be of greater importance in particular settings/sectors than others.

It could also be argued that some are identified under the wrong key question. For example, ‘workforce wellbeing and enablement’ is placed under the caring key question, while it might be better placed under well-led. In addition, some quality statements conflate concepts that would be better kept separate. For example, under the well-led key question, culture should be separated from vision (and strategy).

While the emphasis of people’s experience of care is clearly of major importance, this does appear to downplay the importance of outcomes and proxies for outcomes. In healthcare settings, patients may report a good experience of care, while actually receiving treatment that is suboptimal and may affect their long-term morbidity or mortality. Patients under the care of breast surgeon Ian Paterson and GP Harold Shipman initially reported good experiences of care, but had disastrous outcomes.

In practice, not all 34 quality statements are assessed on any inspection. For example, in adult social care, 5 quality statements were initially selected for inspection, though this has now typically increased to 10-12. In primary care, 18 quality statements are now being advocated for inspection, and in A&E/emergency department inspections, experts are advocating using 21 quality statements. This begs the question as to whether this is a ‘single’ assessment framework.

If only some quality statements are assessed for a particular key question, it means it is not possible to give a rating for that key question (without relying on past assessments, which may be several years out of date). In addition, assessment of only a selection of quality statements may mean that an inspection does not cover all the fundamental standards set out in the regulations.

The currency and credibility of ratings is a key issue for providers and the public.

6.3. Evidence categories

Six evidence categories have been identified relating to each of the 34 quality statements. These are:

  • People’s experience of health and care services
  • Feedback from staff and leaders
  • Feedback from partners (e.g. commissioners and other local providers)
  • Observation
  • Processes
  • Outcomes

Although these sources of evidence were used from the outset of CQC inspections and ratings, the single assessment framework makes them more explicit. Equal weighting is meant to be given to each evidence category for each quality statement. However, the relative importance of different evidence categories may vary between different services and key questions. For example, the effectiveness of a hospital service may largely be measured through processes and outcomes, while caring may largely be measured through people’s experience and through observation. In addition, the availability of data varies widely between sectors (see Data and insight section).

6.4. Scores

The single assessment framework process involves scores being given to each of the relevant evidence categories for each of the quality statements on a 4-point scale (where 1 is worst and 4 is best).

While this might be thought to provide greater consistency and allow for automated aggregation of scores to provide an overall rating, this would depend on clear criteria being set out for each score for each evidence category and for each quality statement in all services. It is unclear where judgement and moderation should prevail in this approach, especially where aggregated scores are at the borderline between ratings (e.g. good versus requires improvement).

6.5. The single assessment framework in practice: CQC staff perspective

A total of 1,379 inspections of providers were undertaken between December 2023 and September 2024 using the single assessment framework methodology. The breakdown by type of sector is as follows:

SectorNo. of inspections
Adult social care885
Primary care350
NHS and independent hospitals47
Mental health97

This is far fewer than would have been done in a comparable period before the Covid pandemic but is sufficient for staff in each sector to have formed clear views on the new methodology.

The views of CQC staff across all sectors who have been using the single assessment framework for assessments can be summarised almost unanimously as follows:

Views on quality statements and evidence:

  • The concept of a single assessment framework is superficially attractive, but it doesn’t take account of the major differences in size, complexity or function between services/organisations, or in the nature of the information necessary to assess a service.
  • CQC personnel working in each of the sectors do not feel that the single assessment framework works for their services.
  • CQC staff in both the Operations Directorate and in Regulatory Leadership continue to find the 5 key questions helpful and are glad these have been retained.
  • The 34 quality statements are broadly acceptable as they are little different from the topics previously used. However, the wording of the statements is lengthy and some statements would benefit from modification, separation and being moved to a different and more relevant key question. Some of the quality statements overlap with each other, leading to confusion and duplication.
  • The rationale for the selection of ‘priority quality statements’ for assessing different service types is unclear and confusing.
  • There is insufficient emphasis on outcomes. These cannot be adequately measured for hospitals and primary care through people’s experiences. Much more informative datasets are available but are not being used.
  • The insistence on assessing several evidence categories for any individual quality statement is causing major difficulties, both in the assessment process and in report writing. This precludes writing a narrative report that would make sense either to a provider or to people trying to get information about a service.
  • Uploading of evidence from assessments of individual quality statements to the regulatory platform is extremely time-consuming and can delay publication of reports by several months. This is having a serious adverse impact on the overall number of inspections being undertaken. 

Views on producing scores and ratings:

  • There was virtually no support for the use of scoring for each evidence category. Although scoring may seem superficially logical, it precludes the use of judgement about the rating of a whole key question, or even for a quality statement. It was described to me as a ‘pseudoscience’. It also creates a risk of gaming to get the ‘right’ overall rating. Scores that are at a borderline (e.g. between good and requires improvement) can feel unfair, especially if the negative findings could be corrected and validated rapidly (e.g. between inspection and report).
  • Evidence that has previously been successfully developed for primary care inspections and has been welcomed by GPs, CQC inspectors and specialist professional advisers (SPAs) cannot be accommodated within the current single assessment framework, so assessments are considered less valid than previously. GP inspectors and SPAs found the use of templates for evidence and a narrative report much more meaningful. Comparison of ratings between around 150 primary care practices recently assessed using the old methodology and around 150 using the single assessment framework showed major differences in ratings.
  • The single assessment framework has made assessment of the well-led key question at NHS trust level more complex – not simpler. Assessment of multiple evidence categories for each quality statement, combined with equal scoring of each evidence category, is making the task almost impossible, especially when combined with the problems of uploading evidence to the regulatory platform. The previous framework for assessing well-led in trusts was developed jointly by CQC and NHS England/Improvement and worked well.
  • If only a limited number of quality statements relating to a key question are assessed, it is difficult – if not impossible – to determine a reliable current overall rating for that key question. This is especially true if previous ratings were given several years ago.
  • The ratings given by applying the single assessment framework do not give an accurate view of the quality of care in some services. In adult social care, the scoring system can give a rating of good, even though there are sometimes multiple breaches of the regulations (Appendix 2). This is not a rare occurrence, as over 96 assessments using the single assessment framework (around 10% of the total number undertaken to date) have been rated as “good with a breach”. A member of the public might see a rating of good and not be aware of the breach, unless they read the full report. Under the old methodology if a service was in breach of one of the fundamental standards, it would not be rated as good.  In addition, the use of ratings limiters supported consistency in judgements.
  • Combining new ratings for individual quality assessments with old ratings (some of which were awarded several years ago) does not make sense. In some cases, this can make it impossible to upgrade a rating of a key question even when there has been improvement.

In summary:

  • The single assessment framework is not simpler than the previous approach and does not accurately reflect the quality of care delivered – which were 2 of its key objectives.
  • CQC staff feel that the single assessment framework was introduced without sufficient testing and training.
  • As one correspondent put it (and many others would agree): “It takes longer to look at less”.
  • A large number of people I spoke to advocated going back to the previous approach based on the 5 key questions and prompts/KLOEs.

6.6 Application of the single assessment framework to local authority assessments

CQC has only recently started to assess local authorities (LAs) in relation to their role as commissioners of adult social care. These assessments are being undertaken as part of CQC’s relatively new duties to reflect how care is delivered across a local area. In due course, it is anticipated that integrated care systems (ICSs) will also be assessed, though these assessments have not been commenced as yet.

A dedicated team has been established to undertake the LA assessments. LAs are given 6 weeks’ notice of an inspection, with a substantial amount of information being requested before a site visit is undertaken. To date, 26 of the 153 LAs have been inspected, with reports published for 9 of these, and 58 LAs have started the process with information having been requested. The teams for inspection are made up of around 14 personnel, around 40% of whom are external expert reviewers. Case tracking forms part of the process.

Nine quality statements are assessed, with 4 of the 6 evidence categories being used for each. Although the quality statements cover several of the key questions, rating at key question level is not part of the process. A single overall rating is given with sub-scores for the 9 quality statements. The regulatory platform is not being used for LA assessments.

It is still too early to assess how well these assessments are working or the value of the reports. However, although the quality statements used come from the single assessment framework, it is questionable whether the move to a single assessment framework was needed to undertake these reviews. As with other assessments being undertaken using the single assessment framework, inspection teams report that scoring has been unhelpful, as it can drive towards a rating that is not felt to be appropriate.

6.7. Registration and the single assessment framework

Registration of health and social care providers is one of the key functions of CQC. All new locations from which services are to be provided have to be registered by law and certain changes to registered services have to be agreed with CQC.

When a provider wishes to register a service/location, they complete a standard application form. Following initial checks, this is passed to dedicated registration inspectors who review documentation, conduct interviews with the provider and manager and decide whether an on-site visit is needed. The assessment is conducted against regulations and is based on intent, as delivery of services will not have started.

Demand for registration increased by around 33% between 2020/21 and 2023/24. This has largely been driven by a major increase in applications related to domiciliary care (homecare) agencies. I was told that many of these applications are ultimately rejected.

Backlogs have increased markedly. In June 2022, 23.2% of registrations were waiting more than 10 weeks to be processed, but by May 2024 this had risen to 61.6%. Recruitment of additional staff on fixed term contracts has now been undertaken to tackle these backlogs, but given the time needed for induction and training, backlogs are likely to remain for some time. This has a serious impact on providers who may have invested substantial funds in developing a new service, but cannot start to recoup these through delivery of services.

Although one of the stated aims of the single assessment framework was to connect registration activity to assessments of quality of service delivery, registration managers report that there is no evidence of this happening.

6.8. Oral health and the single assessment framework

CQC regulates around 11,500 dental locations with a dedicated team of around 33 inspectors (i.e. around 500 locations per inspector). These services are deemed compliant or not compliant with regulations, but CQC has not been given the powers to rate dental services. CQC aims to inspect around 10% of dental practices each year. Overall, around 85% of practices are found to be compliant.

Seven quality statements are used for dental inspections, 3 of which relate to the safe key question (staffing, environment and infection prevention and control), with one quality statement each for the other 4 key questions. Initially, 22 evidence categories were used across these 7 quality statements, but this has now been reduced to 15. Each inspection typically requires one or two inspectors (depending on the size of the practice) and one specialist professional adviser (SPA).

The large majority of oral health inspectors wish to come off the regulatory platform and would wish to dispense with evidence categories, as these (as in other sectors) impede the flow of a report. Previously, dental practices had to submit a provider information report (PIR) before an inspection, but this has currently lapsed. This did give an indication of risk. Better data/intelligence is wanted by inspectors, who believe that large corporate providers have such intelligence that could support more effective assessment of quality and risk pre-inspection.


Note

[7] ‘Our new single assessment framework’, CQC, July 2022, https://www.cqc.org.uk/news/our-new-single-assessment-framework