Tell Companies House you have verified someone’s identity

Companies House's Tell Companies House you have verified someone’s identity beta assessment report

Tell Companies House you have verified someone’s identity 

Assessment date 11/09/2025
Assessment stage Beta
Assessment type Assessment
Service provider Companies House
Result Amber

Service description

As part of the Economic Crime and Corporate Transparency Act, all company officers will need to verify their identity with Companies House (CH) over a period of 12 months starting from 18 November 2025. Officers can verify directly with CH through GOV.UK One Login or choose to use a third-party agent (e.g. lawyer, accountant) that is registered with CH as a Companies House authorised agent (AA - also known as Authorised Corporate Service Provider (ACSPs) in law). 

This assessment covers the ‘Tell Companies House you have verified someone’s identity’ service, which is used by agents to tell CH they’ve verified their client’s identity to CH standards. The service capture’s identity information about their client, such as personal details, addresses and identity evidence information (e.g. passport number).  

Service users

Businesses who are supervised under the Money Laundering Regulations can register to become authorised agents at a cost of £55. A business needs to become an AA if they either want to verify their client’s identity on behalf of CH, or from spring 2026 want to present any company filings to CH on behalf of clients. The type of businesses who typically register as AAs are accountants and solicitors. 

People who will be using the ‘Tell Companies House you have verified someone’s identity’ service:

Primary users – Members of the AA

  • Senior employees or directors (or equivalent) of businesses/AAs who are considering whether to offer identity verification as a service to their clients 
  • Staff members of AAs who need to tell CH they have verified their client’s identity to CH’s standard or if they need make a change 

Secondary users – Clients of the AA

  • Company officers (e.g. directors, Person with significant control) that need to provide their documents to their AA to comply with verifying. 

These users cannot access this service, but they provide the data to their agents. They can find guidance on how to verify their identity with Companies House here, although this service is not in scope for this assessment.

Things the service team has done well:

The service is well integrated into its wider context: the team has worked closely with colleagues and across organisations to make the user experience and guidance consistent across channels. They are sharing insights and learning from others, contributing to a better overall experience for these users.

The team has set themselves up with a CI process that enables them to make deployments at a cadence that’s appropriate for agile software delivery, ranging from a few sprints at the upper end down to mid sprint deployments where appropriate for smaller releases; meaning users get real working software as early and often as possible.

The team have collated a large amount of data and displayed this on dashboards and they maintain a strong awareness of PII and GDRP constraints. There is also clear evidence of how a Performance Analyst (PA) and a User Researcher (UR) have solved specific issues with data. 

From a technical perspective, the team has demonstrated strong alignment with the service standards by ensuring penetration and vulnerability testing is in place alongside robust security governance. Quality assurance and automation are embedded within release pipelines and ways of working, while coding in the open has been adopted where appropriate. The team has also shown effective cross-functional collaboration, ensuring the technical solution remains aligned with user needs.

1. Understand users and their needs

Decision

The service was rated amber for point 1 of the Standard.

During the assessment, we didn’t see evidence of:

  • Observations of private beta users engaging with the service, especially for the first time.  The panel accepts the service has been thoroughly usability tested in alpha and private beta.  The team also have performance analytics that shows people are completing the service.  But they are missing the opportunity to learn new things about their users that would benefit this service and the wider landscape.  The panel recommends the wider UR team in the organisation works together to overcome the barriers that have contributed to the lack of contextual research e.g. seeing live data during research sessions.
  • The point of this phase is to establish whether the service works for real people in their own environment. Relying solely on people’s memory of an interaction from a survey response or during an interview isn’t a reliable method for answering this question. 
  • Research with users with accessibility needs.  The team have conducted sessions with proxy users, and they should continue their efforts in a creative approach to identify these users and involve them in research.
  • Clear personas to aid team and stakeholder understanding of user groups.  Those presented used names and stock images.  It wasn’t clear whether they were representing an individual or the experiences of a group.  The team need to be transparent about the evidence they’re presenting and what it’s based on
  • A detailed public beta research plan that includes continuing to test the service with new users and working on new functionality.

2. Solve a whole problem for users

Decision

The service was rated green for point 2 of the Standard.

3. Provide a joined-up experience across all channels

Decision

The service was rated green for point 3 of the Standard.

4. Make the service simple to use

Decision

The service was rated green for point 4 of the Standard.

Optional advice to help the service team continually improve the service:

  • Consider if there is an opportunity to safely remove repeated email address entry while still managing the risk of incorrect addresses being entered. GOV.UK Design System guidance on repeated email entry is to “only do this if your user research shows it to be effective”: https://design-system.service.gov.uk/patterns/email-addresses/: repeating input may increase validation errors without necessarily targeting the problem.

5. Make sure everyone can use the service 

Decision

The service was rated amber for point 5 of the Standard.

During the assessment, we didn’t see evidence of: 

  • Research with users with a full range of accessibility needs. 
  • Testing (and mitigation if needed) for known usability issues in Select components with long lists (such as in the Country field), especially when used on zoomed browsers or smaller viewports.

Optional advice to help the service team continually improve the service:

  • The team has tried hard in both Alpha and Private Beta to recruit users for accessibility testing but has found this very difficult. If it is not possible to find large numbers of users, the team could focus on recruitment for needs not covered by their previous research, for example people with motor or auditory impairments.

6. Have a multidisciplinary team

Decision

The service was rated green for point 6 of the Standard.

7. Use agile ways of working

Decision

The service was rated green for point 7 of the Standard.

Optional advice to help the service team continually improve the service:

  • Continue to focus on ways to ensure the handover from Version 1 to the internal PA team can happen in a smooth and seamless manner.

8. Iterate and improve frequently

Decision

The service was rated green for point 8 of the Standard.

9. Create a secure service which protects users’ privacy

Decision

The service was rated green for point 9 of the Standard.

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

11. Choose the right tools and technology

Decision

The service was rated green for point 11 of the Standard.

12. Make new source code open

Decision

The service was rated green for point 12 of the Standard.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated amber for point 13 of the Standard.

During the assessment, we didn’t see evidence of: 

Optional advice to help the service team continually improve the service:

  • Where the team has departed from a standard pattern or component, or used one with known issues, they should share their variation and the research behind it with the GOV.UK Design System, both to evidence the need for the change and so that other teams can benefit from their research.

14. Operate a reliable service

Decision

The service was rated green for point 14 of the Standard.

Updates to this page

Published 11 April 2026