Apply for, maintain and renew an SIA licence

HO's Apply for, maintain and renew an SIA licence alpha assessment report

Service standard report

Apply for, maintain and renew an SIA licence

Assessment date 01/10/2025
Assessment stage Alpha
Assessment type Assessment
Service provider Security Industry Authority / Home Office
Result Amber

Service description

The service provides users with the ability to apply for and manage SIA licences. This is a legal requirement for working in a private security role in the UK. The licence confirms that holders are fit and proper and have the necessary skills and training for security work. Users are able to apply for licences, renew their licence and tell the SIA about changes to circumstances. The service maintains a list of people with licences and certified training holders.

The current service sits on an out-of-support or near out-of-support technology and we are seeking to solve this whilst re-platforming to improve the service according to user needs.

Service users

This service is for:

  • Licence applicants 
    • Frontline licences (door supervision, security guarding, public space surveillance (CCTV), close protection, cash and valuables in transit, vehicle immobilisers (Northern Ireland only)), key holding
    • Non-frontline licences
  • Awarding bodies and training providers
  • Security companies (assisting with applications for and managing licences)
  • SIA officers (not direct users of the service but consumers of the information provided in the service)
  • The public accessing the online Register of Licence Holders
  • Law enforcement bodies

Things the service team has done well:

  • User Research: Clearly identified user needs and riskiest assumptions for eligible users across the stages of the application process. Cross team collaboration on user research activities.
  • Technology: Team have chosen suitable cloud-based SaaS products to deliver the core infrastructure for the service and integrate with future services that meet the requirements laid out in the SIA transformation plan.
  • Agile ways of working: The team is multidisciplinary and takes a collaborative, open approach, communicating with stakeholders frequently and iterating designs/ messaging based on data insights. Their thoughtful and detailed engagement with call-centre personnel is to be commended. 

1. Understand users and their needs

Decision

The service was rated amber for point 1 of the Standard.

This is amber because:

Although the team have tested the happy paths with successful users of the service, they have not been able to fully explore and understand the needs of people who are currently applying for licences when they are ineligible (the main problem they are trying to solve). In Beta we’d expect to see more research conducted with ineligible users who have submitted applications to understand their misconceptions about eligibility and what service iterations will best overcome common misunderstandings.

Most of the challenges and risky assumptions are based on the needs of individual users rather than business users. For practical reasons a high proportion of testing has been with business users. The service would benefit from research with more of the individual users who represent the majority of applications and problems experienced.

2. Solve a whole problem for users

Decision

The service was rated amber for point 2 of the Standard.

This is amber because: 

While solid progress has been made in Alpha, there are still a number of unhappy (offline and online) journeys which need to be examined. The absence of user journey maps/empathy mapping and testing of alternative journey options are key concerns. We note the team plans to explore a number of problems more deeply in the Beta phase including user behaviour after failing an eligibility check, certification and badges being delayed, gaps in required information (eg address history/ ID) and assisted digital requirements. 

Optional advice:

  • Develop user journey maps that reflect diverse scenarios, including edge cases.
  • Introduce empathy mapping to better understand emotional and practical user needs.
  • Investigate reasons for failed applications and appeals to inform design improvements.
  • Consider engaging policy colleagues to understand external drivers of demand.

3. Provide a joined-up experience across all channels

Decision

The service was rated amber for point 3 of the Standard.

This is amber because:

There was very little evidence of the wider user journey or potential cross-channel activities and interactions. The focus of this Alpha was to solve the burning platform issue at pace, while improving the user experience as much as possible in the process. As a result, there is some tension between strategic goals, business needs, and user needs. The (pragmatic) narrowing of scope in the Alpha phase left little room for the team to map and research the wider touchpoints or offline interactions that would provide a more holistic view of the service. At Beta we’d expect to see evidence that the team has explored how users interact with the service across all touchpoints. In addition, we’d recommend they engage further with frontline staff and agencies to understand their role and impact and visualise how digital and non-digital elements work together.

Optional advice:
We note that many of the activities which will make this a truly joined-up user experience are on the backlog for the Beta phase. We’d recommend the team re-frames these backlog items as user-centred problem statements, maps the full end-to-end journey, including offline and support channels and facilitates workshops to align strategic, business and user goals. 

4. Make the service simple to use

Decision

The service was rated green for point 4 of the Standard.

Optional advice: in Alpha, the team has focused on the improved usability and accessibility of the eligibility checker and online application form, making good use of gov.uk design patterns. We’d recommend the work to make this service simple to use is applied end-to-end in Beta, researching and mapping some of the more complex user journeys and mindful of unhappy paths, as recommended in 1 and 2. 

5. Make sure everyone can use the service 

Decision

The service was rated amber for point 5 of the Standard.

This is amber because: 

  • There was limited testing with users with access needs and little evidence of researching and understanding assisted digital journeys, which may unintentionally exclude users who need support. 
  • We’d recommend the team conducts inclusive research with users who have a range of access needs; tests assisted digital journeys to ensure the required support is available and use accessibility checklists and tools to validate design decisions.
  • There was no clear policy or user statement around the collection and storing of sensitive data during the eligibility checking process, which may cause access issues for some (see point 9).

6. Have a multidisciplinary team

Decision

The service was rated green for point 6 of the Standard.

Optional advice: it was noted that the team is almost entirely external, raising issues around future knowledge and skills transfer. We recommend a plan is put in place to mitigate these and the other risks associated with outsourcing digital design and build.  

7. Use agile ways of working

Decision

The service was rated green for point 7 of the Standard.

8. Iterate and improve frequently

Decision

The service was rated green for point 8 of the Standard.
## 9. Create a secure service which protects users’ privacy

Decision

The service was rated amber for point 9 of the Standard.

 This is amber because:

  • The service team have completed DPIA and are planning ITHC and have briefly discussed the data migration, however a clear plan around the handling of existing data was not presented.
  • The team need to further consider the implications of holding detailed sensitive data (such as that around mental health) and document their approach to data handling and storage at Beta, as well as providing data privacy statements for users.  

Optional advice: We suggest the team review regularly the decision that there is no benefit from using Gov.UK One login for authorisation and/or to identity verification.

10. Define what success looks like and publish performance data

Decision

The service was rated green for point 10 of the Standard.

Optional advice: metrics and KPIs outlined are adequate for the Alpha stage. We’d expect the proposed metrics for Beta to be reviewed and solidified as the service evolves, ensuring they remain relevant and actionable.

11. Choose the right tools and technology

Decision

The service was rated green for point 11 of the Standard.

Optional advice: choice of cloud platform aligns with SIA strategy and public cloud usage, team should ensure it reviews and monitors the financial and licensing implications of the chosen provider. 

12. Make new source code open

Decision

The service was rated green for point 12 of the Standard.

Optional advice: No artefacts at this stage however the team should look to establish a coding in the open policy and secure processes around it considering the intent to use open source for the frontend service.

13. Use and contribute to open standards, common components and patterns

Decision

The service was rated green for point 13 of the Standard.

Optional advice: continue to use GDS design system and reuse open-source components when building the frontend service.

14. Operate a reliable service

Decision

The service was rated green for point 14 of the Standard.

Optional advice: Choice of cloud platform aligns with SIA strategy and public cloud usage principles and is architected in such a way as to minimise / eliminate downtime, however appropriate RTO and RPO should be established and business and routine business continuity exercises carried out. It is noted that the design and build team is almost entirely outsourced. The team will need to present a plan to run and maintain the live service moving forward. 

Updates to this page

Published 11 April 2026