Securing the future of ASSURE
The ASSURE scheme sees the expertise of cybersecurity professionals used to evaluate the security of disparate systems and processes within the aviation sector. But is the evidence-based compliance process too prescriptive? Anthony Dickinson, managing consultant at Prism Infosec, explores the issues and what needs to happen to ‘assure’ the standard’s future.
When the ASSURE scheme was introduced in January 2020, the move was welcomed by an industry now regularly subjected to cyber attacks, with 61.5% of airports attacked that year.
The scheme promises direct access to experienced cybersecurity specialists, as the audit must be carried out by a CREST or IASME accredited tester, and a holistic overview of the security posture, as IT and OT systems are assessed in parallel. But while it has delivered on these promises, it is proving challenging to implement.
ASSURE was developed to help aviation organisations – airlines, airport operating firms, and air navigation service providers – meet their regulatory requirements and communicate cyber risk and security issues to the board.
It comprises a six-step process that consists of engagement, critical systems scoping, a pre-audit cyber self-assessment, the audit itself, a provisional statement of assurance, and a final statement and certificate of compliance, with an accredited third-party ASSURE assessor required to carry out the audit.
It’s a process that has proven particularly difficult for airports, which are expected to comply by the end of 2021. Still reeling from the effects of the pandemic, shifts in working patterns, and a declining workforce, many now find themselves under-resourced to the extent that some have sought an extension. But it’s not just timescales that are proving problematic.
The ASSURE scheme is based upon sound principles, as it uses the National Cyber Security Centre (NCSC) Cyber Assessment Framework (CAF). But the CAF for aviation is far more prescriptive. Where the NCSC CAF, which applies to critical national infrastructure, contains four objectives (A-D) with advisory statements under each, the CAF for aviation uses each of these statements as an objective with indicators of good practice marked against these.
Every outcome has multiple sub-category statements spanning A-D, and airports are required to carry out all the assessments for each and every critical system. This means that a tier two airport will typically have to complete these assessments 20-30 times to cover airside and landside critical systems such as radar, telecoms, CCTV, ground lighting, and network storage, which can take months.
In a bid to create some flexibility, the requirements are open to interpretation. Airports can even choose to disregard the statements altogether and demonstrate compliance using alternative methods of their choosing. For example, perhaps the system being assessed features advanced technology not reflected in the indicators of good practice. In this case, airports can use an alternative method to show they have met the outcome.
While some organisations are being extremely thorough in evidencing their compliance, which is to be applauded, others are doing much less – diluting the value of CAF.
Unfortunately, this flexibility has created a certain amount of ambiguity, which has resulted in some inconsistency as to how the CAF is applied. While some organisations are being extremely thorough in evidencing their compliance, which is to be applauded, others are doing much less – diluting the value of the exercise.
Acceptable forms of evidence can take various forms. System documentation, manuals, reports, meeting minutes, and interviews with key personnel, are all admissable but teams must document both the process of achieving compliance as well as the outcome. They’ll need to identify what evidence to submit and liaise with one another to create/collate this information, which can prove logistically challenging.
For these reasons, many choose to appoint a third-party assessor for the self-assessment stage as well as for the audit proper to ensure the correct evidence is generated and time (and costs) are kept under control.
Another issue that has arisen as a result of basing the scheme on the NCSC CAF is that it has revealed an IT bias. Assumptions are made that the systems being assessed are modern, connect to the internet and are designed with security in mind when in fact many airport systems could be over thirty years old with no internet connectivity.
Radar systems, such as the Watchman, are a perfect example of a system that is classed as critical, but that actually have very little cyber functionality. Consequently, much of the criteria simply doesn’t apply but the process must still be followed and evidence provided.
Once the evidence has been compiled, the organisation will need to select an independent assessor (a second assessor if the organisation has already used an ASSURE assessor during the self-assessment) who will carry out the audit.
This should be performed using all the observations, evidence, controls, guidance, standards or good practice provided from the self-assessment against the indicators of good practice and used to deliver a verdict of ‘achieved’, ‘partially achieved’ or ‘not achieved’ with commentary on each CAF contributing outcome. Recommendations are then made for those areas deemed ‘partially achieved’ or ‘not achieved’ – forming the basis of a Corrective Action Plan, which is completed within an agreed timeframe.
CAF must become more relevant, focused and streamlined to ease workloads.
Happily, it's common that even those organisations that have met the compliance requirements are still choosing to implement a comprehensive programme of works post-assessment to address issues, which shows the standard is not being viewed as a tick-box exercise.
To capitalise upon this enthusiasm, the ASSURE scheme now needs to adapt. It’s already built upon the sturdy foundations of the NCSC CAF and adheres to a number of aviation industry regulations or guidance. But it must become more relevant, focused and streamlined to ease workloads. We’d also like to see more of a narrative response required from the auditor in response to each of the 39 questions to give the organisation more information.
As with any new standard, there were always going to be teething issues when introducing ASSURE. There are processes in place to allow for review, with assessors required to undergo a ‘wash-up’ post-audit. This allows discussion at a high level of how the assessment went and so it is anticipated that this feedback will help shape the standard and its guidance, ensuring ASSURE can fulfil its potential in making aviation systems and processes more cyber secure.