About the project
Having robust methods and tools for ensuring safety and security of cyber-physical systems is a key to their future development and deployment. The long-term objective of this project is to develop formal methods that will verify and ensure the security of cyber-physical systems with AI (neural network) components.
Increasing use of AI (neural network) components has raised concerns about robustness and reliability of autonomous and cyber-physical systems in which they are integrated. By deploying cutting-edge verification methods, you will discover new ways to make cyber-physical systems with neural network components more resilient to failures and adversarial attacks.
In this project you will:
- explore the range of vulnerabilities caused by neural networks in cyber-physical systems
- identify the vulnerabilities that can be formalised as neural network verification properties
- program the cyber-physical system environments that embed the neural networks in a safe by construction way.
You can find some information about the topic here:
- The Vehicle Tutorial: Neural Network Verification with Vehicle
- Neural Network Robustness as a Verification Property: A Principled Case Study
- CheckINN: Wide Range Neural Network Verification in Imandra
You will join the Cyber Physical Systems research group and the Cyber Security research group recognised as Academic Centre of Excellence for Cyber Security Research (ACE-CSR) and Education (ACE-CSR), and work with Prof. Ekaterina Komendantskaya and Dr Erisa Karafili.
As part of the CPS and Cyber-Security research groups, you will attend regular research seminars and events organised by these research groups. You will also attend national and international summer schools, workshops and conferences.