Keynotes
Professor Alfred Hero - Biography
Abstract - Learning to Benchmark
Using mathematical models to benchmark the capability of a sensor platform to provide data for accurate signal detection, classification, or estimation has been an essential part of performance-driven system design. When a mathematical model is unreliable, or not available, a natural question to ask is whether it is possible to use machine learning to accurately benchmark the capability of a sensor solely from experimental data collected from the sensor. In this talk we will answer this question in the affirmative. For example, in the context of classification, empirical estimation of the minimal achievable classification error, i.e., the Bayes error rate, from labeled experimental sensor data can be framed as the meta-learning problem of estimating the Bayes-optimal misclassification error rate without having to estimate the Bayes-optimal classifier. The talk will cover relevant background, theory, algorithms, and applications of benchmark learning.
Professor Andy Bell - Biography
Abstract - Defence and Security in the Information Age
We are in the Information Age, defined as a period of rapid transformation in methods of generation, access and exploitation of information. Its characteristics are connectivity, mobility, high data volume and rate, and rapid innovation. It is having a profound effect on defence and security, and this is particularly apparent in the many ways that threats now manifest themselves.
The opportunity to act or be acted upon in the virtual or social domains has been significantly increased by the information age. Boundaries which we have traditionally sought to protect are less secure, or no longer exist. There are also new ways by which our interests can be attacked (including in ways that may not yet have been recognised) that cannot fully be countered by conventional military means. Fundamentally, information has passed from being an enabler for military action to a weapon, tactical or strategic, in its own right.
The MOD is investing significant resources to develop underpinning mathematics, engineering and computer sciences to counter the threats, and exploit the opportunities, brought by the information age. This talk will describe the potential characteristics of operating in the information age, MOD’s response to these challenges and some examples of technologies which promise to deliver information advantage.
Invited Speakers
Professor Simon Maskell - Biography
Abstract - Big Hypotheses: A Generic Tool for Fast and Good Bayesian Machine Learning
There are many machine learning tasks that would ideally involve global optimisation across some parameter space. Researchers often pose such problems in terms of sampling from the distribution and favour Markov Chain Monte Carlo (MCMC) or its derivatives (e.g., Gibbs sampling, Hamiltonian Monte Carlo (HMC) and Simulated annealing). While these techniques can offer the good results that are so important in defence contexts, they are stereotypically slow. We describe an alternative numerical Bayesian algorithm, the Sequential Monte Carlo (SMC) sampler. SMC samplers are closely related to particle filters and are reminiscent of genetic algorithms. More specifically, an SMC sampler replaces the single Markov chain considered by MCMC with a population of samples. The inherent parallelism present makes the SMC sampler a promising starting point for developing a scalable Bayesian global optimiser, e.g., that runs 86,400 times faster than MCMC and might be able to be 86,400 times more computationally efficient. The University of Liverpool and STFC’s Hartree centre have recently started working on a £2.5M EPSRC-funded project (with significant support from IBM, NVidia, Intel and Atos) to develop SMC samplers into a general purpose scalable numerical Bayesian optimisation and embody them as a back-end in the software package Stan. This talk will summarise recent developments, initial results (in a subset of problems posed by Astrazeneca, AWE, Dstl, Unilever, physicists, chemists, biologists and psychologists) and planned work over the next 4 years towards developing a high-performance parallel Bayesian inference implementation that can be used for a wide range of problems relevant to researchers working in a range of application domains including defence.
Professor Peter Willett - Biography
Abstract - Navigation and Destination-Aware Modeling for Highly-Maneuvering Threats
Concern has largely shifted from ballistic threats to those that execute high-speed and seemingly-random maneuvers prior to final target engagement. For target protection it is vital that a method for very accurate tracking of such objects be developed. The scheme discussed here has four key ingredients: it adheres to physics, it assumes a Proportional Navigation feedback guidance model when targeted toward that destination, it estimates the parameters of the feedback, and it allows for periods in which other are other temporary sham destinations. It is applied to a 3D maneuvering target state estimation problem with a target capable of high-magnitude, random lateral and vertical accelerations under a Proportional Navigation control policy. It is shown that due to the observability of the feedback control parameters, the filter significantly reduces the estimated position, velocity, and prediction errors.
Professor Daniele Faccio - Biography
Abstract - Coherent Beam Control and Machine Learning for Enhanced Imaging Applications
Computational techniques and coherent control for beam shaping are promising new approaches to imaging in challenging scenarios. We will discuss the use of machine learning to imaging through optical fibres and single pixel LIDAR. Feedback-controlled coherent control of laser beam fronts, combined with the speckle-memory effects, can be used to re-collimated a laser beam after scattering from a wall and then scan a hidden a scene, therefore providing a non-line-of-sight LIDAR that we use to scan a hidden object with 100 micron resolution.