RESEARCH

The DCE group at University of Exeter does leading research at the dynamic interface between applied mathematics, statistics and high-performance computing. Spanning all aspects of research from fundamental theory in data science and AI to applied industrial focused projects, the group works with major international companies, research centres and university partners.

Intelligent Virtual Test Pyramids for High Value Manufacturing

Adaptive multilevel MCMC sampling.   Developing an MCMC algorithm for efficient Bayesian inference in multilevel models   

  Funding. Alan Turing institute  Developing and implementing a new method called Multi-Level Delayed Acceptence (MLDA) to drastically accelerate inference in many real-world applications where spatial or temporal granularity can be adjusted to create multiple model levels. Sampling from a Bayesian posterior distribution using Markov Chain Markov Chain Monte Carlo (MCMC) methods (Fig. 1) can be a particularly challenging problem when the evaluation of the posterior is computationally expensive and the parameter space and data are high-dimensional. Many MCMC samples are required to obtain a sufficient representation of the posterior. Examples of such problems frequently occur in Bayesian inverse problems, image reconstruction and probabilistic machine learning, where calculating the likelihood depends on the evaluation of complex mathematical models (e.g. systems of partial differential equations) or large data sets.   Fig.1 - An MCMC algorithm drawing samples from a 2D distribution. In each step a proposed sample is generated and it is either accepted (blue) or rejected (green). If rejected, the previous sample of the chain is replicated. Samples occur more frequently in areas of high probability. This project has developed and implemented an MCMC approach (MLDA) capable of accelerating existing sampling methods, where a hierarchy (or sequence) of computationally cheaper approximations to the 'full' posterior density are available. The idea is to use samples generated in one level as proposals for the level above. A subchain runs for a fixed number of iterations and then the last sample of the subchain is used as the proposal for the higher-level chain (Fig. 2).   Fig .2 - Three-level MLDA: For each step of the fine chain, a coarse subchain is created with length 3. The last sample of the subchain is used as a proposal for the fine chain. The sampler is suitable for situations where a model is expensive to evaluate in high spatial or temporal resolution (e.g. realistic subsurface flow models). In those cases it is possible to use cheaper, lower resolutions as coarse models. If the approximations are sufficiently good, this leads to good quality proposals, high acceptance rates and high effective sample size compared to other methods.In addition to the above main idea, two enhancements to increase efficiency have been developed. The first one is an adaptive error model (AEM). If a coarse model is a poor approximation of the fine model, subsampling on the coarse chain will cause MLDA to diverge from the true posterior, leading to low acceptance rates and effective sample size. The AEM corrects coarse model proposals employing a Gaussian error model, which effectively offsets, scales and rotates the coarse level likelihood function (Fig. 3). In practice, it estimates the relative error (or bias) of the coarser model for each pair of adjacent model levels, and corrects the output of the coarser model according to the bias.   Fig 3 - Adaptive Error Model (AEM): When the coarse likelihood function (red isolines) is a poor fit of the fine (blue contours), AEM offsets the coarse likelihood function (middle panel) using the mean of the bias, and scales and rotates it (right panel) using the covariance of the bias. The second feature is a new Variance Reduction technique, which allows the user to specify a Quantity of Interest (QoI) before sampling, and calculates this QoI for each step at every level, as well as differences of QoI between levels. The set of QoI and QoI differences can be used to reduce the variance of the final QoI estimator. 

Intelligent Virtual Test Pyramids for High Value Manufacturing

Intelligent Virtual Test Pyramids for High Value Manufacturing

Start Date 01/10/19 - Finish Date 30/09/24 Funding. UKRI 5 Year Turing AI Fellowship ∼£2.6M  Developing novel AI methods to build a more sustainable aviation industry There is a paradox in aerospace manufacturing. The aim is to design an aircraft that has a very small probability of failing. Yet to remain commercially viable, a manufacturer can afford only a few tests of the fully assembled plane. How can engineers confidently predict the predict the failure of a low-probability event? This research is developing novel, unified AI methods to intelligently fuses models and data, enabling industry to slash conservatism in engineering design, leading to faster, lighter, more sustainable aircraft.   The engineering solution to building a safe aircraft is the test pyramid. Variability in structural performance is quantified from thousands of small tests of the material (a coupon or element test), hundreds of tests at the intermediate length scales (called details components) and a handful of tests of the full system. Sequentially, the lower length scales derive ‘design allowables’ for tests at the higher levels. Yet, such derivation is accompanied by significant uncertainty.   In practice, this uncertainty means that ad hoc ‘engineering safety factors’ have to be applied at all length scales. As we seek a more sustainable aviation industry, the compound effect of these safety factors leads to significant overdesign of structures. The overdesign results in extra weight, which severely limits the efficiency of modern aircraft. A clear route to flying lighter, more efficient aircraft is to better quantify the uncertainties inherent in the engineering test pyramid, which will allow the industry to confidently trim excessive safety factors. New high-fidelity simulation capabilities in composites now mean that mathematical models can supplement limited experimental data at the higher length scales. Yet, with this comes additional uncertainties from the models themselves. Existing experimental tests, particularly at the lower length scales, can be used to reduce and quantify uncertainty in the models. The typical typical statistical approach is Bayesian, in which the distribution of a model?s parameters is learned to fit the data probabilistically. However, existing capabilities are notoriously computationally expensive, often limited to small-scale applications and simplified experimental data sets. For real engineering test pyramids, the available data and models are more heterogeneous and their connections are are complex, driving the need for new fundamental research. This research will develop a novel, unified AI framework that intelligently fuses models and data at all levels of the engineering test pyramid, remains computationally feasible and is sufficiently robust to support ethical engineering decision-making in order to slash conservatism in engineering design, leading to faster, lighter, more sustainable aircraft.  Recent Updates Michael Gibson and Ben Fourcin, started 6 July 2020.With City Science we have reviewed available modelling software for engineering consumption, and have down selected “EnergyPlus”.Obtain building information data for Living Systems Institute (LSI)   Collaborators Professor Tim Dodwell, Academic Principle Investigator, https://emps.exeter.ac.uk/engineering/staff/td336Professor Mark Girolami, http://www.eng.cam.ac.uk/profiles/mag92Professor Philip Withers, https://www.royce.ac.uk/about-us/professor-phil-withers/  Partn

From RIBA To Reality - Deep Digital Twin To Enable Human - Centric Buildings For A Carbon Neutral Future

From RIBA to Reality - Deep Digital Twin to enable Human - Centric Buildings for a Carbon Neutral Future

Start Date 01/04/20 - Finish Date 31/03/22 Funding. Innovate UK - Transforming Construction Call $1.1M  Developing novel AI methods to track human usage of a building using mobile to WiFi connections. This will enable adaptive control of energy consumption in the building uniquely tailored to it's use. "RIBA to Reality" is a transformative industrial research project to support the RIBA design process right through to delivery and operations of the next generation of Carbon Neutral Buildings. Using digital twinning technology the programme will deliver a holistic approach to proactive energy control, by elevating Building Information Models (BIM) so it is capable of tracking live or simulating both human building usage and energy demands.  The project is developing new technology to track human building useage using mobile-to-wi connections. Bayesian calibration techniques will the use this discrete data to build a live `heat map' of building useage. This will use the latest techniques in Deep Gaussian Process and Model order reduction techniques. The human heat map will be embedded into active energy control models for the building (building on top of BIM), with the aim of building a carbon neutral building of the future. 'RIBA to Reality' is uniquely centred around two university buildings as real world test-bed.  The Living Systems Institute which is built and in operation will provide real world data to trial and test the new technology and methods proposed, showing how digital planning assets can be aggregated to create an operational digital twin.  Project North Park a 70M visionary capital investment, with the ambition to be carbon neutral over its life time, is in early stages of design. This programme will explore how the new technology can then be used to support design decisions at critical stages, in particular looking at how the building could exceed its already ambitious energy targets by taking into account its wider footprint.  Recent Updates Michael Gibson and Ben Fourcin, started 6 July 2020.With City Science we have reviewed available modelling software for engineering consumption, and have down selected “EnergyPlus”.Obtain building information data for Living Systems Institute (LSI)   Collaborators Professor Tim Dodwell, Academic Principle Investigator, https://emps.exeter.ac.uk/engineering/staff/td336Professor Richard Everson, Co-Investigator, http://emps.exeter.ac.uk/computer-science/staff/reversonProfessor Jonathan Fieldsend, Co-Investigator, http://emps.exeter.ac.uk/computer-science/staff/jefieldsDr Matt Eames, Co-Investigator, https://emps.exeter.ac.uk/engineering/staff/py99meeMr Laurence Oakes-Ash, Industrial Principle Investigator, City Science, https://www.cityscience.com/team/  Partners

Analysis and Design for Accelerated Production and Tailoring of composites (ADAPT)

Analysis and Design for Accelerated Production and Tailoring of composites (ADAPT)

Start Date 01/09/16 - Finish Date 31/012/20    Establishing novel manufacturing techniques that speed up deposition of stiffness tailored   This project is developing novel manufacturing techniques to ensure delivery of better products by minimising occurrence of manufacturing defects. The project will enable production of high performance composite components at rates suitable for the next generation of short-range aircraft. If demand for production of next-generation, short-range commercial aircraft is to be profitably met, current methods for composite airframe manufacture must achieve significant increases in material deposition rates at reduced cost. However, improved rates cannot come at the expense of safety or increased airframe mass. This project will enable a fourfold increase in productivity by establishing novel manufacturing techniques that speed up deposition of stiffness tailored material. New continuum mechanics-based forming models will ensure delivery of better products by minimising occurrence of manufacturing defects. In a parallel stream of activity, new methodologies for analysis and design of composite structures in which the ply angle and thickness of fibre-reinforcement is spatially tailored, both continuously and discretely, will reduce the need for stiffening, leading to significant savings in structural mass (by up to 30%) and manufacturing cost (by up to 20%). Potential structural integrity and damage tolerance issues, such as transition in fibre angle and tapering of laminate thickness from one discrete angle to another, will be addressed.  Recent Updates A new method for determining the formability of a stack of layers of different fibre orientations has been devised.  Collaborators Professor Tim Dodwell, Academic Principle Investigator, https://emps.exeter.ac.uk/engineering/staff/td336Professor Richard Butler, https://researchportal.bath.ac.uk/en/persons/richard-butler-2Dr Andrew Rhead, https://researchportal.bath.ac.uk/en/persons/andrew-rhead Partners 

Multi-Level Inversion and Uncertainty Quantification for Groundwater Flow and Transport Modelling

Multi-Level Inversion and Uncertainty Quantification for Groundwater Flow and Transport Modelling

Start Date 24/09/18 - Finish Date 23/09/22   Investigating cutting edge multi-level Markov Chain Monte Carlo methods and machine learning techniques to improve inversion and uncertainty quantification in hydrogeology The problem being addressed is the prohibitive computational cost of doing standard MCMC on high-dimensional distributions with expensive forwards models, such as groundwater flow and transport problems. One solution is the use of surrogate models in multi-level MCMC model hierarchies, and we are currently exploring various machine learning techniques for ultra-fast approximation of model response. The work has applications in both groundwater abstraction and remediation – improved estimates of groundwater flow patterns can improve decision support systems, allowing groundwater abstraction companies to make better sustainable yield estimates, and remediation companies to design taylor-made remediation campaigns. Inversion of distributed environmental models is typically an ill-posed problem with noisy data. This is a well known challenge that has been addressed from various angles throughout the previous half century. Markov Chain Monte Carlo (MCMC) overcomes the ill-posedness by treating models as realisations of a stochastic process, allowing for rigorous uncertainty quantification. However, this power oftentimes comes at a prohibitive computational cost, since MCMC requires the forward model to be evaluated many times and since, in case of groundwater flow and potentially an advection/diffusion process, forward model evaluation involves solving a Partial Differential Equation (PDE). Delayed Acceptance or Multi-Level MCMC alleviates some of the model evaluation cost by introducing a model hierarchy, in which an inexpensive reduced-order or surrogate model is employed as a filter by rejecting highly unlikely parameter sets. Hence, a carefully chosen reduced-order model can accelerate MCMC significantly. Emerging machine learning techniques provide ultra-fast predictions for non-linear problems if tuned correctly and may thus deliver previously unimaginable improvements to multi-level MCMC. This project is about identifying promising reduced-order models and multi-level MCMC techniques that would allow fast inversion and uncertainty quantification of hydrogeological problems.  Recent Updates We are currently working on a journal paper describing the use of a Deep Neural Network as a surrogate model. The underlying code makes use of the high-performance PDE solver FEniCS along with the popular neural network framework Keras and an in-house MCMC code.  Collaborators Professor Tim Dodwell, Academic Principle Investigator, https://emps.exeter.ac.uk/engineering/staff/td336Dr David Moxey, PhD Supervisor, https://emps.exeter.ac.uk/engineering/staff/dm550 

Uncertainty Quantification for Black Box Models

Uncertainty Quantification for Black Box Models

Start Date 01/04/19 - Finish Date 01/03/22   Using statistical uncertainty quantification methods for computationally-expensive engineering problems. Some PDE-based models in engineering are extremely challenging and expensive to solve, and are often characterised by high-dimensional input and output spaces. We aim to tailor existing methods from the emulation and calibration literature to such problems, providing full-field predictions of the model output at a fraction of the cost of solving the model directly.  With the ability to predict the full model output (with uncertainty) at a particular setting of the input parameters via an emulator, such an emulator can be used as a proxy for the true model in many applications that are currently hindered by a high computational cost. For example, in problems requiring domain decomposition, where the model is solved on a large number of subdomains, an emulator proxy can be used in place of the exact solution on every subdomain, with full solutions (unobtainable or extremely expensive directly) constructed from the individual emulators, incorporating the correlation structure between the different subdomains. How best to include this structure is an area of ongoing research within this project.  Recent Updates Coming Soon  Collaborators Professor Tim Dodwell, Academic Principle Investigator, https://emps.exeter.ac.uk/engineering/staff/td336Peter Challenor, http://emps.exeter.ac.uk/mathematics/staff/pgc202Danny Williamson, https://emps.exeter.ac.uk/mathematics/staff/dw356

Spatially Correlated Bernoulli Processes

Spatially Correlated Bernoulli Processes

Start Date 01/08/20 - Finish Date 01/08/22 Funding. UKRI 5 Year Turing AI Fellowship ∼£2.6M  Spatially correlated Bernoulli processes using de Bruijn graphs. I aim to develop a spatially correlated Bernoulli process in n dimensions so that samples from this process have a clustered structure instead of appearing at random. Initial work on a 1-d process (during my PhD) has focused on using de Bruijn graphs, which are somewhat of a generalisation of Markov chains, so that we can control the spread of the correlation. I initially want to extend these ideas to higher dimensions, and then focus on possible applications, for example in classification. Given a series of Bernoulli trials (or coin tosses) in ‘space’ it is often the case that the successes in nearby trials are more highly correlated than trials far away. We then want to model spatially varying probabilities of success so that the probability of getting a 1 is higher near another 1 (or equally for 0’s). Often such processes are modelled using a logistic regression, which allow the probability of observing a 1 to vary smoothly in space, producing a marginal Bernoulli distribution with probability of success at each point. If we were to sample a sequence of 0’s and 1’s from these models, we actually neglect any form of spatial correlation, and hence there is a much higher chance for the 0’s and 1’s to appear random in the sample.  An example of this is seen in computer modelling or spatial classification. Consider a two-dimensional grid where the input space is split into two separate regions. We can fit a logistic regression to these regions, which gives the probability of being classified into one of the two regions at any point in space. To generate classification predictions, we would sample from an independent Bernoulli distribution at specified input values, where we would expect input points which are closer together to be more likely to be classified into the same region.  Drawing marginally, however, gives a much higher chance of misclassifications, especially close to the region boundaries. We therefore want to produce a correlated Bernoulli process, and recent work has focused on using the structures in de Bruijn graphs. Given the symbols 0 and 1, a length m de Bruijn graph is a directed graph where the nodes consist of all the possible length m sequences of 0’s and 1’s. Edges are drawn between nodes so that the end of the previous node is the same as the start of the next. Since probabilities can be attached to each edge to show the probability of transitioning from sequence to sequence, de Bruijn graphs can be seen as a generalisation of a Markov chain. The spread of the correlation can be controlled by changing both the size of the node sequences and the transition probabilities themselves. This means that we can create both ‘sticky’ (clustered) sequences and ‘anti-sticky’ (alternating) sequences of 0’s and 1’s. I have worked on producing both a run length distribution and inference. Given a sequence of 0’s and 1’s, we can estimate the de Bruijn process that was most likely used to generate it. This work has been successful at defining a correlated Bernoulli process in one dimension, ands the next step is to extend this to higher dimensions. This is a difficult problem since the direction associated with de Bruijn graphs does not easily apply to a spatial grid. Initial work has focused on adapting multivariate Markov chains, and/or redefining the form of the dependent de Bruijn sequences. Due to the directional restrictions on a spatial grid, I will also develop a more generalised non-directional de Bruijn process.  Recent Updates PhD Thesis (Submitted) 09/04/20University of Exeter Internal Seminar 02/06/20Isaac Newton Institute Talk 07/19   Collaborators Professor Tim Dodwell, Academic Principle Investigator, https://emps.exeter.ac.uk/engineering/staff/td336Prof Peter Challenor, PhD Supervisor, https://emps.exeter.ac.uk/mathematics/staff/pgc202Prof Henry Wynn (LSE), http://www.lse.ac.uk/CATS/People/Henry-Wynn-homepageDr Daniel Williamson, https://emps.exeter.ac.uk/mathematics/staff/dw356   Partners

Modelling Multiphase, Multiscale Resin Flow in Composites

Modelling Multiphase, Multiscale Resin Flow in Composites

Start Date 14/08/17 - Finish Date 06/05/22   Developing a Computer Model to Simulate the Forming of Composites During Infusion Manufacture In infusion manufacture, optimising the cure cycle time is a complex challenge, as too quick a cure can lead to porosity, whereas a slower cure reduces output and increases cost. The aim of this project is to create a computational model to simulate the flow and cure of liquid resin within a composite fibre bed. This will allow the user to alter parameters to optimise the cure process, without the fabrication of many costly prototypes. Numerous factors affect the infusion and cure processes, so this computational model needs to take many parameters into account, including: Resin flow rateLocal temperaturesTimeCure rateResin viscosityDisplacementFibre direction To provide a complete picture, the model needs to encompass the infusion stage of manufacture, as this is when voids can occur in the resin. This requires tracking the fluid front in a two-phase fluid simulation (i.e. the air being expelled and the liquid resin being introduced).  Recent Updates A three-dimensional, linear model has been created in Matlab, encompassing fibre compaction, fluid pressure, temperature and resin cure. A non-linear model of fibre compaction has now been created using FEniCS, to which will be added the fluid behaviour of the resin.  Collaborators Professor Tim Dodwell, Academic Principle Investigator, https://emps.exeter.ac.uk/engineering/staff/td336

...
...
Data Centric Engineering

SUBSCRIBE

​Your email:

This field is required.

You are now subscribed!

 

Contacts

Phone  /  +44 (01392) 725899

Follow us on Twitter
Website by GFIVEDESIGN
image-1 image-5mainMagicWidget74_ey
Follow us on Instagram

SEND

MESSAGE

​Your name:

Your name is required.

Your message has been sent

 

​Your email:

Your email is required.

​Your message:

Your message is  required.

© 2020 Data Centric Engineering | All Rights Reserved | Privacy & Cookie Policy

Data Centric Engineering