Dr Dalia Chakrabarty
Senior Lecturer in Statistics
My D.Phil (from St. Cross College, Oxford) was in Theoretical Astrophysics, and I was examined by Prof. James Binney. My doctoral thesis was dedicated to the development of a novel Bayesian learning method to learn the gravitational mass of the black hole in the centre of the Milky Way, (along with the Galactic phase space density), and to the computational modelling of non-linear dynamical phenomena in galaxies. Thereafter, I continued to develop probabilistic learning methods, and undertake Bayesian inference, within astronomical contexts, till 2009, when I moved to Warwick Statistics, and started developing Bayesian methodologies, to apply to diverse areas. After Warwick, I was a Lecturer in Statistics, in Leicester Maths, and then in Loughborough Maths. I moved to Brunel, Department of Mathematics, at the beginning of 2020. My current interest is strongly focused on the development of Bayesian learning methodologies, given different challenging data situations, such as data that is shaped as a hyper-cuboid; with components that are diversely correlated; absent training data; data that is discontinuously distributed and/or changing with time. I am equally keen on learning graphical models and networks of multivariate datasets, as random geometric graphs, with the ulterior aim of computing distance between a pair of learnt graphs. I am also interested in the development of Bayesian tests of hypotheses that are useful when the alternative model is difficult/impossible to perform computation within, and recently, have initiated a method of optimising the mis-specified parameters of a parametric model, while learning the desired model parameters. My current applications include areas such as healthcare, vino-chemistry, astronomy, test theory, material science, etc. My research focuses upon the development of methodologies within Computational & Mathematical Statistics, and I undertake inference primarily using Markov Chain Monte Carlo techniques. My reserach interests include: Bayesian learning methods — given different data situations, such as high-dimensional data; temporally-evolving; and/or discontinuously distributed data; absent training data; large in size, or under-abundant. This has resulted in --supervised learning methodologies given hypercuboidally-shaped, discontinuous and/or non-stationary data, using compounding of Gaussian Processes; --pursuit of graphical models & networks of multivariate data, as random graphs, followed by computing distance between learnt graphical models, to inform on the inter-data correlation. --a novel method that allows for variable prediction at test data, (when bearing of functional relation between variables is not possible), by embedding the sought variable vector within support of the state space density. --Sometimes, in pursuit of this latterly mentioned prediction without learning, intractability is encountered, and I am interested in developing new tests of hypotheses in which we seek the probability of a simplifying model, conditioned on the data, where said simplification is undertaken to counter the intractability. --Have recently developed a 5-step method that optimises the mis-specification parameter vector, in a parametric model, while Bayesianly learning the sought model parameters. In addition, I have worked on developing a novel classification technique in lieu of training data, and on another occasion, trained the model for the causal relationship between the observable and covariates, using hierarchical regression. Applications of these methods are in Astronomy, Materials Science, Chemistry, Petrophysics, Testing theory, etc.