My work focuses on the topic uncertainty quantification from a broad perspective and has focused on the study of extreme events. My expertise lies at the intersection of computational mathematics, stochastic dynamical systems, machine learning, data assimilation, and extreme events. I consider myself a researcher that takes a holistic approach, with the perspective that newly developed methods need practical demonstration on hard problems in real-world applications and in engineering tasks.
Most of my work has been on developing efficient methods to quantify extreme events (bursting phenomena) in physical systems. My research here has involved developing new quantification strategies by blending ideas from probability theory and dynamical systems theory. Most recently, I worked on a new technique utilizing Gaussian process regression, a popular method machine learning, to develop efficient sampling algorithms that targets heavy-tailed output statistics.
Recently, another topic of interest is centered around data assimilation and Bayesian parameter estimation methods from noisy measurement instruments. In particular, I have been working on this topic in the context of prediction of flow fields from measurements obtained from instruments (i.e. tracer particles) that are being advected by a flow field. In particular, this is a practically important problem in the assimilation of tracer particle observations used to track oceanic and atmospheric flows to understand various mechanisms in the climate and also for state prediction.
I am a big advocate and contributor to open source software, in particular the Julia language and packages in the Julia ecosystem. My projects and contributions are available on Github.