Searches for Lepton Universality violation at LHCb

Over the last four decades, an extensive series of experimental measurements---recently culminating in the discovery of the Higgs boson---have shown that the Standard Model (SM) of particle physics is an almost complete description of all physical phenomena except gravity. However, there are some observations that remain unexplained, such as the fact that most of the gravitational bending in the universe is due to something we have not been able to observe in any other fashion (dark matter), or why our universe contains so much more matter than anti-matter. I am broadly interested in addressing these foundational questions, and my tool of choice is the LHCb experiment at the CERN LHC in Geneva, Switzerland.

As part of the UMD flavor physics group, my research focuses on measurements that test lepton universality, a fundamental assumption within the SM that states that the interactions of all charged leptons (electrons, muons, and taus) differ only because of their different masses. During my PhD at BaBar we found first evidence for an excess of B→D(*)τν decays significant at the 3.4 σ level. Subsequent measurements by the Belle and LHCb experiments have found rates for decays that contain b→cτν (b→sμμ) transitions consistently above (below) the SM expectations when compared the decay rates to lighter leptons, thus seemingly violating LFU. I co-wrote a detailed review in Review of Modern Physics of the situation with the b→cτν transitions and summarized the 2021 status of both of these anomalies in a seminar and a colloquium.

LFU anomalies 2021

Currently my group is working to take advantage of the very large samples of B mesons collected by the LHCb experiment to measure with unprecedented precision a host of B meson decays with tau leptons in the final state that could be key to making sense of the current anomalies.

Next generation of simulation techniques aided by machine learning

Modern analysis of particle collider data requires enormous quantities of Monte Carlo (MC) simulations of the collision and the interactions of their decay products with the various subdetectors. As particle colliders produced larger and larger data samples over the last few decades, the production of simulated events kept up thanks to Moore's law and its attendant increase in computing power. However, with Moore's law slowing down and the LHC poised to take a giant step forward in the size of the data sets thanks to the advent of the High Luminosity LHC (HL-LHC), new ideas are needed to produce sufficient samples of simulated events.

The solution will probably come from a combination of heterogenous computing (eg, CPUs, GPUs, FPGAs), massive parallelization, and enhancements made possible by the recent advances in machine learning. We have used machine learning in high energy physics for many years now, for instance Boosted Decision Trees and shallow neural networks. But the new generation of Deep Neural Networks (DNNs) have the potential to take these application one step further the way they did for image or speech recognition. Already they are helping us classify bottom and top quark more effectively, and in some cases, separate signal from background as well. Generative techniques such as Generative Adversarial Networks (GANs) or normalizing flows may have the ability to produce very large amounts of simulated events employing orders of magnitude less computing power than standard MC methods. I am interested in exploring the architectures and training techinques, as well as the uncertainty estimation methods, required to make ML-aided generation of simulated events a reality, at least in some applications.

GAN

Readout electronics (PEPI) for the Upstream Tracker

During the 2019-20 long shutdown of the LHC, the LHCb detector will undergo a sweeping upgrade of all its subsystems to be able to cope with particle rates five times larger than up to now. The Upstream Tracker (UT) is a vital part of this upgrade and is also the first major US contribution to the LHCb detector. The UMD group is responsible for the development and construction of the readout electronics of the UT which include the Data Concentrator Boards (DCBs), the PEPI backplane, and the linear regulator block. The signals from the silicon detectors are routed through the PEPI backplane to the DCBs and then shipped to the LHCb data acquisition system that sits in the counting rooms on the surface. The linear regulator block controls and provides LV power to the PEPI crates and electronics on the silicon sensors.

This is a project that is perfectly suited for both undergraduate and graduate students to contribute to as it involves a wide set of skills in hardware, firmware, and software development. These are skills that not only lead to a deeper understanding of how particle physics measurements are done but also to enhanced job prospects outside of academia.

UMD people building UT











Selected publications

F. U. Bernlochner, M. Franco Sevilla, D. J. Robinson, and G. Wormser. Semitauonic b-hadron decays: A lepton flavor universality laboratory (2021). Accepted by Review of Modern Physics. arXiv: 2101.08326

CMS Collaboration. Search for Higgsino pair production in pp collisions at s = 13 TeV in final states with large missing transverse momentum and two Higgs bosons decaying via H → bb.
Phys. Rev. D 97 (2018), 032007. arXiv: 1709.04896

G. Ciezarek et al. A Challenge to Lepton Universality in B Meson Decays.
Nature 546 (2017), 227–233. arXiv: 1703.01766

CMS Collaboration. The Phase-2 Upgrade of the CMS Muon Detectors.
CERN-LHCC-2017-012, CMS-TDR-016 (2017). Main editor of Chapter 4 on CSCs.

CMS Collaboration. Search for Supersymmetry in pp Collisions at s = 13 TeV in the Single-Lepton Final State Using the Sum of Masses of Large-Radius Jets.
Phys. Rev. Lett. 119. (2017), 151802. arXiv: 1705.04673

BABAR Collaboration. Evidence for an excess of B → D(*)τν decays.
Phys. Rev. Lett. 109 (2012), 101802. arXiv: 1205.5442