Speaker
Description
The Large Hadron Collider (LHC) at CERN is one of our most powerful tools to probe the fundamental particles of nature and their interactions. By colliding protons at extremely high energies (13 TeV centre of mass), the LHC can probe conditions of the early universe just after the Big Bang. Particle detectors, such as the Compact Muon Solenoid (CMS) experiment are designed to reconstruct the proton collisions from the complex system of particles that are produced in such collisions and recreate the fundamental interaction that
occurred. Detectors like CMS are capable of recording huge quantities of data to do this and as experimental particle physicists, our job is to analyse these data to determine whether or not some new particle (like the Higgs boson) or process can be seen in the data. Producing and collecting data at the LHC is an expensive task, so making the most out of the data we have is vital in our field. In this talk, I will discuss the way we analyse data from experiments like CMS and how analysis of data led to the discovery of the Higgs boson. I will discuss techniques from the fields of machine learning and data science that have been used to analyse our data and new methods being proposed to analyse data from CMS and potentially discover new physics in future data taking runs of the LHC.