Tag Archives: Particle Physics

Particle Flow Calorimetry

I took a break today from thesis writing to attend the weekly computing seminar, since it had calorimetry in the title, which should supposedly be one of my skills. The slides aren’t posted yet, but whenever they are you can find them here. Mark Thomson was the speaker, who also has a few papers out on the same subject on the web, this one seems to have the most overlap with the talk.

The idea goes a bit like this: If you want to test new physics at a collider, you need to measure collision products very carefully. Specifically, testing electroweak sector models (all this higgs hubub) requires identifying and differentiating Z and W bosons real well. Single production Z and W’s both decay in to quark-antiquark pairs. We can’t see quarks in detectors directly, but we observe sprays of particles called jets which have basically the same energy and direction, etc. So they want to measure jets real well at the proposed and under-development International Linear Collider (ILC), and the main technique to do this that they are trying to develop is called Particle Flow Calorimetry (PFC).

Continue reading


The end of the affair….

I’m part of a collaboration that built and operated a particle detector at a collider which finished taking data in July of last year. I wasn’t here to build it by a long shot, but I did run and repair part of it during last two years of data taking. Now I’m only analysing data for my thesis, not doing any hardware work anymore, and I’m finding its a mixed blessing. Continue reading

A Non-Technical Explanation of Flavo(u)r Physics

The extra “u” is for Helen. 

 I figured that with all of Homer’s excited postings about the new physics CKM fit, the blog could benefit with a too simple introduction to why flavor physics is interesting and important. The standard model of particle physics has 6 known types, called flavors, of quarks with somewhat fanciful names: up, down, charm, strange, top and bottom. Now, the structure of the standard model is that these quarks come in what’s known as “generations,” eg. up/down, charm/strange, top/bottom are the three generations of quarks. Because of this generational structure, an up quark really “wants to” interact with a down quark (and a W boson), the charm quark really wants to interact with a strange, etc. However, there’s also a small chance that an up quark will interact with a strange quark or a bottom quark (and a W boson). In fact any of the up type quarks (up, charm, top) can interact with any of the down type quarks (down, strange, bottom), although the dominant interaction is within generations (it is still a mystery why this is, however). The relative strengths of these interactions is parametrized, in the standard model, by what’s known as the CKM (Cabibbo-Kobayashi-Maskawa) matrix.   Now, it turns out that you can specify the CKM matrix with three real angles and one complex phase. The fact that part of the matrix is complex is extremely important (and wouldn’t happen, for example, if there were only two generations). What it means in nature is that certain processes happen at different rates than their “mirror” images, where the definition of “mirror” is slightly technical. For example, a positively charged pion decaying to a positively charged muon and a muon neutrino is the mirror process of a negatively charged pion decaying to a negatively charged muon and a muon antineutrino. This complex phase in the CKM matrix predicts that the rate for these two mirror decay processes is slightly different! This phenomenon, of nature being non-mirror symmetric, is known as CP violation, and rocked the physics world when it was discovered experimentally in the 60’s. In the standard model, all measured CP violation is thought to come from the complex phase in the CKM matrix.  

Now, there are many good reasons to think that at energies that will soon be obtained by the new LHC (Large Hadron Collider), we will see new physics and particles beyond the standard model. In most extensions of the standard model, the CKM matrix has to be made a bit larger to incorporate the new particles. In general, though, when we do this, we also must add in new complex phases to the extended CKM matrix. For example, if we do the stupidest possible extension–just adding another, more massive, generation of quarks (and leptons), the extended CKM matrix now has 3 complex phases. These new complex phases will contribute to CP violating observables; the CP violation in nature will be more than is predicted by just the standard model. In basically any viable extension of the standard model, say supersymmetry, there are new complex phases running around everywhere, contributing to CP violation. In fact, many models can be ruled out experimentally precisely because we have not seen the amount of CP violation predicted by their orgy of complex phases.  

The name of the experimental game is this: you measure a bunch of CP violating observables, as many as you can get your hands on. You take your experimental CP violating data and assume that all CP violation came from the standard model only.  Using this, you can translate your experimental result into a statement about some part of the CKM matrix. You can translate this result, your measurement of some part of the CKM matrix, into a nice picture called the unitarity triangle. If you do an experiment that measures some decay, you can maybe translate that result into a statement of where the apex of the triangle sits in the picture. If you do a measurement of some other decay, and translate it into a result about the apex of the triangle using the standard model, well then you should get the same place that you got with the first experiment. If you do forty different experiments that measure the apex, if the standard model is correct, they should all tell you that the apex is in the same place. However, if you a forty first experiment, and it tells you that the apex is somewhere wildly different in the picture, it would have told you that either you made some mistake, or that your assumption, that the standard model is the only source of CP violation, was incorrect. It has long been the hope that by making various measurements on the triangle in this way, some inconsistent result would be a signal of physics beyond the standard model. Until now, unitarity triangle measurements have been maddeningly consistent with the amount of CP violation predicted by the standard model alone.  

 Until now? The paper that Homer keeps talking about claims that there is a likelyhood of new physics based on averaging various data sets of triangle measurements. I haven’t talked to anybody in the field who knows if this result is really exciting, but I think I’ve heard before that there have been whispers of new physics from the flavor community before, only to have some mistakes realized and the standard model endure. So I’m taking a cautiously skeptical attitude until the numbers get a bit better than three sigma. Could be exciting though…