Office: 3245 BPS
Department of Physics and
Michigan State University
3245 Biomedical Physical Sciences
East Lansing, MI 48824-2320
Currently, I'm teaching Physics 215,
an introduction to modern (20th century) physics for majors and others who've
taken the introductory calculus-based sequence. In the past, I've also taught
introductory physics for science majors (using algebra and trigonometry),
the corresponding laboratory courses 251 and 252; and for majors and engineers, the lab courses 191 and 192 which are an
introduction to quantitative measurement for physics majors, engineers, and
other scientists. I've also taught
an introduction to particle physics; electronics; advanced
laboratory; and graduate-level statistical data analysis.
I'm looking forward to teaching a graduate-level introduction to
particle astrophysics. If you're curious why I spend more time
teaching elementary courses than advanced ones--the reason is simple:
there are a lot more students in beginning courses than there are
physics majors or PhD students.
I'm an experimenter in High Energy Physics and Astroparticle Physics. I am currently spending my time on the HAWC experiment. HAWC does gamma ray astronomy with photons of an energy of 1 TeV or more. Only rather exotic astronomical objects can generate such high energies, such as blazars (radiating supermassive black holes at the center of other galaxies) or remnants of supernova explosions such as radiation associated with highly magnetized neutron stars. Our sun, as far is 1 TeV light is concerned, is more notable as an abosorber than a source. So our key aim is to extend the catalog of these rare sources (less than 200 known so far). We also look for violent transient phenomena, such as gamma ray bursts, and search for even more exotic primordial black hole decays, which might reveal themselves through never-before-observed Hawking Radiation.
I was in charge of the electronics of HAWC during its construction
phase, and involved in its remote monitoring. We much prefer to
operate and monitor it from the comfort of our offices, as it's at 13,500 feet
altitude in Mexico, nestled between a nice quiet pair of volcanos. Milagro, a predecessor experiment, was sited on the enormous ring-shaped
rim of an enormous volcano
in New Mexico, but "only" at 8650 feet altitude. Going higher lets
us "see" more of the left-overs (hitting our detector at ground
level) from the shower of particles which result when these very energetic
cosmic rays interact high in our atmosphere (starting about 20,000 feet, so
above where airplanes fly). To
build the detector, we had to truck up the mountain enough purified
water to give every man woman and child in Mexico a bottle. The
water is the W in HAWC; HA is for High Altitude, and C is for Cherenkov light, which is what we detect in the water when the showers from cosmic rays hit our detector.
My current graduate student, Sam Marinelli, has greatly improved the accuracy of HAWC's measurements of the energy of these high-energy photons, and he's preparing to compare the energy distributions emitted by different kinds of astronomical sources. My previous student Udara Abeysekara wrote firmware (detailed hardware instructions) for the HAWC timing and GPS control system, and studied the relationship between light emitted directly from a neutron star, and from the nebula surrounding it. I've been involved in studies to see whether we might be able to use Hawking Radiation as an accelerator to produce supersymmertric particles.
For much of my time at MSU, my main effort has been in the D0 experiment, a collider experiment at the Fermilab Tevatron. I often work intensively with computers, particularly in the area of triggering. A hadron collider can produce as many as a million proton-antiproton collisions in a second. Since our apparatus records more than 1/4 Megabyte of information about each interaction, they can't be all recorded and analyzed in detail. "Triggering" the apparatus is deciding which of these interactions is interesting enough to keep. In our previous data run, we recorded 2 events per second; now we're up to 100 per second. The rejection of "less interesting" events takes place in 3 levels of triggering. The first level lasts a few microseconds (millionths of a second), and requires sophisticated custom hardware. Luckily, our electronics shop and its talented designers (Dan Edmunds and Philippe Laurens) specialize in just this sort of thing, and MSU built a large fraction of the trigger for D0. The second level of triggering is more relaxed in pace, allowing up to 100-1000 microseconds per event. This usually is carried out in a mixture of specialized hardware and carefully written software. The third level takes place in more conventional computers, albeit with strict requirements on memory usage and speed. Here the time scale is 50-200 milliseconds per event for a decision.
In our previous data run, I was in charge of the physics algorithms for the third level trigger selecting events online: we had 200 milliseconds to decide which of the 2% of events to keep. Our programs ran in an array of 48 microcomputers, and processed more than 109 events. Looking for a further challenge, I moved up to the second level trigger for the next run (2000-2003), and had a hand in examining nearly 1012 events. MSU postdocs wrote code (in C++, with a specially modified version of the Linux operating system) for special versions of PC processors to control data flow and carry out the triggering decisions. The level 2 trigger is the first opportunity to carefully match up information from different parts of the apparatus. This allows us to verify whether a set of signals is really consistent with, for example, the track of a high energy electron, which would be a good tipoff that this event is interesting and unusual. Real trigger decisions will involve several such conditions, in up to 4096 different combinations (luckily, we don't go much beyond 600, about one per collaborator!).
For the last couple of years, I've been coordinator for D0's databases, ranging from the database we use to keep track of how all those triggers were set up for each data run we've taken, to a 1 terabyte database we use to keep track minute by minute of the intensity of the particle beams the Fermilab accelerator complex has provided to us. That's data equivalent to several hundred movie DVD's worth of data captured since the start of this run in 2001.
In the course of this work, I've also developed interests in software engineering, the art of making reliable programs to do physics. I have also developed a fascination with statistics, the basis for many of the techniques we use both to tell signal from background events, and to analyze our data. Since the interactions we measure are directly governed by quantum mechanics, and since we often seek rare occurrences, we almost always use statistical techniques to measure rates, test theories, and set limits on occurrences of rare processes. Dennis Gilliland and Raoul LePage of the MSU statistics department are among the people I've worked with on these problems. My picture above was taken in Canada when my family joined me after a statistics workhop.
I've been involved with several different analyses of D0 data. I had the great pleasure and excitement of participating in the discovery of the top quark in 1995. With my first thesis student, we've measured the Drell-Yan process, a rare QCD process producing pairs of electrons from proton-antiproton collisions (MSU theorist C.P. Yuan, a member of the CTEQ phenomenology collaboration, is one of the world experts in the theory of this process) My next students, searched for supersymmetric (SUSY) particles, in particular, squarks and gluinos decaying into electrons or muons; here are the latest ATLAS SUSY results. We applied a systematic framework for interpreting the results to set limits on the possible parameters to supersymmetric models, which would extend and unify the standard model of particle physics. We also tried searches for deviations from expectations without restricting it to a specific model. The technique is quite thorough, and does many searches in parallel, though it does give up a bit of sensitivity compared to specific targeted searches. You can see some of the physics highlights from the D0 physics run.
In 2003, I spent a 7-month sabbatical leave in New Mexico at Los Alamos. As a result, I joined the Milagro experiment, which searches for cosmic rays, high enegy particles from space which hit the earth constantly. The particular kind we'll be looking for come in the form of extremely high energy photons . TeV gamma rays is the techical jargon --particles of light packing more energy than one of the protons in the Fermilab TeVatron collider beams I usually work with. These photon cosmic rays are a rarer by a factor of a thousand than the usual cosmic rays of the same energy, most of which are protons, not photons. But the photons are interesting, because their direction isn't confused by magnetic fields in the galaxy, and they can point back to interesting astrophysical sources. And there's a good professional challenge in using statistical techniques to separate out these interesting signals. In 2007, my graduate student Aous Abdo finished a thesis in which he improved the overall sensitivity of Milagro by a factor of two: it was like suddenly having four times as much data (that's not a typo, but how two becomes four is a tale of Poisson statistical fluctuations!). In spring of 2008, data taking (but not data analysis) on Milagro stopped and we started work on HAWC, a successor experiment with over ten times the sensitivity.
From 2010-2014, I worked on the next-generation collider in Europe with the Atlas experiment, at the CERN
LHC accelerator complex in Geneva Switzerland, where I did my
postdoctoral work. The LHC is the highest-energy accelerator in the
world for at least the next decade. I coordinated simulations of the
upgrades to the Atlas calorimeter trigger which was installed in 2014
and is part of the data taking at the full LHC energy (13 TeV collision
energy). A student of mine, James Koll, in his thesis did measurements
on production of top quarks at the LHC.
CERN, by the way, is the particle physics lab where the World Wide Web was invented as a tool to help communication within international collaborations to do particle physics, such as D0 and Atlas. Now that's what I call a spinoff!
Besides time with my family, I find music, especially jazz , a great pleasure. I serve on the board of Happendance, a local professional modern dance troupe with an excellent company school. Happendance not only performs for the public, but also has an active outreach program, performing and teaching motion in schools.
Updated November 26, 2015