Hunting Neutrinos with Big Detectors and Super-Fast Computers
Coding Particle Physicist

About


That's me! in front of DCTPC prototype detector at MIT. DCTPC was a small prototype for a directional fast neutron detection, and it was a great fun hardware project. You can read more about it here.

I am a high energy physicist working in the field of neutrino experiment. Neutrinos are elementary particles that only participate in "weak interactions," which means they can pass through matter (like you or even the entire earth) easily undetected. But usually there are a lot of them.

To see neutrinos we work with a big, BIG, and even BIGGER detectors. It is an exciting job to build and operate these detectors using cutting-edge technologies. An even more fun challenge is to find a rare signal in big chunk of data. I enjoy making these things work in general, and in particular the software aspect of the effort.

I am currently involved in MicroBooNE experiment as an associate staff scientist member in the Elementary Particle Physics (EPP) division at SLAC national accelerator laboratory. Previously I worked on the same experiment was a post-doctoral research scientist at Nevis Laboratories at Columbia University. Before MicroBooNE, I worked on Double Chooz during my Ph.D at MIT, and KamLAND while I was at U.C. Berkeley for undergraduate studies.

My current focus is to apply a machine learning technique, in particular deep neural networks to perform data recontruction tasks and physics analysis.

Here is my CV, and a list of publications. I also put a list of software development effort I have contributed as a lead code developer.

Research


MicroBooNE

I am currently a member of the neutrino group under Elementary Particle Physics (EPP) division at SLAC national accelerator laboratory. My main research focus is at Deep Underground Neutrino Experiment (DUNE) and Short Baseline Neutrino Program (SBN) , in particular MicroBooNE experiment and ICARUS experiments.

We use a liquid argon time projection chamber (LArTPC) to detect electron neutrinos produced by booster neutrino beamline, an accelearator complex at Fermi National Laboratory. The physics objective of MicroBooNE is to investigate the nature of excess electron neutrino events observed by MiniBooNE experiment, which is hard to explain by our current understanding of neutrinos.

LArTPC is essentially a "camera" which produce essentially images of charge particles traveling in the detector. This detailed image data can tell us more about the physics nature of observed excess, and shed light to its implication to our understanding about this elusive elementary particle.



Event Reconstruction

Though the LArTPC detector provides us great details of particle interactions inside the detector, interpreting the full details is a non-trivial task. In order to extract high level physics information, we must "reconstruct" each neutrino event first, which means we must combine multiple 2D projections from the same event and understand 3D topology and calorimetric nature of particles. The movie shown above is a result of running the WireCell reconstruction tool, which operates very complicated hand-written algorithms to generate this stunning view. However this step takes a great deal of computation time, and clustering of dots and recognition of particles are yet to be performed.



Deep Learning

A trained deep neural network (AlexNet + Faster-RCNN) detecting the local region (red box) in which it thinks a neutrino interaction exists. The yellow box is the right answer drawn by us, and is hidden when the network analyze this image.




I am attacking this event reconstruction and physics analysis challenge from a different perspective using machine learning algorithms. In particular I am interested in applying Deep Learning, the state-of-the-art machine learning algorithm in the field of artificial intelligence and computer vision, to analyze LArTPC image data. Deep learning has found a vast number of applications ranging from automated human face recognition, real-time object detection for self-driving cars, teaching a robot Chinese, and even playing Go. So I thought it is time to do this in physics!

In MicroBooNE we published a paper in which we report the first application of the technique to LArTPC image data in the field. The result clearly showed that computers "learned" useful features in a LArTPC image to perform a given task.

I found DeepLearnPhysics group, a small collaboration of theoretical and experimental physicists to develop machine learning for our research. My main interest is a development of high quality data reconstruction chain for LArTPC and other particle imaging detectors.



Work on Electronics

MicroBooNE collects TPC and PMT data from total of 10 readout crates. Our group was responsible for the production of readout electronics boards, installation, and commissioning.


I have co-led the group for development, installation and commissioning of the readout electronics hardware for the MicroBooNE experiment. You can read here for the details of the system. Brief description is given below.

When the accelerator complex notify us the arrival of neutrino beam, the trigger module issues a signal to all readout electronics modules to readout data. There are total of 8256 TPC wires and 64 PMT channels to be readout. For each event, the front-end modules (FEMs) records a TPC waveform for 4.8 milli-seconds, digitized at 2 MHz, from all 8256 channels. The FEMs are equipped with a type of huffman compression algorithm that compress the data by a factor of 5 from the raw rate of 150 MByte/second. The PMT waveforms are digitized at 64 MHz, and hence it requires a further reduction of data rate. Our PMT FEM achieves this by applying zero-suppression after locating a signal region using a constant fraction differentiator logic. These algorithms in PMT and TPC FEMs are programmed in FPGA which realizes extremely fast data processing speed. Collected data from FEMs are sent to data acquisition server using XMIT data transmitter module via optical fiber cable.



Software Development

I have developed a lot of software critical for operationg the whole experiment as well as analyses.

Some examples are large scale data processing framework to handle inter-cluster file transfer protocols, gird job management system for multi-stage data production, and data analysis framework. Most of my software are written in C++/Python and also SQL when needed. See the link above for a list of softwares I developed in MicroBooNE and Double Chooz!

I hold C++/Python software workshop occasionally to better educate students and post-docs (and sometimes senior PIs!) because I believe it is an essential skill that is not taught properly in our field unfortunately. Here's an example from Summer 2015 given to summer students. Feel contact me if you are interested in having one!

Contact Me


If you are looking for me physically, I am usually at SLAC building 84 Room 198.

Or, send an e-mail to kazuhiro "at" slac "dot" stanford "dot" edu.

You can also find my LinkedIn and GitHub account at the bottom and contact me there.