
Hi, I’m Philip
Stanford University • B.S. Computer Science • GPA 3.98 • 2023–2027
Experiences

As a Software Developer Intern at the Stanford Shen Laboratory, I work on automation systems for microwave impedance microscopy (MIM) within a multidisciplinary research team combining physics, materials science, and software engineering. My primary focus is developing a particle-filter–based localization and automated scanning pipeline, which increased positioning accuracy by 40% and reduced manual scanning time by up to four days per experiment.
I designed and implemented this system in Python, leveraging tools such as OpenCV, scikit-image, and PyTorch for image processing and model training. The pipeline integrates both probabilistic and CNN-based localization methods, enabling robust alignment even in noisy imaging conditions.
Beyond algorithm development, I built a modular PyQt5 GUI that unifies data processing, live visualization, and hardware communication. This interface now serves as a central control hub for the lab’s microscopy experiments, improving workflow efficiency for over 15 researchers and forming the basis for future automation work.

As a Data Science Intern at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC), I worked under Professor Vahe Petrosian in the multiwavelength astrophysics group, studying emission correlations across different energy bands in active galactic nuclei. My main project focused on analyzing cross-band luminosity relationships to investigate how particle acceleration and jet physics shape broadband emission spectra.
Using Python (NumPy, SciPy, Pandas, Astropy), I processed over 10,000 luminosity samples from SDSS, Fermi-LAT, and VLBI surveys. I applied the Efron–Petrosian method to remove redshift biases from truncated astronomical data and performed correlation analyses (Kendall’s τ, Pearson) to identify statistically significant trends. The results revealed a stronger radio–gamma correlation (PCC = 0.53) compared to radio–optical (PCC = 0.40), consistent with a shared nonthermal origin of emission.
This work contributed to Professor Petrosian’s ongoing research on nonthermal radiation mechanisms and cosmological selection effects in high-energy astrophysics. I also developed visualization and preprocessing pipelines that standardized the analysis workflow for future luminosity correlation studies within KIPAC.

As a Machine Learning Intern at SLAC National Accelerator Laboratory, I worked with Dr. Julia Gonski in the Particle Physics and Artificial Intelligence Group on permutation-invariant anomaly detection for identifying beyond-the-Standard-Model particle-tracking signatures. Our project aimed to uncover subtle, previously unmodeled collision patterns in large-scale simulation datasets, helping expand the sensitivity of collider experiments to new physics phenomena.
I benchmarked a permutation-invariant anomaly detection model in Python (scikit-learn, NumPy) across more than one million simulated particle events, achieving AUC = 0.98 and outperforming supervised baselines such as the Particle Flow Network. The model’s architecture, designed to preserve event-level symmetries, demonstrated that unsupervised methods can rival fully labeled approaches in detecting exotic event topologies.
Working closely with Dr. Gonski, I also identified and diagnosed oversampling artifacts in pile-up simulations that caused model performance to collapse (AUC = 0.49). Tracing the problem to statistical imbalance, I helped design higher-statistics datasets and improved the preprocessing pipeline, laying the groundwork for future anomaly-detection research in high-energy physics.

As a Research Intern with the UCLA Nuclear Physics Group, I worked under Professor Huan Zhong Huang and Associate Researcher Gang Wang on the study of high-energy particle collisions at the Large Hadron Collider (LHC) and Relativistic Heavy Ion Collider (RHIC). My primary project focused on analyzing particle spectra and nuclear modification factors (RAA) to uncover trends in parton energy loss within the quark–gluon plasma.
Using Python and ROOT, I processed large datasets from the ALICE and STAR experiments, extracting transverse-momentum spectra and investigating centrality-dependent suppression patterns across multiple collision energies. The analysis revealed contrasting energy-loss behaviors between RHIC and LHC systems, providing new insight into medium-induced parton interactions and their scaling with system size.
This research culminated in a peer-reviewed publication, “Contrasting Features of Parton Energy Loss in Heavy-Ion Collisions at RHIC and LHC.” My contributions supported the paper’s quantitative findings and advanced ongoing efforts to characterize the properties of the quark–gluon plasma through cross-experiment comparisons.
Projects
Stack: XGBoost, TensorFlow, Next.js, TypeScript, PostgreSQL
Built a novel heirarchical esports analytics platform for Overwatch 2 and Marvel Rivals used proprietarily by professional teams in both games. Processes time series game data for advanced statistics such as ultimate usage win rates, first death classifications, and character matchup spreads using an ML pipeline using XGBoost gradient boosting and TensorFlow neural networks. Optimized PostgreSQL database berformance, enabling Python ETL pipelines to process 17,000+ data points per game, cutting down manual analysis time by 70%.
Stack: PyQt5, Sci-kit Image, Open CV, PyVisa, Matplotlib
A modular PyQt5 GUI unifying hardware control into a single interface. Generated confidence-weighted scanner movement using a particle filter method, enabling automated navigation across microscopy samples. Developed high-performance visualization using pyqtgraph for live streaming data and matplotlib for embedded plots.
Stack: Flask, Gemini 2.0 Flash, ElevenLabs, React, Vite, FFmpeg
3rd Place– AGI House Google AI Build Weekend Hackathon for developing an AI poker commentator with real-time video analysis and synchronized speech. Built a Flask backend using Gemini 2.0 Flash and ElevenLabs, achieving 0-delay playback from 10–15 s lag. Developed a React + Vite frontend and cached FFmpeg pipeline for responsive commentary.
Resume
Grab a copy of my latest resume.
About

Hi, I’m Philip Suh
I’m Philip and I am currently a computer science student at Stanford with a background in Physics, passionate about building intelligent systems that bridge automation, data, and human insight.
My work combines software engineering with computer vision and scientific computation. I’ve developed particle-filter-based localization pipelines that use contrast analysis, noise subtraction, and morphological filtering to track microscopic features with physical-unit precision. These systems accelerate imaging workflows and bring reproducibility to experimental research.
Outside the lab, I enjoy creating side projects that merge creativity with engineering from PokerSync, an AI poker commentator, to a voice-controlled interface for video and streaming services, and even esports analytics tools for competitive play. I like building projects that feel both novel and useful, exploring how human–computer interaction can make technology more intuitive and expressive.