Finite Element Modeling and Simulation

The AtC (Atoms-to-Continuum) package is a plugin for LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) designed and written at Sandia National Lab. I am currently working on adding new functionality to the software, as well as improving its overall runtime performance. For instance, my team and I are giving AtC the ability to use more complex mesh geometries (such as heterogeneous tetrahedra), the ability to make approximations using quadratic and and higher-order interpolation functions, and adding spatial-partitioning methods for parallelization.

YouTube Personalized Recommendations

YouTube spends a lot of time and energy providing recommendations to users on what videos they might like to watch and what channels they might want to subscribe to. Traditionally, these recommendations are based on each user's individual watch history and set of subscriptions. However, new users don't have this library of information available. I worked on implementing a system to improve "cold start" recommendations for brand new users. My improvements resulted in a significant increase in click-through rate in the suggestions pane.

Complex Networks in Twitter

During my semester abroad at AIT Budapest, I took a course on the Structure and Dynamics of Complex Networks as well as a pair of half-courses on Data Mining. These classes culminated in a project in which I conducted extensive data mining on retweet patterns in Twitter. I showed that the basic characteristics of the information-spread networks can be used to classify individual tweets as being personal, newsworthy, or regarding popular culture. In the process, I also defined a new measured that I called "flux centrality", which is related to the statistical probability that a particle flowing through the network would pass through a given edge. The flux centrality of a simple lattice with a source and sink in opposite corners is shown at left.

Finite Element Model Construction

LaGriT (the Los Alamos Grid Toolbox) is an application for the creation of large and complex Finite Element meshes. While the creation of a simple rectangular / octahedral mesh is relatively simple, advanced applications require that certain parts of the mesh fall along pre-determined surfaces, such as the earthquake fault shown at right. Generating a "nice" triangulation of such complex geometries is an NP-complete problem that is the subject of much active study. I spent three months adding features to the LaGriT package. My two primary contributions were adding the capability to stretch and deform a mesh to fill a volume, and the ability to insert a 2D surface into a 3D mesh, preserving the triangles of the surface as faces of tetrahedra in the final mesh.

Platitude

Platitude was the term-long, class-wide project of my Large-scale Software Development class, taught by visiting professor Alexandre Francois. The game was a hybrid platformer / tower-defense experience, where 6 players (the platformers) tried to cross an expansive level while 1 player (the defender) built towers and other obstacles to stop them. The game, including its music, networking, and physics engines, was built from scratch on top of GLuT and the SAI/MFSM concurrency framework.

Visual Identification of Arthropods

The BugID project is focused on the joint development of robot image capture techniques and advanced computer vision algorithms for automated identification of arthropods (aquatic insects, soil mesofauna, and zooplankton). The final goal of the project is to develop self-contained systems that could conduct population and biodiversity surveys in the field. I assisted in the design, implementation, and testing of feature detection algorithms. I also contributed improvements to the manual classification user interface and the robotic camera control module.

Text Clouds

This simple text cloud page was the final project for the first introductory computer science course I took at Harvey Mudd. Implemented entirely in Python, it crawls a website following links to a depth of 5, extracts human-readable text from the HTML, stems the words, and displays text clouds based on the relative frequencies of words found in the final corpus. Try it on a book at Project Gutenberg for some fun results.