Hey! I'm a second year computer science student at Boston University. Over the past
two years I've enjoyed doing CPU parallelization research at Harvard, traveling as far as
Montreal to compete in hackathons, and building myself cool tech gadgets in my spare time.
Last summer I interned with Fidelity Investments doing back-end software engineering. This summer I am interning with Wayfair.
I am currently seeking part-time work opportunities for Fall 2018. Shoot me a message over email or LinkedIn to get in touch!
Automatically Scalable Computation (abbreviated ASC) is an ongoing research project between Harvard and Boston University that
explores a new method of CPU parallelization. Rather than the traditional approach of compiling sequential programs into parallel programs
at runtime, CPUs utilizing ASC execute sequential programs on a single CPU core and use any remaining CPU resources to predict and proactively
execute future parts of the sequential program.
I developed code for ASC in my senior year of high school that parses and visualizes data generated by CPUs at execution time. My parsed data was used for efficiency analysis by other members of the project. I wrote my code using a Python 3.4 distribution with Numpy and SciPy packages, and also published a 32-page thesis on Amazon describing my work.
Snappeshop was an award-winning hackathon entry at BU Local Hack Day 2016. It is a mobile app that allows
a user to snap a picture of an item they want to buy and receive four links to that item at different price points
I and five other CS students developed this project in 12 hours. Snappeshop's frontend is written in Android Studio and its backend is written in Python 3.5. It uses IBM's Watson API for image recognition, the eBay Shopping API for product discovery, and Heroku cloud services for deployment.
I wrote the code that interacts with IBM Watson and eBay to determine what product to search for on eBay and which four price points to show the user.
Over the summer of 2016, I designed and built a computer-powered magic mirror that displays live information
such as the date, time, upcoming calendar events, weather forecasts, and news headlines.
I made it out of a Raspberry Pi, an LCD monitor, a 2-way mirror, and a pinewood enclosure that I designed and assembled myself. The enclosure is 16" x 26.25" with ventilation holes on its top and bottom sides. The mirror uses a customized version of Michael Teeuw's MagicMirror2 software on Raspbian to display its live information.
The Fenway Victory Gardens are the oldest continuously-operating WWII Victory Gardens in the US.
Across Fall 2016 and Spring 2017, The Global App Initiative at BU developed Android
and iOS companion apps for the gardens. The goal of these apps were to keep gardeners informed about
relevant events in their area and maintain a comprehensive list of the resources in the Victory Gardens.
I was the programming lead for the iOS development team of this project (about 10 members). The app is now complete and more information about it can be found in our GitHub repository.
I and a classmate from high school programmed a physics-based dueling game inspired by the training sequences
in Orson Scott Card's "Ender's Game." In it, a player and a computer-controlled bot shoot projectiles at each
other in a two-dimensional zero-gravity arena. The duelists are not able to directly affect their positions in the arena; instead, they can only move themselves by
generating momentum. Momentum is generated through firing projectiles and shooting magnetic grappling hooks
at arena terrain.
I implemented several features of the game including its navigation menu, basic projectile physics, and grappling hook. The game was written in GML 1.4.
I and a classmate from high school programmed a chatbot for the popular communication service Discord,
nicknamed "Doran's Bot." Our bot fetches and displays live information about League of Legends games,
such as profile statistics, champion winrates, and player availability.
The bot was written in Python 3.5 and uses several web APIs to collect its data.
Five out of the six dining plans at Boston University cost the same amount of money to enroll in. After agonizing
over which plan I should choose for about an hour, I gave up and began writing a food tracker app to calculate
which one I should choose. The result was this app, which returns information about the optimal number of meal swipes
and dining dollars to use per day using a certain plan. The app took about a week to code.
It is written in Python 3.5 and uses the datetime module to evaluate differences between dates.
The Social Playlist was a software entry at [email protected] 2017. It is a dynamically-generated music playlist
hosted on the web, designed to solve the age-old problem of "whose music do we play?" The Social Playlist allows
people to queue songs into a collaborative playlist that is accessible on any device with a browser.
Alternatively, the app can use a webcam to gauge the emotional atmosphere in a room and try to play appropriate
music. Multiple playlists can exist at any given time, each organized by a party name and password.
I and three other freshman students developed this project in 24 hours. It uses Meteor for its back end, basic HTML for its front end, plus the YouTube API and Microsoft Cognitive Services API for playlist management and emotion data respectively.
Scavenger Glasses was a hackathon entry at MakeMIT 2017. It is a pair of Raspberry Pi-powered glasses that
plays a scavenger hunt game though a web app hosted on a local network. It features a mounted camera that is used
with an image recognition API to detect whether the wearer of the glasses has found the correct item. Player score
and scavenger hunt tasks are displayed on the web app hosted by the Pi.
I and two other BU students developed this project in 16 hours. It uses Python 3.5 for its back end and Flask for its front end, plus Microsoft Cognitive Services API for image recognition. Also, we 3D printed all of Scavenger Glasses' non-Pi components!
This project was a datathon entry at the Brown Datathon 2017. It predicts whether any given user on tripadvisor.com will
purchase a hotel booking based on their behavior browsing the website. The program uses a neural network to make
its predictions, which was trained and tested using a large .csv file provided by TripAdvisor. In the end, our network
made booking predictions to an accuracy of roughly 82.3%.
I and three other students developed this project in 24 hours. It is written entirely in Matlab and uses the Matlab neural network toolbox. It's interesting to note that because about 21% of users in our given dataset booked hotels, our program's benchmark prediction accuracy was 79%; any degree of accuracy lower than that could always be beaten by guessing "no purchase will be made" for every user. So our 82.3% accuracy was actually a successful result.
I loved video games when I was in 8th grade, so I built myself a gaming PC. This project took about 6 months and
involved setting a build budget, researching components in a spreadsheet, and assembling those components.
I named the computer Marvin, after the existential robot in The Hitchhiker's Guide to the Galaxy.
5 years later, I still maintain and use it. The current specs: Intel i5 2550k (3.4 GHz), GeForce GTX 960, 8 GB RAM.
Laverna was a personal project that I developed in my spare time during my internship in Summer 2017. It is a web app that allows university
students to meet and talk with each other anonymously, through a one-on-one SMS chat channel that resets every 24
hours. Laverna's goal as a platform is to help foster a community at universities that overlooks social presuppositions
frequently caused by race, appearance, and gender.
Laverna is written in Python 3.5. It uses Flask and Socket.IO to display live user information on its landing page and uses the Twilio Programmable SMS API to send SMS.
This project was an award-winning hackathon entry at MoMath Expressions 2017. It visualized several mathematical algorithms
using MoMath's Dynamic Wall, a motorized wall comprised of 128 programmable metal slats.
The algorithms this project visualized include bubble sort, selection sort, insertion sort, a "creased" sine wave (illustrated using the intersection of slats instead of just slat depth),
a "creased" Fourier series convergence, a binary representation of the Fibonacci sequence, and a "creased" smiley face algorithm designed by our team.
I and three other students developed this project in 32 hours. It is written in Processing.
Genie was an award-winning hackathon entry at PennApps XVI. It is a mobile app that securely reads a user's investment portfolio to display account information,
plot relevant security prices, and talk to the user about their finances. Genie's chatbot interface responds to questions made in natural language with security information and risk analyses.
I and three other students developed this project in 36 hours. Genie is written in React Native and its backend server Lamp is written in Express. Lamp is hosted via AWS, uses the Blackrock Aladdin API to perform portfolio analysis, and uses the api.ai NLP API to parse natural language.
BlackRock's Chief Engineer, Jody Kochansky, wrote about Genie in his Medium article 3 "Hacks" That Could Change the World !
Serendipity was a hackathon entry at MIT's Reality, Virtually Hackathon. It is a mobile app that helps people navigate the city in a more intuitive way than top-down-facing maps, by projecting destination markers into the city in augmented reality.
I built Serendipity in a team of four using Unity, MapBox SDK, and ARKit.
Rise was a hackathon entry at [email protected] 2018. It an IoT device with a web interface (pictured) that helps people wake up smoothly in the morning. Instead of loudly and suddenly waking a user from bed, Rise gradually opens their curtains to let in natural light, eventually turning on a light of its own and vibrating until a user wakes up calmly. It is also Alexa-integrated, so by asking Alexa to wake up at a certain time, users can verbally set up Rise.
I and three other Brown students built Rise in 36 hours. We used a Particle Photon board, Flask, and the Alexa skill builder to bring it to life.