Skip to content

Software Projects

Vendoor

Touchless Storefront Solution with Hand Tracking, Mobile AR

July 2020

I led a team (Gabriel Santa-Maria, Kavya Akash Bakshi, Cesar de Castro, and myself) to develop a touchless walk-up storefront solution powered by 3D hand tracking and mobile augmented reality, in the process winning Runner-Up in Ultraleap’s Beyond Touchscreens dev competition.

The COVID-19 pandemic instantly made millions of people wary of the health risks presented by shared public touchscreens. In the US, offline commerce still accounts for almost 90% of overall retail sales, and many retail businesses have struggled to adapt to the “new normal.”

For Vendoor, we used physical metaphors to deliver a fun, differentiated in-person shopping experience for the COVID-19 era, visualized through a demo built for a local ice cream shop.


HyperViz

Virtual Reality Medical Visualization with Hand Tracking

January 2020

I recruited and led a team (Eric Tao, Gabriel Santa-Maria, Beste Aydin, and myself) at the 2020 MIT Reality Hack to build a virtual reality medical visualization platform, designed around a hand-tracking interface.

Every year, over 80 million CT scans are performed in the U.S. alone. Adding in MRI and other types of imaging, the number is even higher. This data is fundamentally in 3D, but is instead consumed by physicians as 2D, black and white images. To understand complex internal structures, doctors must flip through dozens or even hundreds of images in order to create a mental 3D reconstruction.

HyperViz processes 2D images like CT scans into 3D models, then visualize them in a VR environment, integrated with a photogrammetric scan of the patient. HyperViz segments the anatomical data to show different types of tissue, like skin, soft tissue, and bone.

More information at Devpost.


ASA-VR

Virtual Reality Haptic Feedback System for First Responders

April-September 2019

I recruited and led a team (Cesar de Castro, Andreas Dias, Shaunak Bakshi, and myself) to enter the U.S. NIST (National Institute of Standards and Technology) PSCR (Public Safety Communications Research Division) 2019 Haptic Interfaces for Public Safety Challenge. We were selected as one of eight finalist Haptic Development Teams nationwide, through the final VR phase of the challenge, and presented our work and ran demos at the 2019 Public Safety Broadband Stakeholder Meeting in Chicago.

We developed a software solution and haptic interface to aid first responders – law enforcement, firefighters, and EMS – in NIST PSCR’s virtual reality simulated environments. Our haptic interface provides navigation assistance (to allow firefighters to reach victims in a smoke-filled office environment), targeting assistance (to indicate the location of assailants to SWAT team members in an underground parking garage), and data on patient vital signs (for EMS conducting triage after a highway car crash).

On the hardware front, we partnered with Contact CI, a startup that’s developed haptic gloves (including both finger-tip feedback and “exotendons” that allow for force perception), to develop a customized haptic feedback glove for the Haptics Challenge and NIST PSCR’s Unreal Engine VR environments.


Bright

Augmented Reality Solution for the Visually Impaired

January 2019

I recruited and led a team (Cesar de Castro, Andreas Dias, Jan Simson, Charlene Yu, and myself) at the Reality Virtually 2019 AR/VR Hackathon at the MIT Media Lab to build an augmented reality solution for the visually impaired. We won the overall Best in AR grand prize, and the Best AR/VR in Health category prize.

According to the WHO, over 250 million people around the world suffer from moderate to severe vision loss, 81% of whom are above the age of 50. Many face significant challenges in daily life. Bright is a seed to make faces familiar again and everyday challenges manageable.

Bright runs as an app on Microsoft HoloLens and allows users to zoom in to a customizable level, hear written text (e.g. books, newspapers, documents, TV) as spoken speech, recognize other people nearby (giving names for stored people, and age/gender/emotion estimates for others), and contact others in the case of an emergency.

More information at Devpost.

I presented a paper based on this project at the IEEE GEM conference at Yale University in June 2019.


Tiny Media Converter

Photo / Video Converter for TinyDuino

July-August 2016

In the summer of 2016, I was working on Project Halo, a smart ring with which I was planning to propose to my now wife. The device was built around an Arduino-compatible TinyCircuits TinyScreen+, whose ARM processor was powerful enough to process images and video, but not in standard formats typically used for such media.

I wrote a desktop (WPF) app that batch-converts photos, videos, and Microsoft Living Images to a 16-bit RGB-565 (TSV) format viewable on the TinyScreen+.

Download app / source code from GitHub.


Marengo

Peer-to-Peer Interview Prep Marketplace

2014-2015

One of the best ways to prepare for an interview is to learn from someone who went through the same process recently.

To put this into action, I teamed up with Rob Li and Rachel Hu to co-found Marengo Inc., which aimed to connect job and school applicants with peer advisors for on-demand interview prep over live video chat. The system included built-in matching and scheduling, in-video annotation, recording, notifications, and more.

I created mockups for preliminary user trials and then a detailed spec to guide a team of developers. Eventually, we created a working prototype of the full system.


TA Maps

Multi-Source Global Mapping

2010-2012

When I moved to Mumbai in 2010 to work at Mahindra, I realized that no single mapping app presented a suitable map search and navigation experience, relative to what I was used to back home in the U.S.

So I developed TechAutos Maps, which integrated multiple map data sources, such as Google Maps and OpenStreetMaps, for optimal global coverage, and supported mixing different map layers, point of interest search, directions (with driving mode), and more.

By early 2011, TA Maps became the most popular third-party mapping / navigation app for Microsoft’s Windows Phone 7 OS.


Clinifox

Clinical Trial Management System

2009-2010

After witnessing the frustration of doctors who conducted clinical trials, I realized there could be a better solution. Existing clinical trial management systems (CTMS) were clunky, difficult to use, required local server infrastructure and IT manpower, and above all hindered, rather than facilitated, the offices’ work.

My solution was a cloud-hosted rich web app aimed at making the day-to-day operations of a clinical trial site not just more efficient but even fun. After detailed user interviews, observation, and testing, I envisioned a CTMS that in some ways more closely resembled a social networking site than the ’80s-esque solutions of the time.

I created Clinifox, LLC., developed a detailed technical spec, secured initial funding, and worked with developers to create a prototype of the system for a pilot trial.


Peek Desktop

Mobile Device Update Infrastructure

2008-2009

During my time working at Peek, the Peekux operating system on our mobile email device (TIME Gadget of the Year 2008) could initially be updated only via a delicate set of low-level tools, which were not made available to consumers at all. Loading new beta OS updates to test was quite a hassle, so I decided to build a desktop app to make downloading and installing Peekux updates easier for myself.

The company decided to ship my tool to users, so I completed the end-to-end update infrastructure by building a server back-end and a basic update manifest creation tool. I also created an “app store” for customized system content (e.g. email tones), though this part was not included in the consumer release.


Peek Auto Emailer

Email Infrastructure Stress Testing Tool

2008

The seamless Peek email experience resulted from lots of custom functionality, both in the Peek server infrastructure and onboard the device.  With every update, we naturally conducted lots of testing (sending emails of different types from/to different account types), but it was difficult to manually conduct tests that regularly and adequately stressed all parts of the system.

I decided to build a tool that performed automated batch sending of test emails of various types at a programmed cadence. The tool also included features like SMTP Discover (automated discovery of outbound server settings for email accounts).


InfusionCalc

Image Processing for Brain Tumor Drug Delivery

2007

As part of my research at a Yale biomedical engineering lab, I helped develop a microinjection system to deliver drug-containing biodegradable nanoparticles to the site of a brain tumor. Targeted treatment would require being able to modulate the location, shape, and size of the delivery site, so we varied certain injection parameters and test-injected pieces of gel that approximated human brain tissue, to see what sort of injection we got.

However, the injection evaluation method entailed freezing the gel, manually slicing the frozen sample into hundreds of slices, mounting each slice onto a microscope slide, feeding those into a slide camera, and then computationally processing the layers into a 3D depiction. This took several months for each sample, so the process of matching injection parameters with outputs was glacially slow.

I realized most of the injection shapes were rotationally symmetrical, so I took UV images of my test injections from the side and built a tool that computationally estimated various properties of the injection shape — turning a multi-month process into a five-minute one (albeit with a bit less accuracy).


BotWare

Semi-Autonomous Robot Control System

2003-Present

Since 2003, I’ve been developing successive generations of a relatively compact semi-autonomous robot, featuring on-board PCs, various localization sensors, cameras, Wi-Fi, and more. [more details on my hardware projects page]

The software side of the project entails on-board software for the robots (running on both the x86 PC and the microcontroller(s)) and software for a remote base station, enabling basic path planning / obstacle avoidance and teleoperation (via Xbox controller).


TechConnect

Multi-Engine Web Browser

2003-2006

In 2003, Mozilla Firefox was a newcomer to the browser market that brought some interesting features but lacked compatibility with a number of websites, while Microsoft’s Internet Explorer was essentially snoozing between version IE 6 (2001) and IE 7 (2006).

I decided to create a browser that used both Microsoft’s Trident and Mozilla’s Gecko rendering engines for optimal performance and compatibility. TechConnect featured tabbed browsing, customizable toolbars, an RSS news client, download manager, popup blocker, malware scanner, media player, instant messaging client (AIM + MSN), HTML editor, and more.

My aim was to create a single application that could handle a good chunk of my personal internet workflows, and I ended up using TechConnect on a daily basis for several years.