March – July 2017
We were invited by the Imperial College Advanced Hackspace (ICAH) to create an interactive experience to represent the process of ideation. The team came up with a series of machines that transported ping pong balls, representing ideas, through the space. These 7 machines filled a 1600 square foot room during the Imperial Festival.
I was the lead designer on the third machine, 'Bounce'. A project of this scale with a tight timeframe developed my management skills, and I learned a lot about working within a team environment to deliver an installation of this scale.
Continuum will also be shown in the Victoria and Albert Museum at the Late on the 28th July, from 6:30 - 10PM.
May - July 2017
Re~master is a collaboration with Sabina Weiss. We worked to create a seamless user experience for the digital creation of embroidery from handdrawn sketches. The embroidery paths were generated by the software and sent to an embroidery machine.
Key technical skills involved were computer vision (accomplished using OpenCV with Python) to digitise the sketch, and custom graph algorithms for generating the embroidery path from the sketch. This technology is used to preserve and enhance a traditional hand craft.
Re~master was shown in the Royal College of Art Graduate Show 2017.
Autonomous Snack Delivery Android (ASDA)
May - June 2017
For our third year group project, I worked for 6 weeks as part of a team of 8 to develop an autonomous delivery system for snacks within an indoor environment. The robot would autonomously navigate downstairs to the café, place an order using voice, and return upstairs. We accomplished this task, succesfully making our first order in the café in front of several excited onlookers.
My main role in this project was developing the system to push the lift button. This involved computer vision and visual servoing to manipulate the arm. I also designed several peices of mechanical hardware for the project, including a tilting mount for the Kinect. This project allowed me to become familiar with ROS, as well as furthering my computer vision skills.
October 2016 - April 2017
Eurobot is one of the leading student robotics competitions in Europe. As part of my role as Chair of the Imperial College Robotics Society, I pushed for the society to enter the competition for the first time in 3 years. The goal of the competition is to develop a robot that can autonomously navigate around an arena and collect and move various items.
In addition to supporting the project from a management position, I also designed and manufactured the two-axis grabber for the robot and wrote the computer vision algorithm for localisation.
At the PennApps XIII hackathon, I worked in a team of 4 to implement a research algorithm for transmitting data out of an airgapped computer using the RAM bus. Without the computer having any connection to the outside world, we could transmit data at 2 bits per second to a software defined radio (SDR) 20 cm away.
I worked on the receiving algorithm for this project. I pushed the limits of my signal processing knowledge, sending the data received from the SDR through 4 different stages of processing (binning, running average, differentiation, thresholding). This allowed us to differentiate the power levels from the transmission as a digital signal that could then be reconstructed into ASCII.
Winner of Grand Prize at PennApps XIII
At the Fishackathon, supported by the US Department of State, Fu Yong Quah and I created an automated system for measuring the length of fish. By placing a fish upon a printed peice of paper, any smartphone could be used to quickly determine and log the size and a photo of the fish. Quickly sizing fish is an important problem, as fish sizes are used to determine the health of fish stocks.
I worked on the computer vision for this project, including integrating the use of the Chilitags library for scaling the recorded sizes accurately and quickly. We won the local Fishackathon, and our project was then entered into the global competition, which we then also won.
Winner of the Global Fishackathon 2015, Judges Choice
FPGA Passive Autofocus
The final project of 1st year EIE is to create a computer vision based project using an FPGA. After using the provided cameras with the FPGAs for a short period of time, me and my project partner became frustrated with having to constantly refocus the camera. We decided to design an autofocus as our project!
The autofocus supported several modes, including 'click to focus' which focussed on the area you clicked on, and 'auto re-focus' - which would automatically refocus when the scene changed. The autofocus was passive, meaning that it used no additional sensors, and depended solely on the image provided by the camera sensor.
I worked on both the computer vision and the mechanical design for this project, including laser cutting gears and mounting the servo to the camera.
Winner of the First Year Group Project Prize
2012 - 2014
AirPi is a weather and air quality monitoring add-on for the Raspberry Pi. Developed in response to a competition to 'build something that will make the world a better place', it is more affordable and accessible than comparable solutions. Being based upon the Raspberry Pi also allows extensive customisation of the hardware and software, which is all open source.
Over 1000 units of the AirPi have been sold over the 2 year span it was available, turning over more than $100,000. The AirPi website has received over 1 million pageviews from more than 300,000 people.
Winner of PA Consulting Raspberry Pi Competition 2013
The PiCycle is a modified bicycle with diffused LEDs on the handlebars to allow the bike to give data to the rider. The default configuration involves the leftmost and rightmost LEDs being used to give GPS guided navigation to the user. Upon approaching a left turn, the leftmost LED will turn on - and vice versa with the right LED.
For the turn by turn navigation, the route is calculated using an online API, and is then returned as a series of waypoints. The software then calculates the distance from the next waypoint and determines when to engage the LED based on the cycling speed.
The software also includes a plugin infrastructure, so displaying information on the center 3 LEDs would be simple - for example, weather or air pollution data.
Winner of Best in Show at Young Rewired State 2013
SmartMove is a househunting app that incorporates various streams of data to allow you to find areas to live in that fulfill your criteria - for example, affordable housing or good schools. You can rank your priorities, and the quality of housing is then overlaid onto a shaded map of London.
The app also allows you to see houses and apartments on the market in areas that you are interested in. From there, you can click through to see the house details on the agency website.
I developed the front end iOS app for this project, which was built in one week for Young Rewired State.
Winner of Best in Show at Young Rewired State 2012