Tom Hartley

Design Information Engineer

Hey, I'm Tom! I've been creating projects with technology since 2011. I graduated from Electronic and Information MEng at Imperial College with First Class Honours, where I developed a wide set of skills through solo and collaborative projects. I then joined the Innovation Design Engineering joint masters programme to explore new directions and push myself to bring innovation design into my practice.

My work seamlessly blends software, hardware and design. I have experience realising ideas using a wide variety of tools and platforms - from 3D modelling and computer vision, to electronics and human-centrered design. I love collaborating with designers to bring ideas and creations to life.

If you want to get in contact with me, the best way is to email tom at this site, or on Twitter @tmhrtly. I'd love to hear from you!

Sony D&AD

July 2019

I was approached by Sony who were looking to add a technological ‘pop’ to their upcoming installation for the D&AD Festival, attended by over 2000 leaders, professionals and students in the design industry. Their stand was focussed around promoting a new fashion watch, the FES Watch U, with an e-paper screen and strap. I was invited to develop two new interactions for visitors to their stand to experience.

The first interaction was a computer vision system which could automatically detect which of the 1440 posters were placed in front of it and transfer this information to a screen placed in the stand.

The second interaction invited festival attendees to design their own watch strap, that was then digitised and placed into a screen below a second stand. This tied into a competion where the best three watch designs could win a watch.


April 2019

Tuonge is a USSD/SMS based platform designed for parents to easily access SRH information and tips on how to approach SRH issues with their children. Adopting a model of gradual learning, content is curated by gender, age, and subject. A support hotline feature also connects parents with existing local community health services and workshops.

Project completed with Antonia Jara Contreras, Nelofar Taheri, Thomas Hartley, Yuying Zhang, Kennedy Kioli, Eliud Munuve. GoGlobal program organized by Imperial College, Royal College of Arts, Nairobi Design Institute, and Gearbox.


November 2018

Invited to create a 'gizmo', I took the spirograph - a children's toy - as a starting point. I was inspired by the hypnotic nature of the patterns that it produced, and wanted to bring them to life through automation. I prototyped with pen drawing, but a desire for continual operation let me to explore reusable media. Using sand, I realised that the marking could be achieved with magnets and the mechanism could be perfectly hidden underneath. I designed a custom r-theta rotational mechanism which was able to sketch continual spirals across the entirety of the sand base, and I turned a custom wooden case for the project on a lathe.

The inspiration for the patterns generated by my Gizmo was the spirograph - a simple children's toy.


Feburary 2019

For this project, I set out to create a chair with an unusual, playful material - swimming pool noodles. I pushed the noodles to their limit, cutting, warping, joining and shaping them to give them a form completely unique and distinct from its origins. I used this project as a way to explore and develop an experimental process that allowed me to work fluidly within an unfamiliar domain.


June 2018

Spacescape came to fruition as a installation for Electromagnetic Field 2018, a festival of 'geekery in an internet-connected field'. In partnership with the Imperial College Advanced Hackspace, we designed and built a portable, single-person escape room. After EMF, its final destination was for a breakout space within the Hackspace, so we needed to ensure that it was reliable and would automatically reset with no human intervention. To ensure it was portable, we manufactured it as two separate parts that could be combined together in the field.

The escape room was constructed in two parts which could be easily transported to the festival, before being reassembled onsite.

Built as a team of four, we blended and pushed all of our skills to bring it together.

Museum Alive - Augmented Reality in Museums

Jan – May 2018

As my final year project, I created an app that used computer vision technology to automatically add context to the artworks that sit on the walls of museum galleries. Through research, I realised that many of the subtleties and context of these pictures go unnoticed by visitors. Through the Museum Alive solo project, I aimed to shine a light on some of the context and background to the images to make the museum-going experience more fun.

By holding your phone up to the image, augmented reality adds markers to the image at points of interest. These are stored in a database, which can be live updated at any point. Museums can also permit users to add their own comments to the images, building onto the curated comments. User submissions are shown separately to curated comments, and can be moderated by museum staff to remove potentially offensive content. I designed and implemented the project at the National University of Singapore, where I studied for the fourth and final year of my Imperial College undergraduate degreee.


March – July 2017

We were invited by the Imperial College Advanced Hackspace (ICAH) to create an interactive experience to represent the process of ideation. The team came up with a series of machines that transported ping pong balls, representing ideas, through the space. These 7 machines filled a 1600 square foot room during the Imperial Festival.

I was the lead designer on the third machine, 'Bounce'. A project of this scale with a tight timeframe developed my management skills, and I learned a lot about working within a team environment to deliver an installation of this scale.

Continuum will also be shown in the Victoria and Albert Museum at the Late on the 28th July, from 6:30 - 10PM.

Over 4000 people experienced Continuum during the two days the exhibition was open.

A sketch of the machine, made during the concept phase of the project. Credit: Larasati

Putting the finishing touches on the project before it went on show.

The complete design as it was shown in the installation.

Many children enjoyed interacting with the machines.


May - July 2017

Re~master is a collaboration with Sabina Weiss. We worked to create a seamless user experience for the digital creation of embroidery from handdrawn sketches. The embroidery paths were generated by the software and sent to an embroidery machine.

Key technical skills involved were computer vision (accomplished using OpenCV with Python) to digitise the sketch, and custom graph algorithms for generating the embroidery path from the sketch. This technology is used to preserve and enhance a traditional hand craft.

Re~master was shown in the Royal College of Art Graduate Show 2017.

Autonomous Snack Delivery Android (ASDA)

May - June 2017

For our third year group project, I worked for 6 weeks as part of a team of 8 to develop an autonomous delivery system for snacks within an indoor environment. The robot would autonomously navigate downstairs to the café, place an order using voice, and return upstairs. We accomplished this task, succesfully making our first order in the café in front of several excited onlookers.

My main role in this project was developing the system to push the lift button. This involved computer vision and visual servoing to manipulate the arm. I also designed several peices of mechanical hardware for the project, including a tilting mount for the Kinect. This project allowed me to become familiar with ROS, as well as furthering my computer vision skills.

Robot DE NIRO could order a variety of snacks using speech synthesis

My main contribution to the project was CV and motion planning to call the lift autonomously.

The team who worked on this project.


Jan 2016

At the PennApps XIII hackathon, I worked in a team of 4 to implement a research algorithm for transmitting data out of an airgapped computer using the RAM bus. Without the computer having any connection to the outside world, we could transmit data at 2 bits per second to a software defined radio (SDR) 20 cm away.

I worked on the receiving algorithm for this project. I pushed the limits of my signal processing knowledge, sending the data received from the SDR through 4 different stages of processing (binning, running average, differentiation, thresholding). This allowed us to differentiate the power levels from the transmission as a digital signal that could then be reconstructed into ASCII.

Winner of Grand Prize at PennApps XIII


June 2015

At the Fishackathon, supported by the US Department of State, Fu Yong Quah and I created an automated system for measuring the length of fish. By placing a fish upon a printed peice of paper, any smartphone could be used to quickly determine and log the size and a photo of the fish. Quickly sizing fish is an important problem, as fish sizes are used to determine the health of fish stocks.

I worked on the computer vision for this project, including integrating the use of the Chilitags library for scaling the recorded sizes accurately and quickly. We won the local Fishackathon, and our project was then entered into the global competition, which we then also won.

Winner of the Global Fishackathon 2015, Judges Choice

FPGA Passive Autofocus

June 2015

The final project of 1st year EIE is to create a computer vision based project using an FPGA. After using the provided cameras with the FPGAs for a short period of time, me and my project partner became frustrated with having to constantly refocus the camera. We decided to design an autofocus as our project!

The autofocus supported several modes, including 'click to focus' which focussed on the area you clicked on, and 'auto re-focus' - which would automatically refocus when the scene changed. The autofocus was passive, meaning that it used no additional sensors, and depended solely on the image provided by the camera sensor.

I worked on both the computer vision and the mechanical design for this project, including laser cutting gears and mounting the servo to the camera.

Winner of the First Year Group Project Prize


2012 - 2014

AirPi is a weather and air quality monitoring add-on for the Raspberry Pi. Developed in response to a competition to 'build something that will make the world a better place', it is more affordable and accessible than comparable solutions. Being based upon the Raspberry Pi also allows extensive customisation of the hardware and software, which is all open source.

Over 1000 units of the AirPi have been sold over the 2 year span it was available, turning over more than $100,000. The AirPi website has received over 1 million pageviews from more than 300,000 people.

Winner of PA Consulting Raspberry Pi Competition 2013

A talk about AirPi that I gave in 2016 for the London Open Source Hardware User Group.


June 2013

The PiCycle is a modified bicycle with diffused LEDs on the handlebars to allow the bike to give data to the rider. The default configuration involves the leftmost and rightmost LEDs being used to give GPS guided navigation to the user. Upon approaching a left turn, the leftmost LED will turn on - and vice versa with the right LED.

For the turn by turn navigation, the route is calculated using an online API, and is then returned as a series of waypoints. The software then calculates the distance from the next waypoint and determines when to engage the LED based on the cycling speed.

The software also includes a plugin infrastructure, so displaying information on the center 3 LEDs would be simple - for example, weather or air pollution data.

Winner of Best in Show at Young Rewired State 2013


June 2012

SmartMove is a househunting app that incorporates various streams of data to allow you to find areas to live in that fulfill your criteria - for example, affordable housing or good schools. You can rank your priorities, and the quality of housing is then overlaid onto a shaded map of London.

The app also allows you to see houses and apartments on the market in areas that you are interested in. From there, you can click through to see the house details on the agency website.

I developed the front end iOS app for this project, which was built in one week for Young Rewired State.

Winner of Best in Show at Young Rewired State 2012