Ling Tan is a designer, maker and coder interested in how people interact with the built environment and wearable technology. Trained as an architect, she enjoys building physical machines and prototypes ranging from urban scale to wearable scale to explore different modes of interaction between people and their surrounding spaces. Her work falls somewhere within the genre of wearable technology, Internet of Things(IoT) and citizen participation. She is currently working at Umbrellium in London to understand social wearables through community participation where she leads and produces projects such as WearON, an open source prototyping platform for wearables and WearAQ, a series of wearable tools for exploring air quality issues through people’s subjective perception. She participated as artist resident in various festivals such as Fak’ugesi African Digital Innovation Festival 2015 where she engaged with residents of Johannesburg to map out their perception of safety through using wearables as an expressive and social interface. Through the work she explores the complex issues surrounding the safety of the city, touching on demographics, race, gender and the subjective experience of the city through people.
As an artist, she is currently supported by FutureEverything‘s FAULT LINES programme and Barbican and The Trampery‘s alt.barbican programme. She has worked with museums such as Wits Art Museum, South Africa and Watermans Art Centre, UK. Her works have been exhibited in shows such as Utopian Bodies: Fashion Looks Forward (2015) and featured in magazines and websites across the globe such as Dezeen, Wired and Fast Company.
Co-Scriptable Bodies: Wear AQ
Single Channel Video
WearAQ is an experimental project that explored how school children make sense of complex issues around air pollution and considered how we might combine our innate subjective perception and intuition with wearable technology and machine learning algorithms to investigate air quality issues. Students at the Marner Primary school in Tower Hamlets London went out into the surrounding neighbourhood to measure air quality both technologically and through their own perceptions, and recorded their subjective experience using low tech wearable devices that catalogued their gestures. This data was compared with measurements from expensive, highly calibrated pollution monitoring equipment and other data like temperature, wind and humidity to look at correlations and contrasts. The perceptual data were then used in various data science experiments. The result of the experiment has been revealing, we were able to obtain an 8/8 correct predictions on students perceptual data based on our machine learning model. We were also able to obtain a 6/8 accuracy when we compared recorded perceptual data with data from the mobile pollution monitoring equipment. We recognised that there was a lack of data, however it was adequate for a first prototype and have proven via this experiment that there is a correlation between perceptual data to actual air quality measurements.