collimation [2015-16]



The installation is composed of two microscopes watching and recording each other and a three screen projection which visualizes the process of interpretive analysis occurring within the software.
The system acts as a rudimentary form of AI, where the visual stimuli is translated, in a performative act of seeing, in the centre image (which is comprised of the two microscope feeds concatenated into one).
The resulting data is then transposed first to the image on the left, taking the form of a neuron and is responsive in both emergent growth and behaviour.
The data chain continues, flowing to the right hand image of a neuron. In a "mirror neuron" scenario, this image is influenced by the actions of the first andindependently reacts and generates its own growth patterns and behaviours.
The audio created by the apparatus itself, coupled with a composed score is introduced into the system where it too acts upon the behaviour and responsivity of the images.
The systems behaviour is, in a mimetic sense, reflective of several kinds of processes which operate under acts of translation and analysis. The parsing of information, existing as it does at the very foundations of embodied cognition, is central to our understanding of the bodies, networks and ecosystems in which
we exist.
In the installation, the visual instantiation of complex notions of being collide with features of surveillance and even further into cartographic renderings of both the microscopic, in the form of neuronal imagery to the macro in terms of alluding to the mapping and visualization of connectivity (through data analytics) within socio-geo-politcal bodies.
credits: Elie Zananiri - code/software development / Dix2 - fabrication/engineering / Ian Ilavsky + Brad Todd - audio / Brad Todd - conception/design