THIS PROJECT EXPLORES THE CAPABILITY OF DIGITAL ART TO CREATE UNCANNY VISIONS OF THE ANTHROPOMORPHISING OF TECHNOLOGY THROUGH THE DEVELOPMENT OF ARTIFICIAL INTELLIGENCE (AI). REFERENCING THE PAST, PRESENT AND FUTURE OF AI, THIS PROJECT SUBVERTS ITS FUNCTION AND BLURS THE LINE SEPARATING COMPUTER FROM MAN.
Continuing from my post about my experiments with chatbot technology and cloning through video cams (https://what-do-you-think.blog/2020/07/29/i-am-analogue/), I also wanted to show you the exploration I did into digital consciousness. As technology is pushed further, we are closer and closer to making technology that can mimic human behaviour uncannily – But, what is consciousness and is is possible for 1s and 0s to feel? These are some visual explorations of this question.
‘Computer Defines Consciousness’ 2020
Size: max 1 min long each/ 6 mins total Media/process: A collection of short videos
Link: https://www.youtube.com/watch?v=gglRBM- lHQEs&list=PLHEO4V8l_efOtHjsW9O6tRzNGD- VWoN-PT&index=2
One of the topics I wanted to address within my project was the possibility of AI experiencing ‘consciousness’(just like in the Turing Test. One of the primary distinctions between bot and human is the ability to experience a subjective and emotional reality. Through this series of videos I attempt to make the subjective reality of ‘feeling’ as objective as possible.
To do this, I used Adam King’s neural network https://talktotransformer.com/ which uses a language model to predict the next word in a text. I made a transcript for various definitions it came up with for consciousness, life etc until I decided to focus on the six primary emotions. I then used text-to-voice software to create an audio AI reading it.The unpredictability of the exercise makes for a humorous take on very heavy subject matter.
Furthering the translation of emotion into data, I created my own facial expression map based off research I did into behavioural psychology. These compositions became interesting in themselves as the diagram format transforms such a seemingly abstract concept, emotion, into a scientific study. The colours are amusing and contrasting.
‘I am ELIZA’ 2020
Size: 1 min 19 secs
Media/process: Video of performance Link: https://www.youtube.com/ watch?v=z_kALAB-Nsc
This is a performance piece which was my interpretation of the 1980s computer therapist, ‘ELIZA’. I set up a table in the light and sound room where I attempted to control all variables apart from the reaction of the participant. In this video Imogen Wilden takes the role of ELIZA, Hannah Urch is the ‘patient’ and I am behind the camera.
Originally I intended to have this as a live performance but I videoed it for record. These were preliminary experiments intended to be developed further but I couldn’t due to lockdown, hence I edited the footage numerous times. Here, a human plays the role of a computer talking to a human, mixing up analogue with a being.
These are the preliminary notes I made before the performance. There are instructions for myself (or whoever plays ELIZA) and separate instructions for the participant. I used the phrases taken directly from the original programme. Having this preparation was key as it produced a framework for the performance.
These are plans for installations on Adobe Dimension. The top one is a development of ‘I am ELIZA’ in which I would use my clone with ELIZA to create a therapy style webcam where the visitor would interact, from the therapy chair, with the bot. The bottom one is an immersive experience called ‘I am awake’. In this, the room full of data banks is echoing with an audio of an AI voice reading out definitions of consciousness devised by the talk transformer.
If I were able to exhibit this project, this is how I envision it being curated in the space. Titles, ‘We are analogue’, i would have: ‘I am ELIZA’ in the middle of the space; the ‘We don’t exist’ portrait series hung in frames on the left wall; and, the ‘Computer defines consciousness’ video series playing on televisions with headsets on the right wall.
On the next post I will give you the contextual research to read if you’re interested 😉