The Nonhuman Turn

The reading was Grusin, Richard. “Introduction”. In The nonhuman turn. University of Minnesota Press, 2015 and Chun, Wendy Hui Kyong. “Crisis, crisis, crisis; or, the temporality of networks.” The nonhuman turn (2015): 139-166.

Until now, I had not lifted the covers of any writings until this term and the course on Computational Arts-based research and theory. This week is no easier, we have been given two texts to read and summarise. The introduction by Grusin lays out the central theme of this collection of plenary addresses from the 2012 Nonhuman Turn in 21st Century Studies conference, The Nonhuman Turn. Grusin points to the meaning of the word ‘turn’ and how it is used in the context of non-human situations. The nonhuman is not restricted simply to machines but also to ‘animals, plants, organisms, climatic systems, technologies, or ecosystems’. He goes on to describe speculative realism; the first conference was held at Goldsmiths in 2007, when the group of speakers challenged the post-Kantian correlationism philosophy, as speculative realism, defining itself ‘loosely in its stance of metaphysical realism’ and working to overturn anthropocentrism, privilege of the human over non-human. Speculative realism favouring distinct forms of realism against dominant forms of idealism in so much of contemporary philosophy. He describes how technology has accelerated pace and intensity of academic discourse the ‘turn’ and noting that in 2012 it was estimated that 51% of internet traffic was non-human.

Wendy Hui Kyong Chun in her presentation argues that crisis both exceeds and is structurally necessary to networks. She contrasts the medium of television, citing catastrophe as central to commercial TV programming, whereas new media as a ‘crisis machine’. She examines the many ways how new media has affected us, how, they, as crises, cut across our rules, even creating new norms. Many problems arise from the sheer speed of telecommunications, ‘undermining the need for scholarly contemplation. ‘Crisis structures new media temporality’.

Donald Trump Twitter Chatbot and using Tensorflow

We understand that the old way of creating a Chatbot would have been to create a substantial repository of questions and possible answers to create the illusion of a conversation that could have been held between the user and the computer. This is static, labour intensive and not a self-learning system.

More recently (as we hear almost all the time) – even this morning, BBC Radio 4 announced that over Christmas, the Today show will be hosted by “AI”. Quite annoying that is so much in the news, however, for our purposed and for the objective of demonstrating AI as part of the “AI as a Tool for Art” topic to be presented at the end of term, I gingerly attempt to build/borrow/steal a working AI chatbot.

Without drowning in detail, a layered Recurrent Neural net is constructed as follows:

  1. Download a dataset (this could be movie subtitle files or twitter feed for example)
  2. Create a model
  3. Train the model
  4. Test the model

Tensorflow – using the python language, developed by Google seemed the most productive approach and I installed it first on my Windows machine but soon realised it was a lot easier to use Linux, so I use a Virtual machine to install all the necessary Python Libraries and with Github.com I used the code from suriyadeepan/easy_seq2seq to start my experimentation.

The example I chose was Mr. Donald Trump’s Twitter feed. I am already learning how to use seq2seq and will report back soon on my findings.

Week 2 of Art Based Research project

Timeline for Project at:

https://docs.google.com/spreadsheets/d/1wrPen6yY3jedji6wTFoX8n6p5K3wE0cZGJg-UUuI_9I/edit#gid=0

Boston Square mapping:
This is to help us think and graph out the relative positions of Art works denoting each work’s independence from the artist and on the other axis how much the work is machine-based.

x:axis Human – Machine
y-axis Machine Independence – Manual

Located at:

https://docs.google.com/drawings/d/1QGvAiW3K8ciyIUhXkNf9dSB1PAxXaP1d8BS_-mRMvJ0/edit

 

Powerpoint presenatation of my findings:

AI and Machine Art v1.0

 

EVERY THING AT ONCE

A very enjoyable exhibition of recent works held at StoreX in Aldywich, London November 2017, held in conjunction with the Lisson Gallery.

I particularly enjoyed Ryoji Ikeda’s installation, Test Pattern. It was example how far you can go with an immersive sound/light environment – this responded in real time to the movement of visitors in the space. I felt a bit queasey at the end, perhaps by brain was overloaded. A great experience, nonetheless.

I made a short video on my phone to record this and some of the other exhibits:

 

 

Algorithm

Last week we read the paper “Algorithm” by Andrew Goffey.

link to Goffey’s text

Goffey contributed this essay to explore different ways of defining the word. It is one of 36 essays in ‘Software Studies, a Lexicon’[1];

He starts the paper by offering the following definition of Algorithm (A.) “an unambiguous specification of how to solve a class of problems in a well-defined formal language [2] for calculating a function, for data processing or automated reasoning tasks”. Goffey (G.) heads up his paper with “Algorithm = logic + control.” [3]
Goffey points out that with A. there is not only a scientist/mathematician’s view of A. as a “theoretical entity … having an autonomous existence independent… of implementation details…having an autonomous independent existence… in (class) libraries…” but also, a software engineer’s view as A. as “A. having pragmatic efficiency to solve a particular problem”.
G. points out that it is not enough to define Algorithms as existing in a purely mathematical, abstract sense – for we cannot view A. in total isolation in the real world – for they affect and are affected by things around them. He goes on to point out specifically how A. are dependent on data and how A. and data are useless without each other. Also, G. points out, the purely ’scientific’ or ‘software engineer’ definitions do not adequately inform us about the social, cultural, and political role A.’s play.

G. approaches the definition from a philosophical standpoint; he describes programming languages as ‘syntactic artefacts’, they exist as machines – a series of steps to perform an actual outcome. G. goes onto illustrate the problem of bridging the purely theoretical definition of A. and the ‘real’ physical world, referring in his paper to Foucault.
G. proposes A. to be referred to as a (series of) statement(s), as Foucault terms it, in ‘Archaeology of Knowledge’ [4] . This is quite a dense work for this short summary, nevertheless, the statement used in the Foucault sense, when we think of this in A. context, would undermine the differentiation between the purely theory/practice definition. This returns us to the point that A. surely cannot exist in a void, independent of extrinsic factors.
The paper written in 2006 however does show how rapidly the world is changing; coincidentally, ‘Hinton, Osindero and The’ [5] published a paper in 2006, proposing a many layered neural network acting as an unsupervised machine, fine tuning it with back propagation.
This contrasts rules-based A. which is largely based on formal logic whereas the above paper heralds the onset of machine learning, giving computers the ability to refine their A. with no human intervention. Goffey does not bring this into his essay, a shame as he misses possibly the greatest leap in technology since the introduction of computers themselves.
The machine writes and re-writes its own A., with no human intervention. AI machines refining their own algorithms with the near impossibility of deciphering what/how they have done it, does pose us with new questions. With AI, there is slim possibility of understanding how the machine-created A.’s work – a black box. Does this pose a threat to humans – or possibly, will it save us and the planet?

[1]Software Studies a Lexicon Ed. Matthew Fuller Software Studies A Lexicon edited by Matthew Fuller The MIT Press Cambridge, Massachusetts London, England (2006)
[2]Rogers 1987:2
[3]Communications of the ACM CACM Volume 22 Issue 7, July 1979
[4]Foucault, Michel. 1969. The Archaeology of Knowledge. Trans. A. M. Sheridan Smith. London and New York: Routledge, 2002. ISBN 0-415-28753-7.
[5]Hinton, G. E.; Osindero, S.; Teh, Y. W. (2006). “A Fast Learning Algorithm for Deep Belief Nets” (PDF). Neural Computation. 18 (7): 1527–1554. PMID 16764513. doi:10.1162/neco.2006.18.7.1527.

IS71076A Computational Arts-based Research (2017-18) Week 1

This page will be updated weekly as part of Computational Arts-based Research (2017-18)
given by Helen Pritchard as part of my MFA Computational Arts at Goldsmiths.

First week, we read and discussed Computational Aesthetics in The Practices of Art as Politics
Patricia Ticiento Clough, Queens College and The Graduate Center CUNY

Although a challenging paper on first reading, it proposed a useful first glimpse of our course ahead. Notably, it was important to identify words and phrases used in the paper we are at best unfamiliar with, if not totally unsure of, including:
Ontlogy
Epistemology
Trauma
Potentiality
‘Calculative Ambience’
Subjectification
‘Ubiquitous calculation’
‘vicarious causality’
immanent
‘causal efficacy’
‘efficicient causation’

Summary:

Computational Aesthetics in The Practices of Art as Politics

Patricia Ticiento Clough

Queens College and The Graduate Center CUNY

 

Patricia Clough (PC) delivered a tightly written and somewhat challenging paper for us to read and summarise as part of our first mini assignment. I have carefully read the paper several times and for me, the material was not the sort I am used to reading, however after several readings, I have made some sense of it. The final two paragraphs did help me as they formed a precis of the whole paper, namely, the advent of digital technology has had a profound effect on how we evaluate and practice Art in a Political context. PC lays out several examples of how the practice of art is challenged not only by the rapid emergence of digital technologies but also the intense commodification of human processes. She points out these factors has affected philosophical examination of art and the potentiality of the object(s), promoting a debate as to whether potentiality exists already in and object or whether it is dependent on the relation to other objects (possibly human).

Further examination of change brought on by digital technology is in relation to aesthetics. Art’s singular claim to aesthetics has been undermined by commodification of human processes and expansion of digital. Art has become so intertwined in market systems globally, it no longer serves as a means to inform us, to de-alienate us, serving more as a means to re-think itself politically re-assert itself and highlight the alienating and divisive effects of capitalism. PC quoting Claire Bishop who also, points out how, art paradoxically dovetails perfectly to fit in with neoliberalism’s recent forms.

PC discusses the opportunity that technology has given us tools to further explore the relationship of aesthetics, ontology and the term ‘calculative ambience’ (where calculation, action and materiality intertwine) via machine interfaces, big data and sensors, yielding a “mathematical seeing”.

PC then discusses the example of nano-technology and big data, positing the differences between the primary and secondary qualities of objects in relation to object-orientated ontologies, she suggests this may lead us to displacing the need for human intervention ultimately, its being our only way to engage the emotion (affect) through art, as politics.

At this point I have to admit I became a bit overwhelmed – no doubt this is a great academic paper but it seemed to become too complex, at least for me to understand. The paper concludes however – we have to re-evaluate aesthetics – All works can be considered as works of art – art is inexorably bound up with our ever-increasing commodification. She quotes Manning who says art where “we might glimpse the relational force of an eternal object coursing through the actual”. Also, where art as forms of play with objects, “with the mediatic spaces or the indeterminate, internal complexity of all objects or entities where incomputable probabilities are still real and present”.

To realise other possibilities, art as politics must be interdisciplinary, incorporating philosophy, mathematics, science, media and technology, requiring all sorts of groupings and alliances.

Art as politics …”The practices of art as politics must lead the way, instructing us in how to play and with our play make the world anew”.