Last week we read the paper “Algorithm” by Andrew Goffey.

link to Goffey’s text

Goffey contributed this essay to explore different ways of defining the word. It is one of 36 essays in ‘Software Studies, a Lexicon’[1];

He starts the paper by offering the following definition of Algorithm (A.) “an unambiguous specification of how to solve a class of problems in a well-defined formal language [2] for calculating a function, for data processing or automated reasoning tasks”. Goffey (G.) heads up his paper with “Algorithm = logic + control.” [3]
Goffey points out that with A. there is not only a scientist/mathematician’s view of A. as a “theoretical entity … having an autonomous existence independent… of implementation details…having an autonomous independent existence… in (class) libraries…” but also, a software engineer’s view as A. as “A. having pragmatic efficiency to solve a particular problem”.
G. points out that it is not enough to define Algorithms as existing in a purely mathematical, abstract sense – for we cannot view A. in total isolation in the real world – for they affect and are affected by things around them. He goes on to point out specifically how A. are dependent on data and how A. and data are useless without each other. Also, G. points out, the purely ’scientific’ or ‘software engineer’ definitions do not adequately inform us about the social, cultural, and political role A.’s play.

G. approaches the definition from a philosophical standpoint; he describes programming languages as ‘syntactic artefacts’, they exist as machines – a series of steps to perform an actual outcome. G. goes onto illustrate the problem of bridging the purely theoretical definition of A. and the ‘real’ physical world, referring in his paper to Foucault.
G. proposes A. to be referred to as a (series of) statement(s), as Foucault terms it, in ‘Archaeology of Knowledge’ [4] . This is quite a dense work for this short summary, nevertheless, the statement used in the Foucault sense, when we think of this in A. context, would undermine the differentiation between the purely theory/practice definition. This returns us to the point that A. surely cannot exist in a void, independent of extrinsic factors.
The paper written in 2006 however does show how rapidly the world is changing; coincidentally, ‘Hinton, Osindero and The’ [5] published a paper in 2006, proposing a many layered neural network acting as an unsupervised machine, fine tuning it with back propagation.
This contrasts rules-based A. which is largely based on formal logic whereas the above paper heralds the onset of machine learning, giving computers the ability to refine their A. with no human intervention. Goffey does not bring this into his essay, a shame as he misses possibly the greatest leap in technology since the introduction of computers themselves.
The machine writes and re-writes its own A., with no human intervention. AI machines refining their own algorithms with the near impossibility of deciphering what/how they have done it, does pose us with new questions. With AI, there is slim possibility of understanding how the machine-created A.’s work – a black box. Does this pose a threat to humans – or possibly, will it save us and the planet?

[1]Software Studies a Lexicon Ed. Matthew Fuller Software Studies A Lexicon edited by Matthew Fuller The MIT Press Cambridge, Massachusetts London, England (2006)
[2]Rogers 1987:2
[3]Communications of the ACM CACM Volume 22 Issue 7, July 1979
[4]Foucault, Michel. 1969. The Archaeology of Knowledge. Trans. A. M. Sheridan Smith. London and New York: Routledge, 2002. ISBN 0-415-28753-7.
[5]Hinton, G. E.; Osindero, S.; Teh, Y. W. (2006). “A Fast Learning Algorithm for Deep Belief Nets” (PDF). Neural Computation. 18 (7): 1527–1554. PMID 16764513. doi:10.1162/neco.2006.18.7.1527.

IS71076A Computational Arts-based Research (2017-18) Week 1

This page will be updated weekly as part of Computational Arts-based Research (2017-18)
given by Helen Pritchard as part of my MFA Computational Arts at Goldsmiths.

First week, we read and discussed Computational Aesthetics in The Practices of Art as Politics
Patricia Ticiento Clough, Queens College and The Graduate Center CUNY

Although a challenging paper on first reading, it proposed a useful first glimpse of our course ahead. Notably, it was important to identify words and phrases used in the paper we are at best unfamiliar with, if not totally unsure of, including:
‘Calculative Ambience’
‘Ubiquitous calculation’
‘vicarious causality’
‘causal efficacy’
‘efficicient causation’


Computational Aesthetics in The Practices of Art as Politics

Patricia Ticiento Clough

Queens College and The Graduate Center CUNY


Patricia Clough (PC) delivered a tightly written and somewhat challenging paper for us to read and summarise as part of our first mini assignment. I have carefully read the paper several times and for me, the material was not the sort I am used to reading, however after several readings, I have made some sense of it. The final two paragraphs did help me as they formed a precis of the whole paper, namely, the advent of digital technology has had a profound effect on how we evaluate and practice Art in a Political context. PC lays out several examples of how the practice of art is challenged not only by the rapid emergence of digital technologies but also the intense commodification of human processes. She points out these factors has affected philosophical examination of art and the potentiality of the object(s), promoting a debate as to whether potentiality exists already in and object or whether it is dependent on the relation to other objects (possibly human).

Further examination of change brought on by digital technology is in relation to aesthetics. Art’s singular claim to aesthetics has been undermined by commodification of human processes and expansion of digital. Art has become so intertwined in market systems globally, it no longer serves as a means to inform us, to de-alienate us, serving more as a means to re-think itself politically re-assert itself and highlight the alienating and divisive effects of capitalism. PC quoting Claire Bishop who also, points out how, art paradoxically dovetails perfectly to fit in with neoliberalism’s recent forms.

PC discusses the opportunity that technology has given us tools to further explore the relationship of aesthetics, ontology and the term ‘calculative ambience’ (where calculation, action and materiality intertwine) via machine interfaces, big data and sensors, yielding a “mathematical seeing”.

PC then discusses the example of nano-technology and big data, positing the differences between the primary and secondary qualities of objects in relation to object-orientated ontologies, she suggests this may lead us to displacing the need for human intervention ultimately, its being our only way to engage the emotion (affect) through art, as politics.

At this point I have to admit I became a bit overwhelmed – no doubt this is a great academic paper but it seemed to become too complex, at least for me to understand. The paper concludes however – we have to re-evaluate aesthetics – All works can be considered as works of art – art is inexorably bound up with our ever-increasing commodification. She quotes Manning who says art where “we might glimpse the relational force of an eternal object coursing through the actual”. Also, where art as forms of play with objects, “with the mediatic spaces or the indeterminate, internal complexity of all objects or entities where incomputable probabilities are still real and present”.

To realise other possibilities, art as politics must be interdisciplinary, incorporating philosophy, mathematics, science, media and technology, requiring all sorts of groupings and alliances.

Art as politics …”The practices of art as politics must lead the way, instructing us in how to play and with our play make the world anew”.