Walkthrough of Google Arts and Culture App

I used my Samsung Galaxy tablet to install Google’s Arts and Culture App.

Having mixed feelings about Google, I try not to use their services too much. This app however, did suprise me, despite the initial impression of art hangings in the hallways of an expensive hotel sor a modern hospitals; anodydne, avoiding any uncessesary or embarrasing subjects that cater for  well heeled international tourists or a bored business traveller who has already read the in-flight magazine twice already.

Digging deeper, it offered me the promise of notifications (weekly) – I had to trade in my privacy again, offering my location and almost certainly logging the items I choose to look at.

Lets get some of the screenshots as I  installed and ventured into the app:

The iconic Classic Greek with its cartoon, dumbed down aspect of the Golden mean, now reduced to a fast food logo.

1 million downloads! In Play Store, 3.8 approval rating from 17,862 people. Classified under Education – similar apps are all Google apps! Not very good classification in the Play store and quite a few negative comments. 3,713 one star and 10,271 five star.

“pretty disappointed because of the region lock and lack of proper communication about it. Have been checking every day hoping it would be unlocked, hopoefully soon. Also it would be cool if you could save articles you like to go back and download photos like so many other art/history museum archives are letting you do now days (sic).”

“region locking a feature like that makes no damn sense… “ etc

Looks like you need a VPN…

Also, what about taking screenshots, like I did??

`however, I was viewing features on art in Japan and the US…

 

The red greek temple logo shows collections you can open and visit in the app, the orange dots are venues with opening times and a Google map.

Allow Google to track your location??

Collections are grouped under subject or theme..

Zoom into a Google – approved artist

Who decides what to put there? Is this paid for by the exhibitior/curator, if it is how does this allow smaller more innovative galleries to show themselves. All very “gallery-system”

Korean artist, good, I have found some new material..

English heritage, very counter culture!

Fodder for the tourists…

I liked the experimental section, this already exists on anothe Google site. Good material buried under ‘tourist’. This was lumped in with the English Heritage material.

The nice feature I found in this app was that certain galleries provide a Google street view style walk through of the gallery interior, For example, the Tokyo Fuji Art Museum:

The Japanese text was in Kanji. Naturally Google offer G. translate…

Walk round the gallery…

and every nook and cranny

Intended audience, mostly visitors to a country – London has 52 collections located on the App’s Google map. This is a useful resource. However, as a means of exploring and researching foreign collections, it is somewhat limited.

The fairly mainstream and ‘establishment;’ slant does not go far enough to push any boundaries in the creative arts but it is not really desgned to do this. It is a glorified guide book.

In terms of ANT – the app will definitely inform the interested art hunter and perhaps alter their approach to exploring the gallery world. In turn another effect of the non human network effect is to facilitate sharing a users likes and islikes, like in so many other apps.

I am uncertain as to the extent that the app will alter its presentation to each user, as like in all apps, the internal workings are hidden. If the app knows the exact identity of each user then it would be more practical to achieve.

As to what the metadata generated might be used for, it is uncertain; there is no obligation to reveal your identity as such, requiring a password to enter etc. I am in no doubt that Google knows enough about you to identify you as you as a user almost certainly have provided login information identifying you with other sister Google apps.

The fairly mainstream and ‘establishment;’ slant does not go far enough to push any boundaries in the creative arts but it is not really designed to do this. It is a glorified guide book.

A few hidden treats offer the oportunity to roam around some of the collections, perhaps more could be provided.

The opportunity to share ‘liked’ items persists throughout, this is perhaps the only networked aspect of the app. The subject matter is not conducive to extend in this way much.

A couple more locations, first Thailand – the imagery really is repulsive, sorry!

I wonder how well these would sell at the Frieze? Art collectors? Oligarchs?

Another feature, zoom!

One of the few featured collections shown on the App in London, I love visiting here…

Moma, good

Select by timeline

The walkthrough method

We are asked to think on the following:

– What is the walkthrough method?

– What is the methodology of the walkthrough method?

– How would you carry out the walkthrough method?

 

We were introduced to the paper The walkthrough method: An approach to the study of apps by Ben Light, Jean Burgess and Stefanie Duguay [1].

 

The study of Apps and their sociocultural and economic effects is proposed and a formal methodology is described in this paper.

 

The environment of expected use and technical walkthrough are part of what is termed the Critical Technocultural Discourse Analysis (CTDA)  and includes forensically examining firstly, the environment of expected use.  This includes identifying the app’s vision, its operating model and its governance. The walkthrough process is to build a foundational corpus of data, starting with examining the app’s intended purpose, its cultural embedded meanings and to step through all the processes involved in registering the user to the app (if required). Further to this, the technical walkthrough would incorporate a data gathering procedure, not only registration but also everyday use of the app and how a user would go about leaving the app, closing an account if it has been opened and so on.

The walkthrough method uses interpretive techniques; Science and Technology Studies (STS) and cultural studies as a lens for app analysis. The walkthrough method as we use it is grounded in the principles of Actor-Network Theory (ANT), as a specific aspect of STS.

Within ANT, there are Intermediaries and Mediators – which in turn can be human or non-human. The intermediaries pass on meaning unchanged through a network of relations, while the mediators may transform meaning. An example in an app might take some information and suggest related things.. the example given in the paper was a dating app, having gathered certain like/dislikes – may suggest further likes to the user’s profile.

The way the app presents itself, its menu structure (in more playful apps this may frequently change) – the size of buttons, graphics, physical interaction gestures (e.g. swipe in Tinder), these all go towards a transformative action by the non-human mediator to affect change in the user.

What happens when the app is running (when removed from the user’s screen – or even, when machine is switched off)?

What happens when a user subverts the app by using it for a ‘non-intended’ use?

Consideration of affordances – again, is the app presenting itself to bias the user in their reactions?

What extra features/ changes occur over extended periods of use, not apparent in the initial walkthrough?

I am not clear on how best I would go about using the walkthrough method.

  1. The paper seems very anglo-centric, is it limited in its use? For example, how would this work in Japanese culture?
  2. Theory is presented well but the practice of the methods described need fleshing out

 

 

 

[1] The walkthrough method: An approach to the study of apps sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/1461444816675438

Wrestling With Olga

Theo and I assembled Olga in my garage… we have our assignment in projection mapping to complete.  – Much coding to do, more on this to come!

Please use the images below (right click save as to grab them)

IMG_2464
IMG_2465
IMG_2466
IMG_2468
IMG_2480
IMG_2481
IMG_2486
all
wireframealpha
wireframe

Readings: Computational Art and Ubiquity

why spoil things by the use of  this word in this context?

PigeonBlog’s birds had the potential to test these interpellation models.
interpellation
noun
a procedure in some legislative bodies of asking a government official to explain an act or policy, sometimes leading, in parliamentary government, to a vote of confidence or a change of government.”
Yes, this is an academic text but I have encountered too many of these ‘rare’ words that do nothing to help the reader. The first reading, Complex Ubiquity-Effects by Ulrik Ekman was even more densly packed with ‘difficult/unnecessary’ words.  How many times do you have to see the word “qua” – The ablative female? The formal style of writing is used to eliminate ambiguity in the meaning of the text, it is a pity the side-effect of this discipline is to increase the difficulty in deciphering and aiding comprehension.
This comes from our second reading this week, chapter 21 of

Tactical Biopolitics Art, Activism, and Technoscience, “Reaching the Limit When Art Becomes Science” by Beatriz da Costa ; much easier to ‘read’ and more accessible.

Not only the language she uses (apart from interpellation), but also the project “PigeonBlog” she describes was fascinating.

The use of pigeons carrying sensors at 300 ft (hard to access this altitude in any event): “The pigeon “backpack” developed for this project consisted of a combined GPS (latitude, longitude, altitude)/GSM (cell phone tower  communication) unit and corresponding antennas, a dual automotive CO/NOx pollution sensor, a temperature sensor, a subscribes identity module (SIM) card interface, a microcontroller, and standard supporting elec-tronic components.”

Da Costa reflects on the reaction to this project, which she states that it  became widely reported. What struck me as significant was her original strategic aim; “in situating itself between the academy and nonexpert participants” this pigeon project carries with it huge potential for other investigations, not confined to simply using pigeons with tech in order to survey pollution levels, but for many other projects, as signposted by the title of the book: Tactical Biopolitics Art, Activism, and Technoscience, “Reaching the Limit When Art Becomes Science.
The first reading, The introduction to Complex Ubiquity-Effects by Ulrik Ekman; very dense and powerful assembly of examples as to how the ‘third wave of computing’ is already underway – I have not read any of the essays in this book yet but Ekman has described the effects and change the explosion in use of RFIDs, multiple sensors, small portable devices, wearables, portable computing, data flows etc.. are changing how we experience the world. The lovely introduction I will include here… if only the text was more accessible!
Prelude
  • Three tourists stop in front of a boom coming down at the entrance to a nature reserve, reading on the little display on the boom that the maximum number of people permitted in the reserve has been reached at this point in time, this close to the breeding season.
  • A series of digital signposts and the GPS in the car lead the driver and his family down a set of side streets due to road repair and construction.
  • A media art installation embedded in the city square has dynamic and interactive video portraits appear on the ground in front of busy passers-by and makes them stop, play, and wonder how they were followed and picked out beforehand.
  • Every once in a while a 17-year-old son gets irritated at having to use his mother’s computer on the Internet—because he is quite frequently asked to consider buying new candles, bathrobes, bras, and women’s magazines.
  • An academic who gets home after a long day at work only vaguely notices that the lighting in the smart home is subdued a bit, the vacuum cleaner stops, and a string quartet replaces the pop songs from yesterday.”

24th Nov, Reflections after our discussion “The Non Human Turn”

#royals_as_robots

Walking home after seeing the last surviving (playing) member of Captain Beefheart and the Magic Band, 50 years after I bought ‘Safe as Milk’ – I pick up an Evening Standard and wait for my take away chips. I feel older still. The cover headline shrieks “ROBOTS SAVE 500 LIVES IN LONDON”

Ignoring for a moment perhaps the ideological motivatation to lull our fears of the privatization of the NHS, reading an old media ‘right wing’ and biased free sheet, I stop to think of the implications of the leader article; where does this fit into the absolute survival of the human race?

After the reading of Richard Grusin’s introduction I felt pessimistic. Does this give me cause for hope? I may have my cancerous prostate removed more accurately so I do not become impotent? It may be a concern for me. My discussion partner was optimistic, look how many improvements there are in the modern world.. healthier, longer lived lives… comfort, communication… etc. I reflected on this, I can only agree. BUT… At what cost? A minority of Londoners get freedom from impotence while 350 miles north of the capital, in Cumbria, hundreds are under water due to extreme weather events this week.

Ratko Mladić the war criminal gets his come uppance through better communication, coordination through our improved telecommunications and, Mugabe forced into resignation.

The Daily Mail spreads fake news of terrorist attack in Selfridges Oxford Street London. The guitarist of the Magic band is held up, nearly missing our concert.

But wait! Fake news spread by Daily Mail about Oxford Street – this very evening..

http://www.onenewspage.com/n/World/75eiq6j1u/The-Daily-Mail-Erroneous-Tweet-On-London.htm

and –

Fake News by Daily Mail

 

I am left with thinking – the Non Humans – out-act, out-implement consequences, beyond our control or comprehension-  but will they be a tool for our survival or the noose that finishes us off?

I suspect the latter because the stupid and lazy people (humans) outnumber the proactive thoughtful people. OK, not stupid or lazy but possibly victims of the corrosive side effects of the non-human.

Maybe some of the Magic Band audience would benefit from robot surgery, many of us will need it soon. Then, maybe the alien Annunaki will save us 😉

 

Haque Burble

http://www.haque.co.uk/openburble.php

Open Burble. This dates back to 2007 but beautiful use of Sparkfun axis accelerometers in each balloon.

The Nonhuman Turn

The reading was Grusin, Richard. “Introduction”. In The nonhuman turn. University of Minnesota Press, 2015 and Chun, Wendy Hui Kyong. “Crisis, crisis, crisis; or, the temporality of networks.” The nonhuman turn (2015): 139-166.

Until now, I had not lifted the covers of any writings until this term and the course on Computational Arts-based research and theory. This week is no easier, we have been given two texts to read and summarise. The introduction by Grusin lays out the central theme of this collection of plenary addresses from the 2012 Nonhuman Turn in 21st Century Studies conference, The Nonhuman Turn. Grusin points to the meaning of the word ‘turn’ and how it is used in the context of non-human situations. The nonhuman is not restricted simply to machines but also to ‘animals, plants, organisms, climatic systems, technologies, or ecosystems’. He goes on to describe speculative realism; the first conference was held at Goldsmiths in 2007, when the group of speakers challenged the post-Kantian correlationism philosophy, as speculative realism, defining itself ‘loosely in its stance of metaphysical realism’ and working to overturn anthropocentrism, privilege of the human over non-human. Speculative realism favouring distinct forms of realism against dominant forms of idealism in so much of contemporary philosophy. He describes how technology has accelerated pace and intensity of academic discourse the ‘turn’ and noting that in 2012 it was estimated that 51% of internet traffic was non-human.

Wendy Hui Kyong Chun in her presentation argues that crisis both exceeds and is structurally necessary to networks. She contrasts the medium of television, citing catastrophe as central to commercial TV programming, whereas new media as a ‘crisis machine’. She examines the many ways how new media has affected us, how, they, as crises, cut across our rules, even creating new norms. Many problems arise from the sheer speed of telecommunications, ‘undermining the need for scholarly contemplation. ‘Crisis structures new media temporality’.

Donald Trump Twitter Chatbot and using Tensorflow

We understand that the old way of creating a Chatbot would have been to create a substantial repository of questions and possible answers to create the illusion of a conversation that could have been held between the user and the computer. This is static, labour intensive and not a self-learning system.

More recently (as we hear almost all the time) – even this morning, BBC Radio 4 announced that over Christmas, the Today show will be hosted by “AI”. Quite annoying that is so much in the news, however, for our purposed and for the objective of demonstrating AI as part of the “AI as a Tool for Art” topic to be presented at the end of term, I gingerly attempt to build/borrow/steal a working AI chatbot.

Without drowning in detail, a layered Recurrent Neural net is constructed as follows:

  1. Download a dataset (this could be movie subtitle files or twitter feed for example)
  2. Create a model
  3. Train the model
  4. Test the model

Tensorflow – using the python language, developed by Google seemed the most productive approach and I installed it first on my Windows machine but soon realised it was a lot easier to use Linux, so I use a Virtual machine to install all the necessary Python Libraries and with Github.com I used the code from suriyadeepan/easy_seq2seq to start my experimentation.

The example I chose was Mr. Donald Trump’s Twitter feed. I am already learning how to use seq2seq and will report back soon on my findings.

Project End of Term 1

Invisible WiFi made Visible

Overview:

WiFi is all around us, the connections made by devices as MAC addresses can be monitored, as can the data itself and also  the density of the signals. The aim of my project is to render one or more of these varying things  as an array of self lit cubes. I will be using ESP8266 or similar to detect WiFi data and if possible, extend the concept to mapping out a large space to convert signal strength of WiFi signal sources.

Implementation:

I have already prototyped the ESP8266 to get the MAC addresses into Arduino IDE serial monitor,using a library developed by Ray Burnette to get a serial stream of Media Access Control (MAC) addresses. These will appear and disappear as mobile phones, laptops enter and leave the space. First 6 digits of the MAC address shows the manufacturer, I will research what else I can derive from the MAC address. More recent iPhones obfuscate their identity by throwing up fake MAC addresses periodically as the ping the WiFi. This could be identified in the installation.

The ESP 8266 will have a serial connection to an Arduin which in turn parses the data coming across so that it can communicatewith an Adafruit 16 channel servo shield. By parsing the serial data sent by the the ESP8266, Arduino will  provide the necessary cues for the operation of the physical installation (the grid of flip up doors), communicating with controlling arduino(s) via a serial bus.

Rendering is a part of the project I have yet to consider but I would want more than just showing LEDs or LCD panel, monitor etc.

It is proposed that a 5 x 3 grid of PLA 3d printed cubes laid out inside a flat MDF laser cut frame will contain the electronics and a small LCD screen to plot the progress of data.  When a new MAC address appears, one of the cubes pops up and the LED will light up. As more connections are made on any particular channel, the resspective cube will move in and out accordingly.

I will build a prototype to show a single cube in action.

Initial sketch:


layout for top of case

 

Lasercutting plan for case

 

 

preliminary Frizing (to be updated)

 

OpenSCAD Sketch of simple mechanism fot moving cubes off servo

 

 

Servo arm may well be just wire connecting to (3d printed) red cube yellow cylinder attached to cube may be same as red cube part. Yellow servo isattached to base, lying on its side. Another printed part to be used to hold servo. See Appendix for my Thingiverse part I created some time ago.

Statistics can also be shown on a small Nokia 5110 LCD in any case to monitor progress, alternatively out to a led 7 segment numeric display.

Power to be supplied via 5v phone charger. The servos may need an independent power supply, yet to be determined.

Further ideas:

Data logging to SD card (MAC adresses, visitors and time in and out).

Integrate with other devices, e.g.  TC35 SMS module to send tweets to twitter…

As well as using Fritzing to design the breadboard layout and the layout for the soldered components, I will use OpenSCad to design the 3D printed components and Tinkercad or http://www.makercase.com/ for the main MDF enclosure.

References:

Github.com Expressif/Arduino-esp32

@Igrr IvanGrokhotkov (www.doit.com)

https://youtu.be/9_Zw_Mls98c

The Glass Room Exhibit Oct-Nov 2017

“Unintended Emissions
Julian Oliver & Bengt Sjölén & Danja Vasiliev
The Critical Engineering Working Group
@julianOliver / @bengtsjolen / @k0a1a

As you make your way around the city each day you are constantly emitting data from your devices and being filmed on CCTV. As you stand here, The Critical Engineering Working Group is using radio receiver–like technology to passively scan the exterior pavement for signals from passing devices. Those signals are then being shown to pedestrians passing by The Glass Room in real time. The devices shown live on the screen are detected and located by ‘unintended electromagnetic emissions’, otherwise known as ‘data transmissions’. They are then represented here, creating a kind of livestream of data passing by.”

https://theglassroom.org/exhibit/

TFLto track customers MAC addresses

https://www.theregister.co.uk/2016/11/17/tfl_to_track_tube_users_by_wifi_device_mac_address/

“The trial, which will last four weeks from 21 November, “will help give TfL a more accurate understanding of how people move through stations, interchange between services and how crowding develops,” according to the transport authority.”

http://www.cbc.ca/news/politics/csec-used-airport-wi-fi-to-track-canadian-travellers-edward-snowden-documents-1.2517881

Servo holder stl I uploaded a while ago…

https://www.thingiverse.com/thing:688035/#files