Neopixel ring Compass Road Safety Dog Jacket

Abandoned the waterproof sleeves idea and decided to do more sewing, not pvc/plastic welding. (from previous blog post)

My dog Baxter is an excellent model and his old coat is never used because he hates wearing it, so I gathered my Arduino Liliypad, compass sensor and other materials listed here:

Bill of Materials: waterproof.fzz


Friday, February 9 2018, 09:35:35
Assembly List
Label Part Type Properties
Part1 Lilypad Arduino Board type Lilypad Arduino
Part2 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
Part3 16 NeoPixel Ring variant variant 1; part # 1463
Part4 FTDI Basic Programmer type Basic; voltage 5V
Real Time Clock ZS-042 RTC Module chip DS3231; variant variant 4
U1 HMC5883L package 16lpcc; axis 3; variant smd
U2 LIPO-2000mAh package lipo-2000; variant 2000mAh
Shopping List
Amount Part Type Properties
1 Lilypad Arduino Board type Lilypad Arduino
1 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
1 16 NeoPixel Ring variant variant 1; part # 1463
1 FTDI Basic Programmer type Basic; voltage 5V
1 ZS-042 RTC Module chip DS3231; variant variant 4
1 HMC5883L package 16lpcc; axis 3; variant smd
1 LIPO-2000mAh package lipo-2000; variant 2000mAh
(I used a 2 x aaa battery as well)

Exported with Fritzing 0.9.3-

I tried out the compass code (other components plugged into breadboard, please ignore them!)

I have not yet integrated the real time clock into the project as I ran out of time. I will be usig the pixel ring to add hour, minute and second pixel to flash up.

Originally I planned to use a 7 segment display for the time and temperature
but I think the part I had was faulty. Additional functions could incorporate the temp display off the real time clock…

A little Evo-Stick on the end of electric thread stops it unravelling and helps thread through tiny component holes…

External USB power socket

LilyPad sewn in!

Power test

Added compass chip

Real time clock, battery. I ran out of electric thread so I used light wiring sewn down instead.

ready for walkies

My patient model. Extra waterproof to protect components in rain. Best results after dark!


Listing for Lilypad (work in progress)


James Tregaskis

NeoPixel ring for dog jacket
9th Feb 2018
This is code I used from two sources and merged them
I have not yet integrated the real time clock into the
project as I ran out of time. I will be usig the pixel
ring to add hour, minute and second pixel to flash up.
Originally I planned to use a 7 segment display for the time and temp
but I think the part I had was faulty.
Additional functions could incorporate the temp display off the real
time clock

Bill of Materials: waterproof.fzz


Friday, February 9 2018, 09:35:35
Assembly List
Label Part Type Properties
Part1 Lilypad Arduino Board type Lilypad Arduino
Part2 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
Part3 16 NeoPixel Ring variant variant 1; part # 1463
Part4 FTDI Basic Programmer type Basic; voltage 5V
Real Time Clock ZS-042 RTC Module chip DS3231; variant variant 4
U1 HMC5883L package 16lpcc; axis 3; variant smd
U2 LIPO-2000mAh package lipo-2000; variant 2000mAh

Shopping List
Amount Part Type Properties
1 Lilypad Arduino Board type Lilypad Arduino
1 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
1 16 NeoPixel Ring variant variant 1; part # 1463
1 FTDI Basic Programmer type Basic; voltage 5V
1 ZS-042 RTC Module chip DS3231; variant variant 4
1 HMC5883L package 16lpcc; axis 3; variant smd
1 LIPO-2000mAh package lipo-2000; variant 2000mAh
(I used a 2 x aaa battery as well)

Exported with Fritzing 0.9.3-

This is a library example for the HMC5883 magnentometer/compass

Designed specifically to work with the Adafruit HMC5883 Breakout

*** You will also need to install the Adafruit_Sensor library! ***

These displays use I2C to communicate, 2 pins are required to interface.

Adafruit invests time and resources providing this open source code,
please support Adafruit andopen-source hardware by purchasing products
from Adafruit!

Written by Kevin Townsend for Adafruit Industries with some heading example from
Love Electronics (

This program is free software: you can redistribute it and/or modify
it under the terms of the version 3 GNU General Public License as
published by the Free Software Foundation.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see <>.


#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_HMC5883_U.h>
#include “ds3231.h”
#define BUFF_MAX 128
uint8_t time[8];
char recv[BUFF_MAX];
long previousMillis = 0;
//long interval = 1000;
unsigned int recv_size = 0;
unsigned long prev, interval = 5000;
boolean doFunkyThings = false;
/* Assign a unique ID to this sensor at the same time */
Adafruit_HMC5883_Unified mag = Adafruit_HMC5883_Unified(12345);
#include <Adafruit_NeoPixel.h>

#define PIN 3

// Parameter 1 = number of pixels in strip
// Parameter 2 = pin number (most are valid)
// Parameter 3 = pixel type flags, add together as needed:
// NEO_KHZ800 800 KHz bitstream (most NeoPixel products w/WS2812 LEDs)
// NEO_KHZ400 400 KHz (classic ‘v1’ (not v2) FLORA pixels, WS2811 drivers)
// NEO_GRB Pixels are wired for GRB bitstream (most NeoPixel products)
// NEO_RGB Pixels are wired for RGB bitstream (v1 FLORA pixels, not v2)
Adafruit_NeoPixel strip = Adafruit_NeoPixel(16, PIN, NEO_RGB + NEO_KHZ400);

int fixedHeadingDegrees; // Used to store Heading value
float headingDegrees = 0;//heading * 180 / M_PI;
void displaySensorDetails(void)
sensor_t sensor;
Serial.print (“Sensor: “); Serial.println(;
Serial.print (“Driver Ver: “); Serial.println(sensor.version);
Serial.print (“Unique ID: “); Serial.println(sensor.sensor_id);
Serial.print (“Max Value: “); Serial.print(sensor.max_value); Serial.println(” uT”);
Serial.print (“Min Value: “); Serial.print(sensor.min_value); Serial.println(” uT”);
Serial.print (“Resolution: “); Serial.print(sensor.resolution); Serial.println(” uT”);

void setup(void)
Serial.println(“HMC5883 Magnetometer Test”); Serial.println(“”);

/* Initialise the sensor */
if (!mag.begin())
/* There was a problem detecting the HMC5883 … check your connections */
Serial.println(“Ooops, no HMC5883 detected … Check your wiring!”);
while (1);
strip.setBrightness(30); //adjust brightness here; // Initialize all pixels to ‘off’
/* Display some basic information on this sensor */

// clock stuff
memset(recv, 0, BUFF_MAX);
Serial.println(“GET time”);
strip.setBrightness(30); //adjust brightness here; // Initialize all pixels to ‘off’
colorWipe(strip.Color(255, 0, 0), 0);

void loop(void)
unsigned long currentMillis = millis();
/* Get a new sensor event */
sensors_event_t event;

/* Display the results (magnetic vector values are in micro-Tesla (uT)) */
// Serial.print(“X: “); Serial.print(event.magnetic.x); Serial.print(” “);
// Serial.print(“Y: “); Serial.print(event.magnetic.y); Serial.print(” “);
// Serial.print(“Z: “); Serial.print(event.magnetic.z); Serial.print(” “); Serial.println(“uT”);

// Hold the module so that Z is pointing ‘up’ and you can measure the heading with x&y
// Calculate heading when the magnetometer is level, then correct for signs of axis.
float heading = atan2(event.magnetic.y, event.magnetic.x);

// Once you have your heading, you must then add your ‘Declination Angle’, which is the ‘Error’ of the magnetic field in your location.
// Find yours here:
// Mine is: -13* 2′ W, which is ~13 Degrees, or (which we need) 0.22 radians
// If you cannot find your Declination, comment out these two lines, your compass will be slightly off.
float declinationAngle = 0.22;
heading += declinationAngle;

// Correct for when signs are reversed.
if (heading < 0)
heading += 2 * PI;

// Check for wrap due to addition of declination.
if (heading > 2 * PI)
heading -= 2 * PI;

// Convert to degrees
float headingDegrees = heading * 180 / M_PI;

// To Fix rotation speed of HMC5883L Compass module
if (headingDegrees >= 1 && headingDegrees < 240)
fixedHeadingDegrees = map (headingDegrees * 100, 0, 239 * 100, 0, 179 * 100) / 100.00;
else {
if (headingDegrees >= 240)
fixedHeadingDegrees = map (headingDegrees * 100, 240 * 100, 360 * 100, 180 * 100, 360 * 100) / 100.00;
int headvalue = fixedHeadingDegrees / 18;
int ledtoheading = map(headvalue, 0, 15, 15, 0);

// Serial.print(“Heading (degrees): “); Serial.print(“ledtoheading : “); Serial.print(ledtoheading); Serial.println(headingDegrees);
if (currentMillis – previousMillis > interval) {
// save the last time you blinked the LED
previousMillis = currentMillis;
doFunkyThings = !doFunkyThings;
if (!doFunkyThings) {
else {
colorWipe(strip.Color(0, 0, 255), 0);

if (ledtoheading == 0) {
strip.setPixelColor(15, 255, 0, 50); //Red
strip.setPixelColor(0, 0, 255, 0); //Green
strip.setPixelColor(14, 0, 255, 0); //Green

else {
if (ledtoheading == 15) {
strip.setPixelColor(0, 255, 0, 50); //Red
strip.setPixelColor(15, 0, 255, 0); //Green
strip.setPixelColor(1, 0, 255, 0); //Green
else {
strip.setPixelColor(ledtoheading, 255, 0, 50); //Red
strip.setPixelColor(ledtoheading + 1, 0, 255, 0); //Green
strip.setPixelColor(ledtoheading – 1, 0, 255, 0); //Green


void colorWipe(uint32_t c, uint8_t wait) {
for (uint16_t i = 0; i < strip.numPixels(); i++) {
strip.setPixelColor(i, c);;
void doClockStuffinLoop() {
char in;
char buff[BUFF_MAX];
unsigned long now = millis();
struct ts t;

// show time once in a while
if ((now – prev > interval) && (Serial.available() <= 0)) {

// there is a compile time option in the library to include unixtime support
snprintf(buff, BUFF_MAX, “%d.%02d.%02d %02d:%02d:%02d %ld”, t.year,
t.mon, t.mday, t.hour, t.min, t.sec, t.unixtime);
//Serial.println(“here it is..”);
snprintf(buff, BUFF_MAX, “%d.%02d.%02d %02d:%02d:%02d”, t.year,
t.mon, t.mday, t.hour, t.min, t.sec);

prev = now;

if (Serial.available() > 0) {
in =;

if ((in == 10 || in == 13) && (recv_size > 0)) {
parse_cmd(recv, recv_size);
recv_size = 0;
recv[0] = 0;
} else if (in < 48 || in > 122) {
; // ignore ~[0-9A-Za-z] } else if (recv_size > BUFF_MAX – 2) { // drop lines that are too long
// drop
recv_size = 0;
recv[0] = 0;
} else if (recv_size < BUFF_MAX – 2) {
recv[recv_size] = in;
recv[recv_size + 1] = 0;
recv_size += 1;

void funky() {
// Some example procedures showing how to display to the pixels:
// colorWipe(strip.Color(255, 0, 0), 50); // Red
// colorWipe(strip.Color(0, 255, 0), 50); // Green
// colorWipe(strip.Color(0, 0, 255), 50); // Blue

void rainbow(uint8_t wait) {
uint16_t i, j;

for (j = 0; j < 256; j++) {
for (i = 0; i < strip.numPixels(); i++) {
strip.setPixelColor(i, Wheel((i + j) & 255));
// Slightly different, this makes the rainbow equally distributed throughout
void rainbowCycle(uint8_t wait) {
uint16_t i, j;

for (j = 0; j < 256 * 5; j++) { // 5 cycles of all colors on wheel
for (i = 0; i < strip.numPixels(); i++) {
strip.setPixelColor(i, Wheel(((i * 256 / strip.numPixels()) + j) & 255));

// Input a value 0 to 255 to get a color value.
// The colours are a transition r – g – b – back to r.
uint32_t Wheel(byte WheelPos) {
if (WheelPos < 85) {
return strip.Color(WheelPos * 3, 255 – WheelPos * 3, 0);
} else if (WheelPos < 170) {
WheelPos -= 85;
return strip.Color(255 – WheelPos * 3, 0, WheelPos * 3);
} else {
WheelPos -= 170;
return strip.Color(0, WheelPos * 3, 255 – WheelPos * 3);
void parse_cmd(char *cmd, int cmdsize)
uint8_t i;
uint8_t reg_val;
char buff[BUFF_MAX];
struct ts t;

//snprintf(buff, BUFF_MAX, “cmd was ‘%s’ %d\n”, cmd, cmdsize);

// TssmmhhWDDMMYYYY aka set time
if (cmd[0] == 84 && cmdsize == 16) {
t.sec = inp2toi(cmd, 1);
t.min = inp2toi(cmd, 3);
t.hour = inp2toi(cmd, 5);
t.wday = cmd[7] – 48;
t.mday = inp2toi(cmd, 8);
t.mon = inp2toi(cmd, 10);
t.year = inp2toi(cmd, 12) * 100 + inp2toi(cmd, 14);
} else if (cmd[0] == 49 && cmdsize == 1) { // “1” get alarm 1
DS3231_get_a1(&buff[0], 59);
} else if (cmd[0] == 50 && cmdsize == 1) { // “2” get alarm 1
DS3231_get_a2(&buff[0], 59);
} else if (cmd[0] == 51 && cmdsize == 1) { // “3” get aging register
Serial.print(“aging reg is “);
Serial.println(DS3231_get_aging(), DEC);
} else if (cmd[0] == 65 && cmdsize == 9) { // “A” set alarm 1
DS3231_set_creg(DS3231_INTCN | DS3231_A1IE);
for (i = 0; i < 4; i++) {
time[i] = (cmd[2 * i + 1] – 48) * 10 + cmd[2 * i + 2] – 48; // ss, mm, hh, dd
uint8_t flags[5] = { 0, 0, 0, 0, 0 };
DS3231_set_a1(time[0], time[1], time[2], time[3], flags);
DS3231_get_a1(&buff[0], 59);
} else if (cmd[0] == 66 && cmdsize == 7) { // “B” Set Alarm 2
DS3231_set_creg(DS3231_INTCN | DS3231_A2IE);
for (i = 0; i < 4; i++) {
time[i] = (cmd[2 * i + 1] – 48) * 10 + cmd[2 * i + 2] – 48; // mm, hh, dd
uint8_t flags[5] = { 0, 0, 0, 0 };
DS3231_set_a2(time[0], time[1], time[2], flags);
DS3231_get_a2(&buff[0], 59);
} else if (cmd[0] == 67 && cmdsize == 1) { // “C” – get temperature register
Serial.print(“temperature reg is “);
Serial.println(DS3231_get_treg(), DEC);
} else if (cmd[0] == 68 && cmdsize == 1) { // “D” – reset status register alarm flags
reg_val = DS3231_get_sreg();
reg_val &= B11111100;
} else if (cmd[0] == 70 && cmdsize == 1) { // “F” – custom fct
reg_val = DS3231_get_addr(0x5);
Serial.print(“orig “);
Serial.print(reg_val, DEC);
Serial.print(“month is “);
Serial.println(bcdtodec(reg_val & 0x1F), DEC);
} else if (cmd[0] == 71 && cmdsize == 1) { // “G” – set aging status register
} else if (cmd[0] == 83 && cmdsize == 1) { // “S” – get status register
Serial.print(“status reg is “);
Serial.println(DS3231_get_sreg(), DEC);
} else {
Serial.print(“unknown command prefix “);
Serial.println(cmd[0], DEC);


Machine Seeing

Three readings this week, we have:
Ways of Machine Seeing by Geoff Cox
A Future for Intersectional Black Feminist Technology Studies by Safiya Umoja Noble
How we are teaching computers to understand pictures by Fei Fei Lee

“Drawing on the two readings consider your example in relation to “ways of machine seeing”.


Inspired by Lisa Nakamura’s Keynote speech at this year’s Transmediale Berlin “Call Out, Protest, Speak Back” I will be looking further into the writings of bell hooks (Gloria Jean Watkins) and her influence on Intersectional thought. Nakamura presentation focuses on VR and how it is being sold by the big tech companies and how it uses black women as users of the media. She points out how this is a superficial and misleading image. This connects with the second of the readings; by Safiya Umoja Noble.

Supershapes formula

Inspired by Johan Gielis, Paul Bourke’s website deals with, among other geometry, supershapes.


see also

Extended to 3d the formula extends as:

Daniel Schiffman excellent youTube on this, here is the code for Processing sketch taken from his gitHub:


Also see Reza Ali’s Website – more supershapes.


to do:

take the code below and convert to a Point Cloud.

// Daniel Shiffman
// Code for:

import peasy.*;

PeasyCam cam;

PVector[ ][ ]  globe;
int total = 75;

float offset = 0;

float m = 0;
float mchange = 0;

void setup() {
size(600, 600, P3D);
cam = new PeasyCam(this, 500);
globe = new PVector[total+1][total+1];

float a = 1;
float b = 1;

float supershape(float theta, float m, float n1, float n2, float n3) {
float t1 = abs((1/a)*cos(m * theta / 4));
t1 = pow(t1, n2);
float t2 = abs((1/b)*sin(m * theta/4));
t2 = pow(t2, n3);
float t3 = t1 + t2;
float r = pow(t3, – 1 / n1);
return r;

void draw() {

m = map(sin(mchange), -1, 1, 0, 7);
mchange += 0.02;

float r = 200;
for (int i = 0; i < total+1; i++) {
float lat = map(i, 0, total, -HALF_PI, HALF_PI);
float r2 = supershape(lat, m, 0.2, 1.7, 1.7);
//float r2 = supershape(lat, 2, 10, 10, 10);
for (int j = 0; j < total+1; j++) {
float lon = map(j, 0, total, -PI, PI);
float r1 = supershape(lon, m, 0.2, 1.7, 1.7);
//float r1 = supershape(lon, 8, 60, 100, 30);
float x = r * r1 * cos(lon) * r2 * cos(lat);
float y = r * r1 * sin(lon) * r2 * cos(lat);
float z = r * r2 * sin(lat);
globe[i][j] = new PVector(x, y, z);

offset += 5;
for (int i = 0; i < total; i++) {
float hu = map(i, 0, total, 0, 255*6);
fill((hu + offset) % 255 , 255, 255);
for (int j = 0; j < total+1; j++) {
PVector v1 = globe[i][j];
vertex(v1.x, v1.y, v1.z);
PVector v2 = globe[i+1][j];
vertex(v2.x, v2.y, v2.z);


Wearables Project – Information Sleeve – Concept

Weather, time, compass …and more high viz  sleeve for bikers/outdoor

Be seen and anticipate problems on the road:

I did 14,000 miles last year – rode up to Tiblisi and down to Athens, two trips to France / Pyrenees. I have information on board the bike – GPS, time, temperature, mpg, range… I have to click through the information button to get each piece of information.  I would like to have something self powered and easy to read while I am riding, particularly in bad weather. It would be a challenge to make some thing rugged and useful to use on long journeys.

I do not necessarily need another GPS but a digital compass would be nice – also a bright LED addressable strip to show temperature, possibly barometer as well by changing colour and lighting up the LEDs along sleeve.

  • Temperature – shown by leds along the arm of ther sleeve.
  • Battery – powerbank NiMd
  • Ruggedized oled
  • Waterprood multicolour addressable LCD strip, display temp by no of lit LEDS and colour for barmteric pressure
  • numeric for displays
  • Realtime clock DS3231
  • HMC 5883L digital compass
  • Barometer
  • Humidity
  • MPU-6050 Gyro, accelerometer

Possible enclosure for components?

LED addressable strip for temp/air pressure display

learn plastic welding of seams

I2C 7 segment diplay for time

OLED display for compass and other information

Waterproof, addressable LEDs sealed into  sleeve

Powerbank to power the unit.

On/off button inside waterproof cushion

This concept for the wearables project is not necessarily ‘art’ but it will involve some opportunity to design and make a rugged outdoor, easy to read sleeve one can wear outdoors – walking cycling, motorcycling. As a keen motorcyclist, I already have a multi-function sensor on the bike, but it is hard to read in bad weather and fiddly to operate, also it can only display one piece of information at a time. This project will combine as many sensors as I can possibly include in the order as shown above.

I will experiment with waterproofing techniques, aim to seal in an oled screen and numeric leds. All will be connected via I2C bus, powered by a decent power bank. A timer will shut the unit down to save power, pressing a button inside waterproof will wake the device up or shut it down. I will use a lilypad or similar to control sensors.

Add Pi Zero with Camera? Luxury version!

Investigate friction welding to waterproof the items in pockets inside the sleeve.

Use of silicone and polycarbonate enclosure to protect OLED inside bespoke 3D printed case.

References: (more to be added)

1. Waterproofing article I found useful

2. Compass as a ring of LEDS

An exercise in intimacy

Were asked to pair up and touch each others palms for 3 minutes; 90 seconds with eyes closed, 90 seconds with them open, then write our experiences, taking 10 minutes.

Reflections on bodily contact

My view of Matthew:

The first part was to stare into each others eyes for ten seconds. This felt like ten minutes, in fact, we had to make several attempts at the gazing preliminary as we would either one of us look away or laugh, distracting ourselves. I do not know my opposite number, seen him in class so it was all genuinely difficult and left me feeling surprisingly uncomfortable.

The main feeling, I experienced throughout was pain. I shut my eyes, at first uneasy at the unnatural circumstances of touching a stranger so to speak. We English are so reserved… My hands – so warm, his hands cold from just walking into the classroom. I began to lose all sense of what was normal… the effort of holding up my arms made it feel like I was holding up the other person like in a circus trick, balancing upwards. My fingers compensating and micro-adjusting so the fingertips would not ‘fall off’ the ends of his fingertips, like balancing on a the edge of a precipice, a high wire. Feeling pulses of movement and the heat transferring from my hands into his. Each finger would twitch.

Releasing our shut eyes- opening them made the task even harder. Now I had to worry about having to avert my gaze. When our eyes meet I look away, embarrassed. We are conditioned not to threaten each other with this gaze avoidance, I think. If I did this to my dog, he would look away, just the same. I glance round the room, looking at the others to see what they are doing, some relaxed, some in embarrassed discomfort, just like me…I carry on, this seems like it is taking hours, certainly not something I would have chosen to so but at last we are released, I drop my arms in absolute relief.

I wonder what I can take from this; the distortion of time, sensory input magnified with eyes closed.

This was Matthew’s viewpoint:

The contact zone moved. As our palms touched and our fingers aligned we knew this would be a long minute and a half. James’ hands felt large and warm. The pressure created between us was enough to sustain the strain of our extended limbs.

I could sense movement, twitches and some rigidity from James. Personally I was calm. A little apologetic for the coldness of my own hands. Questions started arriving. Was I sweating? Was I moving a lot? I felt like James was doing the moving, still I thought of the relationship between driver and passenger in a car. The driver anticipates.

After thirty or forty seconds my left index finger began to slip. It crept leftwards. Gradually heading for the valley. Would we soon interlock fingers? I didn’t move, curious to as to where this would go. James blinked first and corrected our alignment. The minute game of chicken was over.

When we opened our eyes James would not hold my gaze for more than a few seconds. Our separate selves had bonded for a few minutes there. I continued with the exercise and stared at James. Taking in his face, his eyes, his hair. He is several decades older than I am, I knew this change would happen to me too.


Part two

My viewpoint:

This time we did not touch – we closed our eyes  for 90 seconds then I used my phone to film in time laspse mode, observing my partner through the phone for 90 seconds.

With eyes closed, I was disconnected totally from him. The classroom disappeared, I was in meditative mode; since my practice of over 25 years of meditation, I am conditioned to draw within and I began to watch myself. Again, Gurdjieff watched over me so to speak, I was observing myself, my thoughts passing, music – memories of my days when I was in the ‘work’. Now, John Cale… drifting thoughts swirling, bringing my mind back… but still no appearance of my partner in front of me… until I realised I had a task to be in the room with my opposite number.

The 90 seconds seemed long, but I was comfortable this time, happy to spend another hour if need be.

The daylight returns and my task reappears, I hold my phone up and video Matthew in fast motion – maybe he would like to see it, it did not record any particular blow by blow representation, I wanted to shrink time if I ever came to look at it again. He looks a little uneasy, but I am thankful it was not me being observed. I saw him looking at me looking into my phone, he did not seem to like the idea, neither did I, like a sort of voyeuristic thing, I felt guilt. He was a victim, I was the prison warder forced to observe my prisoner. I was dominant in the exercise, not any better position to be in than the observed subject.

The time passed slower in the observation through the phone sequence, it was not enjoyable. We were back dealing with our intimacy despite giving each other permission to do this, I was glad the second part was over.

And Matthew’s view:

I’m searching for James with my eyes closed. We are no longer connected, only present together. The yellow and orange of my eyelids turns to a muddy green. The hairs on my fingers are bristling. They’re searching for contact. My stomach rumbles and I’m reminded of my hunger. I distract myself with a few bars of a song.

When we open our eyes this second time James has been told to take out his phone and views me through it. I stare dead-eyed into the lens. Knowing this black circle will be the locus of my attention for the next period, I settle in.

I try to move the camera with my stare. That is I’m trying to move the man. I visualise pushing the device away to the side. The phone does start to move, a hand swap indicates that this is fatigue not telekinesis.

James looks away from the camera. I know he doesn’t enjoy this, I find this fun.

How does my face look? Am I locking with his invisible eyes? In the pre-meditation I considered discreetly lifting my hood and pulling the cords tight. Not out of shame or shyness but to make James laugh upon opening his eyes.


Readings; Rose Woodcock and Nishat Awan

Digital Narratives and Witnessing: The Ethics of Engaging with Places at a Distance and Instrumental Vision by Rose Woodcock

We had two reading this week; the common theme was how technology impacts on our perception of the world. In the first, we hear of the ethics of engaging with Places at a distance, in the second reading, with objects ‘closer up’ – in our heads, as part of virtual reality.

Nishat Awan describes how we in the developed world have a distorted view of our surroundings, particularly of distant locations. He proposes that this has been brought upon to a large extent by the technologies of social media and also with the warfare technology of drones. He describes vividly his own direct experience of a locality of Gwadar, Pakistan and how it presents itself through the lens of digital media. Gwadar, his example mediates as a place, through the term power topologies [1], where notion of distance is removed. The remote sensing of places, digital maps and even connections with people over great distances, best put: “Topology in this context highlights the intensive nature of the world that such technologies create because as power reaches across space it is not so much traversing across a fixed space and time, as it is composing its own space–time.”

In addition, he points out how little critical engagement there is with the ways in which they mediate our engagement with place. He expands this, using an urgent example of localities in crisis, as humanitarian or war, or both. Haiti, for example, where humanitarian aid can be accelerated through tech without the need to get too caught up in the crisis itself, thus keeping the aid agencies out of harm’s way. One side effect of this is to create the false impression of the distant locality is in constant crisis.

He recalls as to how a crisis is communicated; Michael Buerk reporting the Ethiopian crisis for the BBC in 1982 – his report on the news gave us an immediate emotional response. This was perhaps the best and one early example of how aid is stimulated, now with digital media, it is central to the methodology of calling for help, the example used by Awan was the virtual reality film of a girl in  Za’atri refugee camp, Jordan shot in 2015; Clouds over Sidra, presented to the World Economic forum in Davos. This presents the story in a different way; “There is an authenticity and immediacy associated with such images, but at the same time they are easily exploited, misinterpreted, and hijacked by powerful actors”

Dronestream, an artwork by James Bridle (Tate) uses publicly available Google Earth to show the after effects of drone strikes. Quote: “the politics of witnessing takes another twist. When difficult stories are being told by distant others, then the testimony of presence is suddenly rendered ineffective.”

We hear another example of how technology has changed, through citizen reporting; Eliot Higgins of Bellingcat, “tracking of missiles from Russia to parts of Ukraine under Russian control and of proving through this practice of tracking and location that a Russian-made missile was responsible for bringing down Malaysian Airways Flight MH17”… “what happens to the witness when the claims that are being made do not come from the testimony of individuals but are made through combining multiple narratives? Where do you locate the political subject in such an account and does it matter that witnessing can no longer be attributed to just one person? Are the multiple volunteers that contribute to Bellingcat the authors of this work or is it the various people from social media whose information has been used to piece together an account, or is it in actuality the figure of Higgins and his organization?”

These examples (and others in the paper) describe how spatial analysis with investigative journalism engage with places that are in conflict, where it is difficult to spend time in the field, however this perhaps oversimplifies the events that took place, despite their authenticity. The three emergent practices combine spatial analysis with investigative journalism to engage with places that are in conflict, where it is difficult to spend time in the field

They only tell part of the story; they lack the testimony of people on the ground, other types of seeing, as a feminist geopolitical viewpoint.

He describes at length the history of his chosen locality, Gwadar province, detailing its history and culminating in an earthquake disaster killing 800 people, largely ignored by the international media, how social media has played the greater part in healing from the disaster. However, this has not been without its down-side; ‘dirty linen’ has been aired as well.

Digital technologies have transformed how we engage with distant places, but these techniques have also come with their limitations, particularly with regard to witnessing events. However, the earthquake in Gwadar has shown how social media can ameliorate matters, where the broader lens of the international media misses it.

The second reading, Instrumental Vision by Rose Woodcock deals with another aspect of perceptual change through technology. She examines the ‘practice’ of vision and consider on what basis can vison have its own ‘materiality. She ask many questions; what comes first, pictures or the capacity to see things pictorially?

Woodcock describes the work of Gibson, a perceptual psychologist who worked on an instructional film for AAF Fighter pilots. He found that film was a far more effective means of training than any training manual- to “develop a theory of what a motion picture shot could do that nothing else could Gibson’s emphasis on how the animated display gives the observer a sense of “continuous covariation in time” and particularly, how it inserted the observer’s own view-point at the centre of the flow of images, is more like a description of a virtual reality display than of conventional cinema.”

She concludes that immersive stereoscopic VR is uniquely empowered to instantiate the lit environment, since it alone, as a system of visual representation, can array surfaces in three-dimensions; and it can illuminate itself. Virtual imaging technology is thus well fashioned as a fine tool for the creation, not of pictorial illusionistic space, but three-dimensional, corporeal “actual” space. She goes on to describe Caravaggio’s Supper at Emmaus, how the artist conveys the sense of space from 2D in oils with the use of reflective light, perspective and foreshort

“Realism” as opposed to “realness” thus marks a definitive difference between pictorial and real-world perception respectively. This difference is epistemological, rather than a matter of degree (for example, of detail or resolution), and corresponds to the way assumptions about what vision is for, find expression and manifest differentially within the enterprises of pictorial representation and the design of virtual worlds.


[1] Allen, J. 2011. Topological twists: Power’s shifting geographies. Dialogues in Human Geography 1 (3): 283–98.

Kinect Hack- doing without an adapter for connection to PC or Mac

Microsoft stopped making the adapter to join a Kinect to a PC or a Mac in October 2017. In the video I go through the steps to work around this proble. The second hand adapter now sells at about 18GBP on eBay. Originally they were free.

First – rip out the cable (plug) with a pair of pliers – it will be tough but it does come out! You are fighting with a square rubber grommet instde the case which can be removed later. The next step is to watch the video, its 35 ish minutes long.


You will need :

  • nylon tie wraps
  • female 2.1mm socket for connecting to 12V 1.5A power supply (you need that as well)
  • a USB type C “A to B cable” (as found wih some scanners or printers)
  • A set of torx security tools

Led 3 x 3 Matrix and Neopixel LED strip

Link to video of Challenge 1 and 2

The exercises started in Hatchlab with breadboards… I found it frustrating that the connections I made in the breadbord were rather fragile but I went ahead and got the two exercises underway.

I decided to take the homework home and solder up a mini board with my Nano to make a testbed for the matrix. It took a little longer than I would have liked but it was worth it. I drew a sketch of the connections (snapshot in the video) – rather than drawing on Fritzing, it was quicker and helped me figure out what I had to do.

I started with the 3 x 3 matrix then I added another row of 3 leds and converted my Arduino sketches accordingly. This is also shown in the video.

The Neopixel library had some good examples and I used them to help learn how to use my 17 pixel strip (I borrowed!)

I have ordered some more Neopixel type strip from China and will return to make a more elaborate display when that arrives. I am thinking of a 10 x 10 diaplay.


Walkthrough of Google Arts and Culture App

I used my Samsung Galaxy tablet to install Google’s Arts and Culture App.

Having mixed feelings about Google, I try not to use their services too much. This app however, did suprise me, despite the initial impression of art hangings in the hallways of an expensive hotel sor a modern hospitals; anodydne, avoiding any uncessesary or embarrasing subjects that cater for  well heeled international tourists or a bored business traveller who has already read the in-flight magazine twice already.

Digging deeper, it offered me the promise of notifications (weekly) – I had to trade in my privacy again, offering my location and almost certainly logging the items I choose to look at.

Lets get some of the screenshots as I  installed and ventured into the app:

The iconic Classic Greek with its cartoon, dumbed down aspect of the Golden mean, now reduced to a fast food logo.

1 million downloads! In Play Store, 3.8 approval rating from 17,862 people. Classified under Education – similar apps are all Google apps! Not very good classification in the Play store and quite a few negative comments. 3,713 one star and 10,271 five star.

“pretty disappointed because of the region lock and lack of proper communication about it. Have been checking every day hoping it would be unlocked, hopoefully soon. Also it would be cool if you could save articles you like to go back and download photos like so many other art/history museum archives are letting you do now days (sic).”

“region locking a feature like that makes no damn sense… “ etc

Looks like you need a VPN…

Also, what about taking screenshots, like I did??

`however, I was viewing features on art in Japan and the US…


The red greek temple logo shows collections you can open and visit in the app, the orange dots are venues with opening times and a Google map.

Allow Google to track your location??

Collections are grouped under subject or theme..

Zoom into a Google – approved artist

Who decides what to put there? Is this paid for by the exhibitior/curator, if it is how does this allow smaller more innovative galleries to show themselves. All very “gallery-system”

Korean artist, good, I have found some new material..

English heritage, very counter culture!

Fodder for the tourists…

I liked the experimental section, this already exists on anothe Google site. Good material buried under ‘tourist’. This was lumped in with the English Heritage material.

The nice feature I found in this app was that certain galleries provide a Google street view style walk through of the gallery interior, For example, the Tokyo Fuji Art Museum:

The Japanese text was in Kanji. Naturally Google offer G. translate…

Walk round the gallery…

and every nook and cranny

Intended audience, mostly visitors to a country – London has 52 collections located on the App’s Google map. This is a useful resource. However, as a means of exploring and researching foreign collections, it is somewhat limited.

The fairly mainstream and ‘establishment;’ slant does not go far enough to push any boundaries in the creative arts but it is not really desgned to do this. It is a glorified guide book.

In terms of ANT – the app will definitely inform the interested art hunter and perhaps alter their approach to exploring the gallery world. In turn another effect of the non human network effect is to facilitate sharing a users likes and islikes, like in so many other apps.

I am uncertain as to the extent that the app will alter its presentation to each user, as like in all apps, the internal workings are hidden. If the app knows the exact identity of each user then it would be more practical to achieve.

As to what the metadata generated might be used for, it is uncertain; there is no obligation to reveal your identity as such, requiring a password to enter etc. I am in no doubt that Google knows enough about you to identify you as you as a user almost certainly have provided login information identifying you with other sister Google apps.

The fairly mainstream and ‘establishment;’ slant does not go far enough to push any boundaries in the creative arts but it is not really designed to do this. It is a glorified guide book.

A few hidden treats offer the oportunity to roam around some of the collections, perhaps more could be provided.

The opportunity to share ‘liked’ items persists throughout, this is perhaps the only networked aspect of the app. The subject matter is not conducive to extend in this way much.

A couple more locations, first Thailand – the imagery really is repulsive, sorry!

I wonder how well these would sell at the Frieze? Art collectors? Oligarchs?

Another feature, zoom!

One of the few featured collections shown on the App in London, I love visiting here…

Moma, good

Select by timeline

The walkthrough method

We are asked to think on the following:

– What is the walkthrough method?

– What is the methodology of the walkthrough method?

– How would you carry out the walkthrough method?


We were introduced to the paper The walkthrough method: An approach to the study of apps by Ben Light, Jean Burgess and Stefanie Duguay [1].


The study of Apps and their sociocultural and economic effects is proposed and a formal methodology is described in this paper.


The environment of expected use and technical walkthrough are part of what is termed the Critical Technocultural Discourse Analysis (CTDA)  and includes forensically examining firstly, the environment of expected use.  This includes identifying the app’s vision, its operating model and its governance. The walkthrough process is to build a foundational corpus of data, starting with examining the app’s intended purpose, its cultural embedded meanings and to step through all the processes involved in registering the user to the app (if required). Further to this, the technical walkthrough would incorporate a data gathering procedure, not only registration but also everyday use of the app and how a user would go about leaving the app, closing an account if it has been opened and so on.

The walkthrough method uses interpretive techniques; Science and Technology Studies (STS) and cultural studies as a lens for app analysis. The walkthrough method as we use it is grounded in the principles of Actor-Network Theory (ANT), as a specific aspect of STS.

Within ANT, there are Intermediaries and Mediators – which in turn can be human or non-human. The intermediaries pass on meaning unchanged through a network of relations, while the mediators may transform meaning. An example in an app might take some information and suggest related things.. the example given in the paper was a dating app, having gathered certain like/dislikes – may suggest further likes to the user’s profile.

The way the app presents itself, its menu structure (in more playful apps this may frequently change) – the size of buttons, graphics, physical interaction gestures (e.g. swipe in Tinder), these all go towards a transformative action by the non-human mediator to affect change in the user.

What happens when the app is running (when removed from the user’s screen – or even, when machine is switched off)?

What happens when a user subverts the app by using it for a ‘non-intended’ use?

Consideration of affordances – again, is the app presenting itself to bias the user in their reactions?

What extra features/ changes occur over extended periods of use, not apparent in the initial walkthrough?

I am not clear on how best I would go about using the walkthrough method.

  1. The paper seems very anglo-centric, is it limited in its use? For example, how would this work in Japanese culture?
  2. Theory is presented well but the practice of the methods described need fleshing out




[1] The walkthrough method: An approach to the study of apps DOI: 10.1177/1461444816675438