“Face Value” Transmediale 2018

 

My last visit to Berlin was in 1990 in early March, a few months after the fall of the Berlin wall, Checkpoint Charlie was still in operation while I was there, East Berlin still wearing the ragged clothing of tears of Soviet rule; the flower stall, a bucket of daffodils, the greengrocer shop => a pile of huge unwashed potatoes with can of Coca-Cola placed on top.

Families wander through the park, some leaning up against hot dog stand – the whole family sharing one hot dog. Trabants broken all along the barbed wire fenced motorway leading out of Berlin to West Germany, rows of repaired identical alarm clocks with identical cardboard labels waiting for customers to collect them in the department store. Capitalism manifests again in grimy car parks with Polish families on blankets, selling their children’s toys for bread as vile West Germans swoop in to buy up the bric-a-brac from the comfort of their Mercedes and Bmws.

Today, Capitalism oozes out of every crevice, from the ‘hipster’ Kreutzberg to the immense tower blocks capturing space all over the city. It was a shock to me, to see the transformation of a city – and yet, the energy and power of the city and its people helped me forget my memories of the 80’s and threw me into a future… But why does no-one speak German?

I recall why I am in the city, I continue my innocent ventures into Computational Art spend almost all of my four days attending Transmediale 2018 – I decided from the outset that I would attend as many talks as I could endure, allowing for mental/physical fatigue. My college classmates roll in at 06:30 just as I get dressed and showered…

No, I did not ‘enjoy’ all the talks, however, overall the sessions were all extremely stimulating but I had my likes and dislikes.

First, the dislikes.

The panel discussion Nefarious Values: On Artistic Critique and ComplicityMarc Garrett, Eric Kluitenberg, Sven Lütticken, Ana Teixeira Pinto, belit sağ, Lioudmila Voropai,Moderated by Marc Garrett

Eric K was absent. Sven Lütticken gave a measured, slightly vague presentation, discussing the rising inequality and failure of capitalism. Lioudmilla Voropai was really hard to follow, her English was not flowing and presentation poor. Her main points were concerned solely on the aspect of critique; how the artist will develop his/her practice. She seemed to ramble and was not able to communicate and made any points with clarity. Ana Teixeira Pinto was not much better, more concerned with rattling off a very dense text she read out and proudly proclaimed she finished in 2 minutes 30 seconds in the allotted 5-minute time slot. The third panellist (NOT LISTED) over-ran but showed a 2-minute video concerned with the plight of Kurds and war criminals in Turkey/ censorship. She was earnest and genuine I felt but the moderator seemed to take a dislike to her and challenged her, demanding she gave some ‘answers/solutions’ to the problem. I left feeling quite annoyed at the pompous Ludmilla, the vague co-presenters and felt it a wasted hour.

‘Fuck Off Google’ was another disappointment to me, if only because it reminded me of the unfocused anarchist meetings I attended during the ‘occupy’ period over 2011. The general posits; to demonstrate against the start up space and Google’s plan to move into the (already gentrified) Kreuzberg, a distinctly bohemian/creative area of east Berlin, to  what the presenters claimed was becoming rapidly becoming gentrified. The two presenters were convincing enough but I felt I could have spent less time at the session, even though I arrived a few minutes late. It made me conclude; how far can we divorce ourselves from the racial capitalism that continues to dominate our world without completely descending into nihilism? We have a tension between the world as we have it and an idealised world we would like to have. It is important to speak up and point out imperfections, one of which is how ‘greenwashing’ – crumbs of money off the table to appear green and ‘right on/trendy/cool’  by corporates such as Google, how Google has gentrified cities like San Francisco – now Berlin is in the sights of Google. I felt the motivation of the talk was valid but the free form questioning exposed how little the speakers had to say and went on too long.

However, I really liked all the other presentations of the day, particularly the keynote speech by Jonathan Beller; Derivative Living: “Platform Communism: A Program for Derivative Living Against Regimes of Informatic Subsumption”. It is worth following up with the video of his talk:

Audio:

Beller described eloquently and in a well-structured way, Toxic Media, Toxic Finance, Toxic Information and New Economic Spaces/ECSA.

His conclusion; blockchain offers technologies to frame a means of enabling small communities to bypass the toxic capitalist system, offering a small window of opportunity for those imagining a new beginning. He did not claim to describe how this all may play out as he stated the tech was still in its infancy. The questions raised at the end were excellent also, challenging him but he maintained composure and responded credibly.

Another favourite of mine, a keynote speech by Professor Lisa Nakaruma ‘Call Out, Protest, Speak Back’, I found to be the most memorable and thought-provoking. Nishant Shah gave an excellent presentation and introduced Professor Nakaruma as being inspirational to him.

Her talk revealed my own ignorance, but that was OK, because I benefited – I followed up on her talk – reading up on bell hooks, Audre Lord et al. Prof. N. points out examples of misogyny and racism ‘strengthened and consumed’ in gaming platforms, also presentations VR technology products – particularly via ‘new media’ . She offers an example; a black woman being seen using a VR product is not only shallow but also reveals the efforts by tech companies to counter balance expectations to show their true white middle-class customer base. A stroppy member of the audience pipes up 1 hour 16 minutes in the Q and A, demanding an explanation from Nakaruma, claiming she ‘knew’ VR and how it works… the audience howled in horror! Prof. N calmly agrees, she does not know how to make VR but asks us to see how these products are being sold. Well worth listening to.

Perhaps I attended too many talks, overloading my brain – however – in no particular order of merit, all good: ‘Soundtrack for Webcams – Live’,

‘Hard Feelings: A Conversation on Computation and Affect’ (with my lecturer Helen Pritchard), ‘The Space In-Between: The Value  of Interpretation and Interaction for the  Next Generation Internet’, ‘Politics of Forgetfulness’,‘Calculating Life’ (With Heather Dewey-Hagborg, excellent!), ‘Artists Re:Thinking the Blockchain ‘, ‘Reimagine the Internet: Affect, Velocity, Excess’, ‘The Weaponization of Language’ and ‘Growing a Repertoire: The Preservation of Net Art as Resistance to Digital Industrialism.’

 

Full programme available with recorded presentations

 

 

 

 

 

 

Neopixel ring Compass Road Safety Dog Jacket

Abandoned the waterproof sleeves idea and decided to do more sewing, not pvc/plastic welding. (from previous blog post)

My dog Baxter is an excellent model and his old coat is never used because he hates wearing it, so I gathered my Arduino Liliypad, compass sensor and other materials listed here:

Bill of Materials: waterproof.fzz

/Users/jtreg/gold/physical/waterproof.fzz

Friday, February 9 2018, 09:35:35
Assembly List
Label Part Type Properties
Part1 Lilypad Arduino Board type Lilypad Arduino
Part2 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
Part3 16 NeoPixel Ring variant variant 1; part # 1463
Part4 FTDI Basic Programmer type Basic; voltage 5V
Real Time Clock ZS-042 RTC Module chip DS3231; variant variant 4
U1 HMC5883L package 16lpcc; axis 3; variant smd
U2 LIPO-2000mAh package lipo-2000; variant 2000mAh
Shopping List
Amount Part Type Properties
1 Lilypad Arduino Board type Lilypad Arduino
1 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
1 16 NeoPixel Ring variant variant 1; part # 1463
1 FTDI Basic Programmer type Basic; voltage 5V
1 ZS-042 RTC Module chip DS3231; variant variant 4
1 HMC5883L package 16lpcc; axis 3; variant smd
1 LIPO-2000mAh package lipo-2000; variant 2000mAh
(I used a 2 x aaa battery as well)

Exported with Fritzing 0.9.3- http://fritzing.org

I tried out the compass code (other components plugged into breadboard, please ignore them!)

I have not yet integrated the real time clock into the project as I ran out of time. I will be usig the pixel ring to add hour, minute and second pixel to flash up.

Originally I planned to use a 7 segment display for the time and temperature
but I think the part I had was faulty. Additional functions could incorporate the temp display off the real time clock…

A little Evo-Stick on the end of electric thread stops it unravelling and helps thread through tiny component holes…

External USB power socket

LilyPad sewn in!

Power test

Added compass chip

Real time clock, battery. I ran out of electric thread so I used light wiring sewn down instead.

ready for walkies

My patient model. Extra waterproof to protect components in rain. Best results after dark!

 

Listing for Lilypad (work in progress)

/*

James Tregaskis

NeoPixel ring for dog jacket
—————————-
9th Feb 2018
This is code I used from two sources and merged them
I have not yet integrated the real time clock into the
project as I ran out of time. I will be usig the pixel
ring to add hour, minute and second pixel to flash up.
Originally I planned to use a 7 segment display for the time and temp
but I think the part I had was faulty.
Additional functions could incorporate the temp display off the real
time clock

Bill of Materials: waterproof.fzz

/Users/jtreg/gold/physical/waterproof.fzz

Friday, February 9 2018, 09:35:35
Assembly List
Label Part Type Properties
Part1 Lilypad Arduino Board type Lilypad Arduino
Part2 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
Part3 16 NeoPixel Ring variant variant 1; part # 1463
Part4 FTDI Basic Programmer type Basic; voltage 5V
Real Time Clock ZS-042 RTC Module chip DS3231; variant variant 4
U1 HMC5883L package 16lpcc; axis 3; variant smd
U2 LIPO-2000mAh package lipo-2000; variant 2000mAh

Shopping List
Amount Part Type Properties
1 Lilypad Arduino Board type Lilypad Arduino
1 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
1 16 NeoPixel Ring variant variant 1; part # 1463
1 FTDI Basic Programmer type Basic; voltage 5V
1 ZS-042 RTC Module chip DS3231; variant variant 4
1 HMC5883L package 16lpcc; axis 3; variant smd
1 LIPO-2000mAh package lipo-2000; variant 2000mAh
(I used a 2 x aaa battery as well)

Exported with Fritzing 0.9.3- http://fritzing.org

*/
/***************************************************************************
This is a library example for the HMC5883 magnentometer/compass

Designed specifically to work with the Adafruit HMC5883 Breakout
http://www.adafruit.com/products/1746

*** You will also need to install the Adafruit_Sensor library! ***

These displays use I2C to communicate, 2 pins are required to interface.

Adafruit invests time and resources providing this open source code,
please support Adafruit andopen-source hardware by purchasing products
from Adafruit!

Written by Kevin Townsend for Adafruit Industries with some heading example from
Love Electronics (loveelectronics.co.uk)

This program is free software: you can redistribute it and/or modify
it under the terms of the version 3 GNU General Public License as
published by the Free Software Foundation.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.

***************************************************************************/

#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_HMC5883_U.h>
#include “ds3231.h”
#define BUFF_MAX 128
uint8_t time[8];
char recv[BUFF_MAX];
long previousMillis = 0;
//long interval = 1000;
unsigned int recv_size = 0;
unsigned long prev, interval = 5000;
boolean doFunkyThings = false;
/* Assign a unique ID to this sensor at the same time */
Adafruit_HMC5883_Unified mag = Adafruit_HMC5883_Unified(12345);
#include <Adafruit_NeoPixel.h>

#define PIN 3

// Parameter 1 = number of pixels in strip
// Parameter 2 = pin number (most are valid)
// Parameter 3 = pixel type flags, add together as needed:
// NEO_KHZ800 800 KHz bitstream (most NeoPixel products w/WS2812 LEDs)
// NEO_KHZ400 400 KHz (classic ‘v1’ (not v2) FLORA pixels, WS2811 drivers)
// NEO_GRB Pixels are wired for GRB bitstream (most NeoPixel products)
// NEO_RGB Pixels are wired for RGB bitstream (v1 FLORA pixels, not v2)
Adafruit_NeoPixel strip = Adafruit_NeoPixel(16, PIN, NEO_RGB + NEO_KHZ400);

int fixedHeadingDegrees; // Used to store Heading value
float headingDegrees = 0;//heading * 180 / M_PI;
void displaySensorDetails(void)
{
sensor_t sensor;
mag.getSensor(&sensor);
Serial.println(“————————————“);
Serial.print (“Sensor: “); Serial.println(sensor.name);
Serial.print (“Driver Ver: “); Serial.println(sensor.version);
Serial.print (“Unique ID: “); Serial.println(sensor.sensor_id);
Serial.print (“Max Value: “); Serial.print(sensor.max_value); Serial.println(” uT”);
Serial.print (“Min Value: “); Serial.print(sensor.min_value); Serial.println(” uT”);
Serial.print (“Resolution: “); Serial.print(sensor.resolution); Serial.println(” uT”);
Serial.println(“————————————“);
Serial.println(“”);
delay(500);
}

void setup(void)
{
Serial.begin(9600);
Serial.println(“HMC5883 Magnetometer Test”); Serial.println(“”);

/* Initialise the sensor */
if (!mag.begin())
{
/* There was a problem detecting the HMC5883 … check your connections */
Serial.println(“Ooops, no HMC5883 detected … Check your wiring!”);
while (1);
strip.begin();
strip.setBrightness(30); //adjust brightness here
strip.show(); // Initialize all pixels to ‘off’
/* Display some basic information on this sensor */

}
// clock stuff
DS3231_init(DS3231_INTCN);
memset(recv, 0, BUFF_MAX);
Serial.println(“GET time”);
//
strip.begin();
strip.setBrightness(30); //adjust brightness here
strip.show(); // Initialize all pixels to ‘off’
displaySensorDetails();
colorWipe(strip.Color(255, 0, 0), 0);
}

void loop(void)
{
unsigned long currentMillis = millis();
/* Get a new sensor event */
sensors_event_t event;
mag.getEvent(&event);

/* Display the results (magnetic vector values are in micro-Tesla (uT)) */
// Serial.print(“X: “); Serial.print(event.magnetic.x); Serial.print(” “);
// Serial.print(“Y: “); Serial.print(event.magnetic.y); Serial.print(” “);
// Serial.print(“Z: “); Serial.print(event.magnetic.z); Serial.print(” “); Serial.println(“uT”);

// Hold the module so that Z is pointing ‘up’ and you can measure the heading with x&y
// Calculate heading when the magnetometer is level, then correct for signs of axis.
float heading = atan2(event.magnetic.y, event.magnetic.x);

// Once you have your heading, you must then add your ‘Declination Angle’, which is the ‘Error’ of the magnetic field in your location.
// Find yours here: http://www.magnetic-declination.com/
// Mine is: -13* 2′ W, which is ~13 Degrees, or (which we need) 0.22 radians
// If you cannot find your Declination, comment out these two lines, your compass will be slightly off.
float declinationAngle = 0.22;
heading += declinationAngle;

// Correct for when signs are reversed.
if (heading < 0)
heading += 2 * PI;

// Check for wrap due to addition of declination.
if (heading > 2 * PI)
heading -= 2 * PI;

// Convert to degrees
float headingDegrees = heading * 180 / M_PI;

// To Fix rotation speed of HMC5883L Compass module
if (headingDegrees >= 1 && headingDegrees < 240)
{
fixedHeadingDegrees = map (headingDegrees * 100, 0, 239 * 100, 0, 179 * 100) / 100.00;
}
else {
if (headingDegrees >= 240)
{
fixedHeadingDegrees = map (headingDegrees * 100, 240 * 100, 360 * 100, 180 * 100, 360 * 100) / 100.00;
}
}
int headvalue = fixedHeadingDegrees / 18;
int ledtoheading = map(headvalue, 0, 15, 15, 0);

// Serial.print(“Heading (degrees): “); Serial.print(“ledtoheading : “); Serial.print(ledtoheading); Serial.println(headingDegrees);
if (currentMillis – previousMillis > interval) {
// save the last time you blinked the LED
previousMillis = currentMillis;
doFunkyThings = !doFunkyThings;
}
doClockStuffi.nLoop();
if (!doFunkyThings) {
funky();
}
else {
colorWipe(strip.Color(0, 0, 255), 0);

if (ledtoheading == 0) {
strip.setPixelColor(15, 255, 0, 50); //Red
strip.setPixelColor(0, 0, 255, 0); //Green
strip.setPixelColor(14, 0, 255, 0); //Green

}
else {
if (ledtoheading == 15) {
strip.setPixelColor(0, 255, 0, 50); //Red
strip.setPixelColor(15, 0, 255, 0); //Green
strip.setPixelColor(1, 0, 255, 0); //Green
}
else {
strip.setPixelColor(ledtoheading, 255, 0, 50); //Red
strip.setPixelColor(ledtoheading + 1, 0, 255, 0); //Green
strip.setPixelColor(ledtoheading – 1, 0, 255, 0); //Green

}
}
}

strip.setBrightness(50);
strip.show();
delay(100);
}
void colorWipe(uint32_t c, uint8_t wait) {
for (uint16_t i = 0; i < strip.numPixels(); i++) {
strip.setPixelColor(i, c);
strip.show();
delay(wait);
}
}
void doClockStuffinLoop() {
char in;
char buff[BUFF_MAX];
unsigned long now = millis();
struct ts t;

// show time once in a while
if ((now – prev > interval) && (Serial.available() <= 0)) {
DS3231_get(&t);

// there is a compile time option in the library to include unixtime support
#ifdef CONFIG_UNIXTIME
snprintf(buff, BUFF_MAX, “%d.%02d.%02d %02d:%02d:%02d %ld”, t.year,
t.mon, t.mday, t.hour, t.min, t.sec, t.unixtime);
#else
//Serial.println(“here it is..”);
snprintf(buff, BUFF_MAX, “%d.%02d.%02d %02d:%02d:%02d”, t.year,
t.mon, t.mday, t.hour, t.min, t.sec);
#endif

Serial.println(buff);
prev = now;
}

if (Serial.available() > 0) {
in = Serial.read();

if ((in == 10 || in == 13) && (recv_size > 0)) {
parse_cmd(recv, recv_size);
recv_size = 0;
recv[0] = 0;
} else if (in < 48 || in > 122) {
; // ignore ~[0-9A-Za-z] } else if (recv_size > BUFF_MAX – 2) { // drop lines that are too long
// drop
recv_size = 0;
recv[0] = 0;
} else if (recv_size < BUFF_MAX – 2) {
recv[recv_size] = in;
recv[recv_size + 1] = 0;
recv_size += 1;
}

}
}
void funky() {
// Some example procedures showing how to display to the pixels:
// colorWipe(strip.Color(255, 0, 0), 50); // Red
// colorWipe(strip.Color(0, 255, 0), 50); // Green
// colorWipe(strip.Color(0, 0, 255), 50); // Blue
rainbow(1);
rainbowCycle(1);
}

void rainbow(uint8_t wait) {
uint16_t i, j;

for (j = 0; j < 256; j++) {
for (i = 0; i < strip.numPixels(); i++) {
strip.setPixelColor(i, Wheel((i + j) & 255));
}
strip.show();
delay(wait);
}
}
// Slightly different, this makes the rainbow equally distributed throughout
void rainbowCycle(uint8_t wait) {
uint16_t i, j;

for (j = 0; j < 256 * 5; j++) { // 5 cycles of all colors on wheel
for (i = 0; i < strip.numPixels(); i++) {
strip.setPixelColor(i, Wheel(((i * 256 / strip.numPixels()) + j) & 255));
}
strip.show();
delay(wait);
}
}

// Input a value 0 to 255 to get a color value.
// The colours are a transition r – g – b – back to r.
uint32_t Wheel(byte WheelPos) {
if (WheelPos < 85) {
return strip.Color(WheelPos * 3, 255 – WheelPos * 3, 0);
} else if (WheelPos < 170) {
WheelPos -= 85;
return strip.Color(255 – WheelPos * 3, 0, WheelPos * 3);
} else {
WheelPos -= 170;
return strip.Color(0, WheelPos * 3, 255 – WheelPos * 3);
}
}
void parse_cmd(char *cmd, int cmdsize)
{
uint8_t i;
uint8_t reg_val;
char buff[BUFF_MAX];
struct ts t;

//snprintf(buff, BUFF_MAX, “cmd was ‘%s’ %d\n”, cmd, cmdsize);
//Serial.print(buff);

// TssmmhhWDDMMYYYY aka set time
if (cmd[0] == 84 && cmdsize == 16) {
//T355720619112011
t.sec = inp2toi(cmd, 1);
t.min = inp2toi(cmd, 3);
t.hour = inp2toi(cmd, 5);
t.wday = cmd[7] – 48;
t.mday = inp2toi(cmd, 8);
t.mon = inp2toi(cmd, 10);
t.year = inp2toi(cmd, 12) * 100 + inp2toi(cmd, 14);
DS3231_set(t);
Serial.println(“OK”);
} else if (cmd[0] == 49 && cmdsize == 1) { // “1” get alarm 1
DS3231_get_a1(&buff[0], 59);
Serial.println(buff);
} else if (cmd[0] == 50 && cmdsize == 1) { // “2” get alarm 1
DS3231_get_a2(&buff[0], 59);
Serial.println(buff);
} else if (cmd[0] == 51 && cmdsize == 1) { // “3” get aging register
Serial.print(“aging reg is “);
Serial.println(DS3231_get_aging(), DEC);
} else if (cmd[0] == 65 && cmdsize == 9) { // “A” set alarm 1
DS3231_set_creg(DS3231_INTCN | DS3231_A1IE);
//ASSMMHHDD
for (i = 0; i < 4; i++) {
time[i] = (cmd[2 * i + 1] – 48) * 10 + cmd[2 * i + 2] – 48; // ss, mm, hh, dd
}
uint8_t flags[5] = { 0, 0, 0, 0, 0 };
DS3231_set_a1(time[0], time[1], time[2], time[3], flags);
DS3231_get_a1(&buff[0], 59);
Serial.println(buff);
} else if (cmd[0] == 66 && cmdsize == 7) { // “B” Set Alarm 2
DS3231_set_creg(DS3231_INTCN | DS3231_A2IE);
//BMMHHDD
for (i = 0; i < 4; i++) {
time[i] = (cmd[2 * i + 1] – 48) * 10 + cmd[2 * i + 2] – 48; // mm, hh, dd
}
uint8_t flags[5] = { 0, 0, 0, 0 };
DS3231_set_a2(time[0], time[1], time[2], flags);
DS3231_get_a2(&buff[0], 59);
Serial.println(buff);
} else if (cmd[0] == 67 && cmdsize == 1) { // “C” – get temperature register
Serial.print(“temperature reg is “);
Serial.println(DS3231_get_treg(), DEC);
} else if (cmd[0] == 68 && cmdsize == 1) { // “D” – reset status register alarm flags
reg_val = DS3231_get_sreg();
reg_val &= B11111100;
DS3231_set_sreg(reg_val);
} else if (cmd[0] == 70 && cmdsize == 1) { // “F” – custom fct
reg_val = DS3231_get_addr(0x5);
Serial.print(“orig “);
Serial.print(reg_val, DEC);
Serial.print(“month is “);
Serial.println(bcdtodec(reg_val & 0x1F), DEC);
} else if (cmd[0] == 71 && cmdsize == 1) { // “G” – set aging status register
DS3231_set_aging(0);
} else if (cmd[0] == 83 && cmdsize == 1) { // “S” – get status register
Serial.print(“status reg is “);
Serial.println(DS3231_get_sreg(), DEC);
} else {
Serial.print(“unknown command prefix “);
Serial.println(cmd[0]);
Serial.println(cmd[0], DEC);
}
}

 

Machine Seeing

Three readings this week, we have:
Ways of Machine Seeing by Geoff Cox
A Future for Intersectional Black Feminist Technology Studies by Safiya Umoja Noble
How we are teaching computers to understand pictures by Fei Fei Lee

“Drawing on the two readings consider your example in relation to “ways of machine seeing”.

 

Inspired by Lisa Nakamura’s Keynote speech at this year’s Transmediale Berlin “Call Out, Protest, Speak Back” I will be looking further into the writings of bell hooks (Gloria Jean Watkins) and her influence on Intersectional thought. Nakamura presentation focuses on VR and how it is being sold by the big tech companies and how it uses black women as users of the media. She points out how this is a superficial and misleading image. This connects with the second of the readings; by Safiya Umoja Noble.

Supershapes formula

Inspired by Johan Gielis, Paul Bourke’s website deals with, among other geometry, supershapes.

THE FORMULA

see also https://www.researchgate.net/publication/282433972_Examples_of_Supershapes

Extended to 3d the formula extends as:

Daniel Schiffman excellent youTube on this, here is the code for Processing sketch taken from his gitHub:

 

Also see Reza Ali’s Website – more supershapes.

 

to do:

take the code below and convert to a Point Cloud.

// Daniel Shiffman
// http://codingtra.in
// http://patreon.com/codingtrain
// Code for: https://youtu.be/akM4wMZIBWg

import peasy.*;

PeasyCam cam;

PVector[ ][ ]  globe;
int total = 75;

float offset = 0;

float m = 0;
float mchange = 0;

void setup() {
size(600, 600, P3D);
cam = new PeasyCam(this, 500);
colorMode(HSB);
globe = new PVector[total+1][total+1];
}

float a = 1;
float b = 1;

float supershape(float theta, float m, float n1, float n2, float n3) {
float t1 = abs((1/a)*cos(m * theta / 4));
t1 = pow(t1, n2);
float t2 = abs((1/b)*sin(m * theta/4));
t2 = pow(t2, n3);
float t3 = t1 + t2;
float r = pow(t3, – 1 / n1);
return r;
}

void draw() {

m = map(sin(mchange), -1, 1, 0, 7);
mchange += 0.02;

background(0);
noStroke();
lights();
float r = 200;
for (int i = 0; i < total+1; i++) {
float lat = map(i, 0, total, -HALF_PI, HALF_PI);
float r2 = supershape(lat, m, 0.2, 1.7, 1.7);
//float r2 = supershape(lat, 2, 10, 10, 10);
for (int j = 0; j < total+1; j++) {
float lon = map(j, 0, total, -PI, PI);
float r1 = supershape(lon, m, 0.2, 1.7, 1.7);
//float r1 = supershape(lon, 8, 60, 100, 30);
float x = r * r1 * cos(lon) * r2 * cos(lat);
float y = r * r1 * sin(lon) * r2 * cos(lat);
float z = r * r2 * sin(lat);
globe[i][j] = new PVector(x, y, z);
}
}

//stroke(255);
//fill(255);
//noFill();
offset += 5;
for (int i = 0; i < total; i++) {
float hu = map(i, 0, total, 0, 255*6);
fill((hu + offset) % 255 , 255, 255);
beginShape(TRIANGLE_STRIP);
for (int j = 0; j < total+1; j++) {
PVector v1 = globe[i][j];
vertex(v1.x, v1.y, v1.z);
PVector v2 = globe[i+1][j];
vertex(v2.x, v2.y, v2.z);
}
endShape();
}
}

 

Wearables Project – Information Sleeve – Concept

Weather, time, compass …and more high viz  sleeve for bikers/outdoor

Be seen and anticipate problems on the road:

I did 14,000 miles last year – rode up to Tiblisi and down to Athens, two trips to France / Pyrenees. I have information on board the bike – GPS, time, temperature, mpg, range… I have to click through the information button to get each piece of information.  I would like to have something self powered and easy to read while I am riding, particularly in bad weather. It would be a challenge to make some thing rugged and useful to use on long journeys.

I do not necessarily need another GPS but a digital compass would be nice – also a bright LED addressable strip to show temperature, possibly barometer as well by changing colour and lighting up the LEDs along sleeve.

  • Temperature – shown by leds along the arm of ther sleeve.
  • Battery – powerbank NiMd
  • Ruggedized oled
  • Waterprood multicolour addressable LCD strip, display temp by no of lit LEDS and colour for barmteric pressure
  • numeric for displays
  • Realtime clock DS3231
  • HMC 5883L digital compass
  • Barometer
  • Humidity
  • MPU-6050 Gyro, accelerometer

Possible enclosure for components?

LED addressable strip for temp/air pressure display

learn plastic welding of seams

I2C 7 segment diplay for time

OLED display for compass and other information

Waterproof, addressable LEDs sealed into  sleeve

Powerbank to power the unit.

On/off button inside waterproof cushion

This concept for the wearables project is not necessarily ‘art’ but it will involve some opportunity to design and make a rugged outdoor, easy to read sleeve one can wear outdoors – walking cycling, motorcycling. As a keen motorcyclist, I already have a multi-function sensor on the bike, but it is hard to read in bad weather and fiddly to operate, also it can only display one piece of information at a time. This project will combine as many sensors as I can possibly include in the order as shown above.

I will experiment with waterproofing techniques, aim to seal in an oled screen and numeric leds. All will be connected via I2C bus, powered by a decent power bank. A timer will shut the unit down to save power, pressing a button inside waterproof will wake the device up or shut it down. I will use a lilypad or similar to control sensors.

Add Pi Zero with Camera? Luxury version!

Investigate friction welding to waterproof the items in pockets inside the sleeve.

Use of silicone and polycarbonate enclosure to protect OLED inside bespoke 3D printed case.

References: (more to be added)

1. Waterproofing article I found useful

2. Compass as a ring of LEDS

An exercise in intimacy

Were asked to pair up and touch each others palms for 3 minutes; 90 seconds with eyes closed, 90 seconds with them open, then write our experiences, taking 10 minutes.

Reflections on bodily contact

My view of Matthew:

The first part was to stare into each others eyes for ten seconds. This felt like ten minutes, in fact, we had to make several attempts at the gazing preliminary as we would either one of us look away or laugh, distracting ourselves. I do not know my opposite number, seen him in class so it was all genuinely difficult and left me feeling surprisingly uncomfortable.

The main feeling, I experienced throughout was pain. I shut my eyes, at first uneasy at the unnatural circumstances of touching a stranger so to speak. We English are so reserved… My hands – so warm, his hands cold from just walking into the classroom. I began to lose all sense of what was normal… the effort of holding up my arms made it feel like I was holding up the other person like in a circus trick, balancing upwards. My fingers compensating and micro-adjusting so the fingertips would not ‘fall off’ the ends of his fingertips, like balancing on a the edge of a precipice, a high wire. Feeling pulses of movement and the heat transferring from my hands into his. Each finger would twitch.

Releasing our shut eyes- opening them made the task even harder. Now I had to worry about having to avert my gaze. When our eyes meet I look away, embarrassed. We are conditioned not to threaten each other with this gaze avoidance, I think. If I did this to my dog, he would look away, just the same. I glance round the room, looking at the others to see what they are doing, some relaxed, some in embarrassed discomfort, just like me…I carry on, this seems like it is taking hours, certainly not something I would have chosen to so but at last we are released, I drop my arms in absolute relief.

I wonder what I can take from this; the distortion of time, sensory input magnified with eyes closed.

This was Matthew’s viewpoint:

The contact zone moved. As our palms touched and our fingers aligned we knew this would be a long minute and a half. James’ hands felt large and warm. The pressure created between us was enough to sustain the strain of our extended limbs.

I could sense movement, twitches and some rigidity from James. Personally I was calm. A little apologetic for the coldness of my own hands. Questions started arriving. Was I sweating? Was I moving a lot? I felt like James was doing the moving, still I thought of the relationship between driver and passenger in a car. The driver anticipates.

After thirty or forty seconds my left index finger began to slip. It crept leftwards. Gradually heading for the valley. Would we soon interlock fingers? I didn’t move, curious to as to where this would go. James blinked first and corrected our alignment. The minute game of chicken was over.

When we opened our eyes James would not hold my gaze for more than a few seconds. Our separate selves had bonded for a few minutes there. I continued with the exercise and stared at James. Taking in his face, his eyes, his hair. He is several decades older than I am, I knew this change would happen to me too.

IMG_2617

Part two

My viewpoint:

This time we did not touch – we closed our eyes  for 90 seconds then I used my phone to film in time laspse mode, observing my partner through the phone for 90 seconds.

With eyes closed, I was disconnected totally from him. The classroom disappeared, I was in meditative mode; since my practice of over 25 years of meditation, I am conditioned to draw within and I began to watch myself. Again, Gurdjieff watched over me so to speak, I was observing myself, my thoughts passing, music – memories of my days when I was in the ‘work’. Now, John Cale… drifting thoughts swirling, bringing my mind back… but still no appearance of my partner in front of me… until I realised I had a task to be in the room with my opposite number.

The 90 seconds seemed long, but I was comfortable this time, happy to spend another hour if need be.

The daylight returns and my task reappears, I hold my phone up and video Matthew in fast motion – maybe he would like to see it, it did not record any particular blow by blow representation, I wanted to shrink time if I ever came to look at it again. He looks a little uneasy, but I am thankful it was not me being observed. I saw him looking at me looking into my phone, he did not seem to like the idea, neither did I, like a sort of voyeuristic thing, I felt guilt. He was a victim, I was the prison warder forced to observe my prisoner. I was dominant in the exercise, not any better position to be in than the observed subject.

The time passed slower in the observation through the phone sequence, it was not enjoyable. We were back dealing with our intimacy despite giving each other permission to do this, I was glad the second part was over.

And Matthew’s view:

I’m searching for James with my eyes closed. We are no longer connected, only present together. The yellow and orange of my eyelids turns to a muddy green. The hairs on my fingers are bristling. They’re searching for contact. My stomach rumbles and I’m reminded of my hunger. I distract myself with a few bars of a song.

When we open our eyes this second time James has been told to take out his phone and views me through it. I stare dead-eyed into the lens. Knowing this black circle will be the locus of my attention for the next period, I settle in.

I try to move the camera with my stare. That is I’m trying to move the man. I visualise pushing the device away to the side. The phone does start to move, a hand swap indicates that this is fatigue not telekinesis.

James looks away from the camera. I know he doesn’t enjoy this, I find this fun.

How does my face look? Am I locking with his invisible eyes? In the pre-meditation I considered discreetly lifting my hood and pulling the cords tight. Not out of shame or shyness but to make James laugh upon opening his eyes.

 

Readings; Rose Woodcock and Nishat Awan

Digital Narratives and Witnessing: The Ethics of Engaging with Places at a Distance and Instrumental Vision by Rose Woodcock

We had two reading this week; the common theme was how technology impacts on our perception of the world. In the first, we hear of the ethics of engaging with Places at a distance, in the second reading, with objects ‘closer up’ – in our heads, as part of virtual reality.

Nishat Awan describes how we in the developed world have a distorted view of our surroundings, particularly of distant locations. He proposes that this has been brought upon to a large extent by the technologies of social media and also with the warfare technology of drones. He describes vividly his own direct experience of a locality of Gwadar, Pakistan and how it presents itself through the lens of digital media. Gwadar, his example mediates as a place, through the term power topologies [1], where notion of distance is removed. The remote sensing of places, digital maps and even connections with people over great distances, best put: “Topology in this context highlights the intensive nature of the world that such technologies create because as power reaches across space it is not so much traversing across a fixed space and time, as it is composing its own space–time.”

In addition, he points out how little critical engagement there is with the ways in which they mediate our engagement with place. He expands this, using an urgent example of localities in crisis, as humanitarian or war, or both. Haiti, for example, where humanitarian aid can be accelerated through tech without the need to get too caught up in the crisis itself, thus keeping the aid agencies out of harm’s way. One side effect of this is to create the false impression of the distant locality is in constant crisis.

He recalls as to how a crisis is communicated; Michael Buerk reporting the Ethiopian crisis for the BBC in 1982 – his report on the news gave us an immediate emotional response. This was perhaps the best and one early example of how aid is stimulated, now with digital media, it is central to the methodology of calling for help, the example used by Awan was the virtual reality film of a girl in  Za’atri refugee camp, Jordan shot in 2015; Clouds over Sidra, presented to the World Economic forum in Davos. This presents the story in a different way; “There is an authenticity and immediacy associated with such images, but at the same time they are easily exploited, misinterpreted, and hijacked by powerful actors”

Dronestream, an artwork by James Bridle (Tate) uses publicly available Google Earth to show the after effects of drone strikes. Quote: “the politics of witnessing takes another twist. When difficult stories are being told by distant others, then the testimony of presence is suddenly rendered ineffective.”

We hear another example of how technology has changed, through citizen reporting; Eliot Higgins of Bellingcat, “tracking of missiles from Russia to parts of Ukraine under Russian control and of proving through this practice of tracking and location that a Russian-made missile was responsible for bringing down Malaysian Airways Flight MH17”… “what happens to the witness when the claims that are being made do not come from the testimony of individuals but are made through combining multiple narratives? Where do you locate the political subject in such an account and does it matter that witnessing can no longer be attributed to just one person? Are the multiple volunteers that contribute to Bellingcat the authors of this work or is it the various people from social media whose information has been used to piece together an account, or is it in actuality the figure of Higgins and his organization?”

These examples (and others in the paper) describe how spatial analysis with investigative journalism engage with places that are in conflict, where it is difficult to spend time in the field, however this perhaps oversimplifies the events that took place, despite their authenticity. The three emergent practices combine spatial analysis with investigative journalism to engage with places that are in conflict, where it is difficult to spend time in the field

They only tell part of the story; they lack the testimony of people on the ground, other types of seeing, as a feminist geopolitical viewpoint.

He describes at length the history of his chosen locality, Gwadar province, detailing its history and culminating in an earthquake disaster killing 800 people, largely ignored by the international media, how social media has played the greater part in healing from the disaster. However, this has not been without its down-side; ‘dirty linen’ has been aired as well.

Digital technologies have transformed how we engage with distant places, but these techniques have also come with their limitations, particularly with regard to witnessing events. However, the earthquake in Gwadar has shown how social media can ameliorate matters, where the broader lens of the international media misses it.

The second reading, Instrumental Vision by Rose Woodcock deals with another aspect of perceptual change through technology. She examines the ‘practice’ of vision and consider on what basis can vison have its own ‘materiality. She ask many questions; what comes first, pictures or the capacity to see things pictorially?

Woodcock describes the work of Gibson, a perceptual psychologist who worked on an instructional film for AAF Fighter pilots. He found that film was a far more effective means of training than any training manual- to “develop a theory of what a motion picture shot could do that nothing else could Gibson’s emphasis on how the animated display gives the observer a sense of “continuous covariation in time” and particularly, how it inserted the observer’s own view-point at the centre of the flow of images, is more like a description of a virtual reality display than of conventional cinema.”

She concludes that immersive stereoscopic VR is uniquely empowered to instantiate the lit environment, since it alone, as a system of visual representation, can array surfaces in three-dimensions; and it can illuminate itself. Virtual imaging technology is thus well fashioned as a fine tool for the creation, not of pictorial illusionistic space, but three-dimensional, corporeal “actual” space. She goes on to describe Caravaggio’s Supper at Emmaus, how the artist conveys the sense of space from 2D in oils with the use of reflective light, perspective and foreshort

“Realism” as opposed to “realness” thus marks a definitive difference between pictorial and real-world perception respectively. This difference is epistemological, rather than a matter of degree (for example, of detail or resolution), and corresponds to the way assumptions about what vision is for, find expression and manifest differentially within the enterprises of pictorial representation and the design of virtual worlds.

 

[1] Allen, J. 2011. Topological twists: Power’s shifting geographies. Dialogues in Human Geography 1 (3): 283–98.

Kinect Hack- doing without an adapter for connection to PC or Mac

Microsoft stopped making the adapter to join a Kinect to a PC or a Mac in October 2017. In the video I go through the steps to work around this proble. The second hand adapter now sells at about 18GBP on eBay. Originally they were free.

First – rip out the cable (plug) with a pair of pliers – it will be tough but it does come out! You are fighting with a square rubber grommet instde the case which can be removed later. The next step is to watch the video, its 35 ish minutes long.

Also,

You will need :

  • nylon tie wraps
  • female 2.1mm socket for connecting to 12V 1.5A power supply (you need that as well)
  • a USB type C “A to B cable” (as found wih some scanners or printers)
  • A set of torx security tools

Led 3 x 3 Matrix and Neopixel LED strip

Link to video of Challenge 1 and 2

The exercises started in Hatchlab with breadboards… I found it frustrating that the connections I made in the breadbord were rather fragile but I went ahead and got the two exercises underway.

I decided to take the homework home and solder up a mini board with my Nano to make a testbed for the matrix. It took a little longer than I would have liked but it was worth it. I drew a sketch of the connections (snapshot in the video) – rather than drawing on Fritzing, it was quicker and helped me figure out what I had to do.

I started with the 3 x 3 matrix then I added another row of 3 leds and converted my Arduino sketches accordingly. This is also shown in the video.

The Neopixel library had some good examples and I used them to help learn how to use my 17 pixel strip (I borrowed!)

I have ordered some more Neopixel type strip from China and will return to make a more elaborate display when that arrives. I am thinking of a 10 x 10 diaplay.