ATTiny85

Examining the low ‘minified’ ATtiny85 chip, this provides a cut down version of the Arduino; using much less current and only about £1 in cost.

I connected up the ATTiny to an Arduino to burn the bootloader – this allows other programs to be loaded up afterwards. I also made a harness – dedicated, soldered circuit to simplify future programmingof the ATTiny without wiring up a breadboard each time.

Once the bootloader is loaded, I upload the ‘Blink’ sketch to prove it works.

Here is the pcb I made to program the ATTiny, it plugs on top of the Arduino.

underneath! Filed down solder to make a good connection to Ardiuino…

And Attached 3 LEDs to blink

Schematic for programmer shield

 

 

Proposal for End of Term Project: Sand Plotter

Introduction

Many designs of sand plotters are found on the Internet. When I first thought of doing this, I was unaware of how well trodden the path is, however I am not discouraged because it will be a good test of fabrication and still leave plenty of scope for designing patterns that can also be interactive to external environmental factors.

Movement or sound could modify the emerging pattern of the plotter, which may run continuously for extended periods. Thus, patterns can be drawn in a circular (it could be square also) enclosed chamber with a steel ball or cylinder.

Construction

Under the enclosed chamber is a revolving double rail, powered by a high torque stepper motor. The double rail will have a magnet mounted, pulled in the y axis by another stepper motor. The control board will be an Arduino mega with a Ramps shield board (or similar) comprising polulu stepper drivers.

So, with one circular motion and one lateral motion, circular patterns can be drawn in the sand.

Software to control the Ramps board will be developed and also some runtime scripts developed to demonstrate the sand plotter.

Prototype

First stage will be to construct a frame without enclosure, mounting the mechanical parts. A very simple test script to move the turntable and lateral axis with be the first stage.

Second stage is to provide accurate control of the stepper motors

Third stage is to fabricate the sand enclosure and outer box (laser cut)

Fourth stage (possibly too much in the time alotted) – to build some interactivity with external sensors, thus modulating the sand pattern in real time.

Dimensions

I do not want to build anything too large, it becomes impractical, however it must be large enough to look impressive in a gallery setting. So, at least 50 cms diameter.

Examples:

I am not the first…

https://youtu.be/7SyORW-bhLQ

http://forums.jjrobots.com/forum-34.html

Main Parts list:

Laser cut enclosure

Bearings for circular movement

2 x Nema high torque stepper motors

toothed belt, possibly 3d printed gears, metal belt pulleys,steel rods and linear bearings (I have most of these parts from building a 3d printer)

limit switches – these may not be mechanical but optical sensors

Ramps shield and Arduino Mega, polulu stepper drivers

Steel rods and 3d printed supports

Neodymium magnet

15-20 mm diameter steel ball

Privacy Concerns, Tracking, Surveillance of individuals, Public Privacy

As part of our planning for the next Assignment for Computational Arts-based Research, my theme will be Privacy Concerns, Tracking, Surveillance of individuals, Public Privacy.

 

What are the key questions or queries you will address?

How much are we being tracked by government agencies, companies?

How much is revealed already?

“In the wake of the Government’s proposed “Snoopers’ Charter”, ORG asks why intrusive new laws are being suggested, if they are needed at all and what the alternatives are. Some of the UK’s most prominent surveillance experts examine the history of UK surveillance law and the challenges posed by the explosion of digital datasets. Contributors include journalist Duncan Campbell, legal expert Angela Patrick from Justice, Richard Clayton of Cambridge University Computer Labs and Peter Sommer, Visiting Professor at De Montfort University.”

Open Rights Group

Snowden Global surveillance disclosures

What Google has Hidden

Blank Spots on the Map: The Dark Geography of the Pentagon’s Secret World

places-google-earth-wont-let-you-see

Satellite_map_images_with_missing_or_unclear_data

ispy-cia-campaign-steal-apples-secrets

Trevor Paglen

Why are you motivated to undertake this project?

This is a concern of mine, for example, even yesterday (14th February) 2018), Amber Rudd announced a new means of tracking citizens in their internet activity.

Google have increased the resolution of the street mapper software. Up until recently, a house might be only a vague blob, until now.

My home address is revealed on my ‘whois’ search on my website. I have to pay the hosting company extra to hide it.

Tracking through cookies – possible eavesdropping of metadata on personal emails, e.g. correspondence appearing on Facebook.

Amber Rudd and her efforts to ‘fight’ terrorism may be circumvented:

http://www.wired.co.uk/article/isis-propaganda-home-office-algorithm-asi

by…

https://www.asidatascience.com/ (recruiting now!)

What theoretical frameworks will you use in your work to guide you?

I will explore the contrast between the uninvited (yet legal) surveillance of my own ‘back yard’ vis a vis my un-noticed intrusion to neighbouring houses Wi-Fi (illegal).

If there is a problem undertaking this option, I will be investigating ‘hidden/secret’ sites in UK and abroad to see how much can be gathered using web tools such as Google maps.

I may only in the end enact the possibility of the latter to avoid problems with the law – however, I will use publicly available data online to show how international companies like Google intrude on our private lives. I will use Actor-Network theory to take us through the players in this scene, illustrating with real life examples.

What theoretical frameworks will you use in the analysis of your project?

I will investigate how much can be found out about me in the public domain online.

I will explore tools for intrusion, legal and illegal. (Kali Linux tools, Aircrack-ng, Reaver, Pixiewps, Wireshark.)

Using Actor network theory to determine who is the victim and who is the perpetrator, are they both?

How will you document your project?

Video capture, demonstration of software intrusion tools using Kali Linux on Raspberry Pi.

How close can I get with Google streetview of private and secret establishments, recording this using Camtasia.

Logging into password protected Wi-Fi in public spaces using open source Linux distro Kali Linux (covertly) via hidden battery powered headless Pi Zero. (possibly will not be available as there may be legal issues here.)

Possible logging of public data traffic and using Wireshark for forensic study.

Timeline for project milestones

Week 1, 2. Further research

Week 3. Artefact 1; covert monitoring of data

Week 4. Google Street view compilation

Budget (if any)

Raspberry Pi 3, Pi Zero (already have these)

 

“Face Value” Transmediale 2018

 

My last visit to Berlin was in 1990 in early March, a few months after the fall of the Berlin wall, Checkpoint Charlie was still in operation while I was there, East Berlin still wearing the ragged clothing of tears of Soviet rule; the flower stall, a bucket of daffodils, the greengrocer shop => a pile of huge unwashed potatoes with can of Coca-Cola placed on top.

Families wander through the park, some leaning up against hot dog stand – the whole family sharing one hot dog. Trabants broken all along the barbed wire fenced motorway leading out of Berlin to West Germany, rows of repaired identical alarm clocks with identical cardboard labels waiting for customers to collect them in the department store. Capitalism manifests again in grimy car parks with Polish families on blankets, selling their children’s toys for bread as vile West Germans swoop in to buy up the bric-a-brac from the comfort of their Mercedes and Bmws.

Today, Capitalism oozes out of every crevice, from the ‘hipster’ Kreutzberg to the immense tower blocks capturing space all over the city. It was a shock to me, to see the transformation of a city – and yet, the energy and power of the city and its people helped me forget my memories of the 80’s and threw me into a future… But why does no-one speak German?

I recall why I am in the city, I continue my innocent ventures into Computational Art spend almost all of my four days attending Transmediale 2018 – I decided from the outset that I would attend as many talks as I could endure, allowing for mental/physical fatigue. My college classmates roll in at 06:30 just as I get dressed and showered…

No, I did not ‘enjoy’ all the talks, however, overall the sessions were all extremely stimulating but I had my likes and dislikes.

First, the dislikes.

The panel discussion Nefarious Values: On Artistic Critique and ComplicityMarc Garrett, Eric Kluitenberg, Sven Lütticken, Ana Teixeira Pinto, belit sağ, Lioudmila Voropai,Moderated by Marc Garrett

Eric K was absent. Sven Lütticken gave a measured, slightly vague presentation, discussing the rising inequality and failure of capitalism. Lioudmilla Voropai was really hard to follow, her English was not flowing and presentation poor. Her main points were concerned solely on the aspect of critique; how the artist will develop his/her practice. She seemed to ramble and was not able to communicate and made any points with clarity. Ana Teixeira Pinto was not much better, more concerned with rattling off a very dense text she read out and proudly proclaimed she finished in 2 minutes 30 seconds in the allotted 5-minute time slot. The third panellist (NOT LISTED) over-ran but showed a 2-minute video concerned with the plight of Kurds and war criminals in Turkey/ censorship. She was earnest and genuine I felt but the moderator seemed to take a dislike to her and challenged her, demanding she gave some ‘answers/solutions’ to the problem. I left feeling quite annoyed at the pompous Ludmilla, the vague co-presenters and felt it a wasted hour.

‘Fuck Off Google’ was another disappointment to me, if only because it reminded me of the unfocused anarchist meetings I attended during the ‘occupy’ period over 2011. The general posits; to demonstrate against the start up space and Google’s plan to move into the (already gentrified) Kreuzberg, a distinctly bohemian/creative area of east Berlin, to  what the presenters claimed was becoming rapidly becoming gentrified. The two presenters were convincing enough but I felt I could have spent less time at the session, even though I arrived a few minutes late. It made me conclude; how far can we divorce ourselves from the racial capitalism that continues to dominate our world without completely descending into nihilism? We have a tension between the world as we have it and an idealised world we would like to have. It is important to speak up and point out imperfections, one of which is how ‘greenwashing’ – crumbs of money off the table to appear green and ‘right on/trendy/cool’  by corporates such as Google, how Google has gentrified cities like San Francisco – now Berlin is in the sights of Google. I felt the motivation of the talk was valid but the free form questioning exposed how little the speakers had to say and went on too long.

However, I really liked all the other presentations of the day, particularly the keynote speech by Jonathan Beller; Derivative Living: “Platform Communism: A Program for Derivative Living Against Regimes of Informatic Subsumption”. It is worth following up with the video of his talk:

Audio:

Beller described eloquently and in a well-structured way, Toxic Media, Toxic Finance, Toxic Information and New Economic Spaces/ECSA.

His conclusion; blockchain offers technologies to frame a means of enabling small communities to bypass the toxic capitalist system, offering a small window of opportunity for those imagining a new beginning. He did not claim to describe how this all may play out as he stated the tech was still in its infancy. The questions raised at the end were excellent also, challenging him but he maintained composure and responded credibly.

Another favourite of mine, a keynote speech by Professor Lisa Nakaruma ‘Call Out, Protest, Speak Back’, I found to be the most memorable and thought-provoking. Nishant Shah gave an excellent presentation and introduced Professor Nakaruma as being inspirational to him.

Her talk revealed my own ignorance, but that was OK, because I benefited – I followed up on her talk – reading up on bell hooks, Audre Lord et al. Prof. N. points out examples of misogyny and racism ‘strengthened and consumed’ in gaming platforms, also presentations VR technology products – particularly via ‘new media’ . She offers an example; a black woman being seen using a VR product is not only shallow but also reveals the efforts by tech companies to counter balance expectations to show their true white middle-class customer base. A stroppy member of the audience pipes up 1 hour 16 minutes in the Q and A, demanding an explanation from Nakaruma, claiming she ‘knew’ VR and how it works… the audience howled in horror! Prof. N calmly agrees, she does not know how to make VR but asks us to see how these products are being sold. Well worth listening to.

Perhaps I attended too many talks, overloading my brain – however – in no particular order of merit, all good: ‘Soundtrack for Webcams – Live’,

‘Hard Feelings: A Conversation on Computation and Affect’ (with my lecturer Helen Pritchard), ‘The Space In-Between: The Value  of Interpretation and Interaction for the  Next Generation Internet’, ‘Politics of Forgetfulness’,‘Calculating Life’ (With Heather Dewey-Hagborg, excellent!), ‘Artists Re:Thinking the Blockchain ‘, ‘Reimagine the Internet: Affect, Velocity, Excess’, ‘The Weaponization of Language’ and ‘Growing a Repertoire: The Preservation of Net Art as Resistance to Digital Industrialism.’

 

Full programme available with recorded presentations

 

 

 

 

 

 

Neopixel ring Compass Road Safety Dog Jacket

Abandoned the waterproof sleeves idea and decided to do more sewing, not pvc/plastic welding. (from previous blog post)

My dog Baxter is an excellent model and his old coat is never used because he hates wearing it, so I gathered my Arduino Liliypad, compass sensor and other materials listed here:

Bill of Materials: waterproof.fzz

/Users/jtreg/gold/physical/waterproof.fzz

Friday, February 9 2018, 09:35:35
Assembly List
Label Part Type Properties
Part1 Lilypad Arduino Board type Lilypad Arduino
Part2 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
Part3 16 NeoPixel Ring variant variant 1; part # 1463
Part4 FTDI Basic Programmer type Basic; voltage 5V
Real Time Clock ZS-042 RTC Module chip DS3231; variant variant 4
U1 HMC5883L package 16lpcc; axis 3; variant smd
U2 LIPO-2000mAh package lipo-2000; variant 2000mAh
Shopping List
Amount Part Type Properties
1 Lilypad Arduino Board type Lilypad Arduino
1 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
1 16 NeoPixel Ring variant variant 1; part # 1463
1 FTDI Basic Programmer type Basic; voltage 5V
1 ZS-042 RTC Module chip DS3231; variant variant 4
1 HMC5883L package 16lpcc; axis 3; variant smd
1 LIPO-2000mAh package lipo-2000; variant 2000mAh
(I used a 2 x aaa battery as well)

Exported with Fritzing 0.9.3- http://fritzing.org

I tried out the compass code (other components plugged into breadboard, please ignore them!)

I have not yet integrated the real time clock into the project as I ran out of time. I will be usig the pixel ring to add hour, minute and second pixel to flash up.

Originally I planned to use a 7 segment display for the time and temperature
but I think the part I had was faulty. Additional functions could incorporate the temp display off the real time clock…

A little Evo-Stick on the end of electric thread stops it unravelling and helps thread through tiny component holes…

External USB power socket

LilyPad sewn in!

Power test

Added compass chip

Real time clock, battery. I ran out of electric thread so I used light wiring sewn down instead.

ready for walkies

My patient model. Extra waterproof to protect components in rain. Best results after dark!

 

Listing for Lilypad (work in progress)

/*

James Tregaskis

NeoPixel ring for dog jacket
—————————-
9th Feb 2018
This is code I used from two sources and merged them
I have not yet integrated the real time clock into the
project as I ran out of time. I will be usig the pixel
ring to add hour, minute and second pixel to flash up.
Originally I planned to use a 7 segment display for the time and temp
but I think the part I had was faulty.
Additional functions could incorporate the temp display off the real
time clock

Bill of Materials: waterproof.fzz

/Users/jtreg/gold/physical/waterproof.fzz

Friday, February 9 2018, 09:35:35
Assembly List
Label Part Type Properties
Part1 Lilypad Arduino Board type Lilypad Arduino
Part2 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
Part3 16 NeoPixel Ring variant variant 1; part # 1463
Part4 FTDI Basic Programmer type Basic; voltage 5V
Real Time Clock ZS-042 RTC Module chip DS3231; variant variant 4
U1 HMC5883L package 16lpcc; axis 3; variant smd
U2 LIPO-2000mAh package lipo-2000; variant 2000mAh

Shopping List
Amount Part Type Properties
1 Lilypad Arduino Board type Lilypad Arduino
1 Seven-Segment LED Backpack 1.2 Inch Digits variant Red; part # 1270
1 16 NeoPixel Ring variant variant 1; part # 1463
1 FTDI Basic Programmer type Basic; voltage 5V
1 ZS-042 RTC Module chip DS3231; variant variant 4
1 HMC5883L package 16lpcc; axis 3; variant smd
1 LIPO-2000mAh package lipo-2000; variant 2000mAh
(I used a 2 x aaa battery as well)

Exported with Fritzing 0.9.3- http://fritzing.org

*/
/***************************************************************************
This is a library example for the HMC5883 magnentometer/compass

Designed specifically to work with the Adafruit HMC5883 Breakout
http://www.adafruit.com/products/1746

*** You will also need to install the Adafruit_Sensor library! ***

These displays use I2C to communicate, 2 pins are required to interface.

Adafruit invests time and resources providing this open source code,
please support Adafruit andopen-source hardware by purchasing products
from Adafruit!

Written by Kevin Townsend for Adafruit Industries with some heading example from
Love Electronics (loveelectronics.co.uk)

This program is free software: you can redistribute it and/or modify
it under the terms of the version 3 GNU General Public License as
published by the Free Software Foundation.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.

***************************************************************************/

#include <Wire.h>
#include <Adafruit_Sensor.h>
#include <Adafruit_HMC5883_U.h>
#include “ds3231.h”
#define BUFF_MAX 128
uint8_t time[8];
char recv[BUFF_MAX];
long previousMillis = 0;
//long interval = 1000;
unsigned int recv_size = 0;
unsigned long prev, interval = 5000;
boolean doFunkyThings = false;
/* Assign a unique ID to this sensor at the same time */
Adafruit_HMC5883_Unified mag = Adafruit_HMC5883_Unified(12345);
#include <Adafruit_NeoPixel.h>

#define PIN 3

// Parameter 1 = number of pixels in strip
// Parameter 2 = pin number (most are valid)
// Parameter 3 = pixel type flags, add together as needed:
// NEO_KHZ800 800 KHz bitstream (most NeoPixel products w/WS2812 LEDs)
// NEO_KHZ400 400 KHz (classic ‘v1’ (not v2) FLORA pixels, WS2811 drivers)
// NEO_GRB Pixels are wired for GRB bitstream (most NeoPixel products)
// NEO_RGB Pixels are wired for RGB bitstream (v1 FLORA pixels, not v2)
Adafruit_NeoPixel strip = Adafruit_NeoPixel(16, PIN, NEO_RGB + NEO_KHZ400);

int fixedHeadingDegrees; // Used to store Heading value
float headingDegrees = 0;//heading * 180 / M_PI;
void displaySensorDetails(void)
{
sensor_t sensor;
mag.getSensor(&sensor);
Serial.println(“————————————“);
Serial.print (“Sensor: “); Serial.println(sensor.name);
Serial.print (“Driver Ver: “); Serial.println(sensor.version);
Serial.print (“Unique ID: “); Serial.println(sensor.sensor_id);
Serial.print (“Max Value: “); Serial.print(sensor.max_value); Serial.println(” uT”);
Serial.print (“Min Value: “); Serial.print(sensor.min_value); Serial.println(” uT”);
Serial.print (“Resolution: “); Serial.print(sensor.resolution); Serial.println(” uT”);
Serial.println(“————————————“);
Serial.println(“”);
delay(500);
}

void setup(void)
{
Serial.begin(9600);
Serial.println(“HMC5883 Magnetometer Test”); Serial.println(“”);

/* Initialise the sensor */
if (!mag.begin())
{
/* There was a problem detecting the HMC5883 … check your connections */
Serial.println(“Ooops, no HMC5883 detected … Check your wiring!”);
while (1);
strip.begin();
strip.setBrightness(30); //adjust brightness here
strip.show(); // Initialize all pixels to ‘off’
/* Display some basic information on this sensor */

}
// clock stuff
DS3231_init(DS3231_INTCN);
memset(recv, 0, BUFF_MAX);
Serial.println(“GET time”);
//
strip.begin();
strip.setBrightness(30); //adjust brightness here
strip.show(); // Initialize all pixels to ‘off’
displaySensorDetails();
colorWipe(strip.Color(255, 0, 0), 0);
}

void loop(void)
{
unsigned long currentMillis = millis();
/* Get a new sensor event */
sensors_event_t event;
mag.getEvent(&event);

/* Display the results (magnetic vector values are in micro-Tesla (uT)) */
// Serial.print(“X: “); Serial.print(event.magnetic.x); Serial.print(” “);
// Serial.print(“Y: “); Serial.print(event.magnetic.y); Serial.print(” “);
// Serial.print(“Z: “); Serial.print(event.magnetic.z); Serial.print(” “); Serial.println(“uT”);

// Hold the module so that Z is pointing ‘up’ and you can measure the heading with x&y
// Calculate heading when the magnetometer is level, then correct for signs of axis.
float heading = atan2(event.magnetic.y, event.magnetic.x);

// Once you have your heading, you must then add your ‘Declination Angle’, which is the ‘Error’ of the magnetic field in your location.
// Find yours here: http://www.magnetic-declination.com/
// Mine is: -13* 2′ W, which is ~13 Degrees, or (which we need) 0.22 radians
// If you cannot find your Declination, comment out these two lines, your compass will be slightly off.
float declinationAngle = 0.22;
heading += declinationAngle;

// Correct for when signs are reversed.
if (heading < 0)
heading += 2 * PI;

// Check for wrap due to addition of declination.
if (heading > 2 * PI)
heading -= 2 * PI;

// Convert to degrees
float headingDegrees = heading * 180 / M_PI;

// To Fix rotation speed of HMC5883L Compass module
if (headingDegrees >= 1 && headingDegrees < 240)
{
fixedHeadingDegrees = map (headingDegrees * 100, 0, 239 * 100, 0, 179 * 100) / 100.00;
}
else {
if (headingDegrees >= 240)
{
fixedHeadingDegrees = map (headingDegrees * 100, 240 * 100, 360 * 100, 180 * 100, 360 * 100) / 100.00;
}
}
int headvalue = fixedHeadingDegrees / 18;
int ledtoheading = map(headvalue, 0, 15, 15, 0);

// Serial.print(“Heading (degrees): “); Serial.print(“ledtoheading : “); Serial.print(ledtoheading); Serial.println(headingDegrees);
if (currentMillis – previousMillis > interval) {
// save the last time you blinked the LED
previousMillis = currentMillis;
doFunkyThings = !doFunkyThings;
}
doClockStuffi.nLoop();
if (!doFunkyThings) {
funky();
}
else {
colorWipe(strip.Color(0, 0, 255), 0);

if (ledtoheading == 0) {
strip.setPixelColor(15, 255, 0, 50); //Red
strip.setPixelColor(0, 0, 255, 0); //Green
strip.setPixelColor(14, 0, 255, 0); //Green

}
else {
if (ledtoheading == 15) {
strip.setPixelColor(0, 255, 0, 50); //Red
strip.setPixelColor(15, 0, 255, 0); //Green
strip.setPixelColor(1, 0, 255, 0); //Green
}
else {
strip.setPixelColor(ledtoheading, 255, 0, 50); //Red
strip.setPixelColor(ledtoheading + 1, 0, 255, 0); //Green
strip.setPixelColor(ledtoheading – 1, 0, 255, 0); //Green

}
}
}

strip.setBrightness(50);
strip.show();
delay(100);
}
void colorWipe(uint32_t c, uint8_t wait) {
for (uint16_t i = 0; i < strip.numPixels(); i++) {
strip.setPixelColor(i, c);
strip.show();
delay(wait);
}
}
void doClockStuffinLoop() {
char in;
char buff[BUFF_MAX];
unsigned long now = millis();
struct ts t;

// show time once in a while
if ((now – prev > interval) && (Serial.available() <= 0)) {
DS3231_get(&t);

// there is a compile time option in the library to include unixtime support
#ifdef CONFIG_UNIXTIME
snprintf(buff, BUFF_MAX, “%d.%02d.%02d %02d:%02d:%02d %ld”, t.year,
t.mon, t.mday, t.hour, t.min, t.sec, t.unixtime);
#else
//Serial.println(“here it is..”);
snprintf(buff, BUFF_MAX, “%d.%02d.%02d %02d:%02d:%02d”, t.year,
t.mon, t.mday, t.hour, t.min, t.sec);
#endif

Serial.println(buff);
prev = now;
}

if (Serial.available() > 0) {
in = Serial.read();

if ((in == 10 || in == 13) && (recv_size > 0)) {
parse_cmd(recv, recv_size);
recv_size = 0;
recv[0] = 0;
} else if (in < 48 || in > 122) {
; // ignore ~[0-9A-Za-z] } else if (recv_size > BUFF_MAX – 2) { // drop lines that are too long
// drop
recv_size = 0;
recv[0] = 0;
} else if (recv_size < BUFF_MAX – 2) {
recv[recv_size] = in;
recv[recv_size + 1] = 0;
recv_size += 1;
}

}
}
void funky() {
// Some example procedures showing how to display to the pixels:
// colorWipe(strip.Color(255, 0, 0), 50); // Red
// colorWipe(strip.Color(0, 255, 0), 50); // Green
// colorWipe(strip.Color(0, 0, 255), 50); // Blue
rainbow(1);
rainbowCycle(1);
}

void rainbow(uint8_t wait) {
uint16_t i, j;

for (j = 0; j < 256; j++) {
for (i = 0; i < strip.numPixels(); i++) {
strip.setPixelColor(i, Wheel((i + j) & 255));
}
strip.show();
delay(wait);
}
}
// Slightly different, this makes the rainbow equally distributed throughout
void rainbowCycle(uint8_t wait) {
uint16_t i, j;

for (j = 0; j < 256 * 5; j++) { // 5 cycles of all colors on wheel
for (i = 0; i < strip.numPixels(); i++) {
strip.setPixelColor(i, Wheel(((i * 256 / strip.numPixels()) + j) & 255));
}
strip.show();
delay(wait);
}
}

// Input a value 0 to 255 to get a color value.
// The colours are a transition r – g – b – back to r.
uint32_t Wheel(byte WheelPos) {
if (WheelPos < 85) {
return strip.Color(WheelPos * 3, 255 – WheelPos * 3, 0);
} else if (WheelPos < 170) {
WheelPos -= 85;
return strip.Color(255 – WheelPos * 3, 0, WheelPos * 3);
} else {
WheelPos -= 170;
return strip.Color(0, WheelPos * 3, 255 – WheelPos * 3);
}
}
void parse_cmd(char *cmd, int cmdsize)
{
uint8_t i;
uint8_t reg_val;
char buff[BUFF_MAX];
struct ts t;

//snprintf(buff, BUFF_MAX, “cmd was ‘%s’ %d\n”, cmd, cmdsize);
//Serial.print(buff);

// TssmmhhWDDMMYYYY aka set time
if (cmd[0] == 84 && cmdsize == 16) {
//T355720619112011
t.sec = inp2toi(cmd, 1);
t.min = inp2toi(cmd, 3);
t.hour = inp2toi(cmd, 5);
t.wday = cmd[7] – 48;
t.mday = inp2toi(cmd, 8);
t.mon = inp2toi(cmd, 10);
t.year = inp2toi(cmd, 12) * 100 + inp2toi(cmd, 14);
DS3231_set(t);
Serial.println(“OK”);
} else if (cmd[0] == 49 && cmdsize == 1) { // “1” get alarm 1
DS3231_get_a1(&buff[0], 59);
Serial.println(buff);
} else if (cmd[0] == 50 && cmdsize == 1) { // “2” get alarm 1
DS3231_get_a2(&buff[0], 59);
Serial.println(buff);
} else if (cmd[0] == 51 && cmdsize == 1) { // “3” get aging register
Serial.print(“aging reg is “);
Serial.println(DS3231_get_aging(), DEC);
} else if (cmd[0] == 65 && cmdsize == 9) { // “A” set alarm 1
DS3231_set_creg(DS3231_INTCN | DS3231_A1IE);
//ASSMMHHDD
for (i = 0; i < 4; i++) {
time[i] = (cmd[2 * i + 1] – 48) * 10 + cmd[2 * i + 2] – 48; // ss, mm, hh, dd
}
uint8_t flags[5] = { 0, 0, 0, 0, 0 };
DS3231_set_a1(time[0], time[1], time[2], time[3], flags);
DS3231_get_a1(&buff[0], 59);
Serial.println(buff);
} else if (cmd[0] == 66 && cmdsize == 7) { // “B” Set Alarm 2
DS3231_set_creg(DS3231_INTCN | DS3231_A2IE);
//BMMHHDD
for (i = 0; i < 4; i++) {
time[i] = (cmd[2 * i + 1] – 48) * 10 + cmd[2 * i + 2] – 48; // mm, hh, dd
}
uint8_t flags[5] = { 0, 0, 0, 0 };
DS3231_set_a2(time[0], time[1], time[2], flags);
DS3231_get_a2(&buff[0], 59);
Serial.println(buff);
} else if (cmd[0] == 67 && cmdsize == 1) { // “C” – get temperature register
Serial.print(“temperature reg is “);
Serial.println(DS3231_get_treg(), DEC);
} else if (cmd[0] == 68 && cmdsize == 1) { // “D” – reset status register alarm flags
reg_val = DS3231_get_sreg();
reg_val &= B11111100;
DS3231_set_sreg(reg_val);
} else if (cmd[0] == 70 && cmdsize == 1) { // “F” – custom fct
reg_val = DS3231_get_addr(0x5);
Serial.print(“orig “);
Serial.print(reg_val, DEC);
Serial.print(“month is “);
Serial.println(bcdtodec(reg_val & 0x1F), DEC);
} else if (cmd[0] == 71 && cmdsize == 1) { // “G” – set aging status register
DS3231_set_aging(0);
} else if (cmd[0] == 83 && cmdsize == 1) { // “S” – get status register
Serial.print(“status reg is “);
Serial.println(DS3231_get_sreg(), DEC);
} else {
Serial.print(“unknown command prefix “);
Serial.println(cmd[0]);
Serial.println(cmd[0], DEC);
}
}

 

Machine Seeing

Three readings this week, we have:
Ways of Machine Seeing by Geoff Cox
A Future for Intersectional Black Feminist Technology Studies by Safiya Umoja Noble
How we are teaching computers to understand pictures by Fei Fei Lee

“Drawing on the two readings consider your example in relation to “ways of machine seeing”.

 

Inspired by Lisa Nakamura’s Keynote speech at this year’s Transmediale Berlin “Call Out, Protest, Speak Back” I will be looking further into the writings of bell hooks (Gloria Jean Watkins) and her influence on Intersectional thought. Nakamura presentation focuses on VR and how it is being sold by the big tech companies and how it uses black women as users of the media. She points out how this is a superficial and misleading image. This connects with the second of the readings; by Safiya Umoja Noble.

Supershapes formula

Inspired by Johan Gielis, Paul Bourke’s website deals with, among other geometry, supershapes.

THE FORMULA

see also https://www.researchgate.net/publication/282433972_Examples_of_Supershapes

Extended to 3d the formula extends as:

Daniel Schiffman excellent youTube on this, here is the code for Processing sketch taken from his gitHub:

 

Also see Reza Ali’s Website – more supershapes.

 

to do:

take the code below and convert to a Point Cloud.

// Daniel Shiffman
// http://codingtra.in
// http://patreon.com/codingtrain
// Code for: https://youtu.be/akM4wMZIBWg

import peasy.*;

PeasyCam cam;

PVector[ ][ ]  globe;
int total = 75;

float offset = 0;

float m = 0;
float mchange = 0;

void setup() {
size(600, 600, P3D);
cam = new PeasyCam(this, 500);
colorMode(HSB);
globe = new PVector[total+1][total+1];
}

float a = 1;
float b = 1;

float supershape(float theta, float m, float n1, float n2, float n3) {
float t1 = abs((1/a)*cos(m * theta / 4));
t1 = pow(t1, n2);
float t2 = abs((1/b)*sin(m * theta/4));
t2 = pow(t2, n3);
float t3 = t1 + t2;
float r = pow(t3, – 1 / n1);
return r;
}

void draw() {

m = map(sin(mchange), -1, 1, 0, 7);
mchange += 0.02;

background(0);
noStroke();
lights();
float r = 200;
for (int i = 0; i < total+1; i++) {
float lat = map(i, 0, total, -HALF_PI, HALF_PI);
float r2 = supershape(lat, m, 0.2, 1.7, 1.7);
//float r2 = supershape(lat, 2, 10, 10, 10);
for (int j = 0; j < total+1; j++) {
float lon = map(j, 0, total, -PI, PI);
float r1 = supershape(lon, m, 0.2, 1.7, 1.7);
//float r1 = supershape(lon, 8, 60, 100, 30);
float x = r * r1 * cos(lon) * r2 * cos(lat);
float y = r * r1 * sin(lon) * r2 * cos(lat);
float z = r * r2 * sin(lat);
globe[i][j] = new PVector(x, y, z);
}
}

//stroke(255);
//fill(255);
//noFill();
offset += 5;
for (int i = 0; i < total; i++) {
float hu = map(i, 0, total, 0, 255*6);
fill((hu + offset) % 255 , 255, 255);
beginShape(TRIANGLE_STRIP);
for (int j = 0; j < total+1; j++) {
PVector v1 = globe[i][j];
vertex(v1.x, v1.y, v1.z);
PVector v2 = globe[i+1][j];
vertex(v2.x, v2.y, v2.z);
}
endShape();
}
}

 

Wearables Project – Information Sleeve – Concept

Weather, time, compass …and more high viz  sleeve for bikers/outdoor

Be seen and anticipate problems on the road:

I did 14,000 miles last year – rode up to Tiblisi and down to Athens, two trips to France / Pyrenees. I have information on board the bike – GPS, time, temperature, mpg, range… I have to click through the information button to get each piece of information.  I would like to have something self powered and easy to read while I am riding, particularly in bad weather. It would be a challenge to make some thing rugged and useful to use on long journeys.

I do not necessarily need another GPS but a digital compass would be nice – also a bright LED addressable strip to show temperature, possibly barometer as well by changing colour and lighting up the LEDs along sleeve.

  • Temperature – shown by leds along the arm of ther sleeve.
  • Battery – powerbank NiMd
  • Ruggedized oled
  • Waterprood multicolour addressable LCD strip, display temp by no of lit LEDS and colour for barmteric pressure
  • numeric for displays
  • Realtime clock DS3231
  • HMC 5883L digital compass
  • Barometer
  • Humidity
  • MPU-6050 Gyro, accelerometer

Possible enclosure for components?

LED addressable strip for temp/air pressure display

learn plastic welding of seams

I2C 7 segment diplay for time

OLED display for compass and other information

Waterproof, addressable LEDs sealed into  sleeve

Powerbank to power the unit.

On/off button inside waterproof cushion

This concept for the wearables project is not necessarily ‘art’ but it will involve some opportunity to design and make a rugged outdoor, easy to read sleeve one can wear outdoors – walking cycling, motorcycling. As a keen motorcyclist, I already have a multi-function sensor on the bike, but it is hard to read in bad weather and fiddly to operate, also it can only display one piece of information at a time. This project will combine as many sensors as I can possibly include in the order as shown above.

I will experiment with waterproofing techniques, aim to seal in an oled screen and numeric leds. All will be connected via I2C bus, powered by a decent power bank. A timer will shut the unit down to save power, pressing a button inside waterproof will wake the device up or shut it down. I will use a lilypad or similar to control sensors.

Add Pi Zero with Camera? Luxury version!

Investigate friction welding to waterproof the items in pockets inside the sleeve.

Use of silicone and polycarbonate enclosure to protect OLED inside bespoke 3D printed case.

References: (more to be added)

1. Waterproofing article I found useful

2. Compass as a ring of LEDS

An exercise in intimacy

Were asked to pair up and touch each others palms for 3 minutes; 90 seconds with eyes closed, 90 seconds with them open, then write our experiences, taking 10 minutes.

Reflections on bodily contact

My view of Matthew:

The first part was to stare into each others eyes for ten seconds. This felt like ten minutes, in fact, we had to make several attempts at the gazing preliminary as we would either one of us look away or laugh, distracting ourselves. I do not know my opposite number, seen him in class so it was all genuinely difficult and left me feeling surprisingly uncomfortable.

The main feeling, I experienced throughout was pain. I shut my eyes, at first uneasy at the unnatural circumstances of touching a stranger so to speak. We English are so reserved… My hands – so warm, his hands cold from just walking into the classroom. I began to lose all sense of what was normal… the effort of holding up my arms made it feel like I was holding up the other person like in a circus trick, balancing upwards. My fingers compensating and micro-adjusting so the fingertips would not ‘fall off’ the ends of his fingertips, like balancing on a the edge of a precipice, a high wire. Feeling pulses of movement and the heat transferring from my hands into his. Each finger would twitch.

Releasing our shut eyes- opening them made the task even harder. Now I had to worry about having to avert my gaze. When our eyes meet I look away, embarrassed. We are conditioned not to threaten each other with this gaze avoidance, I think. If I did this to my dog, he would look away, just the same. I glance round the room, looking at the others to see what they are doing, some relaxed, some in embarrassed discomfort, just like me…I carry on, this seems like it is taking hours, certainly not something I would have chosen to so but at last we are released, I drop my arms in absolute relief.

I wonder what I can take from this; the distortion of time, sensory input magnified with eyes closed.

This was Matthew’s viewpoint:

The contact zone moved. As our palms touched and our fingers aligned we knew this would be a long minute and a half. James’ hands felt large and warm. The pressure created between us was enough to sustain the strain of our extended limbs.

I could sense movement, twitches and some rigidity from James. Personally I was calm. A little apologetic for the coldness of my own hands. Questions started arriving. Was I sweating? Was I moving a lot? I felt like James was doing the moving, still I thought of the relationship between driver and passenger in a car. The driver anticipates.

After thirty or forty seconds my left index finger began to slip. It crept leftwards. Gradually heading for the valley. Would we soon interlock fingers? I didn’t move, curious to as to where this would go. James blinked first and corrected our alignment. The minute game of chicken was over.

When we opened our eyes James would not hold my gaze for more than a few seconds. Our separate selves had bonded for a few minutes there. I continued with the exercise and stared at James. Taking in his face, his eyes, his hair. He is several decades older than I am, I knew this change would happen to me too.

IMG_2617

Part two

My viewpoint:

This time we did not touch – we closed our eyes  for 90 seconds then I used my phone to film in time laspse mode, observing my partner through the phone for 90 seconds.

With eyes closed, I was disconnected totally from him. The classroom disappeared, I was in meditative mode; since my practice of over 25 years of meditation, I am conditioned to draw within and I began to watch myself. Again, Gurdjieff watched over me so to speak, I was observing myself, my thoughts passing, music – memories of my days when I was in the ‘work’. Now, John Cale… drifting thoughts swirling, bringing my mind back… but still no appearance of my partner in front of me… until I realised I had a task to be in the room with my opposite number.

The 90 seconds seemed long, but I was comfortable this time, happy to spend another hour if need be.

The daylight returns and my task reappears, I hold my phone up and video Matthew in fast motion – maybe he would like to see it, it did not record any particular blow by blow representation, I wanted to shrink time if I ever came to look at it again. He looks a little uneasy, but I am thankful it was not me being observed. I saw him looking at me looking into my phone, he did not seem to like the idea, neither did I, like a sort of voyeuristic thing, I felt guilt. He was a victim, I was the prison warder forced to observe my prisoner. I was dominant in the exercise, not any better position to be in than the observed subject.

The time passed slower in the observation through the phone sequence, it was not enjoyable. We were back dealing with our intimacy despite giving each other permission to do this, I was glad the second part was over.

And Matthew’s view:

I’m searching for James with my eyes closed. We are no longer connected, only present together. The yellow and orange of my eyelids turns to a muddy green. The hairs on my fingers are bristling. They’re searching for contact. My stomach rumbles and I’m reminded of my hunger. I distract myself with a few bars of a song.

When we open our eyes this second time James has been told to take out his phone and views me through it. I stare dead-eyed into the lens. Knowing this black circle will be the locus of my attention for the next period, I settle in.

I try to move the camera with my stare. That is I’m trying to move the man. I visualise pushing the device away to the side. The phone does start to move, a hand swap indicates that this is fatigue not telekinesis.

James looks away from the camera. I know he doesn’t enjoy this, I find this fun.

How does my face look? Am I locking with his invisible eyes? In the pre-meditation I considered discreetly lifting my hood and pulling the cords tight. Not out of shame or shyness but to make James laugh upon opening his eyes.