nicole @ itp

very serious blog for schoolwork

  • About
  • Contact

[pcomp week 6] Allergic to Love, Joystick Version (serial communication)

October 14, 2015 / 1 Comment

2015-10-14 00.00.59

I ran into a lot of problems this week!

While doing this week’s labs on serial communication, lots of weird things kept happening — it would work sometimes and not other times, my computer crashed, Arduino couldn’t find the right port…

A few hours in, I realized that 95% of my problems were because I kept forgetting to turn on and off the serial monitor in P5. Even when I realized this I would forget to do it, or do it in the wrong order. This was very irritating.

Eventually, I got it sorted out. In the labs, it was suggested that we wire up one button and two potentiometers. Luckily for me, the joystick that I got for this assignment was literally those three things in one, so once I figured out the code on the Arduino side, I could easily move it into my p5 sketch.IMG_0500

The sketch I decided to add a sensor to was a game I made the other week in ICM, Allergic to Love. In the original version, you use your mouse to move your character along the x-axis, and then clicked to shoot your laser. Using a joystick to replace the mouse was fairly straightforward. The pot in the joystick that controls x-axis movement would replace mouseX, and the button on the joystick would replace mousePressed.

(This joystick also allows you to control movement on the y-axis, and even though this particular game doesn’t require it, I decided to keep that code in just in case in the future I decide to add some kind of movement up and down.)

Like in the lab, I made an array of data in P5 for the 3 sensor inputs: x, y, and button press. The P5 code using the sensor data ended up looking like this:

JavaScript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
function serialEvent() {
  // read a string from the serial port
  // until you get carriage return and newline:
  var inString = serial.readStringUntil('\r\n');
  //check to see that there's actually a string there:
  if (inString.length > 0 ) {
    var sensors = split(inString, ',');            // split the string on the commas
    if (sensors.length > 2) {                      // if there are three elements
      if (sensors[0] == 539) {
          cannon.dir = 0
      } else if (sensors[0] < 200) {
          cannon.dir = -1
      } else if (sensors[0] > 900) {
          cannon.dir = 1
      }
      buttonPress = sensors[2];      // element 2 is the button
    }
  }
}

You can see how I made sensors[0], the x-axis, change direction depending on input, and how I set sensors[2], the button, to a simple variable.

It’s pretty fun to play the game with a joystick. It definitely makes it feel more arcade-y. Even though I had to make it a bit easier to accommodate the joystick, it’s still pretty hard!

My P5 code and my Arduino code are below.

Posted in: Physical Computing Tagged: allergic to love, arduino, games, joystick, p5js, pcomp, serial communication, week 6

[video and sound week 3] How to fly a kite storyboard

October 5, 2015 / Leave a Comment

We’ve now moved on to the video portion of video and sound, and our first assignment is to draw a storyboard for a short video. I’m working with Melody and Aaron,

We decided to make a “How to Fly a Kite” video, though that may certainly change depending on how successful we are at flying a kite when it comes time to shoot. If that happens, the theme will likely be slightly different…

For now, we broke it up into three sections: 1) Get a kite, 2) Get the kite in the air, and 3) Keep the kite in the air.

Below are the storyboards we made. (Apologies for the lack of drawing ability in our entire group!)

IMG_0991

IMG_0992

IMG_0993

Posted in: Video & Sound Tagged: how to fly a kite, storyboard, video, video and sound, week 3

[icm week 4] Allergic to Love, v2.3

October 1, 2015 / Leave a Comment

Screenshot 2015-10-01 00.33.44

[TLDR: Here’s the second iteration of my game, Allergic to Love.]

Our assignment this week was to clean up the code of a previous assignment, which was quite welcome because my code from last week was a total mess. I’m not completely sure that it’s now sparkling clean, but it’s definitely much better than it was before.

Before I made any changes to the game, I went in and reorganized the code. Having not yet learned arrays or objects in class, I was wandering about in the array wilderness a little bit last week and ended up doing something really weird with them. I think I fixed them up now after looking up how to do an array of objects. I found the jitterbug example  to be particularly helpful. Now my falling hearts are arranged like this:

JavaScript
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
function Falling() {
  this.x = random(width);
  this.y = random(-650, 0);
  this.broken = 0;
  this.hitByLaser = false;
  this.hitTheGround = false;
  this.speed = random(1, 4);
  this.clr = color(255, random(0, 255), random(0, 255));
 
  this.move = function() {
    this.y += this.speed;
  };
 
  this.display = function() {
    strokeWeight(7);
    stroke(this.clr);
    fill(this.clr);
    bezier(this.x, this.y, this.x - 12, this.y - 12, this.x - 11, this.y + 8, this.x, this.y + 14); //left
    bezier(this.x, this.y, this.x + 12, this.y - 12, this.x + 13, this.y + 8, this.x, this.y + 14); //right
  };

 

I went in and created a lot of my own functions as well.

Finally, I started adding on to the game itself.

Screenshot 2015-09-30 23.21.22

Adding sounds made a big difference. I found music from the Free Music Archive and sounds from Free Sound. I’m not really sure where the proper place to attribute would be…I put it in comments in my code for the time being.

Screenshot 2015-09-30 23.23.11

Besides sounds, I made a few other changes — the girl now bounces up and down, the landscape is a bit different, and she smiles if you win the game. (But it’s still pretty hard to do so!)

Play the game here.

My full code is below.

Posted in: Computational Media Tagged: allergic to love, feelings, games, icm, p5js, projects, week 4

[pcomp week 4] The Lonely But Tender Ghost v.2, now with sound! (analog outputs)

September 29, 2015 / 1 Comment

IMG_0272

It was fun playing with speakers and servo motors this week. After doing the labs, I focused mostly on doing things with sound, but in the future I’d like to spend more time experimenting with servos…

In the tone lab, I had some fun with the pitch library, and found it pretty easy to change the song played to something else:

I wanted to continue building on my lonely ghost project from last week. When I last left it, I had hand-sewn an FSR that caused an RGB LED to change colors depending on how hard you squeezed it. It was supposed to express “feeling” with the colors — green was good, red was bad. At that point, using colors were the only output.

I added a speaker to my breadboard and worked on adding sound in addition to color as feedback for the toy’s feelings.

IMG_0270

The first thing I did was add the tone() function to the code so that when the variable “force” was 3 — that is, pushed the hardest, the speaker would make a noise in addition to having the LED turn red.

I thought the ghost could be made to be a bit needier. What if it got lonely if you didn’t pay attention to it for a period of time?

I used the millis() function to count the number of milliseconds that have passed whenever the ghost was squeezed. I then set a variable called lonelyTime, which was the amount of time it that could pass before the ghost got lonely. When the last time squeezed subtracted from the current millisecond count exceeded lonelyTime, I had the speakers make a tone. It would stop when you squeezed it again.

(I used the same method to make the LED blink when you weren’t squeezing the FSR, which I thought was a more natural neutral state than having the light just be white.)

This was nice, but all of the tones sounded pretty boring and static. That’s when I realized I could use the pitches library, like in the tone lab, to compose custom sounds for each state. I ended up making three:

in pain

in pain

happy

happy

Screenshot 2015-09-29 13.17.28

lonely

I was a bit surprised by how much more effective the custom sounds were at expressing feeling compared to the basic speaker tones.

Now, the ghost feels much more like a pet or a needy toy. When he’s lonely, the light will turn yellow and he’ll make the lonely sound until you squeeze him. If you squeeze him gently, the light turns green and he makes the happy sound. If you squeeze him too hard, he’ll make a distressing sound and the light will turn red. The blink effect makes it feel more alive as well.

Check out the video (with sound) here:

My Arduino code is below.

Posted in: Physical Computing Tagged: arduino, pcomp, projects, soft lab, the lonely but tender ghost, week 4

[video & sound week 2] Late

September 26, 2015 / Leave a Comment
http://nicole.pizza/itp/wp-content/uploads/2015/09/late.m4a

Osama and I teamed up to do our Video & Sound assignment this week, which was to make a short soundscape.

We decided to tell a short story (of sorts) through sound, and did all the recording at my apartment. Osama ran the recorder and microphone while I did the “acting.”

Notes on editing:

  • Even though we ran through some basics in class, I found Logic to be pretty confusing — I guess it takes practice!
  • Upon opening the wav files up in logic, we saw that our levels were slightly on the low side. I’m not totally sure how to fix this except adjusting the volume knob on the track…
  • I’m not sure how to get rid of some of the microphone noises (mostly heard in the bathroom in our piece.)
  • I don’t know how to adjust some of the background sounds, or if that’s even possible in editing. For example, we recorded the sound of “running away” and “running towards” separately and in separate places, and the background hum in the “running towards” part makes the transition hard to sound seamless. (You can hear what I’m talking about around 1:16.)
  • I wasn’t sure how to organize my different tracks, or when to use different tracks.

Overall, we had a fun time recording and putting it together.

You can find the uncompressed version of the file here, and the Logic file here.

Posted in: Video & Sound Tagged: projects, sound, video & sound

[icm week 3] Allergic to Love v. 1

September 24, 2015 / Leave a Comment

Screenshot 2015-09-24 03.21.50

[TLDR: Here’s my (unfinished but finished-enough-for-this-assignment) game, Allergic to Love.]

Oh man, this was a rough one. The beginning was good: Esther and I partnered up to start this project. I really wanted to make a game, so we went down the path of trying to build something that involved shooting at objects that moved on the screen.

Together we got as far as making things fall down, making things shoot up, and coming up with an interim solution for making objects disappear as they got hit. Esther did some pretty awesome work figuring out how to use arrays and create functions, and then we took that code and built separate things on top of it.

I quickly realized that what I wanted to do this week was beyond the scope of what we’ve learned so far in class, so it was quite difficult to sort out some specific things. Once I came up with the theme of the game, it took me a really long time to figure out how to make all the falling objects in the shape of hearts. After some experimenting with for loops, making functions, arrays and bezier curves, I got it!

Screenshot 2015-09-23 19.55.00

This was very exciting. I started adding things like making them fall from different heights and at different speeds. Some of the trickiest stuff to figure out was how to make things happen in the win state and lose state. I ended up having to add a bunch of Boolean arrays. It also started to come together aesthetically.

Screenshot 2015-09-24 03.20.04

I added some fun things, like making the girl’s skin turn green every time a heart hit the ground. (She’s allergic, after all.)

JavaScript
1
2
3
4
5
6
7
8
9
10
11
//if the heart hits the ground
    if (hearts.y[i] == height) {
      if (hearts.hitTheGround[i] === false) {
        hearts.onGround++; // skin changes color
        skin.r = skin.r - 10;
        skin.g = skin.g + 5;
        skin.b = skin.b + 5;
        hearts.clr[i] = (0); //hearts turn black
      }
      hearts.hitTheGround[i] = true;
    }

I also had some crazy mishaps experimenting with the gradient that frankly look way cooler than what I was going for.

Screenshot 2015-09-24 00.55.23

There’s still a lot I want to do, and the code is incredibly unfinished and messy. But it’s almost 4 am and I got this thing in a playable state, so I guess now is as good a time as any to stop. And even though I’ve been playing it all evening, it’s pretty hard! I feel like there’s still a lot to improve on here, but this will have to do for version 1.

Screenshot 2015-09-24 03.38.15

Check out Allergic to Love, v.1!

My (extremely messy) code is below.

Posted in: Computational Media Tagged: allergic to love, feelings, games, icm, p5js, projects, week 3

[pcomp week 3] The Lonely But Tender Ghost (digital and analog inputs, digital outputs)

September 21, 2015 / Leave a Comment

IMG_0156

This week we learned how to program the Arduino to take inputs from our sensors and program them to make stuff happen.

I went to the Soft Lab workshop on Friday, where I learned how to sew a simple button, so I used that in the first example of alternating LEDs with a switch:

The fun part was using analog sensors to change the brightness of LEDs — I wired up a force sensor and a photocell to control two different LEDs on the breadboard.

I had a ton of ideas for our assignment to do something creative with these sensors this week, many of which sounded great in my mind but in reality were all varying degrees of unfeasible for the time being. One thing that stuck with me — newly inspired by the Soft Lab — was the idea of doing something with a doll or plushie. My goal was to make a plushie that gave you the sense that it had feelings.

I decided to go with a force sensitive resistor. The idea was that I’d make a plushie with LED eyes that would change color depending on how hard you squeezed it.

Here’s the circuit I built on the breadboard:

The map() function was really helpful for me to turn the input from the sensor into three different states, which I could then turn into colors. I learned how to use an RGB LED with the help of this example from Adafruit, and I ended up using the setColor() function written in that sketch in my final code.

IMG_0163

The next step was to make my plushie!

 

IMG_0151 IMG_0153

I realized that my original plan to sew two RGB LEDs into fabric as eyes was actually extraordinarily complicated, so I just made the light separate from the plushie and went with the next best thing: googly eyes.

I built my own force sensitive resistor with some conductive material and Velostat, and sewed it all up in some felt to make my little ghost plushie. I noticed that the input values I got from the commercial FSR went pretty accurately from 0 – 1023, but my homemade FSR pretty much started at 750 or so rather than 0. I adjusted my variable in my code to accommodate it and it worked perfectly well.

I decided to call him the Lonely But Tender Ghost. In his normal state, the light is white. When you squeeze him tenderly, the light turns green. If you squeeze him too hard the light turns red. 🙁

This is just a basic first project, but hopefully later on I can further explore building an object that makes you feel like it’s expressing human feelings, perhaps creating sympathy or empathy in you, the user.

My full Arduino code is below.

Posted in: Physical Computing Tagged: arduino, feelings, pcomp, projects, soft lab, week 3

[pcomp week 3] Design meets disability reading & observation

September 20, 2015 / Leave a Comment
from the Alternative Limb Project

from the Alternative Limb Project

This week’s reading was Graham Pullin’s Design Meets Disability, discussing both objects explicitly used to counteract a disability, like prosthetics, as well as objects used by people of all abilities that have varying levels of inclusiveness. Glasses are cited as an example of successful design for disability, to the point that people don’t consider poor eyesight a disability because glasses have transitioned from being medical devices to fashion accessories. This reminds me of Norman’s phrase, “Attractive things work better.”

I appreciate this perspective in the context of physical computing. If we’re designing for the human body, it’s important to take into consideration the ways in which people’s bodies and abilities are different, and to not take any particular ability for granted. I think it’s neat to see examples of things designed specifically for, say, wheelchair users, but also to see products that keep different preferences of usage in mind (a clicking sound and sensation, for example.)

(A small note on the examples: it was fun to see Nick’s Bricolo because we used to work together at my old job before ITP!)

——-

For my observation assignment, I decided to watch people use the entrance to the subway. More specifically, I watched them use the Metrocard slider that collects their fare.

NEW YORK, UNITED STATES - JANUARY 13:  A person swipes a metrocard in New York subway station on January 13, 2014 in New York, United States. The Metropolitan Transportation Authority (MTA) declares that MetroCard the subway rapid transit system is going to change until 2019. (Photo by Bilgin S. Sasmaz/Anadolu Agency/Getty Images)

According to the MTA, people swipe in about 1.7 billion times a year. That’s a lot! I’ve probably done it a thousand times myself.

That said, it’s certainly not perfect. My assumption is that people who are accustomed to the system — understanding which way to swipe and the specific speed at which you swipe — can move through pretty quickly within 3 seconds or so with no problem. But tourists, anyone that has insufficient fare on their Metrocard or any other Metrocard problem, or people that move too slowly I predict will have trouble with the machine.

I watched people use the machine at Union Square because there’s a lot of activity there, and locals and tourists alike.

I noticed that the people using the machines generally fell into three groups:

  1. Confident and experienced users who got through with no problem
  2. Confused users who had problems, likely tourists
  3. Confident users who had a problem with their card

The first group was the majority of users who moved through the system quickly. The second group usually approached the machines slowly and often in groups, and would often swipe too slowly or too quickly, receiving the “Swipe card again at this turnstile” message. They would try again and again until it worked. This usually would take something more like 10 or 15 seconds.

The third group actually ran into the most trouble. People who were experienced and confident moved forward with the speed of someone who would get through in a couple seconds, but were halted by the machine abruptly if the card didn’t work. Sometimes they would almost run into the turnstile because the momentum was carrying them forward. Other times there were almost collisions with people behind them, especially if they had to turn back to refill their card.

In the case of insufficient fare, people had to go back to the machines to refill them, which could take up to a few minutes.

Developing the correct motion to swipe in a way that the machine understands is a skill that improves with practice. This is probably one reason why most other (and more modern) subway systems around the world use a tapping system, which seems to be easier for anyone using the machine, even if they’ve never done it before.

The way to solve the insufficient fare problem seems to be harder. It’s not an issue of not informing riders of how much fare is left (since it’s on the display when you swipe in), but people forget that they need to refill even if during the last ride they knew they ran out. It seems to be an issue of when riders are notified that they need to refill, which should ideally be when they walk into the station and not when they’re already at the turnstile.

A shorter term solution might be to design the space around the turnstiles in such a way that people can quickly exit the turnstile area if they need to, so it’s not a crowded bottleneck.

 

Posted in: Physical Computing Tagged: observation, pcomp, readings, week 3

[video & sound week 1] Plagiarism, originality and remix readings

September 19, 2015 / Leave a Comment

IA4_2ip_steallook_garnett_img_0

This week in Video & Sound we read and watched four pieces of media: Jonathan Lethem’s The Ecstacy of Influence: A Plagiarism; b) On the Rights of the Molotov Man: Appropriation and the Art of Context; c) Allergy to Originality and d) Kirby Ferguson’s Embrace the Remix. The common thread between them all is the idea that no art or work is truly original or creative, and that this isn’t a bad thing. In fact, it’s inevitable, and it should be celebrated because we progress collectively through the efforts of the past.

Ferguson’s talk cites a quotation from Henry Ford: “I invented nothing new. I simply assembled the discoveries of other men behind whom were centuries of work.” And, as Lethem writes, “Finding one’s voice isn’t just an emptying and purifying oneself of the words of others but an adopting and embracing of filiations, communities, and discourses.”

I agree with the sentiment that our culture stands on the ability to borrow and remix ideas, and that openly acknowledging our influences as artists or inventors is important. But I’m also sympathetic to the people whose work gets “stolen” as well. Ideally, we could live in a world where everyone openly admits to using other people’s work and happily allows anyone to use theirs as well. But because it’s so difficult for artists to make a living to begin with, our flawed copyright laws sometimes serve as the only kind of protection they have for their income. It’s hard to fault an artist for feeling protective in an imperfect system, which is why I found Susan Meiselas’s rebuttal sympathetic and a necessary perspective.

 

Posted in: Video & Sound Tagged: readings, video & sound, week 1

[video & sound week 1] Reaction to “Her Long Black Hair”

September 19, 2015 / Leave a Comment

photo_cardiff_04_view1_321x244w

This afternoon I took the train up to Central Park with my iPhone and my headphones to… listen to? experience? walk through? Janet Cardiff’s sound walk, “Her Long Black Hair.” I really enjoyed it.

There was something surreal about the way that the recording overlapped with the sounds of the city. Immediately as it begins, you’re sitting facing the street traffic as the cars whiz by, and you’re not sure if the sounds of tires and horns is coming from your headphones or from the street in front of you. (The reality is that it’s both.)

IMG_0091

My favorite moments throughout the whole piece were like these, where it almost tricks you into thinking a sound is coming from the outside when it’s actually in the recording. There was a part when I was walking by some tall rocks with children climbing all over, and it took me a second to realize that the voices of children I could hear talking to each other about climbing did not belong to the ones in front of me.

I also enjoyed the semi-linear, ambiguously fictional tone of the piece. It almost made it feel like a kind of proto-virtual reality, or even a type of videogame — after all, you follow her directions and feel rewarded whenever something lines up between the recording and your reality. Like when Cardiff describes the bird on the head of a statue, and you see it in front of you.

IMG_0106

One of the most poignant moments on my walk was under a tunnel. In the recording, there is a man singing. On my walk today, there was a man drumming. As she has you linger and listen, the two pieces of music lined up in my ears as an experience that was uniquely mine.

 

Posted in: Video & Sound Tagged: her long black hair, sound walk, video & sound, week 1
« Previous 1 … 5 6 7 Next »

Categories

  • Animation
  • Computational Media
  • Digital Fabrication for Arcade Cabinet Design
  • Electronic Rituals
  • Live Web
  • Nothing: Making Illusions
  • Physical Computing
  • Reading and Writing Electronic Text
  • Understanding Networks
  • Video & Sound
  • Web Development With Open Data

Tags

allergic to love animation arduino chat data data viz digital fabrication for arcade cabinet design eroft feelings final games how to fly a kite icm javascript live web midterm nothing p5js pcomp petfinder poetry projects python reading and writing electronic text readings soft lab stop motion storyboard the spin tinder robot understanding networks unity video video & sound web development with open data week 1 week 2 week 3 week 4 week 5 week 6 week 7 week 8 week 9 week 10

Recent Posts

  • [eroft final] Tabomancy 2.0
  • [eroft week 4] writing with location data
  • [eroft week 3] daily fortune: tabomancy
  • [eroft week 2] Pop Rocks Oracle Deck
  • [eroft week 1] 24 hours in (zelda) nature meditation

Archives

  • April 2017
  • March 2017
  • October 2016
  • September 2016
  • April 2016
  • March 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015

Meta

  • Log in
  • Entries RSS
  • Comments RSS
  • WordPress.org

Copyright © 2018 nicole @ itp.

Me WordPress Theme by themehall.com