Final Project

The project started as a consideration of ways to make visible and thus value the gestural and cerebral communication between artist and art work, specifically labor intensive fiber handwork.   I thought of attaching an accelerometer to a piece of fabric that is being embroidered to record its motion while also recording EEG readings from the artist.  The data could be translated into another art work through light, sound, or text.   As a humorous commentary on production, which is slow in handwork, I also envisioned a printer that would rapidly produce consecutive artworks based on the gestural and cerebral data produced during slow stitching processes.

IMG_2927.jpg

Then came the reality of time limitations. It quickly became apparent that the NeuroSky brain sensor was a steeper learning curve than I had time for.  I settled on using an accelerometer to gather data from my own gestures and translate it into light emitted through fiberoptics.  The accelerometer captures 8 static positions in space and acceleration along x, y, z axes.    Due to multiple glitches, I was only able to produce a working prototype.  I like to try to solve my own problems.  In hindsight though, I probably should have recognized earlier that some of the problems were beyond my skills and not spend time spinning my wheels trying to solve them.

IMG_3069.jpgIMG_3064.jpg

The adafruit MMA8451 accelerometer is sewn onto a fingerless glove and wired to an adafruit feather 32U4 adalogger with a jst connector.  The data logger is then connected to a strand of neopixels again with a jst connector.  The neopixels will attach to both ends of side glow fiber optic strands in a frame.  The jst connectors make the unit modular so it can be used as a live performance, as a data collection tool without the lights, or as a means to playback data previously gathered.

Troubleshooting:

  • Initially I wanted fiber optics to be both warp and weft but even the thinnest strands I found were too thick for a tight weave. Then I tried weaving with clear microfilament (invisible thread).  With all the handling involved with weaving, the microfilament ended up damaging the fiber optics by slicing into the strands so that light concentrated at the point of rupture rather than glowing throughout the strand.

Screen Shot 2017-12-15 at 10.08.25 AM.png

  • To connect fiber optics to neopixels, I would need to gather fiberoptic strands together with shrink tubing and connect with another shrink tube stitched around the neopixels. Shrinking the tube requires finesse so it doesn’t collapse down and cover the neopixel light. Ultimately the class suggested not sending the neopixel lights THROUGH the fiberoptic strands. Instead I could shine the lights behind the strands using the strands or microfilaments as textural interest.

IMG_3068.jpg

  • The unit became temperamental producing random results. After testing conductivity and various connections, we determined it was some tiny stray fibers causing shorts with movement.  Once it was working, I secured the accelerometer connections with nail polish.
Advertisements

Final Help with Final Projects

Hi all,

I hope you are doing well in your preparations for Wednesday’s critique.

If you have questions or would like some help troubleshooting, let me know. I am on campus tomorrow (Tuesday) and can be available at 4pm after my class to assist as necessary.

Looking forward to seeing what you have been working on.

Best,

Christine

Project Development Interactive Projection

3rd-design-interactive-projectiob.jpg Diagram of new Mat design.

My Project has evolved again after thinking about how  the materiality of the interface could connect the viewer more into the process and purpose of interacting with the projection I have changed the design to a giant  Light Colour Wheel Map. This shape will be projected onto the wall and the viewers will be ‘green screened’ into the projection.

The larger  outer sections Red, Green and Blue will be large pressure sensors Chromo keyed individually into Max made of their assigned coloured fabric and neo pixels sewn underneath  surrounding the edges to give enough light  for the camera. Each of RGB section will be allocated its own area of the camera through a mask  and  by using three of the Suckah! object grids to  choose the specific colour of these sections to separate the video footage. 

The pressure sensor in each will allow the other various effects through Max msp  to change the video source with various distortions- i.e. noise, manipulation of some sort. The Cyan, Yellow and Magenta will also be pressure sensors. But they will control the sound of the three videos. Possibly the volume or the pace? 

W- In the centre is a adafruit colour sensor which when activated by the viewer  can also change the live video sourced area of the colour projected onto the wall (example – their pink top). There will be multiple neo pixels in this section too which will change along with the colour of the sensor. 

I want the videos to be white noise when no one is on the mat.  The three videos  will be found source sampled which will have a slight comment on the evolution  digital space on people’s perspective of what is real. From Camera Obscura to film to video and entering virtual reality. 

2nd design Interactive Projection Previous design and idea of Installation set up.