custom font section

Moving Image


Intention

The video was taken by our group to mirror the sounds projected in our soundscape. The video begins with the feet of a man walking from a crowded space into a dark, quiet space of his own. The video then follows the character through the drunken, clumsy state of writing and hating his work, locking away what little bit of his life he has left, and finally becoming a canine-like human without a value or desire left to cherish.

The video was embedded on top of an image of a theatre. Then, the component to display tweets was added. The idea was that the tweets would be pooled based on pre-determined keywords linked to the narrative of the video. Then, at predetermined moments during the video, tweets would be randomly chosen from specific pools and shown. The tweets would be recent, almost real time, loaded with the page.

These tweets form a chorus of random voices narrating or reflecting the story. Every time the page is loaded, different tweets are used. The experience of watching the video will never be the same twice.

Process

The video was taken on iPhone and Canon cameras. The footage was then uploaded onto iMovie and FinalCut programs where it was cut and edited to transition seamlessly with the audio. We decided to exclude the piano interludes in the video version of the project as we deemed it important to the move listeners through the transitions of the audio but was not necessary in the video where the visuals aided the transition enough. In the revision process we did, however, overlay the video with a black-and-white, era-appropriate filter and embedded it on top of an image of a theatre. These effects created aesthetics that fit the time period the piece was written and creates contrast between the foreground and background

The process for creating Tweet feedback component was complex. Node-RED was used to interact with Twitter's API on a server separate from the Blogger hosting system. Node-RED is a powerful tool which brings the simplicity of a flow chart to the creation of server side processing of content.

For our project, Node-RED maintains a connection to the Twitter API, searching for a specific series of keywords. The results are stored in a database powered by mongoDB. The collection in the database is capped so that it never exceeds a fixed size.

When this page loads, it dynamically queries the server running the Node-RED instance using a similar list of keywords. For each keyword, a pool of tweets is returned from the database. This means the tweets are only near real time, not actually in real time.

A combination of a coded script with time based triggers tied into the Youtube API and some randomness combine to display the tweets sporadically in random locations over the seats in the image of the theatre. They are animated using simple CSS transitions. All of the client side javascript, including the coded script are embedded directly into this page. Only the Node-RED code is hidden away.

Changes from Rough Draft

The aesthetic of the page was changed, sources updated. This included changes to the background of the page and the tweet bubbles. The coded script was altered to include more keywords and variety during the playing of the video.