Wednesday, 10 December 2008

Motion tracking and object recognition

This week I am researching on the possibilities of using motion tracking or object recognition in my project. In short, I want the camera to recognize people and their movement so that their postures will not be affected by special effects. Only the background is going to morph. This technique is possible to achieve in Max/MSP Jitter thanks to 'cv.jit' library. It is all fine except one thing - I do not posses the full version of Jitter which is needed for the library to work. Still, no problem - I can always work in TVU facilities where the full version of Jitter is installed on Mac's. In the meantime, I am researching on similar techniques to cv.jit or an equivalent. 

This is what I have found browsing the web. It is called CamSpace and is a great software to play games without controllers such as joysticks, mouse, keyboard or pad. All you need to controll anything in your PC is a camera. This software is recognizing the objects seen by the camera thanks to colour recognition. Anything in front of the camera can become a controller. This is awesome! This invention can be expanded and used in many different ways, not just to play games. But enough reading. See it yourself on their website.It is also ready to be installed on any PC for free. 


I found more on this topic of motion tracking. Here is a blog on which the creator is explaining and demonstrating how we can play 'ping pong' live video tracking and a microphone.

Monday, 8 December 2008

Time-based imaging project part3

Here are some videos and tutorials about high quality timelapse effects. Remarkably cheap setup allows to capture high quality images and then create a movie from a series of images. The tutorial and videos are described on 
And here is an amazing time lapse video I spotted on YouTube

Thursday, 4 December 2008

The design of the glove

While working on the effects I am trying to apply in my project, more and more I am also thinking about the design of the glove. A good example of what I am up to is the scene from 'Minority Report' movie. As I am a fan of Philip K. Dick's film adaptations, I like the design of gloves in that movie. 

What interests me more, is the idea of lights on top of the fingers. Now, I am thinking wheteher it is possible to have a pair of gloves. On one hand, the camera with the ultrasonic sensor, and on the other a big multi-coloured LED in the palm of a hand. When both gloves/hands are close to each other, the one with the LED on can represent the object in a different light and thanks to that change the effect seen on the screen. I came up with this idea also thanks to 'jit.chromakey' patch in Max/MSP Jitter. Chromakeying is the method of masking out one colour, with a tolerance, and replacing it with other data/footage. The colour can be altered so I am planing to take a closer look on that. 

Sunday, 30 November 2008

Time-based imaging project part2

Few weeks ago I have seen a Live Action Animation "Tango" by Zbigniew Rybczynski. This Polish artist's video had a great impact on me. It is showing a single small space - a room - where 36 characters, in different stages of life, are interacting. It would not be possible to have this much action happening in one room so Rybczynski came up with an unique idea. This is how he explains his idea: 'I had to draw and paint about 16.000 cell-mattes, and make several hundred thousand exposures on an optical printer. It took a full seven months, sixteen hours per day, to make the piece.' http://www.zbigvision.com


I was deeply inspired by Zbig Rybczynski's "Tango" and so wanted to include this inspiration in my project. The fact is that he is using layers on layers of animation. His piece is timeless. It remeinded me of other, more contemporary artists. Semiconductor is a team of artists who experiment with digital animation and video-editing. 'Semiconductor make moving image works which reveal our physical world in flux; cities in motion, shifting landscapes and systems in chaos. Since 1999 UK artists Ruth Jarman and Joe Gerhardt have worked with digital animation to transcend the constraints of time, scale and natural forces; they explore the world beyond human experience, questioning our very existence.' http://www.semiconductorfilms.com. I believe, they are using layer-on-layer animation in one of their piece called 'Earth Moves'. This is few snapshots from their website:

I am certain that these inspirations will find place in my project. My idea is to use a statis camera and record a chosen location so to use it as a background of my video. The foreground will be a static image edited frame by frame in photoshop. I am thinking of a city location and using an image of a building in the foreground. Like in the last picture above, I will edit the building 30 times for each second. I am not planing to spend full seven months of editing as Rybczynski did :-) Thanks to mathematical precision of Photoshop effects I can be much quicker in editing.

Topic: Time-based Imaging project part1

This time about my Time-based project. It has been one month since I am working on my ideas for the project. In that time I focused on many aspects of video-making and video-editing proceses. But it is just now when I have an idea worth of realizing on a big scale. I am posting here my recent videos and I will give a short description of the methods I used. This is what I came up with in recent weeks:  

As you can see, I put more importance on pictures and editing in my project than on moving image. This theme is the crucial part of my project as my final artefact is basing on this technique of video-making. I was happy from the final effect but there is something I would like to work on. Precisely, the fact that the whole scene is seen from one angle and the camera is static. My next approach was to make the camera moving around the object. In order to do that, I had to find out the shooting technique that will give me a control over the camera and its angles. The idea was to come up with the final product that will use the same technique as in video above and at the same time be relevant to modern shooting techniques. My next artefact was just an attempt using a phone camera and a shaby installation.

Sunday, 23 November 2008

Details

Hi everyone! Harder, faster, better... Recently nothing else matters and still...Time does no exceptions and that is why I would like you to inform that I closer every day to finish my Experimental Media project. And so, this time I can show you more detailed description of what I am building. 

The idea is to have four URM37 V3.2 Ultrasonic Sensors attached to fingers and a tiny camera attached to palm. All sewed in to a glove. The user is able to observe the world around him wearing this glove. Primarily, I wanted the user to see the world by wearing this glove and seeing things wearing a visor on his head. This idea came from Char Davies's projects where there is an image of a person wearing a visor connected to the rest of the equipment with cables. Obviously, this approach would be too expensive so in this case a screen will do. 

In my previous post I was explaining the ideological background of my project. Like I said, human hand is the medium of recognition in my project. World and materials within it are changing thanks to human touch sight. The user will be able to explore the matter thanks to its virtual representation. A person wearing this glove is seeing the world changing via touch.  

My plan is to use Max/MSP Jitter and Arduino. Two platforms speaking to each other by sending data. The URM37 is sending values of a distance between the hand and an object to Max/MSP Jitter. Jitter and its visual effects is there to manipulate the picture. At this stage I am working on special effects in Jitter. I want to have as many effects as I can get so that the user will be surprised by the vast diversity of virtual representations.

Now, a short description of how I am dealing with the project:

First of all, I have bought two URM37 V3.2 paying approx. 10 pounds for each. It was hard to find the 'Parallax' Ultrasonic Sensor on Ebay but I thought that buying URM37 is a better option because of its additional ability to recognize the temperature changes. That is the link to its preferences: www.yerobot.com/download/mannual/URM3.2%20Mannual.pdf

Everything looked good until I had to connect the URM to Arduino board. With the Parallax it is a piece of cake becuase its got three pins and one cannot be wrong with plugging this thing in. Here, I had to research on URM as its chip has got nine pins. The proper configuration found on: http://www.yerobot.com/forum/viewtopic.php?f=5&t=7&p=10&sid=72f4c2fbb84bf3

The next part is the Max/MSP Jitter patch that recognizes different values of distance. This patch has been downloaded from Lecture 6 of Experimental Media uploaded on TVU Blackboard thanks to Richard Colson:

Tuesday, 28 October 2008

References

Hi, I had a major problem with my internet conection last week. Nevertheless, I was working intensively on my projects, even being offline. The fruits of my divagations are materialising just now. But before I describe my projects in details, I would like you to take a look on http://www.immersence.com/ where Char Davies explains her understanding of immersive virtual-reality environments and our self-awareness. 'Osmose' and 'Immersion' by Char Davies have had a great impact on people who experienced 'the immersion effect'. As it is explained: 'Such response has confirmed the artist's belief that traditional interface boundaries between machine and human can be transcended even while re-affirming our corporeality, and that Cartesian notions of space as well as illustrative realism can effectively be replaced by more evocative alternatives. Immersive virtual space, when stripped of its conventions, can provide an intriguing spatio-temporal context in which to explore the self's subjective experience of "being-in-the-world"—as embodied consciousness in an enveloping space where boundaries between inner/outer, and mind/body dissolve'.
Following this thinking, I began to contemplate on human nature and human interaction with surrounding and nature. If human thought is considered as the beginning of evolution, the body is the slave of mind. Nowadays. people have tendency to change nature accordingly to their needs, in the name of development. Our mind has the ability to 'see' alternatives and to re-evaluate. We tend to control the environment with the power of our mind but, in order to make it happen, we need the power of our bodies. Since the finding of the great equation E=mc2 by Albert Einstein we recognize the world and its objects by physical eppearance. Nevertheless, the great minds of our times are trying to find the 'God's particle' by building The Hadron Collider. Humankind is looking for something that seems to be mystical and religious because it has not been yet observed. The ideology behind my project is to try to bring people closer to the understanding of reality as something mystic and enigmatic. Alternative worlds and abstract thinking are part of human nature. For instance, every single piece of a broken mirror is reflecting world as a separate entity and becomes detached from its previous state of a whole.
I am personally interested in the understanding of human emotions on a level of total surprise when on is put in front of unexpected change. Material things are changing its form and values with a human touch.
Do you remember a scene from a 'Matrix' movie when Neo, the protagonist, is entering the world of Matrix by touching the surface of a mirror that changes into fluid? A great metaphor of Alice in Wonderland. People explore using their senses. And so, Neo saw something unusual in the mirror and his next reaction was his finger touching its surface. Sight and touch are the tools of recognition. My project is basing of these factors. A human hand is the medium of recognition in this case. A camera attached to the palm is the virtual eye and the ultrasonic sensors attached to fingers are reality-distorting tools.
This is the ideological background behind my project. More technical details, sketches and information soon.   


Friday, 10 October 2008

Ok, I realized that reading Marcel Proust is unfortunately not enough to pass the egzams at TVU. It suddenly became clear that I need to browse internet in order to find something more contemporary. Except bringing the dusty Arduino board back to life and reinstalling all software from last year, I bought new edition of Maplin's magazine. I actually thought that I will have ideas just by flicking through the magazine:-) Well, apparently, things don't work this way(at least not in my case). Assuming that I inherited my visual thinking after my parents and that people are divided into those who have a visual memory and those who rather remember things just by listening, I decided to look for inspiration watching videos on YouTube. Easy, all you do is tap 'Arduino'. No need of reading books, presto, we live in digital age. The first movie that pops up is the 'Intro to the Arduino' on which two geeks in cool T-shirts are trying to show how cool Arduino can be. Probably you have seen that one so I won't leave a link to it. Then, I found a video of a guy who created sthg called 'Arduino Wireless POV', trully amazing, badass geek style:-) 



This is not what I am looking for anyway. I am interested in using Parallax Ping as an input and Visual effects as output. The process will involve Arduino software and MaxMSP as code holding software. Preferably, I will be using Winamp AVS effects projected on the screen. The Parallax Ping is a relatively small device that can be, for instane, to a finger. And so, if I use 5 Pings attached to each of the fingers, I can create different actions for each of the fingers. Here you can find an Arduino code that can be the starting point of this project:

http://www.arduino.cc/playground/Main/UltrasonicSensor

Hi,Hello, Welcome to my blog dedicated to new technologies in the world of digital art. This blog has been created in order to...make my life easier. Also, I want to share my ideas, thinking processes with my tutors and class-mates from Thames Valley University. Moreover, I welcome anyone interested in electronics or video-processing and so, if you are browsing this blog, I await for your feedback. I am focusing here on Arduino-based technologies, correlation between two mechanisms madiated by Arduino chip, video special effects, video-editing, post-processing and more. I hope, that I will leave posts regularly and that my ideas will be explained clearly enough.