Monday, December 18, 2006

Just been looking at a new AHRC funded project run by Sita Popat at Leeds University. The project is fairly new I think but the site has some details about it:

  • What defines the relationships between the performer-dancer, the projected image and the performer-operator?

  • What are the methodologies for the engagement of performance academics and digital technologists in effective collaborative research?

What interests me about this project is the week it seeks to explore the relationship between performer and video artist or the 'operator'.

'
This project investigates a new set of interrelationships between performer, projection and technical operator, where the operator also becomes a ‘performer’ both controlling and being spontaneously present in the digital image on stage. The stage-performer interacts with the off-stage operator, who simultaneously sees, controls and 'performs' the projected image in the stage 'picture'. The expressive nature of the digital image concentrates the quality of the operator's movement through abstract forms that are choreographed with the stage performer.'

This interaction between performers I feel is fundamental to the creation of collaborative work using technology. This conversation between the stage and the projection needs to become one which utilises the skill of both performers. The human element in controlling and responding live to a another performer is built in to dance but has been lacking due to technological and the methodologies within live video and technological performance.

The video of one of there experiments can be seen HERE

This video got myself and Jonathan Green thinking about ways in which to trace points within a performance space. We discussed using a series of Max/MSP patch's to do this:

  • We can start by tracking skin colour of other colours on the dancers body. This can be more than one point at once and can be done by determining RGB levels and/or luminosity.
  • These points can then be traced using another patch.
  • The resulting points can then be filtered to create multiple images and styles on the stage. This set of filters can be effected by movement sensors on stage as well as by a 'controller'.
  • This resulting image can then be placed into another VJ package to edited live and then projected into the performance space.

We have though of a few ways of projecting:

  • Onto a semi-transparent screen covering the front of the stage.
  • Onto several banners and screens divided around the space.

the great thing is using the CODA system and pre designed Max/MSP patch's we can do this without to much designing or engineering.


No comments: