The experience from Ars Electronica conference 2017E-journey&Diary
The conference is very useful and inspires me very much. One of them that I found in this conference is that people try to create the new interface which we can interact with things surround us, not only activity in daily life but also the new interface that artist can perform their performance in an interactive way. An example of a new interface for the future is call
Transparent Interface project presented by Yuri Klebanov from Japan. He shows three demonstrations for control the device. First, using object detection, we can use three objects in front of the device for to control device starts working. He set the pattern for detect movement of the object; move left and right, move up and down, or turn around, so when we move the object match to the pattern the device will respond to us. Second, we can use our eye and hand movement to control the device. From the example, we can point out our eye directly to the device that we want to control, and the move our hand up and down, or left and right. The device that we point out will start working automatically according to our action. The last example, he tries to communicate that the future, the objects can know our sensation by detect human behaviour and habit. He shows the way that object can recognise us when we feel cold or hot. We will rub our arm when we feel cold, and we will wave our hand when we feel hot. He trains the device to understand these human behaviours and react to the human in the proper way, when people feel hot the fan will turn on and when people feel cold the fan will turn off and the light will turn on.
His project lets me think about in the future everything surrounds turn to be an intelligent device. The device can understand everything that humans do. The future of interfaces will be gesture interaction and less hardware interface.
The new way for sound performance, interactive sound performance. The artist Vibert Thio (TW) and Duanger Du (TW), they develop an interactive sound show called Étude. Their performance using a live coding technique to improve their performance more interactive with the audience. Two things that we can see them use in their show are the big moveable ring and six Chinese lamps in front of their show. The melody will change when they move the ring. They can move the ring slow or fast. It kind of the sound interface can detect the pattern of ring movement and then change to the mapped melody. The six lamp will turn on and off according to the melody and they also can interact with the lamp. The melody will change when they touch the lamp. The show is very impressive with light and sound and feels fascinating with the interactive element.
Another project called SynapSense, created by Australian artist. It is the interactive sound space installation. The space allows actors who perform show interactive with space. The interactive point includes; circle or triangle point on wall, circle or triangle on the floor, and the hanging rope. The space allows audience to go inside the space and participate with actors. The melody will be generated when actor stand on the point on the floor and start touching some audience. Furthermore, the audience can hang the rope and then melodies are generated according to the movement of the rope. The space improves performance more participate with the audience.