Memory Horizon is an immersive media installation that captures the nature of human memory in 3D space using visual, audial, as well as motion interaction.
There are five main features required in the installation. The Kinect located at the front-center of the space detects user's motion; in reaction to the inputs, screen effects are displayed, lights get brighter, and clock gears move faster with audio effects raises its tempos as well.
Memory Horizon is an immersive media installation that captures the nature of human memory in 3D space using visual, audial, as well as motion interaction. There are five main features required in the installation. The Kinect located at the front-center of the space detects user's motion; in reaction to the inputs, screen effects are displayed, lights get brighter, and clock gears move faster with audio effects raises its tempos as well
01 interaction design
Below is the basic interaction map / component of the experience.
Based on the mechanisms of Kinect, various interactions are developed within the viewing space of the installation which uses both of position (dx, fig.3) and depth (dz, fig.4) values.
The change in horizontal position (dx) input is mapped to the size of the screen, and this value is constantly reflected in the X coordinate of the position value of the panel. Also, the depth value is important inputs in the installation. The depth change (dz) of the target is the Z coordinate of the particle sphere displayed on the screen, the illumination of five LED lights, and tempos of the sound effects.
The main object in the Memory Horizon is a mechanical clock that works with a combination of springs and gears without electronics. Stepping motors A and B are installed on the two wheels moving the clock needles.(Fig.5) The motors are attached to the central axis of the wheel and rotate at one speed each gear wheel is achieved by rotating each gear at a single speed (rpm value). As the viewer approaches the clock, the motor swivels at a higher speed and, when away, at a slower speed.
led light mechanism
In response to audience's real-time location values(dz values), the screen, lighting, and sound display the following interactions effects :
02 final prototype
On the screen, real-time images are printed with a visual representation of particles. Each of the three spheres are composed of thousands of particles surrounding the core sphere. The sphere dimension interactively changes in response to audience's position values, using a horizontal position value (Ld) and a depth value (dz) detected by the Kinect. The horizontal position values are mapped to the screen size, which determines the X coordinate of the particle. The depth value to the camera depth value allows viewers to experience a sense of immersion as they get closer to the sensor.
led & sound
To maximize the immersive experience, not only the screen but also the lighting and sound reflect the Kinect information and demonstrates real-time interactions. Five LED Strip lights are installed at the bottom of the circuit controlled by Arduino boards. Also, Processing and Arduino are connected to each other with serial communication method.
The depth value of the viewer in string is uploaded to the board in analogue (AnalogWrite), with the overall LED's light intensity illuminating in real time. Clock sound tracks are also controlled by user location, and this interaction uses the sound-library provided by Processing.
media art installation
mar 2016 - oct 2016
Seoul, South Korea
2 team members