LUME Interactive Media Display

What are new and exciting ways to visualize and access relevant data within an environment?

Interactive LUME Display project explores tangible interactions with digital content through spatially embedded computation in public space at multiple scales from low to high resolution of interactivity. Composed of four interactive touch points, from a mobile app to a large display installation, LUME aims to build the identity for the department of Comparative Media Studies/Writing at Massachusetts Institute of Technology, by increasing awareness and visibility of the different research labs within the department, displaying variety of content, acting as a place finder, and engaging all potential users.

is designed as the main identifying signage of Department of CMS|W. It is also interactive, and responds to the movement of the passersby through the use of PING ultrasonic range finder sensors. Series of sensors are placed on the bottom strip of the signage wall, detecting the presence of nearby person and controlling the LED pixels to turn on/off in a designed sequence through Arduino microcontrollers. The etched acrylic panel is fully lit at all times, and starts to dynamically respond to people’s movement, as a result bringing attention to the otherwise static signage.

The Media Wall displays highest resolution of information on three large TV screens that are encased in linear strips of acrylic sheets to visually divide the large screen into multiple sections. Each band on the screen is dedicated to different labs to display curated contents, such as lab news, latest research, and researcher bios. Contents are operated from a main server, displayed on three sites with slightly different web address that are assigned to web browsers on respective smart TVs. Each of them uses the same backend, and when it needs to update, they are connected through node.js and, which allows socket push communication to the web browser.
User interaction of the Media Wall occurs with the use of an accompanying mobile app that allows users to flick (simple finger gesture of “flicking” the mobile image towards the screen) their pictures to the screen whenever they are close to it. Lab researchers can use this function to share photos, research papers, or other pixel data on a large departmental screen.

Nine palm-sized acrylic cubes placed on a shelf invite users to engage with the objects. Each cube is laser-etched with logos of different research labs, and identified with NFC tags (near field communication). When a lab sends a tweet, the LED light under the respective lab object (acrylic cube) turns on,
indicating new activity. The lifting or placing of the object turns off the LED light and sends a request to the server via NFC readers, to update the content of the web browser displayed on the main screen.

Download the project abstract from Pervasive Display conference in Copenhagen, Denmark 2014