In collaboration with Katherine Isbister (Scientist Collaborator for Eyebeam Computational Fashion Fellowship) and Jack Langerman (Projection mapping & Computer vision).
The Lightning Bug game is a two-person interactive game experience, using costumes embedded with technology, projection on a half dome surface and custom software. The players represent the last remaining lightning bugs in a world consumed by pollution, and must cooperate with each other in order to fight against a virtual enemy of darkness. Each player has a distinct role — one player shoots and the other collects power — and they both must help each other by hold hands in order to transfer power from one player to the other in order to fight. This is a part of a larger exploration of various questions explored when considering the potential of costumes as game controllers.
What “Costumes as Game Controllers” addresses and explores: When we play video games, we often play through characters in a story. We control the avatars, create a relationship with them and experience the game through them. But what happens when we dress up in costume? Can we start to play the role of the avatar? And what happens when we then embed technology into the costumes that allow us to navigate and play through the game experience through the costumes? Would it draw them into a more compelling, fully-realized world? I believe costumes can be a powerful tool that can help the player play a role and posses characteristics and behavior the player normally wouldn’t.
We can already see examples of this kind of immersive experience in live-action role playing games and cosplay—an expression of the desire to fully inhabit a character in the story. Wearing the costume of a character in the story and acting out the various scenarios of the character can help create a rich, fantastical world, removing us from our everyday lives and identities. It’s more than just wearing the costumes, but rather experiencing the character and the story physically and emotionally. The deeper we fall into the “magic circle”of the game world, the more opportunities there are for interactions and experiences not only with the technology, but also with each other, that we would otherwise not have in our regular day to day lives.
While, of course, the mechanics of a game are also important, I believe that this act of stepping into a role, transforming from regular person to fantastical hero with superpowers, is also a crucial process to consider when creating an immersive experience. Culturally, the act of putting on a costume is often seen as a process of transformation; costumes can signify sense of power that wasn’t there before — think of Superman and Wonder Woman. In combination with costumes, ritualized gestures can also help the process of transformation—as in this transformation scene from the “Kamen Rider” Japanese television series.
Gameplay and Physical Interaction: For the last several years my objective has been to make games which utilize technology to encourage face to face interaction. While many digital interactions of today are face-to-screen and can seem isolating, I ultimately believe that technology is a tool that it can be harnessed to enhance face-to-face interactions. The Lightning Bug game requires two players to cooperate with each other in order to battle the Dark Clouds. During the game, players will rely not only on communicating through talking, but also through eyes, body language and touch. It can be said that in this game, technology becomes a secondary element that helps to amplify the ultimate objective of the game — to create an exciting, immersive face-to-face game experience between participants.
Both players have distinctively different roles they are fulfilling in the game. One player plays the role of the shooter, wearing a spikey gauntlet, while the other player plays the role of the collector, accumulating power in the power capsule back pack. In order to distribute the power from the power capsule to the shooter, the players must hold hands. In order to detonate bombs which momentarily stun the enemy, the players must embrace to charge. These examples of physical interaction in order to execute certain actions in the game is a key feature to emphasize the interdependency needed to win the game. When the power runs out, the player wearing the power pack must stand under a spotlight and rub hands together in order to accumulate power in the capsule.
Technology: A dedicated wi-fi network is used to connect gauntlet, power pack and main computer which runs the main portion of the game. Each controller is embedded with an Android phone and Adafruit Mint IOIO Board. The mobile phone is useful because it provides powerful processing abilities, wifi and bluetooth, and an accelerometer. There are also less wired connections when using the phone, and anyone who has worked with custom controllers know that working with wired connections raises the risk of issues and ultimately frustrations. Once it’s connected to the IOIO board, it becomes sort of like an Arduino on steroids and can easily control the LED’s in the 3D printed spikes or the laser on the gauntlet, as well as the LED’s in the power pack. Bluetooth connects the Android phone to the IOIO board, so that the phone does not drain the battery in the IOIO board. Open Sound Control is used over the dedicated wifi network. Both the main game program on the computer and both Android apps were coded in Processing using Ketai, OSC and PIOIO libraries. There are additional contact sensors on the gloves, as well as a photoresistor (light sensor) on the power pack. Above the play area, suspended from the ceiling is the circular mirror, projector used for projection mapping and web cam used for tracking the laser from the gauntlet. This configuration allows for the projection and the tracking to happen without getting blocked by the players’ bodies.
Materials & Fabrication: The gauntlet is made from laser cut EVA foam, 3D printed (Makerbot Replicator 2) translucent spikes embedded with LED’s, and stretchy fabric. The power pack consists of a scuba tank holder holding a 3D printed multi-layered capsule embedded with LED’s. The projection half dome is made from projection material (white mid-weight jersey with reflective particles on surface), heavy weight canvas, and numerous fiberglass replacement tent poles.
Collaborations: I am collaborating with Dr. Katherine Isbister, Director at the NYU-Poly Game Innovation Lab and Associate Professor in the Computing Science Department, on this project. Her area of expertise and research is in Human Computer Interaction (HCI), especially in the video game context. Our collaboration explores ways in which Art and Science can work together and be mutually beneficial. Katherine invited me in 2012, to become the Artist in Residence at the lab. I am also collaborating with Jack Langerman who is responsible for the projection mapping and computer vision of the laser on the dome.
- Toni Pizza (Intern): Playtesting, Dome Set up/Break down, Documentation of fabrication process.
- Sebastian Teesdale (Intern): Dome Set up/Break down, Research/designing/prototyping to make controllers more accessible, Fabricating additional controller prototypes.
- Shoshana Kessock (Intern): Prose writing, Researching accessibility of power pack, Fabricating additional controller prototypes.
Documentation of Process:
- Notes regarding the process and various adventures in development can be found here.
- Fabric half dome making instructions will be found here (not posted yet!).
- Some initial brainstorming sketches:
- Other photos: