Wireless wanderings where I whittle away wonderful w-ideas.

In my search for a good way to wirelessly communicate between Limb-troller and game (see previous post), I’ve been researching different means. I very much doubt I’ll use infrared (IR) communication, because of its orientation limitations. I can’t and don’t want to guarantee that the Limb-troller will always be facing the receiver. Even if I designed the project this way, that would defeat my purpose of creating a controller that could take full advantage of clothing. Jump, twirl, anything like that would likely mess up the IR signal reaching the receiver. Even if that wouldn’t, IR seems fairly iffy, where sunlight or obstructions could mess up the reception.

“But why would you play a game from behind a wall?” you ask.

“Well, I wouldn’t, that’s silly. But I could imagine a situation where a game could work with more than one player. The other player happens to pass between you and your sword fighting game (I’m incredibly tired at the moment; sword fighting is plain but workable as an example). IR? No. Xbee or Bluetooth? Yes. Which one? Still working on that.

Which brings me to my two links, dealing with:

Bluetooth vs. Xbee

Xbee buying and design guide

I’m still working on wrapping my head around the first link (probably Xbee is the way I’ll go due to ease, as long as I’ll be able to get the computer and the game to recognize commands), but the second seems fairly self explanatory.

Project proposal: Limb-troller

Early on last semester, I originally planned to work game design into my project for this semester. When I met with Prof. Galanter about this, he encouraged the possibility while reminding me that the project would still have to put emphasis on the physical/interaction side, and not lose that work to the game design side. I shelved that idea since I wasn’t sure at the time how I could work physical computing into my gaming design bent.

Then this semester I looked closer at the phenomena of wearable computing. And I remembered some posts I’d read about how some people at Valve Software, a private video game development company, are experimenting with wearable computing in relation to gaming. Most of the posts I’ve seen come from the link below, in paraphrases or quotes. The whole article itself is rather long, but the second half deals more specifically with wearable computing, or rather the author’s decision journey towards investigating wearable computing. Last post dealt with the “what” and some of the “how”. This link deals with the “why bother?”

Why Wearables?

I propose to create a wearable gaming controller that will send control input based on the motion and orientation of the arm (for me, the left one as I am right handed). The game used to demonstrate this will be as simple as possible; I’m picturing a sphere flying/bouncing/rolling through space and obstacles. The simplest possible lighting and colors would be used, so as not to detract from the physical computing aspect. Basically a technical demo. I would like to get this “Limbtroller” (controller & arm portmanteau) to communicate wirelessly with the game if possible. Bluetooth type thing maybe; this is a less important aspect for the project than simply getting the controller to recognize the arm’s motion and orientation as useable input. Also the computer running the game should be able to receive the input and break it down into useable actions (for instance, an elbow angle could work in conjunction with finger movements to move the player sphere somehow, or perform some action). I’ve mentioned wireless Bluetooth as a possibility, since many computers can now accomodate wireless devices. I realize some infrared controllers can perform similar functions like this on gaming consoles, but laptop and desktop computers are my focus for this project.

So, design a game controller that can be comfortably worn on and operated by the entire left arm that will (wirelessly?) send signals to a simple demo game resulting in actions within the game. I would likely need trimpots, or some other sensor that can be rigged to read angle changes and angular velocity of arm joints. (I say other than trimpots because I noticed last time that the trimpots we had seemed a bit fragile for quick, repeated movements. Maybe something like a pulley; bending the arm changes the length of a wire/string that is read to determine an angle & angular velocity. This deserves further investigation.) The fingers would also be used, potentially also the connections between thumb and fingers. This could interact with arm position and orientation to create many different possible combinations of game input. Future games using this system or one like it could be developed. My first choice for the demo game itself would be to use the free version of Unity3D, which has many systems already in place to facilitate game creation, especially one as simple as this plan entails. The backup would be using Processing’s 3D capabilities, which while limited, do allow for basic movement. Aspects like collision detection and physics however, which would make the demonstration more engaging and entertaining, would be better implemented in Unity, which in fact already implements them with allowance for modifications.

A little note on why I’ve decided to use my left arm and hand, even though I am right handed: during games I control the mouse to aim with my right hand, but my left really gets most of the work out through the keyboard. This seems to be the default with most games. The left hand (if right handed) is expected to perform many actions and has many more buttons to choose from than the right. There are times when the amount of input required from my left hand exceeds what I feel is reasonable. This plan on focusing on my non-dominant hand my prove not as fruitful as I hope. In that case I can modify the controller to be worn and operated on my right arm. Parameters would need to be reversed or slightly modified, but this seems doable.

Due to the different aspects that must come together for this to work (Limbtroller design and building, game/character basic control, wireless? interaction, I can easily see this as qualifying for a two part project that can fill the entire semester. I’d rather have one big thing to show that works very well, than two smaller things that I wish I could have more time on.

Introduction and interests

My name is Jack Eggebrecht and I am a 3rd/Nth year student in Texas A&M’s Visualization Lab. My thesis work is geared toward character (role) recognition in games as it pertains to level of detail and character design (as the best way I can describe it). This will be my second semester repeating this course, since I thoroughly enjoyed it so well the first time around. As per the course setup, I will be developing 2 main projects (first and second half of the semester).

My main interests grow in video game development, specifically environment/character creation and user interaction. It is the user interaction specifically that I’m looking forward to expanding in this class. I’m still brainstorming, but I’m pretty sure that I want to use my 2 projects to gear towards a single large project in 2 parts, so I can get the most polished and interesting result possible, rather than two smaller results. I’d like to work towards some sort of wearable physical computing device in this class, and to that end I’ve found 5 different links that should and already have provided some interesting and helpful information.

 

Robotic Dress

Interesting how the physical computing aspect can be integrated so visibly into the clothing, plus the motion of the arms really helps. Maybe this could be combined with lights or sound to really set off the aesthetic. I’m not sure how it’s controlled, whether some sensor is picking up something, or manually by remote, or even something the model is doing that’s not apparent. My guess is just remote controlled by a human operator, since the video doesn’t say, but I could be wrong.

 

Lights, lights, lights!

Interesting how these clothes are designed so that LEDs integrated into the material display multicolored patterns, apparently based on social media like Twitter. I’m more into something involving some sort of motion, but the moving patterns are still enjoyable to look at here.

 

Soft Sensors

This is a good reference site for what types of cloth and thread work well for wearable/physical computing, as well as some neat sounding tips and tricks on what techniques work best for this pursuit and some good places to obtain these materials.

 

Soundly Made Sleeves

This is a practical example of a type of clothing that could be achieved using the materials and techniques from the previous link. This shirt has sleeves that produce sound based on how the fabric bends and moves as users flex and cross their arms. With this link, I have a nice example from motion, light, and now sound. Finding an innovative way to combine these three aspects into my project is one of my long term goals in this class.

 

Got Mail

This project looks fairly straightforward in its purpose; LEDs light up based on the amount of email the wearer has received. Aside from the cute, gimmicky nature of this, I note it for its use of wirelessly connecting an Arduino Lilypad, a basic setup which seems very advantageous for wearable computing. I know the Lilypad isn’t as powerful as other Arduino boards, so it seems like a balance would have to be achieved. I’m sure I’ll find out more about this balance as I progress in my project(s).

 

There we go, I look forward to an enriching semester, with me being able to showcase an awesome result from this class in VizaGoGo!

Now let’s get physical. Again.