False game vs. True game (pragmatism)

I’ve found some examples of people who have gotten Arduino/Unity serial communication to work…

…on Windows.

On mac, which is what I’m using (I don’t own a Windows PC), this seems possible, but requires more programs to run in the background, more serial communication, and more potential bugs to work out. Some methods require custom plug-ins for Unity, which means a Unity Pro license (upwards of $1500, so no, not happening). Someone else got a good interface going with Max/MSP. I used this program last semester in my Electronic Composition class, so I do have some prior experience I could bring. But, all of this is putting more complexity into a relatively small aspect of my project. Links to two of the most promising looking examples with actual code are below. Consider these the swan song of the Arduino/Unity aspirations of this project.

Squeezy Cube

Arduino 2 Unity (Windows)

However, Arduino/Processing serial communication works fine on a mac, and is something that is already fully supported by both programs through already written libraries that simply have to be enabled at the start of the programs. My project, quite simply, is not a game design project and was never intended to be such. This is part technical demo, part art installation, but although serial communication between the controller and the “game” is required, that “game” being built in Unity is not. Processing can manipulate and display lit 3D objects in space. It’s not nearly as sophisticated as an actual game development IDE can accomplish, but can work just fine for my project. My game was never going to be complicated anyway. A circle/sphere moving over sections of the screen that light up in sequence would be fine for this. Because the real meat of my installation will be “player” interaction with the controller, not the game, this Arduino/Processing option is desirable.

Here is a link to my previous Physical Computing blog (first time through), when I completed an exercise in serial communication where sensor data was sent from Arduino to Processing. The code here will be very helpful in tweaking the handshake program to work with my project. It will be nice, when I’m designing the technical aspects of the controller, and the aesthetics of the controller, that I will have a handshake protocol to work from. It will be a helpful time saver.

I intend for the controller itself to be a game in a way. Winning will mean getting the controller to function in the game. The game will be the sequence of events the player (sometimes players) must accomplish to allow the controller to send input into the “game”. Wrong inputs or correct inputs not in the correct sequence could cause a loss of progress. For example, mess up too much in step 6, get pushed back to step 3 or 4, depending on the severity. This is the game, and the meat of the project; design the controller and the win condition of the controller as a game. A game that really makes the player think or act quickly, more akin to those schoolyard games like “Red Rover” or “Red Light, Green Light”. And I want to be able to put enough design time into the controller so it really looks inspired. So it will grab the eye and draw the player in, even before anyone plays it. So, what should it look like, where should the controls be placed in the design. These are the big artistic questions. Not what the visual monitor game will look like.

This is why I’ve put “game” (the false game) in quotes when referring to the graphical aspect on the monitor. That is not the game, the controller is the game (the true game).

Rethinking Art, or Gameception

I had an insightful chat with our professor about my project implementation. He commented that while the idea of a game controller was sound, and while wearable computing was a potential direction to pursue, he worried that my project was headed too much towards the “mean.” According to him, my previous projects in his classes (I have had 3 previous ones with him) have always been easily recognizable as “me.” Apparently, I have a style, and I wasn’t really trying for one, specifically. 😀

So I brought up an idea I’d been batting around recently, after searching through Sparkfun’s online catalog. Sparkfun has many different sensors and actuators, but it also has a bunch of cool human interaction devices, switches, buttons, displays, that sort of thing. And not just the ones that might pop into your head at first. Sparkfun offers 6 inch tall number displays, of the type you probably see in your microwave. Fingerprint scanners; huge lighted palm sized buttons; switches with a flip cover, like you might see on some nuclear launch console. Sparkfun also offers many different types of joystick components, from old style Atari to current gen.

Some developers are so scared of game piracy (a real problem), that they have designed their games to work incorrectly, if at all, if certain conditions are not met. No internet connection, a code not entered correctly, etc. They have a right to do this to combat people stealing their games for reasons that are out of the scope of this post, but suffice to say these precautions can often become onerous for normal, legal players who just want to enjoy the game.

But what about the controllers for these games? As far as I have been able to tell, controllers themselves are not protected by similar measures. You can physically operate the controller, maybe move the cursor on the screen, without actually being able to play the game. But what if you couldn’t?

What if the controller was deliberately designed to challenge the user to ensure that anyone who can use it has both a great sense of patience and a sense of humor. A controller with, potentially, a fingerprint scanner that allows a switch to be read that allows a code to be displayed that must be entered, and so on. Different actions could branch off, depending on what the potential player performed. In attempting to eventually access the game, the controller could become the game. Winning the controller would then open up the game, but using the controller is an achievement in itself. I’m not out to torture the player, or induce so much frustration that he or she quits, but encourage further exploration and play every moment along the way. Like a good game should. I’ll explore this idea further, but as of now, this is my direction, after discussing my project idea with my professor.

Well, it IS possible…

Using Xbee to wirelessly connect an arduino project with a Unity game is possible. I have a video right here.

Eureka! Of a sort.

Now, how is it possible. Next up, look for Unity scripting tutorials (I assume a script is used to interface, and I know where the Unity scripting reference docs are). Also, familiarize myself with how exactly Xbee works. I was leaning toward Xbee over Bluetooth (both components are available through SparkFun) since I felt it might be more straightforward for my purposes. Now that I know this is indeed possible and not just my hopes and dreams, it is heartening. I’ll find it out.

Project proposal, graphical form.

This builds off of my previous posts on what I want my project to be, but in visual terms.

The Kinect, an interaction device.

 

kinect

 

Some robot arm, a wearable device.

robot arm

 

Vast majority of game controllers to date: they use the hands.

controllers

 

This Limb-troller will not only use the hand, it will be worn on the hand and arm. It will BE the hand and arm.

Addendum: dominant hand chosen

As an addendum, I’ve decided to pursue using my right arm and hand, instead of the left. I’m not disregarding my points from before, but after thinking more on the decision, I’ve decided using my dominant hand gives this potentially more varied usability. Hypothetically, what if I’m wielding something in my dominant hand. Going off of the example from earlier, a sword fighting game would play better from the dominant hand. I’ve been doing a little informal testing of each hand for dexterity, and it seems my dominant hand is slightly better in that respect. So maybe the reason that the non-dominant hand traditionally controls everything but the aim and firing in a game is not because it’s supposed to be more dexterous with multiple options, but simply because nothing better is available. Well, that’s what I aim to fix, or at least address, with this Limb-troller, dominant hand side.

Wireless wanderings where I whittle away wonderful w-ideas.

In my search for a good way to wirelessly communicate between Limb-troller and game (see previous post), I’ve been researching different means. I very much doubt I’ll use infrared (IR) communication, because of its orientation limitations. I can’t and don’t want to guarantee that the Limb-troller will always be facing the receiver. Even if I designed the project this way, that would defeat my purpose of creating a controller that could take full advantage of clothing. Jump, twirl, anything like that would likely mess up the IR signal reaching the receiver. Even if that wouldn’t, IR seems fairly iffy, where sunlight or obstructions could mess up the reception.

“But why would you play a game from behind a wall?” you ask.

“Well, I wouldn’t, that’s silly. But I could imagine a situation where a game could work with more than one player. The other player happens to pass between you and your sword fighting game (I’m incredibly tired at the moment; sword fighting is plain but workable as an example). IR? No. Xbee or Bluetooth? Yes. Which one? Still working on that.

Which brings me to my two links, dealing with:

Bluetooth vs. Xbee

Xbee buying and design guide

I’m still working on wrapping my head around the first link (probably Xbee is the way I’ll go due to ease, as long as I’ll be able to get the computer and the game to recognize commands), but the second seems fairly self explanatory.

Project proposal: Limb-troller

Early on last semester, I originally planned to work game design into my project for this semester. When I met with Prof. Galanter about this, he encouraged the possibility while reminding me that the project would still have to put emphasis on the physical/interaction side, and not lose that work to the game design side. I shelved that idea since I wasn’t sure at the time how I could work physical computing into my gaming design bent.

Then this semester I looked closer at the phenomena of wearable computing. And I remembered some posts I’d read about how some people at Valve Software, a private video game development company, are experimenting with wearable computing in relation to gaming. Most of the posts I’ve seen come from the link below, in paraphrases or quotes. The whole article itself is rather long, but the second half deals more specifically with wearable computing, or rather the author’s decision journey towards investigating wearable computing. Last post dealt with the “what” and some of the “how”. This link deals with the “why bother?”

Why Wearables?

I propose to create a wearable gaming controller that will send control input based on the motion and orientation of the arm (for me, the left one as I am right handed). The game used to demonstrate this will be as simple as possible; I’m picturing a sphere flying/bouncing/rolling through space and obstacles. The simplest possible lighting and colors would be used, so as not to detract from the physical computing aspect. Basically a technical demo. I would like to get this “Limbtroller” (controller & arm portmanteau) to communicate wirelessly with the game if possible. Bluetooth type thing maybe; this is a less important aspect for the project than simply getting the controller to recognize the arm’s motion and orientation as useable input. Also the computer running the game should be able to receive the input and break it down into useable actions (for instance, an elbow angle could work in conjunction with finger movements to move the player sphere somehow, or perform some action). I’ve mentioned wireless Bluetooth as a possibility, since many computers can now accomodate wireless devices. I realize some infrared controllers can perform similar functions like this on gaming consoles, but laptop and desktop computers are my focus for this project.

So, design a game controller that can be comfortably worn on and operated by the entire left arm that will (wirelessly?) send signals to a simple demo game resulting in actions within the game. I would likely need trimpots, or some other sensor that can be rigged to read angle changes and angular velocity of arm joints. (I say other than trimpots because I noticed last time that the trimpots we had seemed a bit fragile for quick, repeated movements. Maybe something like a pulley; bending the arm changes the length of a wire/string that is read to determine an angle & angular velocity. This deserves further investigation.) The fingers would also be used, potentially also the connections between thumb and fingers. This could interact with arm position and orientation to create many different possible combinations of game input. Future games using this system or one like it could be developed. My first choice for the demo game itself would be to use the free version of Unity3D, which has many systems already in place to facilitate game creation, especially one as simple as this plan entails. The backup would be using Processing’s 3D capabilities, which while limited, do allow for basic movement. Aspects like collision detection and physics however, which would make the demonstration more engaging and entertaining, would be better implemented in Unity, which in fact already implements them with allowance for modifications.

A little note on why I’ve decided to use my left arm and hand, even though I am right handed: during games I control the mouse to aim with my right hand, but my left really gets most of the work out through the keyboard. This seems to be the default with most games. The left hand (if right handed) is expected to perform many actions and has many more buttons to choose from than the right. There are times when the amount of input required from my left hand exceeds what I feel is reasonable. This plan on focusing on my non-dominant hand my prove not as fruitful as I hope. In that case I can modify the controller to be worn and operated on my right arm. Parameters would need to be reversed or slightly modified, but this seems doable.

Due to the different aspects that must come together for this to work (Limbtroller design and building, game/character basic control, wireless? interaction, I can easily see this as qualifying for a two part project that can fill the entire semester. I’d rather have one big thing to show that works very well, than two smaller things that I wish I could have more time on.

Introduction and interests

My name is Jack Eggebrecht and I am a 3rd/Nth year student in Texas A&M’s Visualization Lab. My thesis work is geared toward character (role) recognition in games as it pertains to level of detail and character design (as the best way I can describe it). This will be my second semester repeating this course, since I thoroughly enjoyed it so well the first time around. As per the course setup, I will be developing 2 main projects (first and second half of the semester).

My main interests grow in video game development, specifically environment/character creation and user interaction. It is the user interaction specifically that I’m looking forward to expanding in this class. I’m still brainstorming, but I’m pretty sure that I want to use my 2 projects to gear towards a single large project in 2 parts, so I can get the most polished and interesting result possible, rather than two smaller results. I’d like to work towards some sort of wearable physical computing device in this class, and to that end I’ve found 5 different links that should and already have provided some interesting and helpful information.

 

Robotic Dress

Interesting how the physical computing aspect can be integrated so visibly into the clothing, plus the motion of the arms really helps. Maybe this could be combined with lights or sound to really set off the aesthetic. I’m not sure how it’s controlled, whether some sensor is picking up something, or manually by remote, or even something the model is doing that’s not apparent. My guess is just remote controlled by a human operator, since the video doesn’t say, but I could be wrong.

 

Lights, lights, lights!

Interesting how these clothes are designed so that LEDs integrated into the material display multicolored patterns, apparently based on social media like Twitter. I’m more into something involving some sort of motion, but the moving patterns are still enjoyable to look at here.

 

Soft Sensors

This is a good reference site for what types of cloth and thread work well for wearable/physical computing, as well as some neat sounding tips and tricks on what techniques work best for this pursuit and some good places to obtain these materials.

 

Soundly Made Sleeves

This is a practical example of a type of clothing that could be achieved using the materials and techniques from the previous link. This shirt has sleeves that produce sound based on how the fabric bends and moves as users flex and cross their arms. With this link, I have a nice example from motion, light, and now sound. Finding an innovative way to combine these three aspects into my project is one of my long term goals in this class.

 

Got Mail

This project looks fairly straightforward in its purpose; LEDs light up based on the amount of email the wearer has received. Aside from the cute, gimmicky nature of this, I note it for its use of wirelessly connecting an Arduino Lilypad, a basic setup which seems very advantageous for wearable computing. I know the Lilypad isn’t as powerful as other Arduino boards, so it seems like a balance would have to be achieved. I’m sure I’ll find out more about this balance as I progress in my project(s).

 

There we go, I look forward to an enriching semester, with me being able to showcase an awesome result from this class in VizaGoGo!

Now let’s get physical. Again.