Summary of Summer Projects
I learned about the video capabilities of nighthawk and how to configure its video output to different channels. The different channels can then be received by various devices such as a monitor, the Immersadesk and the Video Wall.
In order for the Immersadesk to use its stereo video output capabilities, nighthawk must be configured to 640X800 resolution and stereo mode. Once this was done, we could run the Virtual Reality demos!
A cavevars program is the diagnostic program used to make sure all the components of CAVElib is working properly. Mainly, it outputs the locations and orientations of the sensors (the goggles and the wand) on the screen as determined by the tracker software running on the Immersadesk and then passed off to the tracker daemon running on nighthawk.
CAVElib is the library that’s used to take advantage of the stereo and tracking capabilities of the Immersadesk. It can be integrated with OpenGL and is versatile and easy to use. A three demos came with CAVElib: Crayoland; 5D; and Quake. Crayoland and 5D were distributed with source code, Quake was distributed with some Quake configuration files.
My first task was to write a ‘hello world!’ program for the Immersadesk. This was not difficult for some sample programs were available both from the CAVElib distribution and the web. Therefore, instead of going for a ‘hello world!’ program, I decided to convert the code for my group project in CS4451 to use CAVElib instead. There are few issues with doing this, and I had to figure these out the hard way most of the time. They are pretty OpenGL specific and I won’t go into them here. But finally, I got some fractal mountains to show up in stereo video. And you could navigate through the mountains using the wand. The tracking and adjusting of the viewpoint was done entirely automatically by CAVElib, and so moving your head around would already have the effect of changing your perspective.
Next I decided the interactive menu used in the 5D demo were pretty cool and decided to use them in my program. I found out that the menus were not part of CAVElib but were merely part of the 5D demo (they even included bitmaps for ascii characters in the code, it was pretty cool). CAVElib does not offer such high level functionality, but only the ability to query the states of the sensors and automatic stereo video rendering relative to the sensors. I decided to steal the menu code right out of the 5D demo. I had a bit more trouble than I thought doing this, and this was again due to the differences between using CAVElib and the glut library. I eventually got to the bottom of it and had the menus functioning in my program.
I instead to write a HOWTO based on what I know about the Immersadesk and the nighthawk video setup.
Stampede Laser Tag Project
The vision was to use sensors throughout the aware home then channel the data to some central processing unit which may then facilitate a game in real time. To do this we – Nathen Bell, Shyam Jayaraman, Ross Hanahan and me – learn the basics of the Stampede library, the Distributed Stampede library, did some brain storming about the natural of the game, and then fiddled with an existing client-server program which channels video data captured by USB cameras connected to Skiff boards through the network.
The ideas for games we came up with were not very precise, but we decided that once the infrastructure was in place, it would be easy to implement many interesting games. Like Ross said, “It’s like, once you have a deck of cards, you can have however many games you want.” Our games would have the following characteristics:
1) There would be cameras and maybe other sensors located throughout the playing area.
2) The playing area would consist of a number of rooms and hallways.
3) There would be some sensitive points in the playing area which consists of one or more sensors which can tell if a particular player was at a particular sensitive point.
4) The identification of a player could be the color of his/her shirt.
5) Each player may carry an Ipaq which could be fed video data or game information inferred by the game server from the various cameras of the playing field.
6) As an alternative or complement to the Ipaqs, computer terminals come be placed at various access points in the playing field which could be used to obtain video and other information about the game in progress.
As the hello world program written In Stampede. I modified the simple example given by Sameer to write a file from one “address space” to another.
We were given some USB camera code. Our next task was to automate the initialization of the clients remotely from the server, in much the same way as Sameer’s Distributed Stampede example. This was done using the rsh unix program. The server would simpling start the client process in the client machine using rsh, which would in turn connect the server thread that’s waiting. The remote commands to be executed were left to some configuration files for flexibility. However, many parts of the automatic startup process was still unclean and should be improved. We had some trouble getting the whole system working, mainly due to the lack of documentation. But eventually we were able to display the captured images on an x386 Linux box.
Our next goal was to write a mindless image processing function to detect the presence of a certain color. This ability would be used at the sensitive points to determine the presence of a player and his identity. We did this by simply sampling the pixels of the video image, comparing the pixel color to the target color, averaging this for all the sampled pixels, and comparing it to a reference value. The reference value is the calculated value from a stabilized frame, .i.e. a frame without any players in it. If the difference is above a certain threshold, we conclude that the player corresponding to the target color has entered the view of the camera.
Much work needs to be done to make sure a game a reality. Right now, a big roadblock lies in getting the Aware Home’s permissions to use their facilities as the playing field.
The Access Grid is a global multimedia multicasting application using Internet2. Eric Brown and me have learned about the operation of it and wrote up an HOWTO for it on the swiki. Our goals in this area were to port the Access Grid to the Ipaq and the Mac OS X platform, among other things.
I have not done anything useful in this area since, but Eric has tested the JVM on the Ipaq, and has ported the RAT utility onto the Ipaq as well.
Links to this Page
- Who's who last edited on 13 November 2002 at 2:04 pm by ss09.co.us.ibm.com
- Access Grid last edited on 8 June 2001 at 12:37 am by user-38lcn2h.dialup.mindspring.com.
- Immersadesk last edited on 25 August 2001 at 5:47 pm by baja-nt.cc.gatech.edu
- Stampede/Aware Home last edited on 17 June 2001 at 1:55 pm by singularity.buserror.org.