25C3: state OF THE ART WEARABLE COMPUTING

[Kai Kunze] from the Embedded Systems lab at Passau concerned 25C3 to talk about Cyborgs and Gargoyles: state of the Art in Wearable Computing. There have been a lot of homebrew wearable computing solutions, but [Kai] covered specifically projects that could see everyday use in the real world.

The first was a prototype system they built for use in hospitals. The doctor wore a belt buckle sized linux computer under his coat which was attached to an RFID reader on his wrist. He would read the clients RFID wrist band, which would display their chart on the screen. He could then scroll and select using a capacitive sensor built into the coat. notes could be taken using a bluetooth headset. The system kept the doctor’s hands totally free for examining the client while still offering as much information as possible. They actually ran this system for 30 days in a hospital.

The next example was a joint project with the car producer Skoda. quality assurance (QA) testing can be a long process with numerous much more steps than assembly operations. The team attached sensors to the worker to identify where the worker was in relation to the car and to get direct measurement of the object being tested. The use of wearable technology implied they got much more data than they typically would with standard QA testing and they could rapidly prompt the worker if they missed a step.

[Kai] identified a couple projects that would make developing your own system much quicker. Context recognition Network Toolbox helps you identify what actions are being performed. They’ve used it to build systems like an automated kung-fu trainer that can recognize poses. There’s also a context logger app for the iphone that can be trained using accelerometer data to recognize different activities. He also suggested a program developed with Zeiss for visually prompting workers as they carried out tasks. In testing, it was 50% faster than text instructions and 30% faster than voice.

One of the much more bizarre/interesting ideas we saw was a phone locator based on resonance (PDF). developed for a Symbian device, it would play a sound and then record the result that had been modified by the surroundings. Each surface had its own signature so you could query the phone and it would report where it was i.e. on the desk, on the sofa, in the drawer. This resonance sampling can also be employed using the vibration motor.

The final point [Kai] touched on was privacy. If you’re wearing a sensor, you’re potentially giving away personal data. He showed an example of how systems could be developed to keep this information to users. The first part was a video camera recording the movement of people in a room. It could identify where the faces were, but not who they were. one of the participants had an accelerometer recording their movements. That user could use the camera’s data to figure out his own movement in the space by correlating the data, but no one else would see the full picture.

Back to top