igbanner.jpg




Wearable Computing For The Masses

nym | 11:16 AM

As a reader of igargoyle, do you identify with cyborgs? Transhumanists? Hardware enthusiasts? Something else?

Vitorio Miliano answered this question with a very thoughtful answer that both questions my initial question, and also addresses wearable computing for the masses:

masses.gif
"Busy." "Overworked." "Forgetful." "Human."

I see wearables and ubiquitous computing in general as a solution for time management, information storage and a way to eliminate the modern workplace need for continuous partial attention, and to be able to go back to giving 100% of our attention to the task at hand by having the computer dictate what that task needs to be.

Mediated reality, digital autoassociative memories, it seems to me that all of this is being toyed with for the sake of toying. There are no serious efforts being made to produce something usable by the mass market, nothing that will take all our inputs during the day, email, news feeds, TV, IMs, schedules, appointments, interrupting coworkers,
family responsibilities, and filter out everything we either don't want to deal with or shouldn't be dealing with or can better deal with at another time.

There's no Jeff Hawkins for wearable/ubiquitous computing. There's no-one who is walking around with a block of wood strapped to their back and face figuring out the best way a single mom middle manager with two kids is going to most effectively use a device that can orchestrate her entire day for her if she would only trust it.

There's no-one taking those use cases and building a multimodal UI that's consistent and efficient and effective and unobtrusive, because having a high resolution HMD so you can run Microsoft Word isn't going to be the way this sort of technology is going to take off. Input must be passive and hands-free unless it's a pointed moment in time, such as interrupting a conversation to say "computer" or pulling out a touchpad so you can write in Graffiti or on a Blackberry-style chiclet keyboard.

Ubiquitous computing needs wearable computing to happen because of the bandwidth problem. The world will never be saturated with multi-megabit wireless bandwidth, and once you come to trust your computer, not having
it available because you're in between cell towers is not going to be pleasant: it's going to be disorienting. Storage and processing capacity will always beat bandwidth in availability. You'll store more information on you, not in the cloud, as time moves forward, so you need ways to ubiquitously present your information, from a behind-the-bathroom-mirror screen to the seatback touchscreen on an airplane to the stereo in your car. Only the work done with multimodal wearable UIs will support that.

The PalmPilot wasn't created to replace the desktop, just to replace a pad of paper. Modern handhelds and phones have forgotten that. Wearables still haven't figured it out. Hardware is essentially a solved problem, has been for years. Physical design and multimodal UI design for mass market appeal and everyday use isn't. No-one's even
started on it, because those that could be are already sitting in front of a high-resolution multi-processor desktop ten hours a day.

I sold off my wearable prototyping hardware because messing with it was a distraction from the real work in this that needs to be done: the user interfaces. A multimodal UI obviously includes a desktop component, because workstations will never go away, so nothing is stopping me from getting started right now besides my own false preconceptions.

All the pieces to accomplish this are out there, right now, today. They have been for years. Will the next Jeff Hawkins please stand up?

[ via the Wear-Hard mailing list ]

Comments

I agree wholeheartedly that the user interface is key, but I disagree that hardware is a solved problem. Displays still suck, after years and years of affordable options being just around the corner. Hell, I would LOVE to be hacking away at the user-interface issues (and thought a great deal about that during my Media Lab time already) but there just isn't anything on the market that I can wear around town without looking like a fool, no matter what software it's running. So why bother... I won't be able to use it anyway.

The next Jeff Hawkins is working on other stuff, because he knows that wearables will still be stuck be in the wooden-block stage for the forseeable future, absent a monumental infusion of VC cash to get over the chicken-and-egg hardware problems.

(Sorry, the above is probably too cynical.)

Posted by: Edward Keyes at April 10, 2006 08:57 PM

I also said physical design, making it look fashionable, was unsolved, and that no-one was working on that, either. :)

But the hardware to power that design and that UI? That's been solved for years, for sure.

I think the reliance on displays is a big mental hurdle, too, hence my multimodal emphasis.

Posted by: Vitorio at April 11, 2006 12:01 AM

Fair enough. I tend to mix physical design into "hardware problems", mainly because I have the tools to solve software problems myself, but not to do decent physical fabrication. A shorthand for "problems other people have to solve first". Heh heh.

I just don't see much of a way around the reliance on displays, though: humans can just take in orders of magnitude more information visually than through any other channel. It doesn't even need to be a GREAT display, to run Office in 1280x1024 24-bit glory... I could do marvellous things with the equivalent of an original 160x160 monochrome PalmPilot display, if it was always in front of my eye.

It's just so frustrating. If Microoptical and Lumus would get together, they could build The Perfect Display eyeglasses. But instead the focus is on trying to get you to watch TV from your cellphone...

Posted by: Edward Keyes at April 11, 2006 08:16 AM

You're absolutely right, we certainly can take in more information visually. The problem is that when something else moves or becomes active in our visual field, e.g. your HMD's news alert or something, it takes precedence, and you walk into a tree or get into a car accident.

This is why so many states have distracted driving laws: cell phones must be hands free and have voice dialing, the driver cannot see any of the DVD players in the car, etc.

Screens work best when you're not moving. If you've stopped moving, then you can probably turn your head to your arm, or pull out a screen from your pocket, or turn your glasses opaque and project onto them.

If you're moving, you should have other options for information transmission and input.

And if you're not moving for an extended period, you're probably at a desk, with a big monitor. So why not use that instead?

Posted by: Vitorio at April 12, 2006 12:26 AM
Post a comment











Remember personal info?