Saturday, November 29, 2008

Gesture and Touch Based Interface: G-Speak from Oblong

Check out this new gesture- and touch-based interface that was developed by Oblong technologies. I came across this video on the Interaction Design Blog - as they rightfully state, this is the closest thing I've seen to the interface from Minority Report. The demo itself is pretty awesome. It showcases several cool features from this system, here is a quick overview of these elements:

Three-dimensional navigation that enables users to access content via an interface that can organize data within a three-dimensional environment. This enables users to "directly manipulate" data objects and quickly access to information. This is achieved because users are able to leverage their familiarity with physical space to navigate this virtual world.

This is not to say that users can just rely on their physical-world metaphors to successfully engage with this system. After all, computer-based metaphors always have an aspect of "magic" because the physical actions we carry out have meaning beyond their direct physical impact. Therefore, users will need to learn how to use their hands to pull, turn, push, select and act on objects within this virtual space.

Integration of physical and virtual spaces to enable users to interact seamlessly across multiple computers and screens within a given environment. This is a really cool feature that can enable an user to easily move files between computers and other devices, such as mobile phones. Better yet, users are still be able to manipulate the files using the same interface across any of the connect devices. Compare this to Microsoft's surface which enables users to drag a file on the screen so that it is added to a mobile device. However, in order to handle the file once it is on the mobile device, then the user needs to leverage the mobile-device specific interface.

One additional aspect of this physical-virtual integration is that it allows users to change the physical configuration of the system in order to impact the virtual configuration. This is illustrated in the video when the user rotates the "screen table". The data being displayed on the table rotates along with the physical movement of the table, except for the pointer which is being manipulated by the user. This pointer does not move because the users hold his hand stable to hold the pointer in place. Sounds like a small simple element but it adds a lot of possibilities to the interactions that can be supported by this system.

Collaborative interaction is enabled by the system's multi-user support. Therefore, several users can work simultaneously with the system. It is not clear whether different users can work on different types of tasks simultaneously within the same environment. One additional questions is if multi-task support is provided then what are the requirements and how would the behavior of the system change in order to support this type of usage.

The data interaction and visualization opportunities provided by systems of this type will likely enable humans to interact with increasingly complex and large data sets. I would love to see this system in action first hand. Until that is possible I will have to be content with this vide. Hope you enjoy it as well.

[via Interaction Design Blog]

No comments: