My Message close
Contents
From Research To Games: Interacting With 3D Space
 
 
Printer-Friendly VersionPrinter-Friendly Version
 
Latest News
spacer View All spacer
 
November 8, 2012
 
Jordan Mechner finds his cinematic rhythm with Karateka remake [2]
 
Game Developer magazine's Power 50: Business
 
Activision beats estimates by going big, focusing on retail hits [6]
spacer
Latest Jobs
spacer View All     Post a Job     RSS spacer
 
November 8, 2012
 
GREE International
Game Studio, Android Developer
 
GREE International
Sr. Analyst, Social Game Platform
 
GREE International
HTML5 UI Developer
 
Avalanche Studios
Senior Tools Programmer, New York
 
Avalanche Studios
Senior Graphics Programmer, New York
 
ZeniMax Online Studios
Customer Service Analytics Manager
spacer
Latest Features
spacer View All spacer
 
November 8, 2012
 
arrow Postmortem: Sony Bend Studio's Uncharted: Golden Abyss [1]
 
arrow Is Game Music All It Can Be? [31]
 
arrow The End of the Journey: One Indie Studio's Tale [21]
 
arrow The Tip of the Iceberg: Storytelling and Reactive Design in Canabalt and Capsule [4]
 
arrow The Tightrope Walk: Hitman Absolution, Freedom, and Realism [8]
 
arrow How to Design Your iOS Game to Grow: Learning from the Style of Tiger Style [5]
 
arrow Postmortem: Humble Hearts' Dust: An Elysian Tail [16]
 
arrow An Assassin's Revolution [3]
spacer
Latest Blogs
spacer View All     Post     RSS spacer
 
November 8, 2012
 
Making Failure Fun [2]
 
Quality of Life: Bring more snus! [7]
 
The Best Political Games of Election 2012 [2]
 
An argument for easy achievements [46]
 
Six words about role-playing games [35]
spacer
About
spacer Editor-In-Chief:
Kris Graft
Features Director:
Christian Nutt
News Director:
Frank Cifaldi
Senior Contributing Editor:
Brandon Sheffield
News Editors:
Frank Cifaldi, Tom Curtis, Mike Rose, Eric Caoili, Kris Graft
Editors-At-Large:
Leigh Alexander, Chris Morris
Advertising:
Jennifer Sulik
Recruitment:
Gina Gross
 
Feature Submissions
 
Comment Guidelines
Sponsor
Features
  From Research To Games: Interacting With 3D Space
by Joseph LaViola Jr. [Design, Programming, Art, Serious]
7 comments Share on Twitter Share on Facebook RSS
 
 
April 22, 2010 Article Start Page 1 of 6 Next
 

[While 3D interfaces are just taking off for consumers, years of research have been poured into the field. Dr. Joe LaViola of the University of Central Florida shares detailed academic findings about building useful and fun ways to interact with games.]

Over the last few years, we have seen motion controller technology take the video game industry by force. The Sony EyeToy and the Nintendo Wii have shown that game interfaces can go beyond the traditional keyboard and mouse or game controller to create more realistic, immersive, and natural gameplay mechanics.



In fact, in the next few months, with the Sony Move, Microsoft's Natal, and Sixense's TrueMotion, every major console and the PC will have input peripherals that support 3D spatial interaction. This paradigm shift in interaction technology holds great potential for game developers to create game interfaces and strategies never thought possible before. These interfaces bring players closer to the action and afford more immersive user experiences.

Although these devices and this style of interaction seem new, they have actually existed in the academic community for some time. In fact, the fields of virtual reality and 3D user interfaces have been exploring these interaction styles for the last 20 years.

As an academic working in this field since the late '90s, I have seen a significant amount of research done in developing new and innovative 3D spatial interface techniques that are now starting to be applicable to an application domain that can reach millions of people each year.

As a field, we have been searching for the killer app for years, and I believe we have finally found one with video games.

I believe that the game industry needs to take a good look at what the academic community has been doing in this area. There is a plethora of knowledge that the virtual reality and 3D user interface communities have developed over the years that are directly applicable to game developers today.

Although it is important for game developers to continue to develop innovative interaction techniques to take advantage of the latest motion sensing input devices, it is also important for developers to not have to reinvent the wheel.

Thus, the purpose of this article is to provide a high-level overview of some of the 3D spatial interaction techniques that have been developed in academia over the last 10 to 15 years. It is my hope that game developers will use this article as a starting point to the rich body of work the virtual reality and 3D user interface communities have built.

As part of this article, I am also providing a short reading list that has more detailed information about research in 3D spatial interfaces and as well as their use in video games.

3D Spatial Interaction

What is a 3D spatial interaction anyway? As starting point, we can say that a 3D user interface (3D spatial interaction) is a UI that involves human computer interaction where the user's tasks are carried out in a 3D spatial context with 3D input devices or 2D input devices with direct mappings to 3D. In other words, 3D UIs involve input devices and interaction techniques for effectively controlling highly dynamic 3D computer-generated content, and there's no exception when it comes to video games.

There are essentially four basic 3D interaction tasks that are found in most complex 3D applications. Actually, there is a fifth task called symbolic input -- the ability to enter alphanumeric characters in a 3D environment -- but we will not discuss it here. Obviously, there are other tasks which are specific to an application domain, but these basic building blocks can often be combined to let users perform more complex tasks. These tasks include navigation, selection, manipulation, and system control.

Navigation is the most common VE task, and is consists of two components. Travel is the motor component of navigation, and just refers to physical movement from place to place. Wayfinding is the cognitive or decision-making component of navigation, and it asks the questions, "where am I?", "where do I want to go?", "how do I get there?", and so on.

Selection is simply the specification of an object or a set of objects for some purpose.

Manipulation refers to the specification of object properties (most often position and orientation, but also other attributes). Selection and manipulation are often used together, but selection may be a stand-alone task. For example, the user may select an object in order to apply a command such as "delete" to that object.

System control is the task of changing the system state or the mode of interaction. This is usually done with some type of command to the system (either explicit or implicit). Examples in 2D systems include menus and command-line interfaces. It is often the case that a system control technique is composed of the other three tasks (e.g. a menu command involves selection), but it's also useful to consider it separately since special techniques have been developed for it and it is quite common.

There are two contrasting themes that are common when thinking about 3D spatial interfaces: the real and the magical. The real theme or style tries to bring real world interaction into the 3D environment. Thus, the goal is to try to mimic physical world interactions in the virtual world.

Examples include direct manipulation interfaces, such as swinging a golf club or baseball bat or using the hand to pick up virtual objects. The magical theme or style goes beyond the real world into the realm of fantasy and science fiction. Magical techniques are only limited by the imagination and examples include spell casting, flying, and moving virtual objects with levitation.

Two technical approaches used in the creation of both real and magical 3D spatial interaction techniques are referred to as isomorphism and non-isomorphism. Isomorphism refers to a one to one mapping between the motion controller and the corresponding object in the virtual word.

For example, if the motion controller moves 1.5 feet along the x axis, a virtual object moves the same distance in the virtual world. On the other hand, non-isomorphism refers to ability to scale the input so that the control-to-display ratio is not equal to one. For example, if the motion controller is rotated 30 degrees about the y axis, the virtual object may rotate 60 degrees about the y axis.

Non-isomorphism is a very powerful approach to 3D spatial interaction because it lends itself to magical interfaces and can potentially give the user more control in the virtual world.

 
Article Start Page 1 of 6 Next
 
Comments

Dustin Chertoff
profile image
I feel like I took a class on this a couple years ago. =) (I'm a former UCF student, graduated there last year, and Joe was on my dissertation committee - k, disclosure complete.)



Seriously though, this is good stuff that game developers interested in 3DUI should be aware of. Great article and it puts everything in a nice, centralized location. And it serves as a great refresher for those already familiar with the concepts.

Simon T
profile image
@ Tim



Research informs creation.

Isaiah Williams
profile image
Research creates information.

John Mawhorter
profile image
This article is highlighting for me the fact that 3D UIs and Virtual Worlds are difficult to use and that the mouse and keyboard are by far the superior input device for most tasks. Seriously, controlling my movement by turning my head? This is uncomfortable on a basic level. Also the magical versus realist distinction and your constant use of "natural" and "immersion" are pretty silly.

John Mawhorter
profile image
Not that this isn't a useful starting point for thinking about using these devices in games, but there are many of these techniques that don't work in a time-intensive situation (ie most real-time video games) or when moving. And there's also the problem that head-tracking is needed for some of these, which most of the controllers won't provide. And the real problem is that the virtual world research mostly seems to be based on VR environments that are expensive and complicated, while also being used for specific tasks that aren't really very game-like (military training simulators excepted). If there was a real academic VR-Game research community it would be great.

Dustin Chertoff
profile image
@Tim Carter



Research does not guarantee that the results of the research are immediately applicable towards creating commercial products. In many cases, research exists solely for the sake of figuring out the truth of the very small part of the world the researcher is interested in. But at no point, is research the antithesis of creation.



Creation cannot exist in a vacuum. Creation must be informed through observations of the world. How do you know what problem needs a solution to be created? How do you know how to build the solution? How do you test that your solution works? This is all research. Creation is the process of developing an informed response based upon the questions asked and answered through the research process. Development cannot exist without research to inform what to develop, and research cannot exist without development defining the problems that need to be researched.



And while punk rock pioneers could not play their instruments with the same technical prowess of their contemporaries, they had performed plenty of research regarding the type of music out there. They felt that the music did lot let them express themselves the way they wished to express themselves (the problem). As a result, they created a new form of musical expression.



@John Mawhorter



Yeah, many of the techniques right now are very cumbersome for VR, let alone for gaming. Even the best VR equipment would choke trying to provide the same quality of experience you can get with AAA PC game. But the technology is getting there, slowly... One of the issues though, is the mindset that great games have to fit the "sit in one spot for 3+ hours" paradigm. A new genre of games based around 10-15 minute immersive experiences can emerge (where immersion refers to both physical and psychological immersion). Just like major game developers balked at the power of social gaming, only to realize now that it is a multi-billion dollar industry, the same can be said of the immersive casual game.



The Wii showed that people will buy the tech, (if not 3rd party games). It was enough to make MS and Sony play catch-up with Natal/Move. This tech is not currently suited for AAA FPS games, but it is great for other genres. It's unwise (from a business perspective) to ignore this nascent market segment because it can't be applied to the current style of AAA game. Let the tech be incorporated and refined in the new genres, so that the mature version can be added to traditional blockbuster style games.

Ruthaniel van-den-Naar
profile image
For me nice summary and something between science and design, I hate overly scientific and pieces, this is ideal combanation.


none
 
Comment:
 




UBM Tech