Urban modelling is arguably an area that has yet to benefit from 3D interaction and quite understandably so since the work involved is often very complex and large-scale and doesn't really lend itself to HCI experimentation. However, Autodesk Labs plans to change all that and has recently unveiled vids of a new research prototype of a novel urban design application.
This prototype runs on a Multi-Touch Wall, a multi-touch input device produced by Perceptive Pixel (link here) and invented by researcher and TED conference luminary Jeff Han. Single touch devices, such as a for example a tablet PC or kiosk at a shopping mall, have been around for years. What distinguishes a multi-touch device from its predecessors is that it recognizes more than one input simultaneously. That means that instead of just touching the screen with the tip of one finger, the device recognizes gestures that the user makes with one or more fingers. Recognizing these gestures can provide computer applications with new opportunities for processing user input...
Therefore, Autodesk Labs believes multi-touch human-computer interfaces may dramatically change how products, infrastructure, and buildings are designed. In the vid above you can see someone grab a section of a city and mock up a building on the site. It looks really exciting (despite the lack of audio commentary on the vid) and could well spark off new avenues in urban modelling research when the project is complete and made commercially available.
No comments:
Post a Comment