Summer 2012: Thesis Update - Week 9: User Interface
Success! I have made significant headway in prototyping the user interface for the iPad virtual camera system. These past couple weeks after SIGGRAPH I have locked down combining certain features available within the Unity Mobile and FigureGestures frameworks (as well as some of my own custom code hackery) to create a workable user interface for a virtual camera. I have successfully built a standalone app to an iPad (seen above) and implemented the following UI features:
Multi-Finger Gesture Input and Visual Feedback
The app will now display a visual cursor to mark when it recognizes a touch input from the user. This can later be changed to a joystick, slider, etc. but is a visual confirmation that the app is definitely reading input not only on the touch pads, but anywhere on the screen.
GUI-Button Help System
As an exercise, I have enabled a very simple GUI button-based help system which can later be extended to a more robust menu system. The GUI buttons can be used to trigger modifications to object properties such as focal length of the camera or triggering on/off lights.
Any objects that are instantiated with the touch-drag script will now have the ability to be moved around the 3D world. The drag motion will happen relative to the camera plane to make 3D manipulation of the object using touch intuitive to the 2D nature of the touch surface. A drag-trail line appears as the object is moved, thanks to FingerGestures, to denote the object’s start and end point (also visible on the minimap located in the corner).
By manipulating the zoom scripts within the FingureGestures framework, I was able to enable a two-finger iOS style pinch-to-zoom gesture which modifies the MainCamera’s field of view. I am able to map this gesture to any camera or object function - in the future, I may be able to use the pinch gesture to enlarge or decrease the penumbra angle of a spot light, for example.
Minimap and Camera Position Warp
I was able to overlay a secondary top-down orthographic camera on the corner of the iPad display to give you an idea of where your camera was located in the 3D world. This could be more useful in larger 3D worlds where your camera may be locked off to one area. In order to move about the 3D world, you could use the touch pads to manually move the camera to the desired location OR you can simply tap on the location directly on the minimap to instantly warp the camera to that position. I utilized a custom Move-to-Position script to translate the pixel location you tapped on the iPad display to the 3D coordinate location of the desired destination.
Over the next couple days, I will be working on building the basic networking infrastructure which will be able to load objects into the scene remotely from a desktop computer and streamed directly to the iPad. I also plan on looking into how to record camera motions within Unity in such a way that I can export the animated spline to Maya for clean-up and animation. The last step is to piece together the networking and UI base with my existing PS Move-iPad prototype system to allow for the iPad to act as a traditional virtual camera system, but now with touch screen support.