A fantastic 8-Big rendition of Daft Punk’s Random Access Memories album.
A fantastic 8-Big rendition of Daft Punk’s Random Access Memories album.
In preparation for submitting some of my thesis work to conferences globally, I put together this overview demonstration video showcasing some of the features of the MobileVCS system I have developed.
Some of the core functionality available in the system thus far include:
Looking forward to continuing to iterate and build upon this previsualization tool over the coming months!
It is no secret that I am a humungous fan of the animated short Paperman, which I first saw at during Disney Animation’s screening at SIGGRAPH 2012. In my opinion, the perfect blend of 2D and 3D with a unique style and a touching story. Take 5 minutes out of your day and relive the magic that can only be captured by Disney.
Brave - Filmmaking Process with Director Mark Andrews
An intelligent conversation on how the nature of storytelling differs between both Movies and Games. Worth every minute.
Star Trek’s JJ Abrams and Valve’s Gabe Newell - D.I.C.E. SUMMIT 2013
Another month of progress developing the Mobile Virtual Camera System. With the hardware side of the system developed for testing, it was important to focus on what I can achieve on the software end that would make this form factor functional. What I hope to achieve, at the beginning, is to rebuild the core functionality of freespace motion and record/playback which I achieved separately. The following includes some of my experiments and observations as I begin to build software functionality to the MobileVCS:
Building software for the system did prove challenging, especially as I begin converting my projects to the latest Unity4 update. Experimentations began on how I could leverage some of the core iPad functionality such as real-time depth of field on mobile, taking screenshots and saving it to the Photos app, and accessing raw back-facing camera to be applied as a video texture in Unity.
I envisioned a functionality within the application where I would be able to mimic the tap-to-focus functionality that is so popular on mobile phone cameras. First I began experimenting in Motion Builder with real-time depth of field with the traditional Virtual Camera System. Even there, the depth of field would work, but would cause significant performance hits when the camera was in motion, understandably so. I was hoping to have better luck in Unity, with my fingers crossed that it was optimized for Mobile. Unity4 included a more robust script for calculating depth of field, something that was enhanced using Direct X11. With it came a binary visualizer where you can see the depth plane viewed from the camera perspective. Building the visualizer into the iPad proved easy to implement and worked flawlessly in real-time, with the ability to dial in the focal plane on the fly. Unfortunately once enabling the full rendered view, much like Motion Builder, caused the framerate of the system to drop greatly with the camera moving in real-time with the Move. That being said, with the visualizer working on mobile in real-time, it is possible to build the two-color visualizer in the app to establish the focal plane for a camera and output the to a program such as Maya for final rendering.
Next up, I looked into enabling some core iPad functionality into the app such as access to the back-facing camera as well as taking programmatic screenshots with a push of a GUI button.
The purpose of enabling the back-facing camera was to potentially be able to, in real-time, overlay the 3D world over the real world. This would allow you to physically see how far you need to move in to achieve a particular camera move. While this is possible using Unity’s WebCamTexture, the camera stream proved not optimized for mobile use. I looked into and began testing with another Unity script that did potentially enable access to iOS camera as well as recording/playback functionality. This is something I hope to look into further and potentially implement in a future build of the application.
Taking programmatic screenshots was also another idea that was spawned after discussions with layout artists. By taking screenshots, it allows layout artists the ability to save out their camera blocking and develop simple storyboards. While yes, you can take a screenshot on the iPad by holding the Power & Home buttons simultaneously, I wished to enable is programmatically so I would be able to map it to a button on the Playstation Navigation Controller. Again, gaining access to saving from the app to the iPad proved challenging and something that could be achieved using Prime’s Etcetera plugin.
One of the key reasons that gaining access to hardware functionality of the iPad proved difficult is because it requires some expertise in XCode. XCode handles the compiling of the Unity project in order to build it to iOS devices. Over the course of attempting to develop these experiments, I learned about how to go about accessing different core iOS functions such as notifications, in-app purchases, access to other running apps, etc. Apple, by default, locks this functionality to ensure that apps do not tamper with core functions to enable a smooth, streamlined user experience. While I did not have success with these experiments, I was able to enable saving video to the Photos app in a later experiment (read ahead).
Record & Playback of Camera Takes
With the success I had with EZReplay Manager in the past, I modified the scripts to work again with this latest build of the app. While not as easy as before, I had to ensure that the Replay Manager would be able to record a dynamically moving camera controlled by the Move Framework. This required a lot of cross-talk between the Move and Recording scripts. With this functionality, the Manager was able to playback recorded camera motions from the Move-controlled camera.
Playing back a recording did not seem sufficient enough - I wanted to be able to save a recording for future review. While I am still figuring out a method to output the camera path to another program for further tweaking, I was able to implement the iVidCap script for recording directly to the iOS camera roll in the iPad Photos app. This would enable layout artists the ability to save out and share takes for review with the director. iVidCap, while simple in concept, required some script modification to getting it to work with the dynamically moving Move-controlled camera. In addition, it required some XCode modification, inclusion of additional frameworks, as well as writing a custom script to call iVidCap outside of Unity when recording to the actual hardware device.
With all that overcome, both iVidCap and EZReplay Manager work seamlessly with recording and playing back the Playstation Move camera. While recording & playback works as fast as it did in my September build of the app, recording to the iOS camera roll takes about 3-4 minutes for a 30 second camera move because it is rendering each frame in real-time to the device using the iPad’s on-board processing. Like with the real-time depth of field, the performance of this recording function could become faster with the iPad4 (currently I am using the earlier iPad3) with double the processing power of its predecessor.
Another experiment I toyed around with is the ability to see an overlay of my 3D world over the output of the PlaystationEye camera. Thanks to the latest updates in the PSMoveWrapper, I now have access to the real-time stream from the PlaystationEye camera, the device that is tracking the position of my Playstation Move and streaming that position information to the iPad in real-time.
With access to the 640x480 IP camera stream, I was able to set up a simple mini-camera setup on the corner of my iPad display where I could see the output of the Move camera appear as a video texture below my 3D objects which appear augmented in the space. While a nifty feature, it allows me to create a simple augmented reality setup where my real-world motions with the Playstation Move-controlled camera can be seen side-by-side from both the birds-eye perspective as well as in first-person perspective. I am still brainstorming on practical applications of such a tool within a previsualization setup, but the functionality is there!
That is the progress thus far. Stay tuned in the coming week as I release my conference demonstration video that overviews the core functionality of the system including true freespace motion on the iPad as well as a practical demonstration of the recording/playback functionality I developed this week. More to come in the coming month so stay tuned :]
Thesis: Virtual Cinematography - January 2012
Building the Prototype Mobile Virtual Camera System
Quite a bit of progress to update on! Over the past couple weeks, I have been focussing on developing a prototype system as a homogenous unit. This required the following challenges to be solved:
iPad to Move.me Networking
One of the key pillars in developing for this project was the ability to leverage native iOS support in order to exploit key iOS features such as multi-touch gesture support, native keyboard input, and app distribution. After setting up a custom field for IP Address entry and setting up the Unity project to be compatible with iOS development, it was a rather painless process to be able to modify the existing network code to work natively on the iPad. With this, I no longer needed to stream the data to a laptop as an intermediary and am able to have the iPad directly talking to the Move.me server running on the Playstation 3.
This means all the position and rotational data from the Move controller as well as the button inputs from the Navigation controller would be sent from the controller, processed on the Playstation 3, and directly network accessed via a remote connection through Move.me on a standalone iPad app.
Custom Character Controller
Camera Control Functionality: Pan, Dolly, Field of View, Aspect Ratio
Here comes the real meat of the software development, thus far, and something I will be continuing to build over the coming months. With the Navigation controller script developed and with some of my previous research on iOS app development, I began implementing some of the core camera control functionality required in a Mobile Virtual Camera System. With some of my personal research into Performance Capture and traditional Virtual Camera System capture techniques, accompanied with Glenn Winters and Drexel University’s Optitrack Insight VCS on our Vicon Motion Capture Stage, I decided to establish the core camera tools I would need to navigate around the 3D world for this initial prototype including the ability to pan/dolly the camera laterally along the base surface, crane the camera vertically, and rotate the world to orient the camera in the right direction.
With my new control script, I was able to pull the necessary values from each of the analog sticks and controller triggers, allowing me to parse and normalize those values so that they can manipulate the position of the camera in the Unity scene. All camera transformations are stored as an offset to the Main Camera, which allows them to occur simultaneously with the real-time tracking accomplished with the Navigation controller.
As an added functionality, I created a dialog box on the iPad screen to allow for adjustment of the camera’s crop aspect ratio, with the aid of the AspectRatioEnforcer script. This script, which is applied to the main camera, parented a black crop region to the Main Camera, which is set by default to 1.33 (the iPads default 4:3 aspect ratio). This can obviously be modified to any aspect ratio required while shooting the desired scene, whether it be Widescreen (1.85), HD (1.77), Classic 35mm (1.5), or Cinemascope (2.35).
Constructing the Mobile VCS
Once I was able to establish the core functionality of this system, I felt that the right next step would be to build a prototype to demonstrate and test some of the camera tools I have developed within Unity. In order to mimic the familiar design of the Virtual Camera System (screen, 2 analog controllers, and tracking markers), I set out to find the necessary clamp which would act as the hub of my system. I found the iKlip Microphone Stand for iPad to be the perfect modular fit to hold the iPad. Instead of a microphone stand, I used .75” copper tubes, L- & T-connectors, duct tape, and plenty of zip ties to pull together a relatively “stable” construction of the system. For some pictures of the system in all its shallow depth-of-field’d glory, view the pictures at the beginning of this post.
While obviously the system has its limitations in construction (and is still being held by duct tape and zip ties…), it serves a solid enough design to continue prototyping and testing the software of my Mobile VCS. Check out the video demonstration, the first of many Thesis Vlogs, overviewing some of the core functionality of the system in action thus far:
In the coming weeks, I will be working to build a simple demo scene that I can begin testing some of the virtual camera principles in action. In addition, I will work to further refine the construction of my system as well as continue to add additional previsualization functionality to the software. All in a days work, until next time :]
Thesis: Virtual Cinematography - Fall 2012 Recap
An overview video highlighting some of the progress I have made thus far with my virtual cinematography tool utilizing the Playstation Move and Apple iPad. Within the past couple months, I had the incredible opportunity to present my work to members of both the film and gaming industries from companies such as Microsoft Game Studios, Halon Entertainment, Firaxis Games, and Carbine Studios.
The next couple months of development is crucial in determining the finer details of the proposed tool for layout artists. I will be working on native mobile tool development and user interface design within Unity3D specifically for the Apple iPad, implementing hardware design techniques in terms of building the entire physical previsualization system, and actually get an opportunity to travel on set of some film shoots to get a greater understanding of the filmmaking process. In addition, I plan to submit and present my research (hopefully) at a couple conferences including SIGGRAPH 2013.
A lot done so far, and lots more to take on. Excited for the road ahead and the challenges it will bring :]
Carlo Atienza | “In The End” - Carissa Rae and Michael Alvarado (Linkin Park Cover)
A gorgeous and emotionally charged cover with some incredible lyrical hip hop choreography!
Why I Hate School But Love Education. Know your motives and follow your dreams, don’t simply preach it…live it.
You don’t have to understand music to enjoy music :]
Beautifully executed animated piece with a message.