Infinite Reality's amazing Oculus Rift demo (+Interview)

Lee Perry-Smith is not a stranger here on HDRLabs. Under his company label Infinite Realities he previously donated a 3D head scan (of his own head) for the graphics researcher community, and he's been a fan of our Smart IBL environment/lighting system from day one.

But now Lee has stepped it up a notch.
He put together a realtime demo for the infamous Oculus Rift - the utterly futuristic VR glasses of the future. The demo contains some nudity, that is, incredibly realistic 3D scans of nude women (may be NSFW if you don't work in the art field). The demo also contains a variety of my panoramas as environments, which I'm rather thrilled about. Check it out:



Download the demo and see for yourself. It's even impressive if you don't have a Rift.

I found it so impressive, that I was burning to ask Lee a few questions on how he did this.
Buckle up for a new literary format on this blog: an interview!


Interview with Lee Perry-Smith


Thanks for this opportunity Blochi! I'm a huge fan of your books and website :)


[ Rollover for scanned 3D geometry ]


The detail in your figure scans is astonishing, in terms of laser-accurate geometry and brilliant textures. What's your setup?


At the moment I use 115 DSLRs for the full body capture system and 48 DSLRs for the separate face (FACS) capture system. Mostly Canon's with about 8x Nikon D800's in there for full body reference shots. It was the first type of system of its kind, world wide, back in 2011.


Wow, that's a mighty impressive setup. Just collecting the images from all these cameras sounds like a lot of work. Are they all wired together?


Yes, they are all wired together to fire in sync. The system is designed to be 99.9% reliable. There are never any black images or missing images. This is integral for reconstruction. There are miles and miles of cable!


The cameras are triggered using custom opto-isolated trigger hubs (designed by Merry Hodgkinson) and Pocketwizard III remotes, with data streamed over a networked system, running Breeze Software and Agisoft Photoscan to process the point cloud data.

[ Panorama of Lee's VR Photostudio ]


Are you doing any special tricks with polarizers or flashes?


At the moment no, but I'm still running experiments trying to figure out the best method here. Linear polarizing film seems the best way to go. The end goal being better surface details.

I've also run many multi-lighting tests using flash lights and photoshop, similar to the Lighstage method, to acquire world space normals. Disney's mesoscopic emboss method is much quicker and far easier in comparison but the output is synthetic rather than true surface bump.


Highslide JS


My Oculus Rift is still on pre-order. The clips look already awesome on my laptop screen, but how would you describe the experience of the Oculus Rift?

You're in for a treat! It's hard to put into words. Really hard to describe until you see it, feel it.

It's a breathtaking moment when assets you've been working hard on for so long in a 2D format, you see now in 3D. Palmer Luckey, John Carmack, and the team at Oculus VR will go down in history as true pioneers. Huge thanks to them!

We're talking true 3D stereo VR here. Not 3D cinema stereo but true stereoscopic vision. You sense depth, vibrant colors and good contrast. You can adjust your focus (minus DOF changes) in VR, you really experience scale like you've never before.

And now you're populating it with realistic scans of real people!

Seeing scans of people in VR is amazing. It can feel strange at times, as if you're invading a person's space. Your sense of presence effects theirs and you can't make out if they are alive or dead because they don't move. Kind of like Wax works on steroids.

When you put your point of view in the position of the scanned person and look down, it's even stranger! Then you are that person. For a split second, you kind of sense what they must feel like living in that vessel, that was given to them to use i.e. being John Malkovich!! This could lead to some very interesting future experiences.

At the moment these are static scans, I'm working on movement but this will take time. I'm pushing for something past traditional motion capture, which I think is quite dated.


I understand that you put this demo together in Unity3D. How involved is this process, from a general 3D artist's point-of-view? Is it a steep learning curve?


I was terrified of learning Unity. I put it off for months/years and started with UDK (Unreal Development Kit) instead. UDK was easy to slip into but I found it had serious limitations. It felt bloated and I had many problems just making simple shaders work with the lights. For publishing self-illuminated or baked models it was great. One cool feature of UDK is it's loading system and ability to load in 8k textures. Something Unity struggles with.

But saying that, Unity is incredibly easy to learn. I initially learned from the Digital-Tutors site, then I dissected the Tuscany Demo supplied with the Oculus Dev Kit SDK.


How well does Unity work with the Oculus Rift?

Adding Rift integration was a walk in the park thanks to Unity and Oculus VR joining forces. Google is a treat with Unity searching because there are just thousands of help pages on all subjects. If you can think it up, someone else has inevitably already written a script for it. I've been lucky enough to make some good friends quickly in the VR/Unity community, who are incredibly talented at writing shaders and scripts. We are able to do things that once upon a time ILM had difficulty doing for film with offline rendering in the 90's, like the T-1000 effect. We can simulate something similar to that now, in Unity, in real-time, at 60-120 fps, in stereo, in VR!! ... It's quite mind blowing.

Highslide JS


What I find stunning is how well the environments interact with the characters shading. Is this baked, or is this image-based lighting in realtime?

This is the key, HDR and IBL, it's easy now thanks to the research that people like Paul Debevec did in the 90's, the work that you do on HDRLabs and people like Thomas Mansencal (sIBL GUI) and Bob Groothuis (Dutch Skies).

This paves the way for artists to be able to utilize advanced lighting techniques easily. Working with sIBL's is as simple as drag and drop in Unity. Also thanks to the great work the Marmoset Co guys do with Skyshop and Unity integration. This is what inspired me to use Unity after seeing their Skyshop tutorial:



So yes, the lighting in my demo is real-time, non-baked, all interactive. Some colored AO is baked during the studio scanning session but it's minimal. I'm also working on some new custom SSS shader integration.


So these are custom shaders?


We've (myself, Charles and drash) implemented a custom SSS solution with many different effects, like 2 specular lobe control, IBL (bent normals) calculated reflections, cavity Fresnel falloff, 8x multi-region micro multi-bump (blended with RGB maps, thanks to Steve), 8x multi-region cavity maps, deep red shadow scattering, GGX specular distance distribution, deep skin scatter for ear glow, 1x GI bounce, screen space reflections, colored AO and much more.
We're also working on a hair rendering solution for Unity, using IBL for lighting and strand reflections, 2 specular lobe control, as well as GI bounce and strand AO. This is a lot harder to implement.

I still use Marmoset for real-time testing but find Unity more open creatively because of scripting, sound, animation and interactive executable publishing.
Although I have a feeling UDK4 is going to be something very special!
 

Where do you see the creative potential of the Oculus Rift?


I can just imagine in 2-3 years, no more monitors, no more keyboard or mouse. Just a set top box (maybe even cloud based), a VR or AR head set, surround sound, and haptic gloves. We will have the ability to sculpt our favorite character or model, actually feel the sculpt as you work, on a virtual hilltop mountain side some where, at sunrise.


The Oculus Rift will be especially interesting for directors. I run my own small business so I've been a director (of a company) for a while and admire the work of film directors like Ridley Scott, Paul Thomas Anderson and James Cameron, just like most 3D artists do. The trick here is, the term director takes on a whole new meaning because of real-time and VR, which allows us to REALLY direct an experience. You get to do things creatively you couldn't ever do with film and this is where VR, 3D and 4D scans really kick in. You also get to add sound, music, visual and response interaction in areas you can't do with film. You can manipulate time, manipulate a users sense of being, drive 360 environments and many other things.

VR for creative experiences, not just for games but really immersive experiences is like the old Wild-West. It's a new frontier, rich for the pickings. At some point Film, VFX and Games will merge into a Directed Experience. There is no doubt.



What's the next thing on your personal list?


My direction with VR goes into the adult entertainment industry. An area few dare to venture, or have the balls to try. Other industry professionals have warned me of this direction, saying you won't get bookings from Disney et al, saying it will affect your reputation, etc! Which I am well aware of; my intention isn't to sell out but to follow my own dreams. The adult industry is dying financially and it needs a new lease of life, a new dimension. VR and scanning in this market can explore a lot of untapped areas and also really benefit many people. Give them experiences they may otherwise never have access to, due to physical disabilities, social issues, or sexual orientation. And this is just one small niche area of VR. The possibilities are endless.

Well, if the Oculus Rift is half as good as you're saying I can see this becoming a big hit. I, for one, await my pre-order with much suspense now.
Thank you very much for this interview, and good luck!


Thanks again Blochi, your book and your site are always a constant source of inspiration.

Lee


Highslide JS


Hope you liked this interview. Maybe we can do more of those in the future.

Visit the Infinite Realities website to see more good stuff, grab the Rift demo, and don't forget to download the sIBL-of-the-month that Lee generously donated.

Next Page