Infinite Reality's amazing Oculus Rift demo (+Interview)

Lee Perry-Smith is not a stranger here on HDRLabs. Under his company label Infinite Realities he previously donated a 3D head scan (of his own head) for the graphics researcher community, and he's been a fan of our Smart IBL environment/lighting system from day one.

But now Lee has stepped it up a notch.
He put together a realtime demo for the infamous Oculus Rift - the utterly futuristic VR glasses of the future. The demo contains some nudity, that is, incredibly realistic 3D scans of nude women (may be NSFW if you don't work in the art field). The demo also contains a variety of my panoramas as environments, which I'm rather thrilled about. Check it out:



Download the demo and see for yourself. It's even impressive if you don't have a Rift.

I found it so impressive, that I was burning to ask Lee a few questions on how he did this.
Buckle up for a new literary format on this blog: an interview!


Interview with Lee Perry-Smith


Thanks for this opportunity Blochi! I'm a huge fan of your books and website :)


[ Rollover for scanned 3D geometry ]


The detail in your figure scans is astonishing, in terms of laser-accurate geometry and brilliant textures. What's your setup?


At the moment I use 115 DSLRs for the full body capture system and 48 DSLRs for the separate face (FACS) capture system. Mostly Canon's with about 8x Nikon D800's in there for full body reference shots. It was the first type of system of its kind, world wide, back in 2011.


Wow, that's a mighty impressive setup. Just collecting the images from all these cameras sounds like a lot of work. Are they all wired together?


Yes, they are all wired together to fire in sync. The system is designed to be 99.9% reliable. There are never any black images or missing images. This is integral for reconstruction. There are miles and miles of cable!


The cameras are triggered using custom opto-isolated trigger hubs (designed by Merry Hodgkinson) and Pocketwizard III remotes, with data streamed over a networked system, running Breeze Software and Agisoft Photoscan to process the point cloud data.

[ Panorama of Lee's VR Photostudio ]


Are you doing any special tricks with polarizers or flashes?


At the moment no, but I'm still running experiments trying to figure out the best method here. Linear polarizing film seems the best way to go. The end goal being better surface details.

I've also run many multi-lighting tests using flash lights and photoshop, similar to the Lighstage method, to acquire world space normals. Disney's mesoscopic emboss method is much quicker and far easier in comparison but the output is synthetic rather than true surface bump.


Highslide JS


My Oculus Rift is still on pre-order. The clips look already awesome on my laptop screen, but how would you describe the experience of the Oculus Rift?

You're in for a treat! It's hard to put into words. Really hard to describe until you see it, feel it.

It's a breathtaking moment when assets you've been working hard on for so long in a 2D format, you see now in 3D. Palmer Luckey, John Carmack, and the team at Oculus VR will go down in history as true pioneers. Huge thanks to them!

We're talking true 3D stereo VR here. Not 3D cinema stereo but true stereoscopic vision. You sense depth, vibrant colors and good contrast. You can adjust your focus (minus DOF changes) in VR, you really experience scale like you've never before.

And now you're populating it with realistic scans of real people!

Seeing scans of people in VR is amazing. It can feel strange at times, as if you're invading a person's space. Your sense of presence effects theirs and you can't make out if they are alive or dead because they don't move. Kind of like Wax works on steroids.

When you put your point of view in the position of the scanned person and look down, it's even stranger! Then you are that person. For a split second, you kind of sense what they must feel like living in that vessel, that was given to them to use i.e. being John Malkovich!! This could lead to some very interesting future experiences.

At the moment these are static scans, I'm working on movement but this will take time. I'm pushing for something past traditional motion capture, which I think is quite dated.


I understand that you put this demo together in Unity3D. How involved is this process, from a general 3D artist's point-of-view? Is it a steep learning curve?


I was terrified of learning Unity. I put it off for months/years and started with UDK (Unreal Development Kit) instead. UDK was easy to slip into but I found it had serious limitations. It felt bloated and I had many problems just making simple shaders work with the lights. For publishing self-illuminated or baked models it was great. One cool feature of UDK is it's loading system and ability to load in 8k textures. Something Unity struggles with.

But saying that, Unity is incredibly easy to learn. I initially learned from the Digital-Tutors site, then I dissected the Tuscany Demo supplied with the Oculus Dev Kit SDK.


How well does Unity work with the Oculus Rift?

Adding Rift integration was a walk in the park thanks to Unity and Oculus VR joining forces. Google is a treat with Unity searching because there are just thousands of help pages on all subjects. If you can think it up, someone else has inevitably already written a script for it. I've been lucky enough to make some good friends quickly in the VR/Unity community, who are incredibly talented at writing shaders and scripts. We are able to do things that once upon a time ILM had difficulty doing for film with offline rendering in the 90's, like the T-1000 effect. We can simulate something similar to that now, in Unity, in real-time, at 60-120 fps, in stereo, in VR!! ... It's quite mind blowing.

Highslide JS


What I find stunning is how well the environments interact with the characters shading. Is this baked, or is this image-based lighting in realtime?

This is the key, HDR and IBL, it's easy now thanks to the research that people like Paul Debevec did in the 90's, the work that you do on HDRLabs and people like Thomas Mansencal (sIBL GUI) and Bob Groothuis (Dutch Skies).

This paves the way for artists to be able to utilize advanced lighting techniques easily. Working with sIBL's is as simple as drag and drop in Unity. Also thanks to the great work the Marmoset Co guys do with Skyshop and Unity integration. This is what inspired me to use Unity after seeing their Skyshop tutorial:



So yes, the lighting in my demo is real-time, non-baked, all interactive. Some colored AO is baked during the studio scanning session but it's minimal. I'm also working on some new custom SSS shader integration.


So these are custom shaders?


We've (myself, Charles and drash) implemented a custom SSS solution with many different effects, like 2 specular lobe control, IBL (bent normals) calculated reflections, cavity Fresnel falloff, 8x multi-region micro multi-bump (blended with RGB maps, thanks to Steve), 8x multi-region cavity maps, deep red shadow scattering, GGX specular distance distribution, deep skin scatter for ear glow, 1x GI bounce, screen space reflections, colored AO and much more.
We're also working on a hair rendering solution for Unity, using IBL for lighting and strand reflections, 2 specular lobe control, as well as GI bounce and strand AO. This is a lot harder to implement.

I still use Marmoset for real-time testing but find Unity more open creatively because of scripting, sound, animation and interactive executable publishing.
Although I have a feeling UDK4 is going to be something very special!
 

Where do you see the creative potential of the Oculus Rift?


I can just imagine in 2-3 years, no more monitors, no more keyboard or mouse. Just a set top box (maybe even cloud based), a VR or AR head set, surround sound, and haptic gloves. We will have the ability to sculpt our favorite character or model, actually feel the sculpt as you work, on a virtual hilltop mountain side some where, at sunrise.


The Oculus Rift will be especially interesting for directors. I run my own small business so I've been a director (of a company) for a while and admire the work of film directors like Ridley Scott, Paul Thomas Anderson and James Cameron, just like most 3D artists do. The trick here is, the term director takes on a whole new meaning because of real-time and VR, which allows us to REALLY direct an experience. You get to do things creatively you couldn't ever do with film and this is where VR, 3D and 4D scans really kick in. You also get to add sound, music, visual and response interaction in areas you can't do with film. You can manipulate time, manipulate a users sense of being, drive 360 environments and many other things.

VR for creative experiences, not just for games but really immersive experiences is like the old Wild-West. It's a new frontier, rich for the pickings. At some point Film, VFX and Games will merge into a Directed Experience. There is no doubt.



What's the next thing on your personal list?


My direction with VR goes into the adult entertainment industry. An area few dare to venture, or have the balls to try. Other industry professionals have warned me of this direction, saying you won't get bookings from Disney et al, saying it will affect your reputation, etc! Which I am well aware of; my intention isn't to sell out but to follow my own dreams. The adult industry is dying financially and it needs a new lease of life, a new dimension. VR and scanning in this market can explore a lot of untapped areas and also really benefit many people. Give them experiences they may otherwise never have access to, due to physical disabilities, social issues, or sexual orientation. And this is just one small niche area of VR. The possibilities are endless.

Well, if the Oculus Rift is half as good as you're saying I can see this becoming a big hit. I, for one, await my pre-order with much suspense now.
Thank you very much for this interview, and good luck!


Thanks again Blochi, your book and your site are always a constant source of inspiration.

Lee


Highslide JS


Hope you liked this interview. Maybe we can do more of those in the future.

Visit the Infinite Realities website to see more good stuff, grab the Rift demo, and don't forget to download the sIBL-of-the-month that Lee generously donated.

View Comments

Welcome to the sIBL family, Blender!

Raise your glasses, brothers! Blender just joined the exquisite ranks of sIBL-supported programs.

Highslide JS

Blender extension builds a bridge to sIBL_GUI, enables quick lighting setups.
Screenshot by Jed Frechette.


Thanks to the development effort of Jed Frechette, Blender users can now enjoy the one-click environment lighting setup that the Smart IBL system is famous for. Integration is done thoroughly, by using sIBL_GUI as browser and central library management hub. If you already use sIBL_GUI in conjunction with 3dsMAX or Maya, the workflow with Blender will be familiar:
  • Pick an environment preset
  • Pick Blender as setup template
  • Click the Send to Software button
Photorealistic lighting couldn't be easier.

Links:
Download and installation instructions on wiki.blender.org
Say thanks or report bugs in our dev forum thread

To celebrate this historic event, enjoy this new free sIBL set of an iron bridge in full 16K glory:



View Comments

Welcome to the sIBL family, Blender!

Raise your glasses, brothers! Blender just joined the exquisite ranks of sIBL-supported programs.

Highslide JS

New Blender extension builds a bridge to sIBL_GUI for quick lighting and environment setups.
Screenshot by Jed Frechette.


Thanks to the development efforts of Jed Frechette, Blender users can now enjoy the one-click environment lighting setup that the Smart IBL system is famous for. Integration is done thoroughly, by using sIBL_GUI as browser and central library management hub. If you already use sIBL_GUI in conjunction with 3dsMAX or Maya, the workflow with Blender will be familiar:
  • Pick an environment preset
  • Pick Blender as setup template
  • Click the Send to Software button
Photorealistic lighting (and background integration) couldn't be easier.

Links:


Download and installation instructions on wiki.blender.org.
Say thanks or report bugs in our dev forum thread.

To celebrate this historic event, please enjoy this free sIBL-set of an iron bridge in full 16K glory.


View Comments

KaleidoCam: the ultimate DSLR upgrade gadget

Highlight of this year's SIGGRAPH was the KaleidoCam, which introduces an entirely new idea to the world of photography. It's an add-on that goes between the lens and the camera, and it upgrades the camera's capabilities in ways never thought possible. Suddenly you have true single-shot multi-exposure HDR capture, multi-spectral imaging, variable polarization, you can even capture a light field with a pixel resolution that far exceeds what you can get out of a Lytro camera - and all that with your own DSLR and lenses!



The working principle is ingenious in its simplicity: The gadget contains a diffuse screen, that intercepts the light rays coming from the lens at the very same spot where the sensor would be. So instead of a sensor, there is now a rear-projection screen. A kaleidoscopic mirror arrangement then sends multiple copies of the image to the sensor (9 images in total, arranged in a 3x3 grid). The trick is now to insert different filters into the optical path of each of these images. For HDR, you would have varying ND filters - and voila, each snap of the shutter gives you 9 different exposures. Awesome!



I really hope to see this come to market soon, if the research group would go on Kickstarter they would already have my money! Read the detailed technical paper here, and make sure to scroll down and check out the data sets (original images captured with the prototype).

View Comments

Get the HDRI-Handbook 2.0 in a bundle!


My friends at Unified Color have just released version 3.0 of their professional HDR software line. That includes the stand-alone HDR Expose 3 and the Photoshop plugin 32Float v3 (which is what I use). As cherry on top, you get my HDRI-Handbook 2.0 bundled (the e-book version, that is).

This is an excellent updates, both programs run much faster, and the tonemapping algorithm is much improved. It now uses an adaptive tone curve to tune the overall contrast, similar to Picturenaut's Adaptive Logarithmic method.


Highslide JS

HDR Expose 3 also has a completely revamped HDR merge mode with a nice ghost removal tool (where you can manually pick the hero image for marked areas) and a rock-solid image alignment (where you can manually set down control points and let the frames warp into place accordingly). And of course, there are the trademark features of this product line that I love most: full 32-bit color editing, white balance, veiling glare removal, all while keeping the overbright areas of the HDR image intact and fully valid for further post processing. So, even if this won't become your primary HDR tool, it is a mighty powerful addition to your bag of tricks.


Highslide JS

So there you go - grab a pro HDR software and get my book for free!
www.unifiedcolor.com/hdri-handbook-giveaway (limited to 500, first come first served)

My publisher actually compiled a special eBook edition for this; it's a PDF with the correct layout and decent photo resolution. However, keep in mind that the bonus DVD content (which is needed to follow the tutorials) is not included in the eBook. For the full HDRI-Handbook 2.0 experience I still recommend getting the paperback edition (currently only $35 on Amazon).

View Comments

How I helped the space shuttle Atlantis on its final mission

Here is the full scoop on the NASA project that kept me so busy for the last few months.
It's about the space shuttle Atlantis.

While NASA donated the other remaining space shuttles to various cities across the US (see my pics of the Endeavor flying over LA here), the Atlantis is kept at Cape Canaveral, Florida. It is now displayed at the Kennedy Space Center (KSC) Visitor Complex, in a grand new exhibition that opened to the public this weekend.
I had the honor to work as CG Supervisor / Lead Artist on this exhibit. I've always been a huge space nerd (previously worked on all 4 seasons of Star Trek Enterprise), so this was really a dream project for me.

The final mission of Atlantis is to amaze and inspire the next generation of space explorers. That's why the exhibit is not just the shuttle in a room with a little plaque. No, it's an immersive experience that pays tribute to the achievements and the rich history of this incredible vehicle. It's a friggin' real-life space ship! And you get to see it in flight!



Before you get to the shuttle you see is a 10-minute show in a dedicated 8K dome projection theatre. This is a bit of a foreplay, meant to create excitement and awe. That's where visual effects come in and let you experience weightlessness inside the cockpit, take you up-close to a docking maneuver with the ISS, let you fly inside a Hubble nebula, and take you on a ride with the Atlantis during atmospheric reentry.
I can't show you pictures of this spectacle, you have to go and see it in person. NBC News calls it "fantasy becomes reality and the experience is nothing short of magical."

Orbiting the Earth at 6-times speed


Big part of the experience is the 120-foot LED screen behind the shuttle. The clip playing there is now officially the longest CG shot of my career: it's 15.200 frames of CG animation (just over 11 minutes), describing a full earth orbit. Since the theme of the exhibit is laid out to show the shuttle in flight, this shot provides the backdrop to make this illusion perfect. I don't really expect anyone to stand there and watch the whole orbit; especially when there is a real space ship in the room. Yet, here is some background info for all the space geeks:

What you see is an orbit with a 55 degree inclination, which is the highest orbit that was ever flown on a shuttle mission. All my NASA advisors agreed that this is the most scenic route, covering the maximum amount of picturesque sights.
We start with a sunrise over the Pacific, fly over the Baya California and then cross US mainland in a north-east direction. We come across the Great Lakes, Canada, and then head across the Atlantic, and enter Europe over Great Britain (on very much the same path as transatlantic airliners travel). For dramatic purpose Europe is shown at night, so you get to see a dense network of sparkling city lights. The orbit continues in south-east direction over the Mediterranean, the Sahara desert, and the Middle East (at which point I switched back to daylight to showcase the fascinating desert colors). Then it gets dark again as we head towards the Antarctic Circle, where we fly through the shimmering green lights of the aurora australis. As we swing around we pass over the tip of Australia, cross the entire length of New Zealand, and head out into the Pacific. This is where the loop starts over at the beginning (foreshortening the Pacific crossing because this would really be a long and boring stretch).
In real life a shuttle orbit took 90 minutes, without the Pacific this may be 60 minutes. So my 11-minute animation is in fact showing the orbit at circa 6 times the original speed.


Numbers for geeks

Here are some statistics that show the astronomical scope of this project:
  • 4.1 Terabytes - total project data (after extensive cleanups).
  • 3.5 Gigapixel - texture for the earth surface alone (86K Blue Marble NextGen).
  • 9.2 Gigabytes - accumulated textures for the earth (incl. clouds, citylights, ect).
  • 6.2 TeraHertz - total processing power of our render farm (extended for all this).
  • 37,472 frames - cumulative frame count of all shots (over 26 minutes).
  • 10.5 Million - polygon count of our ISS model.


Software used


  • Modeling in modo, 3dsMAX, Lightwave 3D.
  • Animation and rendering in Lightwave 3D (Huge thanks to the fine folks at Newtek for compiling a special version for us overnight, which sped up render times in our custom dome camera by a factor of 4).
  • Shading with infinimap (which made it possible to render with such gigapixel textures at all, even in record render time!).
  • Composited and tested for dome projection in Fusion.


Credits

CG Visual Effects by Eden FX

  • Christian Bloch - CG Supervisor / Lead Artist / Compositing
  • Mark Hennessy-Barret - CG Artist (Spaceman sequence)
  • Anthony Vu - Modeling & Shading (ISS, Flight Deck)
  • Eric Hance - CG Artist (Swamp opening sequence)
  • Emmanuel Yatsuzuka - Modeling (Atlantis)
  • Dan DeEntremont, Keith Matz, Sean Jackson - Additional Modeling
  • Rebecca West - Project Manager
  • Carrie Stula - Coordinator

Mousetrappe, a Burbank, CA based design & production studio, has worked with Delaware North Companies Parks & Resorts, operators of Kennedy Space Center Visitor Complex for NASA, as well as PGAV Destinations and Nassal, to create an undeniably breathtaking media experience. Mousetrappe once again guides audience members through an all-new architectural projection adventure – a world where the intersection of rich storytelling and cutting edge technology creates a breathtakingly powerful experience.


Media Coverage



View Comments
Next Page