Shortcuts: PMatix 5, HDRLightStudio, HDR video, Dual ISO

Here's a quick summary of various new HDR tools and toys that popped up recently.

Highslide JS

Photomatix Pro 5.0 Beta


Our friends at HDRSoft have been working hard on the next generation of Photomatix.
  • All-new Tonemapping algorithm "Contrast Optimizer" promises very natural results
  • Automatic Deghosting now offers a preview and hero image selection (very nice)
  • Realtime preview during tonemapping
  • Highlight recovery from RAW files
  • and much more
It's a really great update, I highly recommend checking it out!

Download Photomatix 5 Beta


HDR Light Studio integrates live with MODO and Bunkspeed


This is very close to my heart, since I'm an avid MODO user myself.
HDR Light Studio is a unique app for creating/editing spherical HDR panos for lighting purpose (see the quick introduction in my HDRI Handbook 2.0, pages 631-632 ). It's a total niche field, and may at first sound odd or unnecessary, but the way it is executed makes it completely amazing. HDR Light Studio offers:
  • Spotlights and soft boxes with real-world settings for power (Watts) and color (Kelvin)
  • Growing library of HDR image elements, real studio light sources, ready to drop in
  • Select and position light sources by clicking on the highlights in your render
  • Live connection shows the result immediately in your 3D apps preview renderer


Putting all this lighting power at your fingertips, HDR Light Studio has quickly become the standard for industrial designers, especially in the automotive field.

Download the HDR Light Studio demo (registration required)
or see lots of eye candy on their blog.


New HDR Video samples for download


If chapter 7.4.1 in my HDRI-Handbook 2.0 has made you curious about HDR video (pages 596-600), then you will be delighted to hear that our friend Jonas Unger from the Linköping University has updated his HDRv repository.



You can download a variety of video clips, all captured with the infamous Spheron HDRv camera prototype. Files are provided as EXR sequence, ready for your own experiments in Nuke, Fusion, or After Effects. There are even some clips with mirror ball footage to get your feet wet with HDR video-based lighting (example here).

Go grab the goodies from the LiU HDRv Repository


Magic Lantern gets a Dual-ISO mode


The Magic Lantern firmware enhancement for Canon DSLRs now offers the ultimate HDR mode!


Dual ISO means half the sensor is read out with ISO 100 and the other half with ISO 1600 (or any other, for that matter). So you get two exposures at the exact same time! Time to say good-bye to ghost removal. Moving subjects are not an issue with this camera mod, as all the amazing examples clearly prove.

Download the Dual-ISO module


View Comments

Infinite Reality's amazing Oculus Rift demo (+Interview)

Lee Perry-Smith is not a stranger here on HDRLabs. Under his company label Infinite Realities he previously donated a 3D head scan (of his own head) for the graphics researcher community, and he's been a fan of our Smart IBL environment/lighting system from day one.

But now Lee has stepped it up a notch.
He put together a realtime demo for the infamous Oculus Rift - the utterly futuristic VR glasses of the future. The demo contains some nudity, that is, incredibly realistic 3D scans of nude women (may be NSFW if you don't work in the art field). The demo also contains a variety of my panoramas as environments, which I'm rather thrilled about. Check it out:



Download the demo and see for yourself. It's even impressive if you don't have a Rift.

I found it so impressive, that I was burning to ask Lee a few questions on how he did this.
Buckle up for a new literary format on this blog: an interview!


Interview with Lee Perry-Smith


Thanks for this opportunity Blochi! I'm a huge fan of your books and website :)


[ Rollover for scanned 3D geometry ]


The detail in your figure scans is astonishing, in terms of laser-accurate geometry and brilliant textures. What's your setup?


At the moment I use 115 DSLRs for the full body capture system and 48 DSLRs for the separate face (FACS) capture system. Mostly Canon's with about 8x Nikon D800's in there for full body reference shots. It was the first type of system of its kind, world wide, back in 2011.


Wow, that's a mighty impressive setup. Just collecting the images from all these cameras sounds like a lot of work. Are they all wired together?


Yes, they are all wired together to fire in sync. The system is designed to be 99.9% reliable. There are never any black images or missing images. This is integral for reconstruction. There are miles and miles of cable!


The cameras are triggered using custom opto-isolated trigger hubs (designed by Merry Hodgkinson) and Pocketwizard III remotes, with data streamed over a networked system, running Breeze Software and Agisoft Photoscan to process the point cloud data.

[ Panorama of Lee's VR Photostudio ]


Are you doing any special tricks with polarizers or flashes?


At the moment no, but I'm still running experiments trying to figure out the best method here. Linear polarizing film seems the best way to go. The end goal being better surface details.

I've also run many multi-lighting tests using flash lights and photoshop, similar to the Lighstage method, to acquire world space normals. Disney's mesoscopic emboss method is much quicker and far easier in comparison but the output is synthetic rather than true surface bump.


Highslide JS


My Oculus Rift is still on pre-order. The clips look already awesome on my laptop screen, but how would you describe the experience of the Oculus Rift?

You're in for a treat! It's hard to put into words. Really hard to describe until you see it, feel it.

It's a breathtaking moment when assets you've been working hard on for so long in a 2D format, you see now in 3D. Palmer Luckey, John Carmack, and the team at Oculus VR will go down in history as true pioneers. Huge thanks to them!

We're talking true 3D stereo VR here. Not 3D cinema stereo but true stereoscopic vision. You sense depth, vibrant colors and good contrast. You can adjust your focus (minus DOF changes) in VR, you really experience scale like you've never before.

And now you're populating it with realistic scans of real people!

Seeing scans of people in VR is amazing. It can feel strange at times, as if you're invading a person's space. Your sense of presence effects theirs and you can't make out if they are alive or dead because they don't move. Kind of like Wax works on steroids.

When you put your point of view in the position of the scanned person and look down, it's even stranger! Then you are that person. For a split second, you kind of sense what they must feel like living in that vessel, that was given to them to use i.e. being John Malkovich!! This could lead to some very interesting future experiences.

At the moment these are static scans, I'm working on movement but this will take time. I'm pushing for something past traditional motion capture, which I think is quite dated.


I understand that you put this demo together in Unity3D. How involved is this process, from a general 3D artist's point-of-view? Is it a steep learning curve?


I was terrified of learning Unity. I put it off for months/years and started with UDK (Unreal Development Kit) instead. UDK was easy to slip into but I found it had serious limitations. It felt bloated and I had many problems just making simple shaders work with the lights. For publishing self-illuminated or baked models it was great. One cool feature of UDK is it's loading system and ability to load in 8k textures. Something Unity struggles with.

But saying that, Unity is incredibly easy to learn. I initially learned from the Digital-Tutors site, then I dissected the Tuscany Demo supplied with the Oculus Dev Kit SDK.


How well does Unity work with the Oculus Rift?

Adding Rift integration was a walk in the park thanks to Unity and Oculus VR joining forces. Google is a treat with Unity searching because there are just thousands of help pages on all subjects. If you can think it up, someone else has inevitably already written a script for it. I've been lucky enough to make some good friends quickly in the VR/Unity community, who are incredibly talented at writing shaders and scripts. We are able to do things that once upon a time ILM had difficulty doing for film with offline rendering in the 90's, like the T-1000 effect. We can simulate something similar to that now, in Unity, in real-time, at 60-120 fps, in stereo, in VR!! ... It's quite mind blowing.

Highslide JS


What I find stunning is how well the environments interact with the characters shading. Is this baked, or is this image-based lighting in realtime?

This is the key, HDR and IBL, it's easy now thanks to the research that people like Paul Debevec did in the 90's, the work that you do on HDRLabs and people like Thomas Mansencal (sIBL GUI) and Bob Groothuis (Dutch Skies).

This paves the way for artists to be able to utilize advanced lighting techniques easily. Working with sIBL's is as simple as drag and drop in Unity. Also thanks to the great work the Marmoset Co guys do with Skyshop and Unity integration. This is what inspired me to use Unity after seeing their Skyshop tutorial:



So yes, the lighting in my demo is real-time, non-baked, all interactive. Some colored AO is baked during the studio scanning session but it's minimal. I'm also working on some new custom SSS shader integration.


So these are custom shaders?


We've (myself, Charles and drash) implemented a custom SSS solution with many different effects, like 2 specular lobe control, IBL (bent normals) calculated reflections, cavity Fresnel falloff, 8x multi-region micro multi-bump (blended with RGB maps, thanks to Steve), 8x multi-region cavity maps, deep red shadow scattering, GGX specular distance distribution, deep skin scatter for ear glow, 1x GI bounce, screen space reflections, colored AO and much more.
We're also working on a hair rendering solution for Unity, using IBL for lighting and strand reflections, 2 specular lobe control, as well as GI bounce and strand AO. This is a lot harder to implement.

I still use Marmoset for real-time testing but find Unity more open creatively because of scripting, sound, animation and interactive executable publishing.
Although I have a feeling UDK4 is going to be something very special!
 

Where do you see the creative potential of the Oculus Rift?


I can just imagine in 2-3 years, no more monitors, no more keyboard or mouse. Just a set top box (maybe even cloud based), a VR or AR head set, surround sound, and haptic gloves. We will have the ability to sculpt our favorite character or model, actually feel the sculpt as you work, on a virtual hilltop mountain side some where, at sunrise.


The Oculus Rift will be especially interesting for directors. I run my own small business so I've been a director (of a company) for a while and admire the work of film directors like Ridley Scott, Paul Thomas Anderson and James Cameron, just like most 3D artists do. The trick here is, the term director takes on a whole new meaning because of real-time and VR, which allows us to REALLY direct an experience. You get to do things creatively you couldn't ever do with film and this is where VR, 3D and 4D scans really kick in. You also get to add sound, music, visual and response interaction in areas you can't do with film. You can manipulate time, manipulate a users sense of being, drive 360 environments and many other things.

VR for creative experiences, not just for games but really immersive experiences is like the old Wild-West. It's a new frontier, rich for the pickings. At some point Film, VFX and Games will merge into a Directed Experience. There is no doubt.



What's the next thing on your personal list?


My direction with VR goes into the adult entertainment industry. An area few dare to venture, or have the balls to try. Other industry professionals have warned me of this direction, saying you won't get bookings from Disney et al, saying it will affect your reputation, etc! Which I am well aware of; my intention isn't to sell out but to follow my own dreams. The adult industry is dying financially and it needs a new lease of life, a new dimension. VR and scanning in this market can explore a lot of untapped areas and also really benefit many people. Give them experiences they may otherwise never have access to, due to physical disabilities, social issues, or sexual orientation. And this is just one small niche area of VR. The possibilities are endless.

Well, if the Oculus Rift is half as good as you're saying I can see this becoming a big hit. I, for one, await my pre-order with much suspense now.
Thank you very much for this interview, and good luck!


Thanks again Blochi, your book and your site are always a constant source of inspiration.

Lee


Highslide JS


Hope you liked this interview. Maybe we can do more of those in the future.

Visit the Infinite Realities website to see more good stuff, grab the Rift demo, and don't forget to download the sIBL-of-the-month that Lee generously donated.

View Comments

Welcome to the sIBL family, Blender!

Raise your glasses, brothers! Blender just joined the exquisite ranks of sIBL-supported programs.

Highslide JS

Blender extension builds a bridge to sIBL_GUI, enables quick lighting setups.
Screenshot by Jed Frechette.


Thanks to the development effort of Jed Frechette, Blender users can now enjoy the one-click environment lighting setup that the Smart IBL system is famous for. Integration is done thoroughly, by using sIBL_GUI as browser and central library management hub. If you already use sIBL_GUI in conjunction with 3dsMAX or Maya, the workflow with Blender will be familiar:
  • Pick an environment preset
  • Pick Blender as setup template
  • Click the Send to Software button
Photorealistic lighting couldn't be easier.

Links:
Download and installation instructions on wiki.blender.org
Say thanks or report bugs in our dev forum thread

To celebrate this historic event, enjoy this new free sIBL set of an iron bridge in full 16K glory:



View Comments

Welcome to the sIBL family, Blender!

Raise your glasses, brothers! Blender just joined the exquisite ranks of sIBL-supported programs.

Highslide JS

New Blender extension builds a bridge to sIBL_GUI for quick lighting and environment setups.
Screenshot by Jed Frechette.


Thanks to the development efforts of Jed Frechette, Blender users can now enjoy the one-click environment lighting setup that the Smart IBL system is famous for. Integration is done thoroughly, by using sIBL_GUI as browser and central library management hub. If you already use sIBL_GUI in conjunction with 3dsMAX or Maya, the workflow with Blender will be familiar:
  • Pick an environment preset
  • Pick Blender as setup template
  • Click the Send to Software button
Photorealistic lighting (and background integration) couldn't be easier.

Links:


Download and installation instructions on wiki.blender.org.
Say thanks or report bugs in our dev forum thread.

To celebrate this historic event, please enjoy this free sIBL-set of an iron bridge in full 16K glory.


View Comments

KaleidoCam: the ultimate DSLR upgrade gadget

Highlight of this year's SIGGRAPH was the KaleidoCam, which introduces an entirely new idea to the world of photography. It's an add-on that goes between the lens and the camera, and it upgrades the camera's capabilities in ways never thought possible. Suddenly you have true single-shot multi-exposure HDR capture, multi-spectral imaging, variable polarization, you can even capture a light field with a pixel resolution that far exceeds what you can get out of a Lytro camera - and all that with your own DSLR and lenses!



The working principle is ingenious in its simplicity: The gadget contains a diffuse screen, that intercepts the light rays coming from the lens at the very same spot where the sensor would be. So instead of a sensor, there is now a rear-projection screen. A kaleidoscopic mirror arrangement then sends multiple copies of the image to the sensor (9 images in total, arranged in a 3x3 grid). The trick is now to insert different filters into the optical path of each of these images. For HDR, you would have varying ND filters - and voila, each snap of the shutter gives you 9 different exposures. Awesome!



I really hope to see this come to market soon, if the research group would go on Kickstarter they would already have my money! Read the detailed technical paper here, and make sure to scroll down and check out the data sets (original images captured with the prototype).

View Comments

Get the HDRI-Handbook 2.0 in a bundle!


My friends at Unified Color have just released version 3.0 of their professional HDR software line. That includes the stand-alone HDR Expose 3 and the Photoshop plugin 32Float v3 (which is what I use). As cherry on top, you get my HDRI-Handbook 2.0 bundled (the e-book version, that is).

This is an excellent updates, both programs run much faster, and the tonemapping algorithm is much improved. It now uses an adaptive tone curve to tune the overall contrast, similar to Picturenaut's Adaptive Logarithmic method.


Highslide JS

HDR Expose 3 also has a completely revamped HDR merge mode with a nice ghost removal tool (where you can manually pick the hero image for marked areas) and a rock-solid image alignment (where you can manually set down control points and let the frames warp into place accordingly). And of course, there are the trademark features of this product line that I love most: full 32-bit color editing, white balance, veiling glare removal, all while keeping the overbright areas of the HDR image intact and fully valid for further post processing. So, even if this won't become your primary HDR tool, it is a mighty powerful addition to your bag of tricks.


Highslide JS

So there you go - grab a pro HDR software and get my book for free!
www.unifiedcolor.com/hdri-handbook-giveaway (limited to 500, first come first served)

My publisher actually compiled a special eBook edition for this; it's a PDF with the correct layout and decent photo resolution. However, keep in mind that the bonus DVD content (which is needed to follow the tutorials) is not included in the eBook. For the full HDRI-Handbook 2.0 experience I still recommend getting the paperback edition (currently only $35 on Amazon).

View Comments
Next Page