Lots of free HDRIs

Frequent visitors might have noticed that there was no free HDR in November. Ever since I started the monthly giveaway in June 2007, this was the first month I forgot. Well, just to keep the chain of free HDRIs unbroken, let me make it up for you: December brings 2 new sIBLs-of-the-month!

The first one is an 16K panorama of a landmark church in my home town Halle, Germany. The church sits on a hill surrounded by a green park, which makes it a very popular hangout spot. Zoom in and pan around, and you will discover a champaign bottle, love graffiti, and bikes.

The second HDR is exactly 4 years old, but unreleased. It shows the Christmas Tree in the fancy lobby of Tokyo’s Park Hotel. That’s a super high-class hotel in the heart of the futuristic Shiodome district. The hotel starts at the 25th level, spans 9 levels upwards, and the lobby is an atrium that spans the entire hollowed-out tip of the skyscraper.

This hotel was also used as location in the movie "Lost in Translation". I could only afford two nights there, and of course I switched rooms half-way so I could shoot the cityscape in all directions. Just check out the vista:

3DWorld Advent Special: 4 Dutch Skies

And then there’s Bob Groothuis again. He gives away 4 exclusive Dutch Skies!

View Comments

Bublcam on Kickstarter: Panoramic HDR Video Cam

I have become addicted to Kickstarter. All the new VR gadgets show up there first and get mad funding. So far I backed the STEM tracker, the castAR glasses, and most recently the Bublcam.

Sure, you can get a similar result by using one of the many GoPro pano rigs. However, this won't be any cheaper because you need 5 GoPro cameras. And you'll get into all sorts of trouble synchronizing them, collecting the footage, and hiding the seams in the stitch. That's why an integrated solution like the Bublcam is much more reliable.

And while there have been several other ball-type all-in-one pano cameras proposed, this one seems to make the most sense:
  • Optimum orientation of the lenses.
  • Small form factor for minimizing perspective shift between sectors.
  • Rather 4 good quality wide-angle lenses than a multitude of cheaper ones.

Highslide JS

The bublcam captures a single 4-quadrant multiplex image. This is the raw .MP4 that comes off the camera and when placed in Bubl software you get a full spherical experience. Also, such source footage is extremely easy to stitch with regular tools like Fusion, Nuke, or After Effects.

But the deciding factor for this blog is that it will also have an HDR mode! This was made possible when the project reached a stretch goal last week.

Now, if everyone here would help giving it an extra push, we can have a time-lapse mode and maybe even double the resolution. There are only 5 days left. So, please, everyone, go ahead and reserve your very own Bublcam on Kickstarter!

View Comments

Shortcuts: PMatix 5, HDRLightStudio, HDR video, Dual ISO

Here's a quick summary of various new HDR tools and toys that popped up recently.

Highslide JS

Photomatix Pro 5.0 Beta

Our friends at HDRSoft have been working hard on the next generation of Photomatix.
  • All-new Tonemapping algorithm "Contrast Optimizer" promises very natural results
  • Automatic Deghosting now offers a preview and hero image selection (very nice)
  • Realtime preview during tonemapping
  • Highlight recovery from RAW files
  • and much more
It's a really great update, I highly recommend checking it out!

Download Photomatix 5 Beta

HDR Light Studio integrates live with MODO and Bunkspeed

This is very close to my heart, since I'm an avid MODO user myself.
HDR Light Studio is a unique app for creating/editing spherical HDR panos for lighting purpose (see the quick introduction in my HDRI Handbook 2.0, pages 631-632 ). It's a total niche field, and may at first sound odd or unnecessary, but the way it is executed makes it completely amazing. HDR Light Studio offers:
  • Spotlights and soft boxes with real-world settings for power (Watts) and color (Kelvin)
  • Growing library of HDR image elements, real studio light sources, ready to drop in
  • Select and position light sources by clicking on the highlights in your render
  • Live connection shows the result immediately in your 3D apps preview renderer

Putting all this lighting power at your fingertips, HDR Light Studio has quickly become the standard for industrial designers, especially in the automotive field.

Download the HDR Light Studio demo (registration required)
or see lots of eye candy on their blog.

New HDR Video samples for download

If chapter 7.4.1 in my HDRI-Handbook 2.0 has made you curious about HDR video (pages 596-600), then you will be delighted to hear that our friend Jonas Unger from the Linköping University has updated his HDRv repository.

You can download a variety of video clips, all captured with the infamous Spheron HDRv camera prototype. Files are provided as EXR sequence, ready for your own experiments in Nuke, Fusion, or After Effects. There are even some clips with mirror ball footage to get your feet wet with HDR video-based lighting (example here).

Go grab the goodies from the LiU HDRv Repository

Magic Lantern gets a Dual-ISO mode

The Magic Lantern firmware enhancement for Canon DSLRs now offers the ultimate HDR mode!

Dual ISO means half the sensor is read out with ISO 100 and the other half with ISO 1600 (or any other, for that matter). So you get two exposures at the exact same time! Time to say good-bye to ghost removal. Moving subjects are not an issue with this camera mod, as all the amazing examples clearly prove.

Download the Dual-ISO module

View Comments

Infinite Reality's amazing Oculus Rift demo (+Interview)

Lee Perry-Smith is not a stranger here on HDRLabs. Under his company label Infinite Realities he previously donated a 3D head scan (of his own head) for the graphics researcher community, and he's been a fan of our Smart IBL environment/lighting system from day one.

But now Lee has stepped it up a notch.
He put together a realtime demo for the infamous Oculus Rift - the utterly futuristic VR glasses of the future. The demo contains some nudity, that is, incredibly realistic 3D scans of nude women (may be NSFW if you don't work in the art field). The demo also contains a variety of my panoramas as environments, which I'm rather thrilled about. Check it out:

Download the demo and see for yourself. It's even impressive if you don't have a Rift.

I found it so impressive, that I was burning to ask Lee a few questions on how he did this.
Buckle up for a new literary format on this blog: an interview!

Interview with Lee Perry-Smith

Thanks for this opportunity Blochi! I'm a huge fan of your books and website :)

[ Rollover for scanned 3D geometry ]

The detail in your figure scans is astonishing, in terms of laser-accurate geometry and brilliant textures. What's your setup?

At the moment I use 115 DSLRs for the full body capture system and 48 DSLRs for the separate face (FACS) capture system. Mostly Canon's with about 8x Nikon D800's in there for full body reference shots. It was the first type of system of its kind, world wide, back in 2011.

Wow, that's a mighty impressive setup. Just collecting the images from all these cameras sounds like a lot of work. Are they all wired together?

Yes, they are all wired together to fire in sync. The system is designed to be 99.9% reliable. There are never any black images or missing images. This is integral for reconstruction. There are miles and miles of cable!

The cameras are triggered using custom opto-isolated trigger hubs (designed by Merry Hodgkinson) and Pocketwizard III remotes, with data streamed over a networked system, running Breeze Software and Agisoft Photoscan to process the point cloud data.

[ Panorama of Lee's VR Photostudio ]

Are you doing any special tricks with polarizers or flashes?

At the moment no, but I'm still running experiments trying to figure out the best method here. Linear polarizing film seems the best way to go. The end goal being better surface details.

I've also run many multi-lighting tests using flash lights and photoshop, similar to the Lighstage method, to acquire world space normals. Disney's mesoscopic emboss method is much quicker and far easier in comparison but the output is synthetic rather than true surface bump.

Highslide JS

My Oculus Rift is still on pre-order. The clips look already awesome on my laptop screen, but how would you describe the experience of the Oculus Rift?

You're in for a treat! It's hard to put into words. Really hard to describe until you see it, feel it.

It's a breathtaking moment when assets you've been working hard on for so long in a 2D format, you see now in 3D. Palmer Luckey, John Carmack, and the team at Oculus VR will go down in history as true pioneers. Huge thanks to them!

We're talking true 3D stereo VR here. Not 3D cinema stereo but true stereoscopic vision. You sense depth, vibrant colors and good contrast. You can adjust your focus (minus DOF changes) in VR, you really experience scale like you've never before.

And now you're populating it with realistic scans of real people!

Seeing scans of people in VR is amazing. It can feel strange at times, as if you're invading a person's space. Your sense of presence effects theirs and you can't make out if they are alive or dead because they don't move. Kind of like Wax works on steroids.

When you put your point of view in the position of the scanned person and look down, it's even stranger! Then you are that person. For a split second, you kind of sense what they must feel like living in that vessel, that was given to them to use i.e. being John Malkovich!! This could lead to some very interesting future experiences.

At the moment these are static scans, I'm working on movement but this will take time. I'm pushing for something past traditional motion capture, which I think is quite dated.

I understand that you put this demo together in Unity3D. How involved is this process, from a general 3D artist's point-of-view? Is it a steep learning curve?

I was terrified of learning Unity. I put it off for months/years and started with UDK (Unreal Development Kit) instead. UDK was easy to slip into but I found it had serious limitations. It felt bloated and I had many problems just making simple shaders work with the lights. For publishing self-illuminated or baked models it was great. One cool feature of UDK is it's loading system and ability to load in 8k textures. Something Unity struggles with.

But saying that, Unity is incredibly easy to learn. I initially learned from the Digital-Tutors site, then I dissected the Tuscany Demo supplied with the Oculus Dev Kit SDK.

How well does Unity work with the Oculus Rift?

Adding Rift integration was a walk in the park thanks to Unity and Oculus VR joining forces. Google is a treat with Unity searching because there are just thousands of help pages on all subjects. If you can think it up, someone else has inevitably already written a script for it. I've been lucky enough to make some good friends quickly in the VR/Unity community, who are incredibly talented at writing shaders and scripts. We are able to do things that once upon a time ILM had difficulty doing for film with offline rendering in the 90's, like the T-1000 effect. We can simulate something similar to that now, in Unity, in real-time, at 60-120 fps, in stereo, in VR!! ... It's quite mind blowing.

Highslide JS

What I find stunning is how well the environments interact with the characters shading. Is this baked, or is this image-based lighting in realtime?

This is the key, HDR and IBL, it's easy now thanks to the research that people like Paul Debevec did in the 90's, the work that you do on HDRLabs and people like Thomas Mansencal (sIBL GUI) and Bob Groothuis (Dutch Skies).

This paves the way for artists to be able to utilize advanced lighting techniques easily. Working with sIBL's is as simple as drag and drop in Unity. Also thanks to the great work the Marmoset Co guys do with Skyshop and Unity integration. This is what inspired me to use Unity after seeing their Skyshop tutorial:

So yes, the lighting in my demo is real-time, non-baked, all interactive. Some colored AO is baked during the studio scanning session but it's minimal. I'm also working on some new custom SSS shader integration.

So these are custom shaders?

We've (myself, Charles and drash) implemented a custom SSS solution with many different effects, like 2 specular lobe control, IBL (bent normals) calculated reflections, cavity Fresnel falloff, 8x multi-region micro multi-bump (blended with RGB maps, thanks to Steve), 8x multi-region cavity maps, deep red shadow scattering, GGX specular distance distribution, deep skin scatter for ear glow, 1x GI bounce, screen space reflections, colored AO and much more.
We're also working on a hair rendering solution for Unity, using IBL for lighting and strand reflections, 2 specular lobe control, as well as GI bounce and strand AO. This is a lot harder to implement.

I still use Marmoset for real-time testing but find Unity more open creatively because of scripting, sound, animation and interactive executable publishing.
Although I have a feeling UDK4 is going to be something very special!

Where do you see the creative potential of the Oculus Rift?

I can just imagine in 2-3 years, no more monitors, no more keyboard or mouse. Just a set top box (maybe even cloud based), a VR or AR head set, surround sound, and haptic gloves. We will have the ability to sculpt our favorite character or model, actually feel the sculpt as you work, on a virtual hilltop mountain side some where, at sunrise.

The Oculus Rift will be especially interesting for directors. I run my own small business so I've been a director (of a company) for a while and admire the work of film directors like Ridley Scott, Paul Thomas Anderson and James Cameron, just like most 3D artists do. The trick here is, the term director takes on a whole new meaning because of real-time and VR, which allows us to REALLY direct an experience. You get to do things creatively you couldn't ever do with film and this is where VR, 3D and 4D scans really kick in. You also get to add sound, music, visual and response interaction in areas you can't do with film. You can manipulate time, manipulate a users sense of being, drive 360 environments and many other things.

VR for creative experiences, not just for games but really immersive experiences is like the old Wild-West. It's a new frontier, rich for the pickings. At some point Film, VFX and Games will merge into a Directed Experience. There is no doubt.

What's the next thing on your personal list?

My direction with VR goes into the adult entertainment industry. An area few dare to venture, or have the balls to try. Other industry professionals have warned me of this direction, saying you won't get bookings from Disney et al, saying it will affect your reputation, etc! Which I am well aware of; my intention isn't to sell out but to follow my own dreams. The adult industry is dying financially and it needs a new lease of life, a new dimension. VR and scanning in this market can explore a lot of untapped areas and also really benefit many people. Give them experiences they may otherwise never have access to, due to physical disabilities, social issues, or sexual orientation. And this is just one small niche area of VR. The possibilities are endless.

Well, if the Oculus Rift is half as good as you're saying I can see this becoming a big hit. I, for one, await my pre-order with much suspense now.
Thank you very much for this interview, and good luck!

Thanks again Blochi, your book and your site are always a constant source of inspiration.


Highslide JS

Hope you liked this interview. Maybe we can do more of those in the future.

Visit the Infinite Realities website to see more good stuff, grab the Rift demo, and don't forget to download the sIBL-of-the-month that Lee generously donated.

View Comments

Welcome to the sIBL family, Blender!

Raise your glasses, brothers! Blender just joined the exquisite ranks of sIBL-supported programs.

Highslide JS

Blender extension builds a bridge to sIBL_GUI, enables quick lighting setups.
Screenshot by Jed Frechette.

Thanks to the development effort of Jed Frechette, Blender users can now enjoy the one-click environment lighting setup that the Smart IBL system is famous for. Integration is done thoroughly, by using sIBL_GUI as browser and central library management hub. If you already use sIBL_GUI in conjunction with 3dsMAX or Maya, the workflow with Blender will be familiar:
  • Pick an environment preset
  • Pick Blender as setup template
  • Click the Send to Software button
Photorealistic lighting couldn't be easier.

Download and installation instructions on wiki.blender.org
Say thanks or report bugs in our dev forum thread

To celebrate this historic event, enjoy this new free sIBL set of an iron bridge in full 16K glory:

View Comments

Welcome to the sIBL family, Blender!

Raise your glasses, brothers! Blender just joined the exquisite ranks of sIBL-supported programs.

Highslide JS

New Blender extension builds a bridge to sIBL_GUI for quick lighting and environment setups.
Screenshot by Jed Frechette.

Thanks to the development efforts of Jed Frechette, Blender users can now enjoy the one-click environment lighting setup that the Smart IBL system is famous for. Integration is done thoroughly, by using sIBL_GUI as browser and central library management hub. If you already use sIBL_GUI in conjunction with 3dsMAX or Maya, the workflow with Blender will be familiar:
  • Pick an environment preset
  • Pick Blender as setup template
  • Click the Send to Software button
Photorealistic lighting (and background integration) couldn't be easier.


Download and installation instructions on wiki.blender.org.
Say thanks or report bugs in our dev forum thread.

To celebrate this historic event, please enjoy this free sIBL-set of an iron bridge in full 16K glory.

View Comments
Next Page