Tim's Vermeer and Bob's HDR pano tutorial

If you haven't seen Tim's Vermeer yet, go see it!
It's about using technology to make great art. It's about persistence and plowing through an insane project one day at a time. Things every photographer and 3D artist can relate to.

My friend Bob Groothuis, maker of the famous Dutch Skies 360 collections, actually played a vital part in this movie and was at the movie shoot in the Vermeer museum. He talks about this close encounter here, and he shot this panorama with Tim and all the cast and crew.

Bob's been very busy.

He just published Vol.5 of his Dutch Skies collection, with ... wonderful new HDRI containing all the golden light and fluffy clouds the dutch painters - including Vermeer - were famous for. All his Dutch Skies Vol 1,2,3,4, and 5 taken together are 232 skies. And the strange thing is, he doesn't seem to get tired of it. The more he shoots, the more accurate and better quality his HDRIs get.

We had a little chat about techniques and how he has streamlined his approach, and the result is that Bob sponsored this month's free sIBL-set and wrote this sweet tutorial for us. There's a lot of good stuff about HDR color management, HDR stitching, and just the general workflow of producing massive amounts of HDR panos.

I still recommend reading my HDRI Handbook 2.0 (and working through the tutorials in it), but if all you want is a quick online tutorial, then listen to the things Bob has to say.

[ Guest Article]

Bob Groothuis' big HDR panorama tutorial


In recent years I shot an enormous amount of skies here in Scheveningen (Netherlands).

Previously I made Dutch Skies 360° Volumes 1, 2, 3 & 4. Now I finished the series with the last one: Dutch Skies 360° Volume 5. It’s a bit later than planned (wanted to launch it 1,5 years ago) but I had other projects that needed attention first.

I have an external storage place and during a revision of Dutch Skies 360° Volumes Volume 2 found some panos that I completely forgot about.

You might wonder if this is the last one? Yes, because now I'm already shooting new stuff with the Nikon D800, so the resolution will be a bit bigger than the Vol 1,2,3,4 & 5. Instead of 11k I have now 19k skies. So I'm starting with a new product line simply called: Dutch Skies 360° XL (eXtra Large). The first XL skies will be available soon (probably together with brand new website & online shop). Not going to create volumes anymore, files sizes are getting way to big so the XL skies can be bought separately.

Enough of this shameless self promo let me show you some production notes/tips of the creation of the HDRLabs Free sIBL set of the month february 2014: DS360_119a. Please note that there could be better methods, I only show how I did it.

Postproduction workflow notes

The whole post production is not that difficult, but a crucial thing is being consequent. A helpful tool is a checklist where you can see exactly where you are in the workflow and be assured you keep full dynamic range and quality. A mistake is easily made so such a checklist helps a lot.

Here is a reproduction of a more extended checklist. Old school, on paper.

Highslide JS

Now let me explain some of the most important steps in more detail:

- Adobe Lightroom 5 cleaning / retouch / export
- HDRI generation with Photomatix
- Rendering the HDRI panorama with PTGUI Pro
- Bottom retouch trick
- Advanced HDRI color profile conversion method with LittleCMS Color Translator

Important note:

A suggestion that could help you: To prevent going nuts in the case your HD crashes, your office burns down or you get a visit by burglar's etc be sure you backup a lot. Here I have daily backup to an external HD. Also upload finished files to a FTP (here use Filefactory). And then I also backup to Blue Ray. Those Blue Ray backups are taken to a storage place outside the office. Almost sounds like paranoia but believe me, it’s not. You will feel very horrible when you have to redo your work from scratch. I worked for about 2 months almost full-time on Volume 5, and it would be a nightmare when I had to do it al over (or even partially).

Preparation in Adobe Lightroom 5

The source files from DS360_119a were shot in JPG and have a Nikon aRGB color profile (more about that later). To follow this tutorial you can download the original files here. Compare with the completed DutchSkies360_119a.

Lately I’ve been using Adobe Lightroom 5 for cleaning the source files, remove birds and spots on the lenses. It is inexpensive and cleaning goes pretty quick.

I’m not going to explain the whole process only mention the most important steps. If you want a more in-depth tutorial, you should check out Christian's HDRI-Handbook 2.0.

1) Use white balance as shot (with Nikon the safest WB setting when shooting is Cloudy specially when you are in a hurry).

2) Tone Curve MUST be disabled.

3) Turn on "remove Chromatic Aberration" - just default settings.

4) Used Camera Calibration - Process 2012 - Profile - Embedded.

Highslide JS Highslide JS Highslide JS

After that I cleaned out all the birds & spots with the Clone tool (this can be a bit time consuming when there is the need to clean 63 images ;) ).

Highslide JS

When that's done the images can be exported. Because the source files are Nikon aRGB we can safely export as 8 bit Adobe aRGB tiff files (Nikon aRGB and Adobe aRGB don’t differ that much).

Highslide JS

Be sure after cleaning all 63 images that you export the whole catalog (File / Export as Catalog…) that will assure you have all settings (like the cloning) saved so you can later open that Catalog again to do some refinements if needed.

HDR merging in Photomatix

We now move the Lightroom export 8 bit aRGB tiff files into separate folders, so we end with 7 folders, each containing all the exposures of every angle (the pano was shot in 6 horizontal shots and one top shot).

Use Batch Processing to create the HDRIs in Photomatix using the following settings:

Highslide JS

Try to shoot your HDR panorama source files with the lowest ISO as possible. That reduces the noise in the images. In this case I used a mild denoise in the lower exposures. Also I used a normal deghosting. Always make some test to look if all is ok.

The same time as Photomatix generates the HDRIs it tonemaps the images with the default Tonecompressor method. Those files can be used to make the stitch template in PTGUI pro (more on this in the next step).

Extra deghosting note: be careful with the High deghosting settings in Photomatix. The areas where there a fast moving objects there could be severe noise in the HDR after merging. Always check all your merged HDR's before stitching. You can retouch the bad spots in Photoshop by mixing it with a normal deghosted version.

HDR Panorama stitching in PTGUI (Pro)

In the previous step I mentioned to enable generating a tone map (compressor) together with the HDRI generation.

It goes much quicker when you do the stitch in PTGUI (Pro) with JPG files instead of HDR's. After you did the stitching with the JPG angles successfully just save that as a PTGUI template. Now create a new document (stitch) in PTGUI and load the HDR files created with Photomatix. Then apply the JPG template. Don’t forget to enable HDR Stitching in the Exposure/HDR tab.

Now save the stitch as DS360_119a_PTGUI_11k_aRGB.pts and render the HDR.

NOTE: Often you have to use PTGui’s masking feature on some parts of the sky or render the image in layers so you can do some HDR editing in Photoshop. Especially skies with lots of moving clouds can be a time consuming thing. Do not give up easy there. Be also careful with retouching the outer left or the outer right edge of the image. This is the fragile panorama seam, you can retouch/correct that part better in the next step.

Bottom retouch

For the bottom retouch I have a nice and simple trick to keep the maximum of quality. To prevent HDRI quality lost during converting the PTGUI output (Equirectangular) into cube faces .
Some converting applications can loose quality during the conversion.

Here is a trick that prevents loss:

1) Use PTGUI (Pro) to convert the Equirectangular into Cube faces (6 separate files - .hdr file format)

2) Make sure you are in aRGB color mode when you retouch the bottom part in your favorite image retouch application (here use Photoshop). After retouching save the bottom part by overwriting the original bottom part.

3) Now convert the cube faces back into a Equirectangular. Use here the tool Pano2VR.

4) Check if the file size are the same (so the original PTGUI Pro output and the new Pano2VR need to have the same size for this step - sometimes the Pano2VR output is a bit smaller)

5) Now open both panoramas in Photoshop (or any other image manipulation program) and paste them in one document. Use a layer mask to paint in the bottom part.

With this method if there would be some quality lost it’s only in the bottom part!

6) Save the comp as 1) DS360_119a_PTGUI_11k_aRGB.exr (or .hdr) and next as a 2) DS360_119a_PTGUI_11k_aRGB.tiff file - 32 bit float with Linear Adobe aRGB profile.

NOTE: .hdr cannot store profile information so you perhaps can better use the .exr format. In the case you use .hdr and want to open the .hdr in for example just assign the aRGB profile.

LittleCMS Color Translator

For years Marti Maria is known for his excellent work on the free color engine LittleCMS. Recently he made a nifty application called: Color Translator. It’s the definitive tool for converting, assigning and embedding ICC profiles in TIFF, JPEG and PNG images. A must-have for anyone working in any type of media production (print, website creation, 3D etc).

Lately I tested with my good friend Gerardo Estrada the Color Translator application successfully in an HDRI workflow. More info please see below.

So we now we have from the previous steps:

a) An 11k .hdr based on aRGB primaries that can be used by users who want to take advantage of a wider color range.

For creating the sIBL set its better to convert this aRGB based panorama into a sRGB version. So we need:

b) An 11k .hdr based on sRGB primaries that is used for generating the sIBL HDR set and for regular usage in sRGB range.

For converting the profiles we need to install the Color Profiles for Color Translator

The needed profiles for this work can be downloaded here.

You need to install these profiles in your Color profile folder. A very good manual how to do that on your platform just have a look here.

When opening Color Translator you see in the Input & Output tab on the right a black folder just click on it and select your Color Profiles folder as mentioned below.

Highslide JS Highslide JS

Now we are going to use the Translator to convert to Linear sRGB (D65):

Default input tab

Be sure you have the default RGB space set to Linear Adobe RGB (D65) – we just installed

Highslide JS

Output tab

Destination space must be set to the Linear sRGB IEC61966-2.1 (D65) – we just installed

Highslide JS

Tiff tab

In this specific case we keep the Pixel Depth set to default “Keep Original Depth”

Highslide JS

The output is now a linear sRGB (D65) FP 32 bit Tiff file that can be saved as a .hdr or .exr with Photoshop. Note that .hdr files cannot store profile but the image is sRGB (D65) based. So when opening in Photoshop you simply assign the sRGB profile (photoshop automatically assigns the correct linear profile).

The sRGB version panorama can be used for generating your own sIBL sets + tonemapping.

Extra note: This workflow was using Photomatix, Photoshop & Color Translator 1. PTGUIPro is also mentioned but for the color conversion specifically we just need Photoshop and Color Translator. Soon I will make a new workflow made with the latest Color Translator 2. We need to test it because version 2 has OpenEXR support, so most likely we can then skip the Photoshop part and do the whole conversion with Color Translator 2! The same process as described above is the same in version 2.

75 % Discount!

Proudly announce a collaboration with LittleCMS:
You can order LittleCMS Color Translator with 75% discount! Normaly LittleCMS Color Translator cost 19,95 euro (+/- 27,44 $) now you can get it for only 5 euro (+/- 6,90 $)!

More info and ordering here on my blog.


Many thanks to Marti Maria ( LittleCMS ) for his great discount offer & specially Gerardo Estrada who helped with the workflow research! Also thanks to Christian Bloch and HDRLabs for making this publication possible.

Legal notes:

All other trademarks and trade names mentioned here are the property of their respective holders.

Bob Groothuis

[ END of guest article ]

Post remarks

Thanks Bob, that was a really great article.
I'd like to remind all readers to download Bob's free sIBL-of-the-month. The original files (for this tutorial) can be downloaded here.
And if you want some more tutorials on shooting and stitching HDR panoramas you should get my HDRI Handbook 2.0. It has extra in-depth information and step-by-step instructions for you. Just check out the interactive table of content or read the stellar reviews on Amazon!

View Comments

Lots of free HDRIs

Frequent visitors might have noticed that there was no free HDR in November. Ever since I started the monthly giveaway in June 2007, this was the first month I forgot. Well, just to keep the chain of free HDRIs unbroken, let me make it up for you: December brings 2 new sIBLs-of-the-month!

The first one is an 16K panorama of a landmark church in my home town Halle, Germany. The church sits on a hill surrounded by a green park, which makes it a very popular hangout spot. Zoom in and pan around, and you will discover a champaign bottle, love graffiti, and bikes.

The second HDR is exactly 4 years old, but unreleased. It shows the Christmas Tree in the fancy lobby of Tokyo’s Park Hotel. That’s a super high-class hotel in the heart of the futuristic Shiodome district. The hotel starts at the 25th level, spans 9 levels upwards, and the lobby is an atrium that spans the entire hollowed-out tip of the skyscraper.

This hotel was also used as location in the movie "Lost in Translation". I could only afford two nights there, and of course I switched rooms half-way so I could shoot the cityscape in all directions. Just check out the vista:

3DWorld Advent Special: 4 Dutch Skies

And then there’s Bob Groothuis again. He gives away 4 exclusive Dutch Skies!

View Comments

Bublcam on Kickstarter: Panoramic HDR Video Cam

I have become addicted to Kickstarter. All the new VR gadgets show up there first and get mad funding. So far I backed the STEM tracker, the castAR glasses, and most recently the Bublcam.

Sure, you can get a similar result by using one of the many GoPro pano rigs. However, this won't be any cheaper because you need 5 GoPro cameras. And you'll get into all sorts of trouble synchronizing them, collecting the footage, and hiding the seams in the stitch. That's why an integrated solution like the Bublcam is much more reliable.

And while there have been several other ball-type all-in-one pano cameras proposed, this one seems to make the most sense:
  • Optimum orientation of the lenses.
  • Small form factor for minimizing perspective shift between sectors.
  • Rather 4 good quality wide-angle lenses than a multitude of cheaper ones.

Highslide JS

The bublcam captures a single 4-quadrant multiplex image. This is the raw .MP4 that comes off the camera and when placed in Bubl software you get a full spherical experience. Also, such source footage is extremely easy to stitch with regular tools like Fusion, Nuke, or After Effects.

But the deciding factor for this blog is that it will also have an HDR mode! This was made possible when the project reached a stretch goal last week.

Now, if everyone here would help giving it an extra push, we can have a time-lapse mode and maybe even double the resolution. There are only 5 days left. So, please, everyone, go ahead and reserve your very own Bublcam on Kickstarter!

View Comments

Shortcuts: PMatix 5, HDRLightStudio, HDR video, Dual ISO

Here's a quick summary of various new HDR tools and toys that popped up recently.

Highslide JS

Photomatix Pro 5.0 Beta

Our friends at HDRSoft have been working hard on the next generation of Photomatix.
  • All-new Tonemapping algorithm "Contrast Optimizer" promises very natural results
  • Automatic Deghosting now offers a preview and hero image selection (very nice)
  • Realtime preview during tonemapping
  • Highlight recovery from RAW files
  • and much more
It's a really great update, I highly recommend checking it out!

Download Photomatix 5 Beta

HDR Light Studio integrates live with MODO and Bunkspeed

This is very close to my heart, since I'm an avid MODO user myself.
HDR Light Studio is a unique app for creating/editing spherical HDR panos for lighting purpose (see the quick introduction in my HDRI Handbook 2.0, pages 631-632 ). It's a total niche field, and may at first sound odd or unnecessary, but the way it is executed makes it completely amazing. HDR Light Studio offers:
  • Spotlights and soft boxes with real-world settings for power (Watts) and color (Kelvin)
  • Growing library of HDR image elements, real studio light sources, ready to drop in
  • Select and position light sources by clicking on the highlights in your render
  • Live connection shows the result immediately in your 3D apps preview renderer

Putting all this lighting power at your fingertips, HDR Light Studio has quickly become the standard for industrial designers, especially in the automotive field.

Download the HDR Light Studio demo (registration required)
or see lots of eye candy on their blog.

New HDR Video samples for download

If chapter 7.4.1 in my HDRI-Handbook 2.0 has made you curious about HDR video (pages 596-600), then you will be delighted to hear that our friend Jonas Unger from the Linköping University has updated his HDRv repository.

You can download a variety of video clips, all captured with the infamous Spheron HDRv camera prototype. Files are provided as EXR sequence, ready for your own experiments in Nuke, Fusion, or After Effects. There are even some clips with mirror ball footage to get your feet wet with HDR video-based lighting (example here).

Go grab the goodies from the LiU HDRv Repository

Magic Lantern gets a Dual-ISO mode

The Magic Lantern firmware enhancement for Canon DSLRs now offers the ultimate HDR mode!

Dual ISO means half the sensor is read out with ISO 100 and the other half with ISO 1600 (or any other, for that matter). So you get two exposures at the exact same time! Time to say good-bye to ghost removal. Moving subjects are not an issue with this camera mod, as all the amazing examples clearly prove.

Download the Dual-ISO module

View Comments

Infinite Reality's amazing Oculus Rift demo (+Interview)

Lee Perry-Smith is not a stranger here on HDRLabs. Under his company label Infinite Realities he previously donated a 3D head scan (of his own head) for the graphics researcher community, and he's been a fan of our Smart IBL environment/lighting system from day one.

But now Lee has stepped it up a notch.
He put together a realtime demo for the infamous Oculus Rift - the utterly futuristic VR glasses of the future. The demo contains some nudity, that is, incredibly realistic 3D scans of nude women (may be NSFW if you don't work in the art field). The demo also contains a variety of my panoramas as environments, which I'm rather thrilled about. Check it out:

Download the demo and see for yourself. It's even impressive if you don't have a Rift.

I found it so impressive, that I was burning to ask Lee a few questions on how he did this.
Buckle up for a new literary format on this blog: an interview!

Interview with Lee Perry-Smith

Thanks for this opportunity Blochi! I'm a huge fan of your books and website :)

[ Rollover for scanned 3D geometry ]

The detail in your figure scans is astonishing, in terms of laser-accurate geometry and brilliant textures. What's your setup?

At the moment I use 115 DSLRs for the full body capture system and 48 DSLRs for the separate face (FACS) capture system. Mostly Canon's with about 8x Nikon D800's in there for full body reference shots. It was the first type of system of its kind, world wide, back in 2011.

Wow, that's a mighty impressive setup. Just collecting the images from all these cameras sounds like a lot of work. Are they all wired together?

Yes, they are all wired together to fire in sync. The system is designed to be 99.9% reliable. There are never any black images or missing images. This is integral for reconstruction. There are miles and miles of cable!

The cameras are triggered using custom opto-isolated trigger hubs (designed by Merry Hodgkinson) and Pocketwizard III remotes, with data streamed over a networked system, running Breeze Software and Agisoft Photoscan to process the point cloud data.

[ Panorama of Lee's VR Photostudio ]

Are you doing any special tricks with polarizers or flashes?

At the moment no, but I'm still running experiments trying to figure out the best method here. Linear polarizing film seems the best way to go. The end goal being better surface details.

I've also run many multi-lighting tests using flash lights and photoshop, similar to the Lighstage method, to acquire world space normals. Disney's mesoscopic emboss method is much quicker and far easier in comparison but the output is synthetic rather than true surface bump.

Highslide JS

My Oculus Rift is still on pre-order. The clips look already awesome on my laptop screen, but how would you describe the experience of the Oculus Rift?

You're in for a treat! It's hard to put into words. Really hard to describe until you see it, feel it.

It's a breathtaking moment when assets you've been working hard on for so long in a 2D format, you see now in 3D. Palmer Luckey, John Carmack, and the team at Oculus VR will go down in history as true pioneers. Huge thanks to them!

We're talking true 3D stereo VR here. Not 3D cinema stereo but true stereoscopic vision. You sense depth, vibrant colors and good contrast. You can adjust your focus (minus DOF changes) in VR, you really experience scale like you've never before.

And now you're populating it with realistic scans of real people!

Seeing scans of people in VR is amazing. It can feel strange at times, as if you're invading a person's space. Your sense of presence effects theirs and you can't make out if they are alive or dead because they don't move. Kind of like Wax works on steroids.

When you put your point of view in the position of the scanned person and look down, it's even stranger! Then you are that person. For a split second, you kind of sense what they must feel like living in that vessel, that was given to them to use i.e. being John Malkovich!! This could lead to some very interesting future experiences.

At the moment these are static scans, I'm working on movement but this will take time. I'm pushing for something past traditional motion capture, which I think is quite dated.

I understand that you put this demo together in Unity3D. How involved is this process, from a general 3D artist's point-of-view? Is it a steep learning curve?

I was terrified of learning Unity. I put it off for months/years and started with UDK (Unreal Development Kit) instead. UDK was easy to slip into but I found it had serious limitations. It felt bloated and I had many problems just making simple shaders work with the lights. For publishing self-illuminated or baked models it was great. One cool feature of UDK is it's loading system and ability to load in 8k textures. Something Unity struggles with.

But saying that, Unity is incredibly easy to learn. I initially learned from the Digital-Tutors site, then I dissected the Tuscany Demo supplied with the Oculus Dev Kit SDK.

How well does Unity work with the Oculus Rift?

Adding Rift integration was a walk in the park thanks to Unity and Oculus VR joining forces. Google is a treat with Unity searching because there are just thousands of help pages on all subjects. If you can think it up, someone else has inevitably already written a script for it. I've been lucky enough to make some good friends quickly in the VR/Unity community, who are incredibly talented at writing shaders and scripts. We are able to do things that once upon a time ILM had difficulty doing for film with offline rendering in the 90's, like the T-1000 effect. We can simulate something similar to that now, in Unity, in real-time, at 60-120 fps, in stereo, in VR!! ... It's quite mind blowing.

Highslide JS

What I find stunning is how well the environments interact with the characters shading. Is this baked, or is this image-based lighting in realtime?

This is the key, HDR and IBL, it's easy now thanks to the research that people like Paul Debevec did in the 90's, the work that you do on HDRLabs and people like Thomas Mansencal (sIBL GUI) and Bob Groothuis (Dutch Skies).

This paves the way for artists to be able to utilize advanced lighting techniques easily. Working with sIBL's is as simple as drag and drop in Unity. Also thanks to the great work the Marmoset Co guys do with Skyshop and Unity integration. This is what inspired me to use Unity after seeing their Skyshop tutorial:

So yes, the lighting in my demo is real-time, non-baked, all interactive. Some colored AO is baked during the studio scanning session but it's minimal. I'm also working on some new custom SSS shader integration.

So these are custom shaders?

We've (myself, Charles and drash) implemented a custom SSS solution with many different effects, like 2 specular lobe control, IBL (bent normals) calculated reflections, cavity Fresnel falloff, 8x multi-region micro multi-bump (blended with RGB maps, thanks to Steve), 8x multi-region cavity maps, deep red shadow scattering, GGX specular distance distribution, deep skin scatter for ear glow, 1x GI bounce, screen space reflections, colored AO and much more.
We're also working on a hair rendering solution for Unity, using IBL for lighting and strand reflections, 2 specular lobe control, as well as GI bounce and strand AO. This is a lot harder to implement.

I still use Marmoset for real-time testing but find Unity more open creatively because of scripting, sound, animation and interactive executable publishing.
Although I have a feeling UDK4 is going to be something very special!

Where do you see the creative potential of the Oculus Rift?

I can just imagine in 2-3 years, no more monitors, no more keyboard or mouse. Just a set top box (maybe even cloud based), a VR or AR head set, surround sound, and haptic gloves. We will have the ability to sculpt our favorite character or model, actually feel the sculpt as you work, on a virtual hilltop mountain side some where, at sunrise.

The Oculus Rift will be especially interesting for directors. I run my own small business so I've been a director (of a company) for a while and admire the work of film directors like Ridley Scott, Paul Thomas Anderson and James Cameron, just like most 3D artists do. The trick here is, the term director takes on a whole new meaning because of real-time and VR, which allows us to REALLY direct an experience. You get to do things creatively you couldn't ever do with film and this is where VR, 3D and 4D scans really kick in. You also get to add sound, music, visual and response interaction in areas you can't do with film. You can manipulate time, manipulate a users sense of being, drive 360 environments and many other things.

VR for creative experiences, not just for games but really immersive experiences is like the old Wild-West. It's a new frontier, rich for the pickings. At some point Film, VFX and Games will merge into a Directed Experience. There is no doubt.

What's the next thing on your personal list?

My direction with VR goes into the adult entertainment industry. An area few dare to venture, or have the balls to try. Other industry professionals have warned me of this direction, saying you won't get bookings from Disney et al, saying it will affect your reputation, etc! Which I am well aware of; my intention isn't to sell out but to follow my own dreams. The adult industry is dying financially and it needs a new lease of life, a new dimension. VR and scanning in this market can explore a lot of untapped areas and also really benefit many people. Give them experiences they may otherwise never have access to, due to physical disabilities, social issues, or sexual orientation. And this is just one small niche area of VR. The possibilities are endless.

Well, if the Oculus Rift is half as good as you're saying I can see this becoming a big hit. I, for one, await my pre-order with much suspense now.
Thank you very much for this interview, and good luck!

Thanks again Blochi, your book and your site are always a constant source of inspiration.


Highslide JS

Hope you liked this interview. Maybe we can do more of those in the future.

Visit the Infinite Realities website to see more good stuff, grab the Rift demo, and don't forget to download the sIBL-of-the-month that Lee generously donated.

View Comments

Welcome to the sIBL family, Blender!

Raise your glasses, brothers! Blender just joined the exquisite ranks of sIBL-supported programs.

Highslide JS

Blender extension builds a bridge to sIBL_GUI, enables quick lighting setups.
Screenshot by Jed Frechette.

Thanks to the development effort of Jed Frechette, Blender users can now enjoy the one-click environment lighting setup that the Smart IBL system is famous for. Integration is done thoroughly, by using sIBL_GUI as browser and central library management hub. If you already use sIBL_GUI in conjunction with 3dsMAX or Maya, the workflow with Blender will be familiar:
  • Pick an environment preset
  • Pick Blender as setup template
  • Click the Send to Software button
Photorealistic lighting couldn't be easier.

Download and installation instructions on wiki.blender.org
Say thanks or report bugs in our dev forum thread

To celebrate this historic event, enjoy this new free sIBL set of an iron bridge in full 16K glory:

View Comments
Next Page