Forum

Author Topic: De-lighting experiment  (Read 1861 times)

Mak11

  • Sr. Member
  • ****
  • Posts: 374
    • View Profile
De-lighting experiment
« on: November 13, 2019, 01:38:58 PM »
Just had a go for the first time with the De-lighter. I usually do de-lighting manually in Substance Designer/Painter because the source photos are well lit with no hard-shadows. But this time I had some pretty abysmal shots done in a museum with my smartphone with the subject lit by 3 different light sources with varying temperature so I decided to give the de-lighter a go.

The first step before any photogrammetry work was to tweak the photos in Lightroom. Once reconstruction was done and low poly model finished/UV Mapped etc.. I first removed shading & AO and then one pass to remove hard shadows with highlights & color artifacts suppression set to high.. Result was relatively good but not perfect either which was expected given the source. Painted the rest out in Substance Painter.

Sketchfab:

https://skfb.ly/6OJ7r

Source:



De-lighted:



Renders (horribly compressed images sorry):










MAK
 
« Last Edit: November 13, 2019, 02:15:35 PM by Mak11 »

Mak11

  • Sr. Member
  • ****
  • Posts: 374
    • View Profile
Re: De-lighting experiment
« Reply #1 on: November 14, 2019, 06:51:55 PM »
Speaking of de-light.. this now method (publish in June 2019) called "Sea-Thru: A Method for Removing Water From Underwater Images" could/will be a must have for underwater photogrammetry (it's essentially partly based on the same principals which means that it should/could be directly integrated into photogrammetry software like Metashape).



https://www.youtube.com/watch?v=ExOOElyZ2Hk


http://openaccess.thecvf.com/content_CVPR_2019/html/Akkaynak_Sea-Thru_A_Method_for_Removing_Water_From_Underwater_Images_CVPR_2019_paper.html
Quote
Abstract

Robust recovery of lost colors in underwater images remains a challenging problem. We recently showed that this was partly due to the prevalent use of an atmospheric image formation model for underwater images. We proposed a physically accurate model that explicitly showed: 1) the attenuation coefficient of the signal is not uniform across the scene but depends on object range and reflectance, 2) the coefficient governing the increase in backscatter with distance differs from the signal attenuation coefficient. Here, we present a method that recovers color with the revised model using RGBD images. The Sea-thru method first calculates backscatter using the darkest pixels in the image and their known range information. Then, it uses an estimate of the spatially varying illuminant to obtain the range-dependent attenuation coefficient. Using more than 1,100 images from two optically different water bodies, which we make available, we show that our method outperforms those using the atmospheric model. Consistent removal of water will open up large underwater datasets to powerful computer vision and machine learning algorithms, creating exciting opportunities for the future of underwater exploration and conservation.

Paper PDF: http://openaccess.thecvf.com/content_CVPR_2019/papers/Akkaynak_Sea-Thru_A_Method_for_Removing_Water_From_Underwater_Images_CVPR_2019_paper.pdf


MAK