Forum

Author Topic: Problem with using depth maps of smartphones  (Read 2188 times)

Kiesel

  • Sr. Member
  • ****
  • Posts: 332
    • View Profile
Problem with using depth maps of smartphones
« on: July 11, 2022, 03:31:30 PM »
Dear Agisoft team,

Today I tried to follow your article https://agisoft.freshdesk.com/support/solutions/articles/31000162212-smart-cameras-with-depth-sensor-data-processing with photos made with the Google Pixel 4a 5G, but when I import those photos with 'File/Import/Import Laser Scans...' and then choose all photos with 'Dynamic Depth Image (*.jpg *.jpeg) after that Metashape opens surprisingly the same folder again, which now seems emty, to choose a folder. When I go one folder up to choose the same folder, with the photos in it, I get an error message (see appendix) that Metashape can't read the embedded depth data. As result none photos are imported.

To make sure that the photos have embedded depth data I have imported one in Photopea, with the result, that they have 4 layers (see also appendix)
Layer 1 white filled depth data layer
Layer 2 depth map
Layer 3 original photo
Layer 4 with depth data processed photo

In advance to any help

best regards,

Kiesel
« Last Edit: December 08, 2022, 03:46:56 PM by Kiesel »

johnnokomis

  • Newbie
  • *
  • Posts: 29
    • View Profile
Re: Problem with using depth maps of Google Pixel smartphones
« Reply #1 on: July 14, 2022, 07:18:52 PM »
I have the exact same problem except I'm using an LG V60 with the ToF sensor. I've hunted the internet, posted in other forums, reddit, GitHub and nobody can offer any suggestions. I know the depth data is there because I can extract it using other methods.

Kiesel

  • Sr. Member
  • ****
  • Posts: 332
    • View Profile
Re: Problem with using depth maps of Google Pixel smartphones
« Reply #2 on: July 14, 2022, 11:51:01 PM »
The Agisoft team is working on it.
So there is some hope.  :)
Perhaps you send them some photos with depth maps made with your phone too to help to solve the problem.

Kiesel
« Last Edit: July 15, 2022, 12:22:35 AM by Kiesel »

johnnokomis

  • Newbie
  • *
  • Posts: 29
    • View Profile
Re: Problem with using depth maps of Google Pixel smartphones
« Reply #3 on: July 15, 2022, 04:07:16 AM »
Here's 196 photos with depth maps in the EXIF. These would make a great model if only the embedded data could be taken advantage of.

https://photos.app.goo.gl/xRtpjWqbd24BqgN38

I hope this helps.

Kiesel

  • Sr. Member
  • ****
  • Posts: 332
    • View Profile
Re: Problem with using depth maps of Google Pixel smartphones
« Reply #4 on: December 08, 2022, 12:22:48 PM »
Dear Agisoft team,

Yesterday we tried to import photos with depth maps made with the iPhone 13 Pro in portrait mode. The depth maps looks much better than those of the Google Pixel 4a 5G.
But it wasn't possible to import those photos in both Metashape versions 1.84 and 2.0 beta via 'Import/ Import Depth Images...'. We get an error (see Appendix) that the
Quote
'photo... contains depth data with "relative" accuracy value. Only photos with "absolute" accuracy from True Depth camera are supported.'

Why is that? In the freshdesk article https://agisoft.freshdesk.com/support/solutions/articles/31000162212-smart-cameras-with-depth-sensor-data-processing is stated:
Quote
Depth sensor is installed on the range of Apple devices with TrueDepth camera, that is the front camera of iPhone starting from the 10th version (except SE) and the front camera of iPad pro.

Even there seems an error that the front camera is mentioned in the freshdesk article and not the back camera with it's ToF-sensor, shouldn't it work?

I have uploaded these photos here, so that you can test them: https://drive.google.com/file/d/1IH27FCKRliK_TIwP4309y29cyGTSFyqC/view?usp=share_link

Please help!

Best regards,

Kiesel
« Last Edit: December 08, 2022, 03:47:48 PM by Kiesel »

johnnokomis

  • Newbie
  • *
  • Posts: 29
    • View Profile
Re: Problem with using depth maps of smartphones
« Reply #5 on: December 09, 2022, 06:24:24 AM »
This isn't addressing the issue you're having...

 It seems us two are determined to help Agisoft implement this depth map feature in the best way possible. I took some test shots today and wanted to show what my LG V60's portrait mode looked like in Photopea and Metashape. It's sad that when imported into Lightroom or Photoshop it's just a single layer photo and you'd never know about the depth map layer. A free browser based editor shows you 3 extra layers within the image though. I haven't had a chance to collect a whole dataset to see what the outcome is but will soon.

johnnokomis

  • Newbie
  • *
  • Posts: 29
    • View Profile
Re: Problem with using depth maps of smartphones
« Reply #6 on: December 09, 2022, 06:25:48 AM »
Metashape

Kiesel

  • Sr. Member
  • ****
  • Posts: 332
    • View Profile
Re: Problem with using depth maps of smartphones
« Reply #7 on: December 09, 2022, 09:16:02 PM »
Hi johnnokomis,

are you able to import the photos with depth maps in Metashape or are those depth maps generated in Metashape? The depth maps in Metashape are color coded and not gray coded depth maps. In a former test, I was able to import depth maps and they were color coded as well.


Kiesel

johnnokomis

  • Newbie
  • *
  • Posts: 29
    • View Profile
Re: Problem with using depth maps of smartphones
« Reply #8 on: December 09, 2022, 10:38:23 PM »
Yes those depth maps were imported with the original image. I was expecting a black and white depth map, like Photopea shows too. I guess Metashape is upscaling the depth map upon import automatically..? The ToF sensor has a much lower resolution of the 16 MegaPixel RGB photo.
« Last Edit: December 12, 2022, 06:01:57 PM by johnnokomis »

Kiesel

  • Sr. Member
  • ****
  • Posts: 332
    • View Profile
Re: Problem with using depth maps of smartphones
« Reply #9 on: December 12, 2022, 10:27:01 AM »
Hi johnnokomis,

thanks!

I don't think that the depth maps of the iPhone Pro or the ones of your LG60V are a result of the ToF-sensor alone, I think they add some other calculations for example from dual pixel.



Kiesel
« Last Edit: December 12, 2022, 01:05:13 PM by Kiesel »

Kiesel

  • Sr. Member
  • ****
  • Posts: 332
    • View Profile
Re: Problem with using depth maps of smartphones
« Reply #10 on: December 20, 2022, 09:31:23 AM »
For all interested,

According to the answer of the Agisoft team unfortunately only the front camera of the iPhone can be used in Metashape at this time, which isn't very practical because you don't see what you are photographing unless you have a second screen for that.
My hope was that the backcamera and the ToF-sensor of the iPhone can be used in a form like in Apples 3D Scanner app, Recon3Ds https://www.recon-3d.com/ or Dotproducts Dot3D for iOS https://www.dotproduct3d.com/ios.html. So there is still some hope that that will be the case in a future version.

Best regards

Kiesel
« Last Edit: December 20, 2022, 09:49:12 AM by Kiesel »

johnnokomis

  • Newbie
  • *
  • Posts: 29
    • View Profile
Re: Problem with using depth maps of smartphones
« Reply #11 on: December 21, 2022, 05:47:45 AM »
I don't understand why this wouldn't work? The iPhone LiDAR sensor has the same resolution as my V60's? I know Apple has many different 3D Scanning apps to choose from. Android has only one that takes advantage of depth sensors. How does Apple handle the LiDAR metadata in photos? Maybe that's where the issue is, I don't know.

Source- https://www.techinsights.com/blog/apple-iphone-14-image-sensor-preliminary-analysis#:~:text=The%20iPhone%2014%20Pro%2FMax%20LiDAR%20camera%2C%20with%20a%200.3,Max%20LiDAR%20(Figure%2010).