Forum

Author Topic: Python API Doesn't Use GPU for pickPoint  (Read 3184 times)

Ahmed Maher

  • Newbie
  • *
  • Posts: 1
    • View Profile
Python API Doesn't Use GPU for pickPoint
« on: February 10, 2022, 03:57:00 PM »
Hello All,

I am using Metashape Pro's Python API to get the geo-coordinate of each pixel inside the input images (by input images I mean images captured by the physical camera). I reused code already found on this forum. However, when I have a mesh with a large number of faces, I found that chunk.model.pickPoint takes ~ 30 ms per point on a somewhat powerful machine. This makes it totally unusable for a large number of pixels. When I checked the CPU and GPU usage, I found that the GPU is never used even if I explicitly change the GPU mask inside the script while the CPU usage is 100%. As far as I know, the pickPoint function is a ray intersection algorithm that should benefit dramatically from a GPU (the machine has an RTX GPU too). So, how can I make the Python API use the GPU for pickPoint? Is it not currently supported? If it is not supported on Python, is it supported inside the Java API?

Jordan Pierce

  • Newbie
  • *
  • Posts: 23
    • View Profile
Re: Python API Doesn't Use GPU for pickPoint
« Reply #1 on: November 15, 2022, 06:46:23 PM »
Finding the same thing sadly. It'd be great if pickPoint was a process ran on a GPU.

For others who are also bummed about the speed of pickPoint, I recommend using a low resolution tiled model instead of mesh or dense point cloud, as they are significantly slower.