Forum

Author Topic: Dual GTX1060 GPU Processing Issues - Photoscan  (Read 3513 times)

inked88

  • Newbie
  • *
  • Posts: 4
    • View Profile
Dual GTX1060 GPU Processing Issues - Photoscan
« on: April 19, 2018, 09:41:57 AM »
Hey all, using photoscan and it has been working really well.  Just having issues getting photoscan to use the GPUs during the point cloud generation stages.  Photoscan seems to only use the CPU.  We specifically built this computer with multiple GPUs to utilize the faster processing times.  The specs are:
  • AMD Ryzen Threadripper 1900 X 8 core processor
  • Two NVIDIA GeForce GTX 1060 3gb GPUs
No matter what I try, I cannot get photoscan to use the GPUs, let alone both GPUs in parallel.  I have tried the following:
  • Have checked GPU drivers (latest)
  • Have checked Photoscan updates and version (latest)
    • Have disabled 4 of the 16 cores during processing through the task manager and setting the affinity
    • Have selected the GeForce cards in the GPU preferences menu in Photoscan and unticked the Use CPU when using GPU accelerating processing.

    I have not had any luck disabling the cores using OpenCL.  By disabling the cores using set affinity four cores are not being used by Agisoft and this has been verified.  The other cores run pretty close to 99%.

    Any advice on getting Agisoft to use the GPUs instead of the CPU's when it can?  Also is there another way to disable the cores which may change things?  I have also opened NVidia control panel and selected 'All' under CUDA - GPUs.  Under program settings for Photoscan CUDA - GPUs is set to all and OpenGL rending GPU is set to one of the two 1060 cards.
    Is there anything else to get Photoscan to use the GPUs?  Preferably using both in parallel?  Thanks in advance.

inked88

  • Newbie
  • *
  • Posts: 4
    • View Profile
Re: Dual GTX1060 GPU Processing Issues - Photoscan
« Reply #1 on: April 20, 2018, 06:56:23 AM »
Any suggestions?  Tried rolling back the version of agisoft to 1.2.5 and using the OpenCL options in the settings.  Still won't use the GPUs and even using those options, it seems to want to use all cores.

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 14846
    • View Profile
Re: Dual GTX1060 GPU Processing Issues - Photoscan
« Reply #2 on: April 20, 2018, 01:54:01 PM »
Hello inked88,

PhotoScan 1.4 supports GPU utilization for the following stages:
- image matching (the first phase of Align Photos stage, includes feature points detection, preselection and image matching itself),
- depth maps calculation (first phase of Build Dense Cloud stage and experimental mesh generation procedures ),
- refine mesh operation.

All other stages use CPU only.

Also note that you do not need to use system settings (like Affinity dialog) to disable CPU cores.
Best regards,
Alexey Pasumansky,
Agisoft LLC

inked88

  • Newbie
  • *
  • Posts: 4
    • View Profile
Re: Dual GTX1060 GPU Processing Issues - Photoscan
« Reply #3 on: April 23, 2018, 03:44:26 AM »
Thanks for the reply.  I am aware it doesn't use the GPU for everything.  However in both 1.25.xxx and 1.4.xx, it seems to use the CPU instead of the GPU.  I have attached a screen shot of what I am seeing with Agisoft and the task manager.  The screenshot is the newest version of Agisoft. 

Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 14846
    • View Profile
Re: Dual GTX1060 GPU Processing Issues - Photoscan
« Reply #4 on: April 23, 2018, 08:55:18 AM »
Hello inked88,

According to your screenshot current processing stage in PhotoScan is "dense cloud filtering" - it's CPU-only stage.

The previous stage - depth maps generation, also according to your screenshot, has been completed on your dual-GPU setup.
Best regards,
Alexey Pasumansky,
Agisoft LLC

inked88

  • Newbie
  • *
  • Posts: 4
    • View Profile
Re: Dual GTX1060 GPU Processing Issues - Photoscan
« Reply #5 on: April 24, 2018, 01:57:33 AM »
Interesting, thanks Alex.  We read the post from Puget Systems and were under the impression that the filtering depth maps part of the dense cloud generation was also expected to use the GPU.  Might do some more testing and checking across it all.

Appreciate the response.  Thanks.