Forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - aniket@aus.co.in

Pages: 1 [2]
16
Python and Java API / How agisoft gets the origin for a 2D image
« on: April 12, 2022, 10:47:46 AM »
I'm trying to put gcp tagging information using the python api but for agisoft origin starts from different edges for different images. Which is the parameter that decides where this origin will start?

I've attached three images in which origin are at Bottom right, Top left and Top right respectively.


17
General / Addition of GCP images data using Python API
« on: March 08, 2022, 02:15:20 PM »
We want to first load GCP data in the form of comma separated values,

GCP label, X, Y, Z

here X, Y, Z are longitude, latitude and altitude.

And further we want to add GCP points to the chunk using the API in form of CSV again.

GCP label, Image label, X, Y

here X and Y pixel coordinates of the GCP in image.


What is the best possible way to do it?

18
Python and Java API / Re: Even with GPU the processing takes too long
« on: January 28, 2022, 11:56:55 AM »
I'm using GRID drivers with in this one. Does agisoft support GRID drivers?

19
Python and Java API / Getting error while using cloud scripts
« on: January 28, 2022, 11:33:58 AM »
While trying to install agisoft cloud-scripts on g4dn.4xlarge instance in aws with ubuntu 20.04 installed on it. I'm getting an error. Please find the attached logs

20
Python and Java API / Re: Even with GPU the processing takes too long
« on: January 28, 2022, 10:34:59 AM »
PFA

21
Python and Java API / Even with GPU the processing takes too long
« on: January 21, 2022, 09:12:30 AM »
I'm using metashape command line tool on g4ad.4xlarge in AWS. 

Every step is fast but build DEM is really slow.

loaded elevation data in 0.000105 sec
base level size: 80091 x 90724
base level interpolated in 7.6594 sec
loaded elevation data in 0.00014 sec
base level size: 80091 x 90724
base level interpolated in 7.67838 sec
loaded elevation data in 0.000106 sec
base level size: 80091 x 90724
base level interpolated in 7.67397 sec
loaded elevation data in 0.000103 sec
base level size: 80091 x 90724
base level interpolated in 7.67114 sec
loaded elevation data in 0.000105 sec
base level size: 80091 x 90724
base level interpolated in 7.67829 sec
loaded elevation data in 0.000104 sec
base level size: 80091 x 90724
base level interpolated in 7.70326 sec
loaded elevation data in 0.000104 sec
base level size: 80091 x 90724
base level interpolated in 7.71685 sec
loaded elevation data in 0.000105 sec
base level size: 80091 x 90724
base level interpolated in 7.68442 sec
loaded elevation data in 0.000162 sec
base level size: 80091 x 90724
base level interpolated in 7.71292 sec
loaded elevation data in 0.000105 sec
base level size: 80091 x 90724
base level interpolated in 7.65465 sec
loaded elevation data in 0.000109 sec
base level size: 80091 x 90724
base level interpolated in 7.66969 sec
loaded elevation data in 0.000105 sec
base level size: 80091 x 90724
base level interpolated in 7.7343 sec
loaded elevation data in 0.000115 sec

This all takes too long than expected.

22
General / Re: Agisoft Metashape 1.8.0 pre-release
« on: December 06, 2021, 12:50:12 PM »
I'm trying to build dem using the python API and it works fine with metashape 1.7 but not with 1.8. I couldn't find anything that relates to this in change log. I've attached logs for a small dataset that throws Empty dem error.

23
Bug Reports / Re: std: bad_alloc error inside a ubuntu docker container
« on: December 02, 2021, 02:42:01 PM »
I can't find link to download Metashape 1.8

24
Bug Reports / Re: std: bad_alloc error inside a ubuntu docker container
« on: December 02, 2021, 08:50:51 AM »
here are the logs attached

25
Bug Reports / std: bad_alloc error inside a ubuntu docker container
« on: December 01, 2021, 09:34:14 PM »
I'm trying to run metashape cli on an ubuntu docker container with 68 GB of RAM. I'm not sure if it is an out of memory error or not. I've also added 20 Gb swap also. Even then it fails with the same error. The dataset has 250 images. Looking at the memory usage profiles it doesn't seem to be a out of memory issue. Any ideas for the possible cause of failure.

Pages: 1 [2]