Forum

Author Topic: Metashape on AWS EC2 and S3  (Read 3015 times)

freetec1

  • Newbie
  • *
  • Posts: 7
    • View Profile
Metashape on AWS EC2 and S3
« on: February 22, 2022, 07:34:46 PM »
Hello,

i am currently working on implementing Metashape on AWS EC2 and process the data via API. My pipeline is running on my own Server without a problem but i am in need for faster processing and thus I want to move everything in the Cloud.

The Images are on S3 and are mounted via S3FS as a local volume. The project is stored in S3 as well. Storing the project localy works fine, but not when storing it directly in S3.

The pipeline is running in general but there seems to be a problem when reloading the previously stored pointcloud.
Code: [Select]
LoadProject: path = /mnt/photogrammetry_output/Testdatensatz/Testdatensatz.psx
loaded project in 0.042141 sec
Error: Bad local file header signature
OSError: Can't load projections: /mnt/photogrammetry_output/Testdatensatz/Testdatensatz.files/0/0/point_cloud.3/point_cloud.zip

I tried to unzip the "point_cloud.zip" but got the same error with the bad header signatur. As it seems Metashape has a problem when writing zip-files on S3 and reopening them later. Is this a problem related to Metashape or S3? Has somebody tried something simmilar or can give me a hint to solve this problem?

Kind Regards

Moritz



Alexey Pasumansky

  • Agisoft Technical Support
  • Hero Member
  • *****
  • Posts: 14813
    • View Profile
Re: Metashape on AWS EC2 and S3
« Reply #1 on: February 24, 2022, 01:40:31 PM »
Hello Moritz,

What version of Metashape you are using? And if you can reproduce similar problematic behavior without Metashape, for example, by copying some large archives to storage or packing the data to archive from node to storage directly?

Best regards,
Alexey Pasumansky,
Agisoft LLC

freetec1

  • Newbie
  • *
  • Posts: 7
    • View Profile
Re: Metashape on AWS EC2 and S3
« Reply #2 on: February 24, 2022, 03:19:31 PM »
Hello,

thanks for the reply, i am using Version 1.8.1 build 13915 (64 bit)

I did some testing myself in the meantime.

As it seems right now it is not 100% consistent (one time the zip-file was fine but i still got the same error from metashape). Most of the time the zip-file seems "corrupt" and would not unzip correctly.
Code: [Select]
unzip point_cloud.zip
Archive:  point_cloud.zip
 extracting: tracks.ply
points0.ply:  mismatching "local" filename (p0.ply),
         continuing with "central" filename version
 extracting: points0.ply
file #3:  bad zipfile offset (local header sig):  3295365
file #4:  bad zipfile offset (local header sig):  3582151
file #5:  bad zipfile offset (local header sig):  3962441
file #6:  bad zipfile offset (local header sig):  4345579
file #7:  bad zipfile offset (local header sig):  4734349
file #8:  bad zipfile offset (local header sig):  5123327
file #9:  bad zipfile offset (local header sig):  5509905
file #10:  bad zipfile offset (local header sig):  5896419
file #11:  bad zipfile offset (local header sig):  6286917
file #12:  bad zipfile offset (local header sig):  6684119
file #13:  bad zipfile offset (local header sig):  6997049
file #14:  bad zipfile offset (local header sig):  7323020
file #15:  bad zipfile offset (local header sig):  7725119
file #16:  bad zipfile offset (local header sig):  8112450
file #17:  bad zipfile offset (local header sig):  8491589
file #18:  bad zipfile offset (local header sig):  8872104
file #19:  bad zipfile offset (local header sig):  9254475
file #20:  bad zipfile offset (local header sig):  9639902
file #21:  bad zipfile offset (local header sig):  10032145
file #22:  bad zipfile offset (local header sig):  10429364
file #23:  bad zipfile offset (local header sig):  10734983
file #24:  bad zipfile offset (local header sig):  11040810
file #25:  bad zipfile offset (local header sig):  11440029
 extracting: p23.ply
 extracting: p24.ply
 extracting: p25.ply
 extracting: p26.ply
 extracting: p27.ply
 extracting: p28.ply
 extracting: p29.ply
 extracting: doc.xml
Using tar on the other hand works just fine so there seems to be a problem with using unzip.

Processing the project localy on the EC2 instance, copying in to S3 and reopening and also unzipping the .zip-files works fine. My theorys are that perhaps that s3fs is somehow blocking the file during the upload process and metashape cant acces it in the meantime or it has to do with the "multipart" upload from s3fs which divides bigger uploads in multiple smaller parts to store it on S3.

I will try your ideas and will come back to this later.

freetec1

  • Newbie
  • *
  • Posts: 7
    • View Profile
Re: Metashape on AWS EC2 and S3
« Reply #3 on: February 24, 2022, 03:39:09 PM »
I just tested to zip the projectfolder from EC2 local to the mounted S3 bucket, this should be the same process metashape is doing, right? This worked fine without a any problems. also unzipping it on S3. Time for zipping was about 20 seconds.

What i experienced on the other hand is a rather slow unzipping process (like 5 mins) as every file is being uploaded seperatly on S3, perhaps this is the issue, some kind of performance/responsivness which metashape expects but doesn´t get?

Is there some way of telling metashape to store the pointcloud etc. in only one file rather that multiple smaller ones?