Forum

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - sergio

Pages: [1]
1
Hi,

We use Docker containers as a core part of our workflow. We often deactivate an MS license on a container, and then reactivate on another container. But, what would happen if we activate a license on a container, and then remove it without deactivating the license first? Are we doomed? This would be a very complicated issue for our product.

Best,

Sergio

2
Hi,

In our application we are running Metashape tasks in a docker container.
I noticed that every time I stop and start the docker, I get a license error (wrong host for license), so I have to deactivate and reactivate it again. This is not a problem for testing, but it will be a problem when deploying our app. What's the best way of running a containerized application? Should I activate the license in the docker before running the application and then deactivate it after the application is done? I would prefer not to include the license number in the code. Also, if we use this option, we will not be able to run our tasks in parallel (license will be limited to one docker instance).

Any advice on this?

3
And this one is from the GUI
Code: [Select]
2021-07-15 11:04:47 LoadProject: path = /Users/sergiobromberg/Documents/RECON/Research/Data/Gray/ms_images_dense.psz
2021-07-15 11:04:47 Loading project...
2021-07-15 11:04:47 loaded project in 0.174483 sec
2021-07-15 11:04:47 Finished processing in 0.175545 sec (exit code 1)
2021-07-15 11:06:14 Finished processing in 0.012221 sec (exit code 1)
2021-07-15 11:06:56 Finished processing in 0.01292 sec (exit code 1)
2021-07-15 11:07:08 BuildModel: quality = Ultra high, depth filtering = Mild, PM version, source data = Dense cloud, surface type = Arbitrary, face count = Low, interpolation = Disabled, vertex colors = 0
2021-07-15 11:07:08 Generating mesh...
2021-07-15 11:07:08 grid size: 768x768x768
2021-07-15 11:07:08 block size: 256x256x256
2021-07-15 11:07:08 resolution: 0.0104193
2021-07-15 11:07:08 point count: 568172
2021-07-15 11:07:08 initializing... done in 0.168525 sec
2021-07-15 11:07:08 generating surface... done in 5.29774 sec
2021-07-15 11:07:13 generated 0 vertices, 0 faces
2021-07-15 11:07:13 Finished processing in 5.52177 sec (exit code 0)
2021-07-15 11:07:13 Error: Empty surface

4
Hello Alexei, this is my output when trying from the API:

Code: [Select]
ImportPoints: path = /workspace/data/Gray/ms_images_replaced_filtered_all.ply, format = PointsFormatPLY
Working in temporary directory /tmp/import_points.tmp.Zxj9b8
Importing point cloud in memory. Save project in .psx format for big file import.
Analyzing points...
processing 1715416 points...
Importing points...
processing 1715416 points...
Peak memory used: 74.77 MB at 2021-07-15 01:54:43
SaveProject: path = /workspace/data/Gray/ms_images_dense.psz
saved project in 0.700668 sec
BuildModel: source data = Dense cloud, surface type = Arbitrary, face count = High, interpolation = Enabled, vertex colors = 1
Grid size: 1148 x 1148 x 1148
Tree depth: 11
Tree set in 2.42821s (0 points)
Leaves/Nodes: 32768/37449
Traceback (most recent call last):
  File "/workspace/aseeo-research/depth_debug/metashape_build_from_pointcloud.py", line 63, in <module>
    chunk.buildModel(surface_type=Metashape.Arbitrary, source_data=Metashape.DenseCloudData, interpolation=Metashape.EnabledInterpolation)
Exception: Empty surface

5
Hi,

I am trying build a mesh from an imported pointcloud. I have correctly defined the bounding box so that the points are inside it (I have around 1million points). With any interpolation option, I get the "empty surface" error message. I've tried both using the GUI and by repeating this process on the python API, with no success...

Any help would be greatly appreciated.

6
Python and Java API / Proper workflow for depthmap replacement
« on: July 01, 2021, 04:18:02 AM »
Hi,

In our application we want to use Metashape with depthmaps generated externally. So far our processing is the following:

1. We build our depthmaps in the "standard" way:

addPhotos>>matchPhotos>>alignCameras>>buildDepthMaps

2. Then we change the depthmaps using:
[pseudocody...]
Code: [Select]
for cam, our_depth_image in zip(cameras, our_depths):
   
    if cam in chunk.depth_maps.keys():
        ms_dm = chunk.depth_maps[cam]

        if replace_depth:

            byte_image = our_depth_image .tobytes()
            ms_image = ms_dm.image()
            ms_dm.setImage( ms_image.fromstring(
                byte_image,
                w, h,
                channels=ms_image.channels,
                datatype=ms_image.data_type))


    else:
        print(cam)
        print("Camera has notdepthmap")   

3. Finally, we build the model:

Code: [Select]
chunk.buildModel(surface_type=Metashape.Arbitrary, source_data=Metashape.DepthMapsData, interpolation=Metashape.EnabledInterpolation)
When the model is saved to .psz we see the depthmaps have been replaced correctly, however, the model does not have the expected results. I wonder if we are missing something in the process (e.g. depth scaling, or some other processing step that should be done in this kind of workflow)

Best,

Sergio

Pages: [1]