Forum

Author Topic: Duplicating chunks RAM problem  (Read 4531 times)

igor73

  • Full Member
  • ***
  • Posts: 228
    • View Profile
Duplicating chunks RAM problem
« on: February 16, 2015, 02:28:06 AM »
I have a large arbritatrary project with 3000 images. Everything is aligned and dense cloud is generated already.  I have 64GB RAM and for meshing i have to split to chunks.  In previous models i have duplicated the chunk 4 times if i need to split in 4.  I then resize the region and remove unnecessary cameras from the duplicated chunks.  Problem with this project is that its so huge that duplicating a chunk takes up about 7GB of RAM.  I tried removing all my cameras and also tried cropping the dense cloud but this only saved about 0.2 GB.  Now with 4 duplicates i have used 30GB before even starting to mesh.  Not a great start. 

So what are my options?  One option would be to not duplicate the chunk and just rezize region process then export the mesh, rezize again process and export etc.  Problem  is thats anoying because i cant set up a batch process that way.  I just want to leave the computer running and batch process all the chunks.  Also to merge the models again might be problematic this way? 
Any options i have missed?

And a feature request.  Make it possble to make several regions in a single chunk so we don´t have to use a work around for generating mesh in chunks.   BTW im using the stadard version.  Maybe Pro has this? 
« Last Edit: February 16, 2015, 02:54:29 AM by igor73 »

igor73

  • Full Member
  • ***
  • Posts: 228
    • View Profile
Re: Duplicating chunks RAM problem
« Reply #1 on: February 16, 2015, 03:13:29 AM »
Flat line, looks like i made it. 


igor73

  • Full Member
  • ***
  • Posts: 228
    • View Profile
Re: Duplicating chunks RAM problem
« Reply #2 on: February 16, 2015, 03:17:15 AM »
3 min later. 

Mfranquelo

  • Full Member
  • ***
  • Posts: 171
    • View Profile
Re: Duplicating chunks RAM problem
« Reply #3 on: September 12, 2015, 08:17:43 PM »
Hello Igor,

How did you sort out your problem with RAM?

I am currently processing a project where I have about 400 million dense point cloud which results in a 50 million polygon model.
I can´t process this amount of data on my 64gb computer (the mesh generation process runs forever and usually crashes because of low memory readings)

I've just tried duplicating the chunk in two and removing half of the dense cloud in one chunk and leaving the other half on the other chunk. After that I resize the bounding box but I am not removing "unnecessary cameras". Whats that for? does it make the process to run faster?

Thanks,
Manuel.
« Last Edit: September 12, 2015, 08:54:31 PM by Mfranquelo »

Mohammed

  • Full Member
  • ***
  • Posts: 191
    • View Profile
Re: Duplicating chunks RAM problem
« Reply #4 on: September 14, 2015, 03:31:25 PM »
I always face the same problem even with pro edition may be the best solution to use the split chunks Python script, or like Manuel said.

sc

  • Jr. Member
  • **
  • Posts: 59
    • View Profile
Re: Duplicating chunks RAM problem
« Reply #5 on: September 14, 2015, 06:03:16 PM »
Thats the safest way to do it, I always split the chunk into parts if the dense cloud is over 100m points. Just to be safe, the mesh generation is much faster then as well.