I am running Network with two machines, an Alienware Aurora (GTX 1080, i7-7700K, 32GB RAM) and a Mac Pro 2013 (6 Core Xeon E5, Dual Radeon FireProD500, 48GB RAM). The Aurora is acting as a server, client and node while the Mac Pro solely acts as a node. The machines are currently aligning the cameras of a project of some 6300 photos and c. 50GCP. Accuracy is set to highest.
The Mac Pro is using its 2 AMD Radeon to full capacity whilst the CPU is hardly doing anything. RAM use is at c. 25%.
The Aurora uses 5% of its GPU and 14% CPU, RAM use is at 50%.
Processing is running now at 41.9% after just over 26 hours.
Why is the Aurora a) not using all its RAM to feed the GPU/CPU and b) why is it only using so little of its processing power?