Running NoMachine Server 6.6.8 on an NVIDIA Tegra TX2 module (arm64) running Ubuntu 16.04.6 LTS (Linux4Tegra R28 (release), REVISION: 2.1). The same version is running on the client machine, on Kubuntu 18.10. The Tegra board has no display connected, it’s showing a virtual display.
Right after booting the server, NoMachine+XOrg CPU usage seems normal (around 10%). However, after some minutes it starts growing (nxnode.bin takes around 60%, nxcodec.bin takes 88%, Xorg around 30%).
Is there any additional information I could provide to help find the cause of such high CPU usage?
Can you check if the problem still appears if you disable Xserver and allow NoMachine to create its own virtual frame buffer?
To do that – stop the running Desktop Manager (open terminal or connect there via SSH and execute command:sudo service lightdm) and restart nxserver (sudo /usr/NX/bin/nxserver --restart). Now you should be able to connect there via NoMachine and create a session.
Sorry for my late answer, I was expecting to get an email when I got a message, so I didn’t read yours until I checked the forums!
If I kill the Desktop Manager and restart NoMachine, when I try to connect again I shortly get a black screen and then get kicked back to the ¨Recent connections¨ screen on the client (no matter how many times I try to connect). Rebooting the server board allows me to connect once again, with the original Desktop Manager session.
If you disable again Xserver and would be still black screen please collect logs sudo /usr/NX/bin/nxserver --debug --collect and send them and .xserver-errors file (should be in the home directory) to forum[at]nomachine[dot]com.