Performance over LAN

Forum / General Discussions / Performance over LAN

Viewing 7 posts - 1 through 7 (of 7 total)
  • Author
    Posts
  • #32970
    hellcats
    Participant

    A very pleasant use-case for NoMachine is using the Mac client to connect to PC and Linux boxes over a gigabit LAN. I create multiple full-screen desktops on the Mac and capture the keyboard and mouse in NoMachine. I set quality at max with a specific frame rate of 50 or 60 hz.  The effect is nearly indistinguishable from directly using the remote computers. You can flick between systems using 4-finger swipe on the Mac trackpad.  Lovely.

    But (there’s always a but), the high frame rate will slowly degrade after a few minutes of use, settling at around 24-27 FPS (measured using a digital camera). Not bad, but not as great as just after first connecting. Disconnecting and reconnecting restores the high frame rate once again, but only for a few more minutes. Max. network traffic is about 40Mbps when dragging a window rapidly around the screen, so bandwidth isn’t really an issue.  I presume that NoMachine is applying some kind of adaptive flow-control to smooth-out the bandwidth, but on a LAN I really don’t need that. I would really enjoy having the option to disable the adaptive algorithm so that the high frame rate is retained at all times.

    My config.

    NoMachine vers 7.4.1 on all systems.

    • 15″ MacBook Pro with Radeon Pro 560 4 GB (with 3440×1440 external monitor)
    • Linux box #1: Ubuntu 20.04; AMD 5800x, nVidia 3090, 64 Gb RAM
    • Linux box #2: Ubuntu 18.04; Intel 6-core, nVidia 1080Ti, 64 Gb RAM
    • PC: Win 10; Intel 6-core (5820), nVidia 980 Ti, 64 Gb RAM
    #33047
    fra81
    Moderator

    Hi,

    this is strange indeed. Did you have the chance to verify if this behaviour only occurs when connecting from the Mac? Does using the integrated monitor instead of the external one change anything? And would you run a debug package to gather more information?

    #33079
    hellcats
    Participant

    I performed some more tests this weekend and can replicate the problem from the PC client as well as the Mac client connecting to a Linux box. I can more easily get it to switch into “slow mode” using the Linux machine with the NVidia 3090 than the one with the 1080Ti (fyi the 3090 machine is an AMD 5800X and the 1080Ti machine is an intel 6 core i7).  I also tried using the Macbook’s built-in Retina display and the performance was abysmal. Another thing I noticed is that the GPU usage on the Linux server reported by nvidia-smi hovers around 1-2% when things are working well, but climbs into the 3-6% range and stays there when things slow down. Disconnecting and reconnecting restores the fast frame rate and lowers GPU usage.

    I am willing to use a debug build to try to track this issue down. Thanks for looking into it.

     

    #33090
    dark_sylinc
    Participant

    This may be a silly answer but I noticed some of the rigs you mention (particularly your two Linux boxes) are severely prone to overheating.

    Check your sensors to see you’re not just being throttled. For your AMD rig on Ubuntu you’ll need a mainline kernel of 5.11+ otherwise CPU sensor data will show up as 0°C.

    If you suspect it’s an adaptive flow control, try literally disabling it (Ctrl+Alt+0 -> Display -> Change Settings -> Modify -> Disable Network-adaptive display quality). There’s also other options you can try tweaking.

    Another thing you can try is Toggling HW acceleration server-side

    Cheers,

    Note: I don’t work for NoMachine, I’m just another user.

    #33135
    fra81
    Moderator

    @dark_sylinc

    You don’t work for NoMachine, but it seems like you do! 😉


    @hellcats

    I’d try the suggestions dark_sylinc rightly provided, starting from toggling hardware acceleration. This can also be done from the Server settings GUI, Performance panel.

    #33198
    hellcats
    Participant

    Well, my response didn’t submit successfully.

    Basically, I’ve tried changing settings, etc. on both the client and server-side to try to get the highest performance over LAN, and I can get high performance for a little while, but the frame rate always degrades over time to around 20-25 fps. Not unusable, but also not as good as NoMachine is capable of.

    I see the problem more easily when connecting to the Linux box with the Nvidia 3090 (both from Mac and PC clients), but I also see it on the other Linux computer as well (with a 1080Ti).

    I am willing to run a debug version to try to get to the bottom of this if someone on the development team is interested in having me run some tests.

    – hellcats

     

    #33327
    dark_sylinc
    Participant

    Basically, I’ve tried changing settings, etc. on both the client and server-side to try to get the highest performance over LAN, and I can get high performance for a little while, but the frame rate always degrades over time to around 20-25 fps. Not unusable, but also not as good as NoMachine is capable of.
    Did you try to disable HW acceleration on HW as I suggested? Because disabling it should in theory get you slower performance, but in practice enabling it can give a ton of problems

    If you have it enabled, on the client, it should say:  Display 1920×1080, codec H.264 NVENC/VAAPI, audio Opus 22kHZ stereo

    If you have it disabled on the client it should say:  Display 1920×1080, codec H.264/VAAPI, audio Opus 22kHZ stereo

    Another source of slowdown may be Ubuntu’s compositor. I’m not familiar with Gnome’s compositor to suggests tweaks for it. On XFCE you’d go to Window Manager Tweaks -> Compositor -> Enable display compositor (turn it off).

    Try using OBS (Open Broadcaster System) and start recording the screen, and see if such slowdown appears after a while (since how OBS and NoMachine capture the display appear to be very similar).

Viewing 7 posts - 1 through 7 (of 7 total)

This topic was marked as solved, you can't post.