3840×1600 Resolution

Forums / NoMachine for Linux / 3840×1600 Resolution

This topic contains 5 replies, has 2 voices, and was last updated by Avatar fra81 6 days, 19 hours ago.

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #23532
    Avatar
    avvid
    Participant

    Hello,

    I’m using NoMachine on Linux with an NVIDIA GPU and VirtualGL enabled.

    My monitor is 3840×1600 and whenever I use my max resolution and H.264, I get artifacts when scrolling or using any other fast moving animation.  Pictures attached.  These all go away with a lower resolution.

    Any thoughts?

    #23541
    Avatar
    fra81
    Moderator

    Hi,

    can I assume you already tried to use VP8 encoder and that solved the problem?

    Please check if any of the following options solves the problem with H.264 (try them in this order):

    1) disable hardware decoding by checking the ‘Disable client side hardware decoding’ option in the session menu, Display settings

    2) disable hardware encoding on server side by adding this line to the /usr/NX/etc/node.cfg  file (then restart the server):

    EnableHardwareEncoding 0

    #23542
    Avatar
    avvid
    Participant

    Hey,

    the results are different than I thought they would be.

    VP8 worked flawlessly before I created this post, you were correct.  Downside of VP8 was that it used my client side GPU, which wasn’t desired.  So I tried your options in turn:

    1) Disable client side hardware decoding: No change

    2) Disable Hardware Encoding: Works great!

    My fear was/is that my client’s GPU would be used now, but that doesn’t seem to be the case at all; maybe my CPU is being used slightly more, but that’s it.  Also, nxserver no longer appears in my nvidia server-side process list like it used to.

    So, my question is, if my server-side card isn’t encoding, and my client GPU isn’t used.  Then who is doing all the work?

    Thanks for the help.

    #23545
    Avatar
    fra81
    Moderator

    Note that server and client side can either use the GPU or the CPU independently: server side can use the hardware encoder or a software fallback; client side can use the hardware decoder or a software fallback. And both sides can use the hardware (GPU) at the same time, or one of them can, or none. It will depend on hardware capabilities and on settings you’re using. To answer your question, when the GPU is not used (on either side), the CPU does the work.

    That said, it seems there is a problem with hardware encoding. Please provide more info so we can investigate further:

    – server OS version (as I understand Fedora 29)

    – Nvidia card model

    – video drivers type and version

    Also server side logs would be useful. You can gather them as explained in https://www.nomachine.com/AR10K00697 and send them to forum@nomachine.com, if you prefer.

    #23546
    Avatar
    avvid
    Participant

    Got it.

    Please see the information below and attached:

    – Fedora 29 Kernel 5.2.7

    – GeForce GTX 1050

    – Nvidia Driver Version: 435.21

    – CUDA Version 10.1

    #23589
    Avatar
    fra81
    Moderator

    Unfortunately we’re not able to reproduce the issue with hardware encoding in our labs. Would you test one more thing? That would be to restore the EnableHardwareEncoding key back to the original value and change the encoder’s rate control mode instead (EncoderMode key), i.e.:

    EnableHardwareEncoding 1

    EncoderMode bitrate

Viewing 6 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic.