3840×1600 Resolution

Forum / NoMachine for Linux / 3840×1600 Resolution

Viewing 10 posts - 1 through 10 (of 10 total)
  • Author
    Posts
  • #23532
    avvid
    Participant

    Hello,

    I’m using NoMachine on Linux with an NVIDIA GPU and VirtualGL enabled.

    My monitor is 3840×1600 and whenever I use my max resolution and H.264, I get artifacts when scrolling or using any other fast moving animation.  Pictures attached.  These all go away with a lower resolution.

    Any thoughts?

    #23541
    fra81
    Moderator

    Hi,

    can I assume you already tried to use VP8 encoder and that solved the problem?

    Please check if any of the following options solves the problem with H.264 (try them in this order):

    1) disable hardware decoding by checking the ‘Disable client side hardware decoding’ option in the session menu, Display settings

    2) disable hardware encoding on server side by adding this line to the /usr/NX/etc/node.cfg  file (then restart the server):

    EnableHardwareEncoding 0

    #23542
    avvid
    Participant

    Hey,

    the results are different than I thought they would be.

    VP8 worked flawlessly before I created this post, you were correct.  Downside of VP8 was that it used my client side GPU, which wasn’t desired.  So I tried your options in turn:

    1) Disable client side hardware decoding: No change

    2) Disable Hardware Encoding: Works great!

    My fear was/is that my client’s GPU would be used now, but that doesn’t seem to be the case at all; maybe my CPU is being used slightly more, but that’s it.  Also, nxserver no longer appears in my nvidia server-side process list like it used to.

    So, my question is, if my server-side card isn’t encoding, and my client GPU isn’t used.  Then who is doing all the work?

    Thanks for the help.

    #23545
    fra81
    Moderator

    Note that server and client side can either use the GPU or the CPU independently: server side can use the hardware encoder or a software fallback; client side can use the hardware decoder or a software fallback. And both sides can use the hardware (GPU) at the same time, or one of them can, or none. It will depend on hardware capabilities and on settings you’re using. To answer your question, when the GPU is not used (on either side), the CPU does the work.

    That said, it seems there is a problem with hardware encoding. Please provide more info so we can investigate further:

    – server OS version (as I understand Fedora 29)

    – Nvidia card model

    – video drivers type and version

    Also server side logs would be useful. You can gather them as explained in https://www.nomachine.com/AR10K00697 and send them to forum[at]nomachine[dot]com, if you prefer.

    #23546
    avvid
    Participant

    Got it.

    Please see the information below and attached:

    – Fedora 29 Kernel 5.2.7

    – GeForce GTX 1050

    – Nvidia Driver Version: 435.21

    – CUDA Version 10.1

    #23589
    fra81
    Moderator

    Unfortunately we’re not able to reproduce the issue with hardware encoding in our labs. Would you test one more thing? That would be to restore the EnableHardwareEncoding key back to the original value and change the encoder’s rate control mode instead (EncoderMode key), i.e.:

    EnableHardwareEncoding 1

    EncoderMode bitrate

    #23704
    avvid
    Participant

    Hello —

    I ended up rebuilding the machine and installing slightly different drivers and NoMachine 6.8.1.  The problem is no longer present.  Thanks for the help.

    #23710
    fra81
    Moderator

    Hi,

    would you tell us what new drivers exactly?

    #23726
    avvid
    Participant

    Sure.

    Originally I used the official Nvidia Drivers from nvidia.com.

    This time I used a fedora package from this repository: https://negativo17.org/nvidia-driver/

    #23738
    fra81
    Moderator

    So it seems the official ones are affected. Thank you for the info 😉

Viewing 10 posts - 1 through 10 (of 10 total)

This topic was marked as solved, you can't post.