Forums / NoMachine for Linux / Glitches/artifacts

Viewing 8 posts - 1 through 8 (of 8 total)
  • Author
  • #28011

    Split from

    Sorry for the late response, totally forgot after the initial response.

    “Something good came out of it. We discovered that Windows editors misbehave when editing unix files, so we will improve the software to make sure this doesn’t happen :)”

    Well I’m glad to hear that, I thought this was all on me. Which it was, since I edited the config files. But glad in the end there’s a logical reason for it! 🙂

    You can consider purchasing a dummy dongle. This will make your system think there is a monitor attached and the GPU will function correctly.

    I did this and it worked! Due to my CPU/onboard GPU I was stuck at 1080p max though. So I bought a simple (NVIDI) GPU to be able to have a higher resolution. For some reason with the HDMI dummy attached to the GPU I don’t get visuals at all anymore.

    And when connecting my TV to the Linux machine I constantly get visual glitches and artifacts. It seems like the desktop is glitching through and also it’s very laggy… Is there a logical explanation for that?

    On my TV the Linux machine doesn’t show these glitches/artifacts at all, so it’s probably not a (NVIDIA) driver issue. Also there is no lag on my TV only when I access it through NoMachine on my Windows machine. I’ve recorded this if that might be helpful.

    Thanks in advance


    Zenith, it’s not clear whether you mean there are glitches and artifacts when connecting with NoMachine. Since this can be considered a new topic which is unrelated to the authentication issue you previously had, please record here the details of your set up.


    I just discovered this was split into a new topic, so sorry for the late response.

    You want my hardware details?

    I also have recorded the issue on my screen, so I can send that through e-mail if that’s helpful?


    Send any attachments as before making sure you put the new topic title as the subject.

    Please write here the details of your environment for the developer’s convenience and anyone else reading the thread.


    E-mail is sent!


    System specifications:

    Linux server: i5-2400, 8GB RAM, NVIDIA GT 1030 and GTX 1650 (both GPU’s tested)

    Windows client: i7-8700K, 32GB RAM, GTX 1080ti


    Hi Zenith,

    the fact that you don’t see glitches on the TV doesn’t rule out a problem with drivers. Rendering (pushing) to the GPU may work correctly, but pulling the screen content back from the video memory may not. Unfortunately this is not something that NoMachine can control.


    But can I force NoMachine to use the onboard GPU instead of the NVIDIA one? Because this is unusable, besides the glitches it also becomes very laggy.


    It is the display server that uses the GPU. It depends on the configuration of machine and display server. NoMachine does not decide the GPU they use.

Viewing 8 posts - 1 through 8 (of 8 total)

This topic was marked as solved, you can't post.