June 8, 2020 at 10:32 #28011
Sorry for the late response, totally forgot after the initial response.
“Something good came out of it. We discovered that Windows editors misbehave when editing unix files, so we will improve the software to make sure this doesn’t happen :)”
Well I’m glad to hear that, I thought this was all on me. Which it was, since I edited the config files. But glad in the end there’s a logical reason for it! 🙂
You can consider purchasing a dummy dongle. This will make your system think there is a monitor attached and the GPU will function correctly.
I did this and it worked! Due to my CPU/onboard GPU I was stuck at 1080p max though. So I bought a simple (NVIDI) GPU to be able to have a higher resolution. For some reason with the HDMI dummy attached to the GPU I don’t get visuals at all anymore.
And when connecting my TV to the Linux machine I constantly get visual glitches and artifacts. It seems like the desktop is glitching through and also it’s very laggy… Is there a logical explanation for that?
On my TV the Linux machine doesn’t show these glitches/artifacts at all, so it’s probably not a (NVIDIA) driver issue. Also there is no lag on my TV only when I access it through NoMachine on my Windows machine. I’ve recorded this if that might be helpful.
Thanks in advanceJune 8, 2020 at 10:40 #28026
Zenith, it’s not clear whether you mean there are glitches and artifacts when connecting with NoMachine. Since this can be considered a new topic which is unrelated to the authentication issue you previously had, please record here the details of your set up.June 11, 2020 at 14:23 #28078
I just discovered this was split into a new topic, so sorry for the late response.
You want my hardware details?
I also have recorded the issue on my screen, so I can send that through e-mail if that’s helpful?June 11, 2020 at 14:37 #28087
Send any attachments as before making sure you put the new topic title as the subject.
Please write here the details of your environment for the developer’s convenience and anyone else reading the thread.June 12, 2020 at 14:36 #28099
E-mail is sent!
Linux server: i5-2400, 8GB RAM, NVIDIA GT 1030 and GTX 1650 (both GPU’s tested)
Windows client: i7-8700K, 32GB RAM, GTX 1080tiJune 16, 2020 at 18:37 #28156fra81Moderator
the fact that you don’t see glitches on the TV doesn’t rule out a problem with drivers. Rendering (pushing) to the GPU may work correctly, but pulling the screen content back from the video memory may not. Unfortunately this is not something that NoMachine can control.June 17, 2020 at 11:50 #28159
But can I force NoMachine to use the onboard GPU instead of the NVIDIA one? Because this is unusable, besides the glitches it also becomes very laggy.July 6, 2020 at 14:22 #29034
It is the display server that uses the GPU. It depends on the configuration of machine and display server. NoMachine does not decide the GPU they use.
This topic was marked as solved, you can't post.