Reply To: Blurry fonts with latest nomachine viewer

Forums / NoMachine for Windows / Private: Blurry fonts with latest NoMachine viewer / Reply To: Blurry fonts with latest nomachine viewer

#1438
Avatartitan
Participant

We are working on that, believe me. We’ll give you text that is pixel-perfect, because we know that everything that is less than pixel-perfect is not good enough for text. In the 3 text was pixel-perfect because text was rendered by encoding and caching the X commands. This is not feasible anymore for a number of reasons that is too long to explain, the most important being that even X clients that still use the X protocol for rendering (many basically don’t any more and just push pixels down the link), do it in a way that is so bad and insane that the only reason the 3 worked was because there was a HUGE amount of work by the NoMachine developers to get around the inefficiencies and blindness of these clients. Even with these optimizations, the 3 could be brought to its knees by any multimedia load or any demanding application, simply because X is stateful (we made it way LESS stateful, but we could not get around the stateful nature completely) and there is a limit to the amount of compression you can do. You can compress gigabytes to megabytes, but if your clients generate gigabytes per second you have to surrender. You can’t drop content to recover it later, since the amount of state you have to store is so large that it basically becomes more convenient to use a different protocol. It is like using TCP rather than UDP. You better use UDP, when appropriate.

There is no indication that future applications are going to be less graphically demanding or that application developers are going to put more care about the networked case (basically they can’t, because making application is difficult already and the networked case is only interesting for a small minority, at least at the moment), so we designed a protocol that gives all the advantages of being stateless with all the advantages of being aware of what’s on screen, so that we can render the screen in the best way based on the network, the available hardware, the use case and content. The system is not perfect yet, and there is a long way to go, but I can say that at the present moment it already gives you the best speed and quality among all the similar systems we tested by a wide margin. It’s a good start for something that is basically a 1.0.