Nvidia adds integer scaling support.

Lossless Scaling

All-in-one gaming utility for scaling and frame generation

Following Intel, which announced that will support integer scaling in its new Graphics Gen11, Nvidia just added support for integer scaling to its drivers... for Turing only. Despite the fact that in general these are positive changes for the industry, it is amazing how the marketing of graphics card manufacturers works in the conditions of increased competition in the market. It’s ridiculous to read Nvidia's statement that only in Turing there are special cores that allow integer scaling to be implemented, or Intel's that Gen9 graphics lack hardware support for nearest neighbor algorithms (like 2 * 2 = 4 algorithm). Nevertheless, until recently we didn’t have this either. The more community will use it, mention it, put pressure on graphics card manufacturers, the faster the day will come when everyone will have the opportunity to freely use integer scaling on any graphics card. Up to this point, you can use the free Lossless Scaling Demo with the auto function, which does exactly the same as the new Nvidia drivers, except for scaling the desktop.