When Intel releases their Ice Lake CPUs and, later, their Xe discrete graphics cards, they will support integer scaling through the Intel Graphics Command Center. Typical display scaling goes through a filtering algorithm that somehow averages out the pixel contributions of the original resolution to match the resolution of the destination. This helps with smooth transitions, but sharp changes in color, such as the edges of text and pixel art, will look blurry.
Think of it like this: You have a 6×1 grid of three red pixels and three blue pixels, as is the case in the image above. You then upscale to 7×1. You now have three red pixels, three blue pixels, and a weird purple or magenta pixel in the middle. If you use the nearest-neighbor algorithm, you will instead have either 3 red pixels and 4 blue pixels, or 4 red pixels and 3 blue pixels. Of course, this is still wrong because you don’t have enough resolution to be right, but it will look better in cases where there is supposed to be a sharp divide between colors. It will just look a little blocky and weird.
Of course, people who play pixel art games like them to be blocky (and often weird).
Unfortunately, all current Intel GPUs will not support this feature. The reason for this is because they did not have hardware support for the nearest-neighbor algorithm. I must admit that I stopped and thought for a second when I heard that – I always assumed that nearest-neighbor filtering was the absolute bottom state.
Maybe I was just spoiled by MS Paint on Windows 95. And now you have heard that sentence in a completely genuine, unironic context. Congratulations.