
The problem here is it is not a unit of length or a unit of area. It is a count. If you have a grid of 10 gummy bears by 10 gummy bears you have 100 gummy bears. Not 100 gummy bears^2. A pixel is a desecrate thing, not a continuous value like meters are and it is not used to measure area. For that you need to know the length and spacing of a pixel.
In a way that seems to be what the paper is showing evidence for. Basically they looked at what happens if you disable the optimizations the LLVM compiler does when it knows something is undefined (and thus should not appear in a well written program). And they claim to have found minimal performance regressions of which can largely be mitigated in other ways.
And that has been the biggest argument for having UB in C/C++ - to let the compilers optimize things in ways that you cannot do if everything was well defined. This might have been true in the past when we had a lot more variation in CPU designs but this paper seems to conclude that is no longer the case. Thus raises the question as to why do we need so much UB in C/C++ any more if performance is not a bit issue for modern programs using modern CPUs.