So one of our readers asked us this question the other day: Why can’t humans see UV light, while other animals can?
Our photoreceptors work by triggering as photon(s) of proper wavelength hit certain molecules in them, causing electrochemical responses. Rods are sensitive to a wide variety of photons, while cones have a certain peak sensitivity, enabling color vision – but only work when there’s enough of the differently colored photons around (which is why we only see colors properly in good lighting).
Photons of UV wavelength can trigger rods, but low end of the UV spectrum overlaps slightly with the blue cone sensitivity area. So if you were to see UV light, you would probably see it as bluish light (which is consistent with historical reports of this occurring, more of it later).
Now, the reason people don’t normally see UV wavelengths is that they are blocked by the lens in human eye, and this feature is shared with most other long-lived mammals.
Many short-lived species – insects, some birds, small mammals like rats, etc. – can see UV because their lenses don’t block UV and thus they can see it just as we can see the light visible to us. Rats, for example, have dichromatic vision, and their “blue” cones are shifted towards UV which enables them to see urine tracks that reflect light on the UV spectrum. Too bad for them that (some) birds can also track these urine trails since they also can see UV light. And flowers basically glow in UV spectrum, which attracts pollinating insects.
So why do long-lived animals have this adaptation to prevent us from seeing this spectrum of light? Most common hypothesis is that if we could see UV light, we would get retinal cancer and die young. While we could still live a successful life in evolutionary sense, the mutation to block UV was clearly beneficial and allowed our ancestors to live a longer life and have more offspring, so there was a positive selective pressure for having lenses that stop us from seeing UV, but protect our eyes from the harmful radiation.
However, we still do have the ability to perceive UV wavelengths, and there are examples of this in history. If the lens is absent (aphacia) or removed, like in early cataracts surgery, people can indeed see UV light, but since our color vision has been optimized for the visible spectrum, UV light pretty much just messes with it.
The most famous (likely) example of this would be Claude Monet, whose “blue era” occurred after cataracts surgery where the lenses in his eyes were removed (modern cataracts surgery typically switches the original, adaptive lens for an artificial, fixed focal length replacement that also blocks UV). This made him see more blues, and to reflect that, his paintings ended up much more blue – because in his eyes, the blue paintings matched what he was seeing.
Actually changing the wavelengths that your cone cells are optimized for observing would require some pretty advanced genetic manipulation to change the structure of the photopigment molecules in the photoreceptors. Extreme changes would probably require “inventing” a new photopigment, and then somehow figuring out what kind of genetic code you’d need to insert into the genome to have a new kind of photoreceptor produce that pigment molecule and be sensitive to that wavelength.
While this might have some use – particularly by adding photoreceptors sensitive to the near IR spectrum – it would be extremely difficult in practice, and is probably beyond what we can currently or in near future do with genetic engineering.
Actual “heat vision” however would never quite work, though. For example, let’s say you wanted to have a heat vision that allows you to see glow of human or other animal bodies in the darkness. This would mean the photopigment has to be sensitive enough to be activated by the low energy IR light emitting from 36-37 degrees Celsius (living) human bodies.
The first obvious problem with this is that the pigment itself is inside an eye which is inside a human head which is also at that same temperature – and probably slightly higher than human skin temperature, actually.
This is also why true IR camera sensors need to be cooled to temperatures lower than the surroundings; otherwise, they would just be flooded by the thermal radiation coming from themselves. Which would be pointless. And also the reason why, aside from our ability to crudely sense our skin temperature (more like change of temperature, really), actual thermal imaging in animal kingdom is pretty much limited to cold-blooded animals, like pit vipers for example. These animals have nearly the same temperature as their environment, so they can – to some extent – sense the thermal signature of warm-blooded animals. This is how pit vipers “see” their prey and are capable of striking and hitting them in perfect darkness. It’s never going to work in warm-blooded animals, unless you also genetically engineer eye stalks that allow our eyes to be at significantly lower temperature than our bodies.
I think I would rather grab an IR camera or IR vision goggles.