I wonder if that has to do with the light source -- the carbon-arc lamps and mercury-vapor lamps. Because later with the tungsten incandescent lamps, 5,000 and 10,000 watts and even higher were and are common.
I fear we are going to be finding eye damage with the LED lamps, too, because it is such an unnatural and harsh light source.
In my opinion, incandescent is the only good form of electrical light. And just think of the beauty of so many movies in the 1930s-1990s period that used tungsten incandescent compared even to the movies on film today that often use HMI and LED.
If the light isn`t "warm" but "white" it can have higher UV-proportion, which could damage eyes faster.
But back in the silent era it probably just was the amount of light. About 1905 they decided to record a boxing match on film - probably the first boxing match being filmed at full length.
They were inside, had several cameras as a camera back then could not record very long - and they had a hell of floodlights, because film was that slow (and lenses too) - resulting in the boxers having to rest inbetween each round because the floodlights did produce that much heat. They paused for 5 or 10 minutes after each round.
In the silent era it rather was the vast amount of light needed than UV-proportion of an carbon-arc lamp - though UV-proportion surely didn`t help.
Today the problem also is that there is no standard for color-reproduction of digital sensors, as far as i know. In the days of analog film there were standards on how a film should reproduce certain colors (not every film did), but developing labs also had means to measure color reproduction of developed film. Sometimes a film director did use a certain faint color filter, so the scene did get a certain mood - but the lab considered this a film- or developing-mistake and did filter the copy to natural color by standard. Then the director had to contact the lab to tell them that he wanted this scene to be light-green, or light-red etc. - because there were standards for color film.
With digital sensors it at least was different. There at least was no standard on how this sensor does reproduce green or red etc. - and if you shoot the same subject with different digital (video) cameras you`ll end up having different color reproduction - i once did and had.
Meaning you can have the best (incandescent) light but still end up with a greenish or bluish shot.
And movies still shot on film today usually get scanned to make it on the screen and in the scanner there again is a digital sensor.
At least today there are LEDs having good color reproduction - but these of course are more expensive.
Also they started to use the possibilities of digital post-production, often resulting in awful colors - but they don`t care because they want to stand out from the crowd - or because this de-saturated colors somehow are in fashion now.
In the 80s and 90s colors in movies often were bright and cheerful, around 2000 these zombie and vampire movies came up and colors got darker - and for quite a while now colors are de-saturated. Not everywhere but i see a pattern there.
But now there is digital, new possibilities but also more competition - who really cares about colors, just make them current fashion or outstanding so we will have some success at least - on netflix or the internet where there is a hell of competition.
....
But on the other hand film manufacturers at least did strive for natural color reproduction when film was standard - and so did and still does Kodak. Their line of Kodak Vision is a color film which has very natural color reproduction - and this did improve from Vision1 to 3 as far as i know.
Also the revived Ektachrome had a little change in color reproduction, as the old version had a bit too much saturation for being natural color reproduction (by measurement, in a lab etc.).
So there is a mix of different light sources, digital sensors, post production, competition and different fashion...
...not really helping on the result.