The speed rating is supposed to give you near-perfect exposures if your equipment, film, and everything else is within spec. If you (and everyone else) find that with every (in-spec) camera, lens, and meter that your exposures are insufficient, then the speed rating is wrong. It's that simple.
Hi, I think when you say "It's that simple." you're going out on the limb a bit.
I spent over 40 years working full-time in photography, with the great majority doing tech-type work in a large lab. Back in the day we kept about 60 or 80 ANSI standards (all current) on file; these included both film speed and exposure meter standards. And we USED, not just read, many of them. I don't recall ever reading what you said in your first sentence.
When you say "is supposed to..." I suspect you you really mean it's something of an ideal as opposed to a requirement. And... I have to say, I don't even really know what a "perfect exposure" is. Now, in the case of professional color neg portrait/wedding films, the manufacturers typically gave some info on achieving a "normal" exposure (as opposed to under or over exposure). It's basically a range of density aims for grey card and skin tones. And... at one time one could buy sets of "printer setup negs," which would include a "normal exposure" negative, allowing one to compare on a video analyzer of the day. So aside from such a so-defined "perfect exposure" I don't know what it really means. At least not in concrete terms.
Should I go on? I don't feel like I can really stop now. Regarding what an ANSI (or ISO) film speed rating means, I mostly take it as a reference point to put everything on a similar basis. A sort of driving a stake in the ground, so to speak. The films I am most familiar with, pictorial b&w and color neg films, both use an exposure point that produces a slight density over "base plus fog." And there are defined conditions for the testing procedure. But... equal ANSI speeds aren't the whole story. Case in point - in the late 1960s I was shooting weddings for a local guy; pro color film of the day was Kodak CPS. A few years later I was doing high-volume portrait shootings (no, not school pix - this was the HARD sort of work) same film, CPS in long roll 70mm. This film was KNOWN to need more exposure than the ASA speed would imply. A few years later I got into lab work where we were doing plenty of sensitometric testing (fresh EG&G sensitometer and an EG&G photo-radiometer to check calibration). It was very clear that the CPS film (Yes, properly processed) had contrast TOO LOW to achieve the density aim values for "normal exposure" when rated at its ASA speed. So... I would say this shows a hard case contrary to the OP'S first sentence.
To repeat... correct ASA speed, properly calibrated exposure meters, correct color temperature of light, but this would NOT be enough exposure to meet Kodak's definition of "normal" exposure.
Now, would we PREFER for the meter to produce a "correct exposure?" Yes, but it did NOT, and the ANSI film speed WAS correct. The obvious fix for this was to derate the film speed on the exposure meter which is what the OP seems to be suggesting, which is a bit different than than saying that the ASA speed was wrong (it was not). A bit more color film history... not long after that came Kodak's new C-41 process and VPSII film. It was also lowish contrast and also needed increased exposure (I'm thinking not as much as the CPS though). A few years later VPSIII was introduced. It started out with a lower contrast, but before too long the contrast was increased in several running product changes... after which it worked fine at the metered ASA speed. I know these things from first-hand experience; each time we received a new emulsion of film we pulled a couple of 100-ft rolls for sensi testing. So we tracked the entire history of VPSIII film in that way, albeit with emulsion numbers unique to my employer (full emulsion runs were reserved for our use).
All this (mostly) in response to "It's that simple."
Ps, as a note, when films, etc., are well behaved, I would guess that the most likely cause for poor exposure would be in the exposure-metering stage. The metering standards are not as rigid as many might think, and are not necessarily aligned with spectral sensitivity of the films used. Further, there are potentially a number of tricky metering situations where the photographer's judgement is important. For example, if skintones are important but shot in a snow scene vs a shaded foliage background, the photographer should ideally know not to trust an overall reflective meter reading. (A close-up skin reading ought to be taken, perhaps compensating for the shade of the complexions; or perhaps an incident meter reading, with hemispheric diffuser, at the subject position.)
I should apologize for getting so wordy, and for going off topic related to t-max films. I'm mainly trying to make the case that things can often be more complicated than we might want to let them be. Often a simpler view works ok, within some limitations; other times a deeper examination may be warranted. I think the last handful of posts, at least, need the deeper examination.