What Class A really means is that: “Grid bias and alternating grid voltages are such that plate current in a tube flows at all times”. Got that? That’s the entire definition from the RCA Tube Manual: the Ultimate Authority itself. I’ve pondered that statement for decades and have concluded that it’s both simpler and more complex than it appears.
...
Class A would simply be a set of operating parameters such that the valve never shuts off completely and some amount of current, even if it’s just a trickle, is always flowing through it. This is the best part of Class A operation, because it’s when a tube stops and starts --cuts off then resumes conducting current-- that most distortion (or “non-linearities”) occur. And distortion, to the RCA Engineers, was Bad. In their world, amps were never intended to be turned up into distortion. But even the most Class A amp, if turned up loud enough, would go beyond Class A and into cut-off.
At some point those “alternating grid voltages” (that’s the signal, which increases with loudness) will add so much to the “fixed bias voltage” and create such a strongly negative field at the grid, that current will indeed cease flowing. In radios and hi-fi’s, designers can assume you’ll never turn it up into heavy clipping because it sounds so bad. Thus they can say an amp is always Class A because they can predict the maximum input signal.
Then there are guitar amps. No such prediction can be made and in fact the opposite is true. Huge signals are purposely used to create overdrive and distortion. Those poor old engineers would be shocked and stunned at what we do to their tubes! And in the name of music, no less! I’ve hung out with some of these old guys and indeed, they were appalled, once they got over their disbelief! (Try describing an amp spewing out a barrage of hard-core krang to a guy whose only exposure to guitar is campfire folk songs!)