I have an amplifier with 2 pairs of 6L6GC tubes coupled to an 8 ohm load via an output transformer. Looks like a Class AB circuit with grounded cathodes. According to the manufacturer's schematic, the max plate voltage can be 500 VDC and the max plate current can be 90ma per pair. If I were to adjust my bias to get to a plate current of 45ma per tube and set the plate voltage to 500 VDC, that calculates out to 90 Watts rms max. The manufacturer claims it to be a 100 watt amp. It would seem to me that since the tubes are rated at 30 watts I could increase my plate current and achieve a 100 watt output, but then I would be violating the schematic's 90 ma per pair spec. Should I just leave it at 75 watts (.038 plate currents X 4, with a plate voltage of 470 VDC) or increase my plate currents and get the 100 watts the amp is advertised as being? It's almost like the amp may have been upgraded to 30 watt 6L6GC's from the 25 watt 6L6's and the schematic wasn't updated. I'm tempted to leave it as it is, currently at 470 VDC plate voltages and 38ma plate currents (about 60%). Any opinions?