In his site, Robrob recommends the Output Transformer Resistance Method as the best and most accurate biasing method. However, I've always had a lot of trouble with this method, particularly because the measurements I take to not stay constant. When I measure the voltage or resistance, they're in a constant state of flux, both for short time periods and long time periods. For example, when I measure the plate voltage, I can't easily say "this is 410.1v". Instead, it will be constantly moving between 409.5v and 410.3v, oscillating around. And to make things worse, if I go and check it again about 3 minutes later, it'll instead be oscillating around a different range, maybe between 412.2v and 413.5v. The same goes for measuring anything, like the center tap voltage, or the OT resistance - constant flux, changing both over short and long time periods. Given that this is the case, I can't imagine that finding the voltages, subtracting them, and dividing by the resistance will be accurate.. what if when I take the voltage on the plate, it was in a low range, and then when I take the voltage on the center tap, it's in the high range? Then the voltage when subtracted, and the current when divided by the OT resistance, will be significantly lower than it would have been if I had just measured the current directly, right? I've tended to use the shunt method instead, though I know it suffers from issues as well. Most notably, I think if the resistance of the multimeter is too high relative to the resistance of the tube, then not enough current will flow through the meter, and the reading will be off. So my question is, can anyone justify which one is better than the other? If the OT resistance method is better, how can I take measurements better so I don't have to worry about fluctuating inaccurate measurements? (For the purposes of this post, I'm referring to an amp that does not have 1 ohm resistors on the cathodes, so I can't use that method) Update: Consider this scenario I posed in a later post in this thread, illustrating my problem with the OT resistance method: I take a reading on the tube plate. It reads 410.1v. 1 minute later I take a reading on the center tap. It reads 408.1v. I therefore find the difference is 2.0v, and divide by my OT resistance which is 155ohm, to get a current of 12.9ma. Therefore the output of the tube is 5.29, which I determine is 38% dissipation for a 6V6gt. I bias the tube hotter as a result. But wait! In between when I took the plate measurement and when I took the measurement on the center tap, the whole voltage of the system had actually increased without my knowing. The plate voltage had actually increased to 411.8v during that time (but I wasn't aware of it, I thought it was still 410.1v), which messes up all my calculations. In fact the real voltage difference would have been 3.8v (411.8v - 408.1v), meaning the current was 24.5ma, and the tube was actually at 70% dissipation. That's a massive difference, from a small fluctuation in voltage! Now when we bias hotter, we'll be putting the tube into over 70% dissipation, which will shorten the lifespan of the tube. So I'm not sure how to reconcile the fact that the system seems to fluctuate so much, if even these small amounts can cause drastically different calculations. To be clear, I'm not suggesting that subtle variations in current will ACTUALLY cause massive changes in bias or sound. I'm suggesting that subtle variations in voltage will cause you to miscalculate the current, and think it's way different than it actually is, which is why I'm especially wary of the OT resistance method as being accurate.