A question that's been plaguing me. Not which is best but which does one TRUST as yielding ACTUAL bias? In staying honest and having no money on the game as far as a prescribed method of biasing a tube amp - I can say that different methods yield different results. Transformer resistance method requires a digital multimeter that uses several decimal places. Useless for those who don't have a FLUKE etc. Transformer Shunt is considered accurate but dangerous and again, doesn't yield exact results as above. Which isn't to say there are reasons for this potential disparity but again - this is not about reasoning behind the disparity but that the disparity exists. NOW - we get to Cathode Resistor Method. This takes your power tubes Cathode voltage and converts it into milliamps. Basically, it moves some decimal places for you BUT - the big issue I have with this, not to mention it yields vastly different readings than above 2 methods, it relies on basically taking ones Cathode voltage reading and saying that is the bias reading (assuming one can move a decimal place in their mind). Big name commercial plug in tube testers use this method. HOWEVER - nowhere have I ever seen someone suggest that (let's pretend the Cathode Resistor method didn't exist) by taking the Cathode voltage one can accurately obtain the bias reading. It's confounding! FURTHER - why then should we persist in believing then that all bias and dissipation levels should give us some universal standard reading that we believe is right, when in actuality they all give different readings. That's is - just taking a Cathode voltage and applying it to calculate 50% dissipation is not going to give us the same reading as Transformer Shunt Method. So then why do we persist in believing that all methods should thus be applied to preconceived notions of "correct" bias - an issue that seems lacking in logic and clarity given the amount of time spent discussing and changing and biasing tubes? Now - I have 20 years experience working on amps - I have the tools and knowledge. I represent more than just the average hobbyist but if I'm doing something wrong then EVERYONE below my skill and knowledge level is too. So this suggests yet another issue. Until the industry standardizes a method we cannot refer to some arbitrary chart or calculator to ascertain what exactly our "bias" is. Now - this hasn't only been something I've addressed as my NOS tube stash runs low. I don't care what modern production tubes run at - they willgo microphonic or fail before they wear down. But in regards to NOS tubes it suddenly becomes an issue to me. It's deceptive to oneself to accept one method as correct "just cause" when in fact the tubes could be under or over biased. Or are they? How will we ever really know? I know there are methods of removing an OT transformer lead and some say THIS will reveal true bias but yet again this compounds the issue. If we aren't getting the same readings with different methods then the whole thing negates itself. But the Cathode Resistor Method - which I like because of its ease of use, not to mention safety aspect for myself and amp - makes the least sense to me. To "bias" an amp with Cathode voltage? AND adjusting bias based on the same "chart" as say, transformer Shunt method? Yields two totally different readings. What am I missing? And if I am missing something, then the whole world of hobbiests and professionals too are missing something as well. Thanks.