Greetings, I read several articles on cathodyne phase inverters and how to improve their response to overdrive. And then, to my surprise, I bumped on this: So, for curiosity, I did plot the cathodyne load-line. From the values on the schematic: Ebb = 260V Eb_k = 188V - 75V = 113V R_load_dc = 68K + 68K = 136K I_max = Ebb / R_load_dc = 260V / 136K = 1.91 mA From the plot, to get 113 V drop across the 12AX7, it is necessary a plate current of 1.08 mA, commanded by a grid voltage bias of - 0.7V… the difference from this voltage to the power supply voltage is divided between the two 68K resistors, that is, (260V - 113V)/ 2 = 74V each (close enough). Looking on the net, I found a few more vintage guitar amps (dubbed “bargain bin models” that use similar cathodyne grid-leak bias PI design) … I thought that grid-leak bias was OK for input stages, were the signal level is low (approx. 100 mV or so), but I never saw that bias method used on the stage before the output tubes… Therefore, I would like to benefit from the experience of you guys that worked on these models, and ask a couple of questions to get a better understanding on this PI design. Question 1: In yours experience, is it possible to get 0.7V grid-leak voltage bias on a 12AX7 out of 1M grid resistor? I thought that this would require a grid-leak resistor from 6M8 to 10M… (?) As reference, I plotted (in GREEN) the values for a hypothetical 1K5 cathode bias resistor… Question 2: I also did think that these hot biased cathodyne PI (Ec1 near zero) are susceptible to draw grid current and produced unpleasant distortion, as pointed out on several cathodyne PI posts… However, I heard people playing this amp (and similar) on YouTube and they sound quite nice… How is this possible? Do they have enough gain to drive the output tubes into overdrive? Any clarification would be much appreciated… Thanks.