I thought I'd share an interesting test I performed over the weekend:
I stumbled across a variac (variable transformer) that I had over the weekend. For kicks, I decided to see how varying the AC voltage to my Mark III and IIC affected their tone.
I varied the input voltage to each amp from 106VAC to 128VAC (roughly +/- 10% from nominal 117VAC). Here's what I found:
- The AC fan in each amp changed speeds, as would be suspected - you could hear its pitch change when I changed the voltage.
- clean channel = I could only detect a very subtle change in tone when I varied the voltage. The tone seemed to get slightly louder and brighter when the voltage was increased. The change was barely noticeable.
- distortion channel(s) = A significant tone change was evident when the voltage was changed. At lower voltages, the output was quieter. As the voltage was increased, the volume got louder and brighter with more distortion. This difference was most pronounced at each end of the voltage range. The tone transition seemed to change most somewhere around 115-120 volts. I thought the best tone seemed to be around 120VAC. The amp seemed lively around that voltage. Conversely, the tone seemed a little subdued below 115VAC.
- The III seemed more affected by the voltage changes than the IIC.
Conclusion: For me, this quick test has confirmed that the AC voltage that an amp sees does affect the tone of the amp, and that voltage affects some amps and amp settings more than others.
I'll also add that although there was a tone change, it was not enough to make me run out and buy a voltage regulator or use a variac when playing live. However, if I were recording, and seeking the absolute best tone possible, I would make sure that the voltage was at least 120VAC.
I stumbled across a variac (variable transformer) that I had over the weekend. For kicks, I decided to see how varying the AC voltage to my Mark III and IIC affected their tone.
I varied the input voltage to each amp from 106VAC to 128VAC (roughly +/- 10% from nominal 117VAC). Here's what I found:
- The AC fan in each amp changed speeds, as would be suspected - you could hear its pitch change when I changed the voltage.
- clean channel = I could only detect a very subtle change in tone when I varied the voltage. The tone seemed to get slightly louder and brighter when the voltage was increased. The change was barely noticeable.
- distortion channel(s) = A significant tone change was evident when the voltage was changed. At lower voltages, the output was quieter. As the voltage was increased, the volume got louder and brighter with more distortion. This difference was most pronounced at each end of the voltage range. The tone transition seemed to change most somewhere around 115-120 volts. I thought the best tone seemed to be around 120VAC. The amp seemed lively around that voltage. Conversely, the tone seemed a little subdued below 115VAC.
- The III seemed more affected by the voltage changes than the IIC.
Conclusion: For me, this quick test has confirmed that the AC voltage that an amp sees does affect the tone of the amp, and that voltage affects some amps and amp settings more than others.
I'll also add that although there was a tone change, it was not enough to make me run out and buy a voltage regulator or use a variac when playing live. However, if I were recording, and seeking the absolute best tone possible, I would make sure that the voltage was at least 120VAC.