Someone mentioned this a few dozen pages before your message, I don't recall who, and I meant to reply then but forgot, so before this turns into an (incorrect) urban legend...
This is wrong. It's backwards. If you lower the voltage, you increase the current, to maintain the same power. If you have a 20 watt load, and you supply it 10 volts, it's going to be drawing 2 amps. With the same 20 watt load, if you supply it 5 volts, it's going to draw 4 amps. That's why the LV-EV draws more current than the standard (6v) EV.
Watts (Power) = Volts (Voltage) * Amps (Current).
So if you want to check if the current rating of your switch is high enough, you should be calculating using the lowest voltage the cell(s) get to before they cutoff, NOT the highest voltage they have when fresh off the charger.
Note that no heater core is going to have a constant power draw. It will most likely draw more power when it's cold, and then stabilize when it gets to its running temperature. Then as you draw air through it, the core will draw more power to keep the temp up. You could dig up a .05 or .01 ohm kilowatt rated load, (maybe OF has one), put it in series with the battery/batteries, hook up a storage scope, press the button, and watch what shows up on the scope. Or you could ask TV Tim if he can tell you what the cold state onrush current to a given heater core is.
Onward thru the Fog.