Just how is it that a high VSWR can damage the final transistors in an RF power amplifier? Is it simply the wrong impedance (after transformation by the feedline) appearing at the terminals or is the transmission line in particular important?
It depends on the design of the amplifier you're using.
If the reflection coefficient seen by the amplifier is -1 (thus \$\rm{VSWR}\approx\infty\$), that's equivalent to driving a short circuit, and you can see why that would be an overload condition for just about any type of amplifier.
If the reflection coefficient is +1 (again \$\rm{VSWR}\approx\infty\$), that's equivalent to driving an open circuit. If you're amplifier's output stage looks like a common emitter amplifier with resistive pull-up (for example a CML buffer), that's not going to be a problem at all. In some other amplifier configuration with reactive elements, the increased output voltage could cause breakdown of the output devices, for example.
Is it reflected power being absorbed and dissipated in the transistors or something else?
If the output of your amplifier has a real part to its output impedance, then that would imply that it is absorbing the reflected wave.
However the reflected wave will likely be coherent with the outgoing wave the amplifier is producing. Thus it's possible that interference effects between the two waves enhance or reduce the possibility of damage to the amplifier, depending on the phase relationship between them.
If you're driving a long line, then small changes in the signal frequency, or even the temperature of the line, could change the reflected wave phase significantly, so it would probably not be a good idea to try to design on the assumption that you can control the phase of the reflection.
If you're driving a short line, then controlling the phase of a reflection by controlling the line length is a common practice, done every time we use a stub or shunt as a matching filter, for example.