Decreasing# of bits should not necessarily increase the noise. It will have an affect on the resolution though (and 12 bit is still 70dB resolution, which is pretty good in guitar terms). The noise should only go up if dither has been added to compensate for truncation effects.
I'm really curious about HOW Eventide has implemented this algorithm (it's trade secret I'm sure, but I'm curious nonetheless). When I select 12 bits, do I get 12 bits that are uniformly distributed across the input voltage range range, or do I just get the most significant 12 bits of the Timefactors 24 bit converters? There's a big difference there, for example, if the Timefactor 24 bit A/D input range is from +20dBV to -100 dBV, then if we just get most significant 12 bits, then our range only goes from +20dBV to -50 dBV, throwing away much of the signal, and not making use of the top parts because you'd need to really crank the level going into the timefactor to get the input up there. Most of those old digital delay units had input volume controls to let you drive the A/D converters with good resolution….but we don't have that on the Timefactor, so we don't know hoy many of the upper bits we're throwing away.
I suspect (pure speculation, don't quote me on it) that Eventide is throwing away the least significant bits in their algorithm, and that combined with the fact that the TF input range is set up for some pretty high level signals (Try and see just how much level you need to put in in order to light the clip lamp), results in users not getting 'representative' performance from the vintage delay algorithm. By 'representative' I mean if you want your box to sound like an old 12 bit korg SDD3000, you'd think that dialing up the 12 bit setting would be correct, but since you don't have input preamp control on the Timefactor it might be more sonically accurate to dial in a 14-16 bit setting.
Maybe one day I'll hook my soundcard up to the timefactor and do some analysis….