TCQ trellis structure induces autocorrelation in quantization errors that could be exploited or must be modeled
IMPORTANT NEGATIVE. TCQ errors are effectively iid (lag-1 autocorrelation: -0.007, all higher lags < 0.005). FWHT rotation destroys any trellis-induced error structure by mixing all elements before quantization — the trellis sees rotated (near-Gaussian) data, and any sequential dependencies in quantization indices map to random positions in the original (unrotated) domain. This validates the white noise model for theoretical analysis of TCQ error impact on attention. No need for correlated-error corrections or noise shaping.