Let Xn Xn be a sequence drawn from a discrete memoryless source, and let Yn yn be the corresponding reconstruction sequence that is output by a good rate-distortion code. This paper establishes a property of the joint distribution of (Xn, Y n). It is shown that for D > 0, the input-output statistics of a R(D)-achieving rate-distortion code converge (in normalized relative entropy) to the output-input statistics of a discrete memoryless channel (dmc). The dmc is 'backward' in that it is a channel from the reconstruction space yn to source space Xn. It is also shown that the property does not necessarily hold when normalized relative entropy is replaced by variational distance.