We consider a multiterminal source coding problem in which a source is estimated at a central processing unit from lossy-compressed remote observations. Each lossy-encoded observation is produced by a remote sensor. The sensor first obtains a noisy version of the source, then compresses this observation based on minimizing a local distortion measure that depends only on the marginal distribution of its observation. The central node, on the other hand, has knowledge of the joint distribution of the source and all the observations and produces the source estimate that minimizes a different distortion measure between the source and its reconstruction. In this paper, we investigate the problem of optimally choosing the rate of each lossy-compressed remote estimate so as to minimize the distortion at the central processor, subject to bound on the sum of the communication rate between the sensors and the central unit. We focus, in particular, on two models of practical relevance: the case of a Gaussian source observed in additive Gaussian noise and reconstructed under quadratic distortion, and the case of a binary source observed in bit-flipping noise and reconstructed under Hamming distortion. In both scenarios we show that there exist regimes under which having more remote encoders does not reduce the source distortion. In other words, having fewer, high-quality remote estimates provides a smaller distortion than having more, lower-quality estimates.