Abstract : Previous works have shown that regular distribu- tions with differential entropy or mean-squared error behavior close to that of the Gaussian are also close to the Gaussian with respect to some distances like Kolmogorov-Smirnov or Wasserstein distances, or vice versa. In keeping with these results, we show that under the assumption of a functional dependence on the Gaussian, any regular distribution that is almost Gaussian in differential entropy has a mean-squared error behavior of an almost linear estimator. A partial converse result is established under the addition of an arbitrary independent quantity: a small mean-squared error yields a small entropy difference. The proofs use basic properties of Shannon’s information measures and can be employed in an alternative solution to the missing corner point problem of Gaussian interference channels.