I don't think this has a real world application, but I would like to know the reason anyway.

When we calculate the covariance of two variables X and Y, we can make a mistake in calculating one of the two means (the ones that we use as an input to calculate the difference between each value and the mean in each variable, which in turn are multiplied between them) and the result is the same as if we have used the correct mean. We tried using an incorrect mean in both variables but the result doesn't hold, it just holds when only one of the means is incorrect.

I don't think it is exactly a re-scaling issue and neither one of the properties of the covariance [Cov(X+a,Y)=Cov(X,Y)]

Thanks in advance!