g*****z 发帖数: 333 | 1 I am trying to estimate the variance of a normally distributed data set. All
the information I have about the data set is its mean, min, max and number
of points in the data set. Is there any algorithm to estimate the variance
from these information?
I did some experiments in R and tried some formulae. Two of them looked OK.
1. (mean - min)(max - mean)/18
2. [(min - mean)^2 + (max - mean)^2]/2/18
Is there any theoretical basis for those experimental results?
Thanks a lot for any suggestions and |
g*****z 发帖数: 333 | 2 Maybe I should try at the Statistics board? |
g*****z 发帖数: 333 | 3 Just found that the constant 18 is actually related to the sample size (# of
points in the data set) logarithmically. The formula that will work for all
sample size is:
(mean - min)(max - mean)/(1.624 * log(sample size))
where log() is a natural logarithm function.
Any suggestions on why it would be like this? Or is there a better way of
doing this?
Thanks a lot! |