n***s 发帖数: 1257 | 1 【 以下文字转载自 Music 讨论区 】
发信人: novas (novas), 信区: Music
标 题: Re: 翻唱Beyond《海阔天空》- 学粤语搅拌版
发信站: BBS 未名空间站 (Thu Feb 23 19:29:11 2012, 美东)
I guess I didn't learn entropy well. Isn't it that the bigger the entropy
of
something is, the more information the thing carries? | N***m 发帖数: 4460 | 2 should be the opposite
【在 n***s 的大作中提到】 : 【 以下文字转载自 Music 讨论区 】 : 发信人: novas (novas), 信区: Music : 标 题: Re: 翻唱Beyond《海阔天空》- 学粤语搅拌版 : 发信站: BBS 未名空间站 (Thu Feb 23 19:29:11 2012, 美东) : I guess I didn't learn entropy well. Isn't it that the bigger the entropy : of : something is, the more information the thing carries?
| n***s 发帖数: 1257 | 3 Thanks! Do you mean opposite to mine or to his? Or simply put, who is
correct, I or he?
【在 N***m 的大作中提到】 : should be the opposite
| D***r 发帖数: 7511 | 4 For a list of words {S1, S2,...Sn}
Suppose their probabilities are {P1, P2...Pn}
Entropy: H=sum(-Pi*log(Pi))
When P1=P2=...=Pn, the entropy reaches the maximum.
Such a text would have the maximal information.
【在 n***s 的大作中提到】 : Thanks! Do you mean opposite to mine or to his? Or simply put, who is : correct, I or he?
|
|