いまどこ ―冒頭表示2
キーボードの2段めと3段目はなぜ互い違いになっていないの - 教えて!goo:
に答えてってな形で部分統合しようかナとも思う。
http://blog.goo.ne.jp/raycy/e/c11db5b33d4a1d67900e568ab0dc6273ではちょっとスレ違うと思う。
http://www6.atpages.jp/~raycy/Q/ を http://www6.atpages.jp/raycy/blog2btron/door やらの作業経過を取り入れつつ、ふくらませるようなかんじで、、
http://www6.atpages.jp/~raycy/Q/ を http://www6.atpages.jp/raycy/blog2btron/door やらの作業経過を取り入れつつ、ふくらませるようなかんじで、、
→
”対数尤度”
”対数自由度”
”対数多様度”
ナット ハートレー ビット
その意味で、私は対数尤度=(確率論的)エントロピーの測定値、という理解をするわけです。(赤池弘次、P.248)
”対数尤度”
”対数自由度”
”対数多様度”
会誌「えんろとろぴい」 5号(85.11)
情報エントロピー(対数多様度)について 槌田敦
ナット ハートレー ビット
→ "判読限界" 再生紙 OR エントロピー
紙‐鉛筆系における判読限界。紙のホワイトノイズ強度限界?
再生紙‐鉛筆系の、デコード性能。
メッセージメディア用紙の要求特性。事務用再生紙 要求特性 OR 要求仕様 OR 要求性能
事務用再生紙 要求特性 OR 要求仕様 OR 要求性能
ストレス少なく、楽に、速く、誤り少なく読めるか?
書けるか?
紙‐鉛筆系における判読限界。紙のホワイトノイズ強度限界?
再生紙‐鉛筆系の、デコード性能。
メッセージメディア用紙の要求特性。事務用再生紙 要求特性 OR 要求仕様 OR 要求性能
事務用再生紙 要求特性 OR 要求仕様 OR 要求性能
ストレス少なく、楽に、速く、誤り少なく読めるか?
書けるか?
"Grammatical Man" shannon entropy Neumann
http://humanthermodynamics.wetpaint.com/page/Information+theory
Information theory - Encyclopedia of Humanthermodynamics
Why Shannon defined “missing information” in telephone lines as entropy?
“measure of uncertainty” or attenuation attenuation
http://humanthermodynamics.wetpaint.com/page/Information+theory
Information theory - Encyclopedia of Humanthermodynamics
Why Shannon defined “missing information” in telephone lines as entropy?
“measure of uncertainty” or attenuation attenuation
“My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
5. M. Tribus, E.C. McIrvine. (1971). “Energy and information”, Scientific American, 224 (September).
-------
According to another source: [7]
"When Shannon discovered this function he was faced with the need to name it, for it occurred quite often in the theory of communication he was developing. He considered naming it "information" but felt that this word had unfortunate popular interpretations that would interfere with his intended uses of it in the new theory. He was inclined towards naming it "uncertainty" and discussed the matter with the late John Von Neumann. Von Neumann suggested that the function ought to be called "entropy" since it was already in use in some treatises on statistical thermodynamics... Von Neumann, Shannon reports, suggested that there were two good reasons for calling the function "entropy". "It is already in use under that name," he is reported to have said, "and besides, it will give you a great edge in debates because nobody really knows what entropy is anyway." Shannon called the function "entropy" and used it as a measure of "uncertainty," interchanging the two words in his writings without discrimination.
7. M. Tribus, "Information theory and thermodynamics", in Harold A. Johnson (ed.), Heat Transfer, Thermodynamics and Education: Boelter Anniversary Volume New York: McGraw-Hill, 1964; page 354.
------
According to another source: [8]
At first, Shannon did not intend to use such a highly charged term for his information measure. He thought “uncertainty” would be a safe word. But he changed his mind after a discussion with John von Neumann, the mathematician whose name is stamped upon some of the most important theoretical work of the first half of the twentieth century. Von Neumann told Shannon to call his measure entropy, since “no one really knows what entropy is, so in a debate you will always have the advantage.”
8. Campbell, Jeremy. (1982). Grammatical Man – Information, Entropy, Language, and Life, (pg. 22). New York: Simon and Schuster.
--------
11. Muller, Ingo. (2007). A History of Thermodynamics - the Doctrine of Energy and Entropy (ch 4: Entropy as S = k ln W, pgs: 123-126). New York: Springer.
通信路の経済、パッキングの経済、信号の経済、思考の経済、論理ステップ長の経済。計算機演算時間の不経済。
パッキングの経済 → 収納、スケジュール帖、腹八分
データ圧縮 KL情報量
情報圧縮 KL情報量
情報圧縮 OR データ圧縮 KL情報量 OR カルバック OR Kullback
理論圧縮率 KL情報量 OR カルバック OR Kullback
ランダウアーの原理
パッキングの経済 → 収納、スケジュール帖、腹八分
データ圧縮 KL情報量
情報圧縮 KL情報量
情報圧縮 OR データ圧縮 KL情報量 OR カルバック OR Kullback
理論圧縮率 KL情報量 OR カルバック OR Kullback
ランダウアーの原理