http://wiki.naturalphilosophy.org/index.php?title=Bayesian_Entropy_and_Inference&feed=atom&action=historyBayesian Entropy and Inference - Revision history2021-11-29T20:42:12ZRevision history for this page on the wikiMediaWiki 1.34.0http://wiki.naturalphilosophy.org/index.php?title=Bayesian_Entropy_and_Inference&diff=17016&oldid=prevMaintenance script: Imported from text file2017-01-01T17:05:20Z<p>Imported from text file</p>
<table class="diff diff-contentalign-left" data-mw="interface">
<col class="diff-marker" />
<col class="diff-content" />
<col class="diff-marker" />
<col class="diff-content" />
<tr class="diff-title" lang="en">
<td colspan="2" style="background-color: #fff; color: #222; text-align: center;">← Older revision</td>
<td colspan="2" style="background-color: #fff; color: #222; text-align: center;">Revision as of 17:05, 1 January 2017</td>
</tr><tr><td colspan="2" class="diff-lineno" id="mw-diff-left-l12" >Line 12:</td>
<td colspan="2" class="diff-lineno">Line 12:</td></tr>
<tr><td class='diff-marker'> </td><td style="background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>==Abstract==</div></td><td class='diff-marker'> </td><td style="background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"><div>==Abstract==</div></td></tr>
<tr><td class='diff-marker'> </td><td style="background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"></td><td class='diff-marker'> </td><td style="background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;"></td></tr>
<tr><td class='diff-marker'>−</td><td style="color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;"><div>The traditional definition of entropy employed in statistical mechanics and in Shannon's information theory, &#8722;npn ln (pn), may be viewed as a noninvariant special case (associated with an implicit uniform prior) of an invariant covering theory based on &#8722;npn ln (pn/p), where pn refers to a posterior probability distribution, as affected by the arrival of ?new data,? and p refers to a Bayesian prior probability distribution. This generalized or explicitly Bayesian form of ?entropy? thus quantifies the transition between two states of knowledge, prior and posterior, exactly as does Bayes' theorem, and may be considered to have the same scope and information content as that theorem. Constrained extremalization of this form of entropy is demonstrated to be useful in solving three types of classical probability problems: (1) those for which the availability or presumption of single-parameter information allows a Poisson distribution to serve as ?universal prior,? (2) those for which additional prior information justifies a known departure from the Poisson law, and (3) those for which statistical sampling provides arbitrary nonuniform prior information; that is, prior to additional data input. In all cases the ?new data? must be of the aggregated or summed type expressible as Lagrange constraints. By reference to an example taken from Denting and by extension of the proof of Shannon's ?composition law? (hitherto thought to be valid only for the traditional form of entropy), it is shown that use of Bayesian entropy can broaden the scope of information theory, with interpretation of ?information? as that which quantifies a transition between states of knowledge. Shannon's ?monotonicity law? becomes superfluous and can be eliminated. This generalized form of entropy also promises a more powerful means of treating nonequilibrium thermodynamics, by freeing statistical-mechanical entropy from implicit connection to the equilibrium (uniform prior) or thermostatic state.[[Category:Scientific Paper]]</div></td><td class='diff-marker'>+</td><td style="color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>The traditional definition of entropy employed in statistical mechanics and in Shannon's information theory, &#8722;npn ln (pn), may be viewed as a noninvariant special case (associated with an implicit uniform prior) of an invariant covering theory based on &#8722;npn ln (pn/p), where pn refers to a posterior probability distribution, as affected by the arrival of ?new data,? and p refers to a Bayesian prior probability distribution. This generalized or explicitly Bayesian form of ?entropy? thus quantifies the transition between two states of knowledge, prior and posterior, exactly as does Bayes' theorem, and may be considered to have the same scope and information content as that theorem. Constrained extremalization of this form of entropy is demonstrated to be useful in solving three types of classical probability problems: (1) those for which the availability or presumption of single-parameter information allows a Poisson distribution to serve as ?universal prior,? (2) those for which additional prior information justifies a known departure from the Poisson law, and (3) those for which statistical sampling provides arbitrary nonuniform prior information; that is, prior to additional data input. In all cases the ?new data? must be of the aggregated or summed type expressible as Lagrange constraints. By reference to an example taken from Denting and by extension of the proof of Shannon's ?composition law? (hitherto thought to be valid only for the traditional form of entropy), it is shown that use of Bayesian entropy can broaden the scope of information theory, with interpretation of ?information? as that which quantifies a transition between states of knowledge. Shannon's ?monotonicity law? becomes superfluous and can be eliminated. This generalized form of entropy also promises a more powerful means of treating nonequilibrium thermodynamics, by freeing statistical-mechanical entropy from implicit connection to the equilibrium (uniform prior) or thermostatic state.</div></td></tr>
<tr><td colspan="2"> </td><td class='diff-marker'>+</td><td style="color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div> </div></td></tr>
<tr><td colspan="2"> </td><td class='diff-marker'>+</td><td style="color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;"><div>[[Category:Scientific Paper<ins class="diffchange diffchange-inline">|bayesian entropy inference</ins>]]</div></td></tr>
</table>Maintenance scripthttp://wiki.naturalphilosophy.org/index.php?title=Bayesian_Entropy_and_Inference&diff=2503&oldid=prevMaintenance script: Imported from text file2016-12-30T04:58:13Z<p>Imported from text file</p>
<p><b>New page</b></p><div>{{Infobox paper<br />
| title = Bayesian Entropy and Inference <br />
| author = [[Thomas E Phipps]], [[Michael H Brill]]<br />
| keywords = [[Bayesian Entropy]], [[Inference]]<br />
| published = 1995<br />
| journal = [[Physics Essays]]<br />
| volume = [[8]]<br />
| number = [[4]]<br />
| pages = 615-625<br />
}}<br />
<br />
==Abstract==<br />
<br />
The traditional definition of entropy employed in statistical mechanics and in Shannon's information theory, &#8722;npn ln (pn), may be viewed as a noninvariant special case (associated with an implicit uniform prior) of an invariant covering theory based on &#8722;npn ln (pn/p), where pn refers to a posterior probability distribution, as affected by the arrival of ?new data,? and p refers to a Bayesian prior probability distribution. This generalized or explicitly Bayesian form of ?entropy? thus quantifies the transition between two states of knowledge, prior and posterior, exactly as does Bayes' theorem, and may be considered to have the same scope and information content as that theorem. Constrained extremalization of this form of entropy is demonstrated to be useful in solving three types of classical probability problems: (1) those for which the availability or presumption of single-parameter information allows a Poisson distribution to serve as ?universal prior,? (2) those for which additional prior information justifies a known departure from the Poisson law, and (3) those for which statistical sampling provides arbitrary nonuniform prior information; that is, prior to additional data input. In all cases the ?new data? must be of the aggregated or summed type expressible as Lagrange constraints. By reference to an example taken from Denting and by extension of the proof of Shannon's ?composition law? (hitherto thought to be valid only for the traditional form of entropy), it is shown that use of Bayesian entropy can broaden the scope of information theory, with interpretation of ?information? as that which quantifies a transition between states of knowledge. Shannon's ?monotonicity law? becomes superfluous and can be eliminated. This generalized form of entropy also promises a more powerful means of treating nonequilibrium thermodynamics, by freeing statistical-mechanical entropy from implicit connection to the equilibrium (uniform prior) or thermostatic state.[[Category:Scientific Paper]]</div>Maintenance script