Lorentz JÄNTSCHI (lori) works ?id=289
- [id] => 289
- [recorddate] => 2016:08:26:11:24:25
- [lastupdate] => 2016:08:26:11:24:25
- [type] => conference
- [place] => Online, Internet
- [subject] => mathematics - probability theory; mathematics - statistics
- [relatedworks] =>
- 3 (low):
- Some applications of statistics in analytical chemistry, ?id=5
- Quantitative structure-activity relationships: linear regression modelling and validation strategies by example, ?id=268
- 4 (some):
- Correlations and regressions with MathLab, ?id=20
- [file] => ?f=289
- [mime] => application/pdf
- [size] => 924529
- [pubname] => 2nd International Electronic Conference on Entropy and Its Applications, November 15-30, 2015
- [pubinfo] => Molecular Diversity Preservation International
- [pubkey] => https://sciforum.net/conference/ecea-2/paper/3251
- [workinfo] => ECEA-2/B001
- [year] => 2015
- [title] => Shannon's entropy usage as statistic
- [authors] => Lorentz JÄNTSCHI, Sorana D. BOLBOACĂ
- [abstract] =>
Distribution of measured data is important in applied statistic to conduct a appropriate statistical analysis.
Different statistics are use to assess a general null hypothesis (H0): data follow a specific distribution.
The Shannon’s entropy (H1) is introduced as statistic and its evaluation was conducted compared with Anderson-Darling (AD), Kolmogorov-Smirnov (KS), Cramér-von Mises (CM), Kuiper V (KV), and Watson U2 (WU) statistics.
A contingency containing four continuous distributions (error function, generalized extreme value, normal, and lognormal), six statistics (including Shannon’s entropy as statistic), and fifty datasets with sample sizes from 14 to 1714 of active chemical compounds was constructed.
Fisher's combined probability test was applied to obtain the overall p-value from different tests bearing upon the same null hypothesis for each data set.
Two scenarios were analyzed, one without (Scenario 1: AD & KS & CM & KV & WU) and one with (Scenario 2: AD & KS & CM & KV & WU & H1) inclusion of Shannon’s entropy as statistic.
One hundred and sixty-eight rows of cases were valid and included in the analysis.
The number of H0 rejections of varied from 0 to 14, and the Shannon’s entropy (H1) was the statistic with smallest number of rejections.
The overall combine test showed identical results in assessment of Error, Generalized Extreme value and Normals distribution when inclusion (Scenario 2) or not (Scenario 1) of Shannon’s statistic led to the same results.
In the case of lognormal distribution, inclusion of Shannon’s statistic decreases the number of rejections from 16 to 14.
- [keywords] => distribution; Shannon’s entropy; statistic
- [acknowledgment] => Thanks to Prof. Dr. Badong Chen, for its fair evaluation of the work during the conference.