Marcin J. Schroeder
Akita International University, 193-2 Okutsubakidai, Yuwa-machi, 010-1211 Akita, Japan.
E-mail: [email protected]
Received: 18 October 2004 / Accepted: 14 December 2004 / Published: 14 December 2004
Abstract: Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outcomes were more of formal, theoretical interest, and neither has provided better insight into the nature of information. The strengths of entropy seemed so obvious that no much effort has been made to find an alternative to entropy which gives different values, but which is consistent with entropy in the sense that the results obtained in information theory thus far can be reproduced with the new measure. In this article the need for such an alternative measure is demonstrated based on historical review of the problems with conceptualization of information. Then, an alternative measure is presented in the context of modified definition of information applicable outside of the conduit metaphor of Shannon's approach, and formulated without reference to uncertainty. It has several features superior to those of entropy. For instance, unlike entropy it can be easily and consistently extended to the continuous probability distributions, and unlike differential entropy this extension is always positive and invariant with respect to linear transformations of coordinates.
Keywords: entropy; measures of information; information theory; semantics of information.
MSC 2000 Codes: 94A17