Jean-Bernard Brissaud
Lab/UFR High Energy Physics, Physics Department, Faculty of Sciences, Rabat, Morocco.
E-mail: [email protected]
Received: 19 November 2004 / Accepted: 14 February 2005 / Published: 14 February 2005
Abstract:
Entropy is a basic physical quantity that led to various, and sometimes
apparently conflicting interpretations. It has been successively assimilated
to different concepts such as disorder and information. In this paper we're
going to revisit these conceptions, and establish the three following results:
Entropy measures lack of information; it also measures information. These
two conceptions are complementary.
Entropy measures freedom, and this allows a coherent interpretation of
entropy formulas and of experimental facts.
To associate entropy and disorder implies defining order as absence of
freedom. Disorder or agitation is shown to be more appropriately linked with
temperature.
Keywords: entropy; freedom; information; disorder.
MSC 2000 codes: 94A17