arXiv Open Access 1997

Heat capacity in bits

P. Fraundorf
Lihat Sumber

Abstrak

Information theory this century has clarified the 19th century work of Gibbs, and has shown that natural units for temperature kT, defined via 1/T=dS/dE, are energy per nat of information uncertainty. This means that (for any system) the total thermal energy E over kT is the log-log derivative of multiplicity with respect to energy, and (for all b) the number of base-b units of information lost about the state of the system per b-fold increase in the amount of thermal energy therein. For ``un-inverted'' (T>0) systems, E/kT is also a temperature-averaged heat capacity, equaling ``degrees-freedom over two'' for the quadratic case. In similar units the work-free differential heat capacity C_v/k is a ``local version'' of this log-log derivative, equal to bits of uncertainty gained per 2-fold increase in temperature. This makes C_v/k (unlike E/kT) independent of the energy zero, explaining in statistical terms its usefulness for detecting both phase changes and quadratic modes.

Penulis (1)

P

P. Fraundorf

Format Sitasi

Fraundorf, P. (1997). Heat capacity in bits. https://arxiv.org/abs/cond-mat/9711074

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
1997
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓