Look up entropy in Wiktionary, the free dictionary.
Entropy, in thermodynamics, is a property originally introduced to explain the part of the internal energy of a thermodynamic system that is unavailable as a source for useful work.
Entropic force. The product of temperature and the gradient of the entropy density is viewed an effective force, yielding a gradient in the energy density of a system.
Configuration entropy, the entropy change due to a change in the knowledge of the position of particles, rather than their momentum
Conformational entropy, the entropy change due to a change in the "configuration" of a particle (e.g. a right-handed vs. a left-handed polyatomic molecule)
Tsallis entropy, a generalization of Boltzmann-Gibbs entropy
von Neumann entropy, entropy in quantum statistical physics and quantum information science
Entropy of entanglement, related to the Shannon and von Neumann entropies for entangled systems; reflecting the degree of entanglement of subsystems
Algorithmic entropy an (incomputable) measure of the information content of a particular message
Rényi entropy, a family of diversity measures generalising Shannon entropy; used to define fractal dimensions
Measure theoretic entropy, a measure of exponential growth in dynamical systems; equivalent to the rate of increase of Shannon entropy of a trajectory in trajectory-space
Topological entropy, a measure of exponential growth in dynamical systems; equivalent to the rate of increase of α->0 Renyi entropy of a trajectory in trajectory-space.