From 1908 to 1912, Brillouin studied physics at the École Normale Supérieure, in Paris. From 1911 he studied under Jean Perrin until he left for the Ludwig Maximilian University of Munich (LMU), in 1912. At LMU, he studied theoretical physics with Arnold Sommerfeld. Just a few months before Brillouin's arrival at LMU, Max von Laue had conducted his experiment showing X-ray diffraction in a crystal lattice. In 1913, he went back to France to study at the University of Paris and it was in this year that Niels Bohr submitted his first paper on the Bohr model of the hydrogen atom.[1] From 1914 until 1919, during World War I, he served in the military, developing the valve amplifier with G. A. Beauvais.[2] At the conclusion of the war, he returned to the University of Paris to continue his studies with Paul Langevin, and was awarded his Docteur ès science in 1920.[3] Brillouin's thesis jury was composed of Langevin, Marie Curie, and Jean Perrin and his thesis topic was on the quantum theory of solids. In his thesis, he proposed an equation of state based on the atomic vibrations (phonons) that propagate through it. He also studied the propagation of monochromatic light waves and their interaction with acoustic waves, i.e., scattering of light with a frequency change, which became known as Brillouin scattering.[4][5]
Brillouin was a founder of modern solid state physics for which he discovered, among other things, Brillouin zones. He applied information theory to physics and the design of computers and coined the concept of negentropy to demonstrate the similarity between entropy and information.[4][5]
Brillouin Science And Information Theory Pdf File
A classic source for exploring the connections between information theory and physics, this text is geared toward upper-level undergraduates and graduate students. The author, a giant of twentieth-century mathematics, applies the principles of information theory to a variety of issues, including Maxwell's demon, thermodynamics, and measurement problems. Author Leon Brillouin begins by defining and applying the term `information` and proceeds to explorations of the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, and thermal noise in an electric circuit. A discussion of the negentropy principle of information introduces Brillouin's renowned examination of Maxwell's demon. Concluding chapters explore the associations between information theory, the uncertainty principle, and physical limits of observation, in addition to problems related to computing, organizing information, and inevitable errors.
It is also important to bear in mind that all attempts to deriveLandauer's principle, to date, have been based upon classicalinformation processing. While it would appear that a lower bound, verysimilar in form to Landauer's principle, can be derived for quantumcomputational operations, unlike the classical case there appears noproof as yet of the existence of processes that can in principle reachthis bound. It remains possible, therefore, that quantum computationsmay need to incur additional thermodynamic costs. This appears to betrue even for the quantum analog of logically reversible operations:Bennett's (1973) procedure for avoiding the cost of storing additionalbits involves an operation which cannot in general be applied toquantum operations (Maroney 2004 [in Other Internet Resources]). Finally, as noted above,Allahverdyan and Nieuwenhuizen argued in the opposite direction, thatthe derivations of this lower bound involve assumptions which can beviolated by quantum theory in the low temperature regime.
Both the Szilard engine and Landauer's principle seem to raise asimilar problem about the relationship between knowledge andthermodynamic entropy: if one could know which side of theengine the molecule was located, one could extract work; if one couldknow which logical state the device was in, one could set itto zero without work. Without this knowledge, it is necessary to designa process that acts independently of the specific state the system isin. But it is plain that this does not tell us that, evenwithout the knowledge, it is impossible to design a cleverprocess that can still extract the work from engine withoutcompensation, or a clever process that can still reset the bit withoutwork. Hamiltonian mechanics and Liouville's Theorem seems to play avital, if largely unnoticed, role. As Zhang and Zhang's demondemonstrates, unconstrained violations of the second law are clearlypossible given non-Hamiltonian flows and no appeal to informationtheory or computation would seem able to avoid this.
In information theory and statistics, negentropy is used as a measure of distance to normality. The concept and phrase "negative entropy" was introduced by Erwin Schrödinger in his 1944 popular-science book What is Life?[1] Later, Léon Brillouin shortened the phrase to negentropy.[2][3] In 1974, Albert Szent-Györgyi proposed replacing the term negentropy with syntropy. That term may have originated in the 1940s with the Italian mathematician Luigi Fantappiè, who tried to construct a unified theory of biology and physics. Buckminster Fuller tried to popularize this usage, but negentropy remains common.
In information theory and statistics, negentropy is used as a measure of distance to normality.[5][6][7] Out of all distributions with a given mean and variance, the normal or Gaussian distribution is the one with the highest entropy. Negentropy measures the difference in entropy between a given distribution and the Gaussian distribution with the same mean and variance. Thus, negentropy is always nonnegative, is invariant by any linear invertible change of coordinates, and vanishes if and only if the signal is Gaussian. 2ff7e9595c
Comentarios