READING: You should be done with Ch. 1 and into Ch. 2 by now.
The way we do this is as follows: we assume U is only known (or specified) within some uncertainty dU and that dU (while very small) is much larger than the separation between any discrete energy levels in its range. We must then think of g(U) = D(U) dU in terms of the density of states D(U) per unit energy.
of a system.
is a meaningful concept only for
predicting the most probable configuration
(how the energy gets shared) of two systems
in thermal contact. However, this can be very powerful!
(U)
somewhere in between.
At that value of U the inverse temperature
goes through zero.
If two systems are in thermal equilibrium with (have the same temperature as) a third system, they are in thermal equilibrium with (have the same temperature as) each other.
(Well, duh!)
Energy is conserved; heat is a form of energy.
(``You can't get something for nothing.'')
More details on this later.
If a closed system is in a configuration that is not the equilibrium (most probable) configuration, the most probable consequence is that the entropy of the system will increase monotonically with time.
(``You can't break even.'')
This is actually somewhat circular, as it is our only way of defining the ``arrow of time.'' I did indulge in a short philosophical digression on this subject. Again, ``more later.''
The entropy of a system approaches a constant as its temperature approaches zero.
(We are always talking about absolute temperature here, of course. See the textbook's Appendix B on thermometry.)
This Law is a little arcane. We may get back to it later or we may not. In any case the notion of temperature smoothly approaching zero is invalid for small systems (see above).
& 2nd Law:
1 of
system 1 as a function of U1 from 0 to U.
the slope of the curve is the inverse temperature
1 of system 1.
On the same graph, plot the entropy
2 of system 2
``backwards'' as a function of
U2 = U - U1
(i.e. from U2 = 0 on the right
to U2 = U on the left).
The slope of that curve (rise over run, taking the ``run'' leftward)
is the inverse temperature of system 2.
The condition of maximum net entropy (most probable configuration) is satisfied at the value of U1 (and U2 = U - U1) where the slopes are the same.
If the combined system starts in some other configuration, . . . (see the Second Law).
N
-
N
g(0,N)
exp(-2s2/N), valid for s
N,
we can express the entropy in terms of the energy as
-
U2 / 2 m2 B
When two spin systems are in thermal contact,
the sharpness of the combined g =
g1(s1,N1)
. g2(s2,N2)
can be defined in terms of the fractional deviation
2
s1 / N1
at which g drops from its peak value gpeak
(defining the most probable configuration)
to [let us say] 1/e2
5% of gpeak.
This deviation is shown to be given by
2/
N1
if N1
N2 (the smaller system determines the sharpness).
For small systems this is not all that sharp (20% for a system of 100 spins) but when N1 = 1020 (not atypical for a small sample of magnetic material) the sharpness is on the order of 10-10, which (to paraphrase Boltzmann) is ``the practical equivalent of fixed.''
We shall shortly move on to a general assumption that the thermal equilibrium value of a statistical quantity is indistinguishable (for all intents and purposes) from the actual, instantaneous value. This is not mathematically true, of course, so it is important to bear in mind what conditions must be satisfied in order to make it ``good enough.''