One typical, and not easy to answer, question is: What is entropy?
This is hard to answer since the concept of entropy has been, in the last decade perhaps, being in used in several fields of knowledge. However, this post is devoted to a perspective from chemical thermodynamics.
A little of history
Although several researchers could have noticed entropic phenomena it was Rudolph Julius Emannuel Clausius who ave the term entropy more than 164 years ago.
Well, entropy comes from a greek word meaning: transformation. Transformation of what? Clausius referred to entropy by means of an integral,
$dS>\dfrac{\bar{d} Q_{irr}}{T}$ Eq. (01)which does not help too much either. Howver, there are several demonstrations to arrive to th Clausius inequality Eq. (01) using the Carnot cycle for a series of process transformations. In Clausius mind, he is talking about the relationship of entropy to reversible heat transfer ans temperature. Also, Clausius inequality Eq. (01) is a condition for reversible processes in the way of measuring something called directionality: as in the direction in which an expansion process occur.
Clausius idea makes sense in the macro. Next, at molecular level, Ludwig Boltzmann, in 1877, used the same concept of entropy and adapted it to molecular statistics. In the idea of Boltzmann, entropy is referred as: the degree of disorder in a system or as J. Willard Gibbs calls it: the mixed-up-ness.
An idea of entropy from Boltzmann statement |
Now, several others have made what Boltzmann did: used the term entropy into another field (by using some licences). Nowadays, you can find entropy even in informatics.
About physics of entropy
Clausius, basically says, that a transformation may not occur if inequality Eq. (01) is not satisfied. This is, an engine may not be as efficient as thought or something else in nature may not be possible.
From the point of vew of Boltzmann entropy has the following equation,
$dS\equiv \dfrac{\delta Q_{rev}}{T}$ Eq. (02)
and the same of Clausius holds along with the idea of disorder. From here, the idea of estiamting the maximum work we can get from an engine (form example) comes out. Continuing with the disorder idea, the larger the entropy the greater the disorder.
How tro estimate entropy stuff
First, entropy is also a thermodynamic function of state.
Also, since math gets complicated several cases are to be managed separately so that more easy to use formulas can be obtained.
For a non isothermal process
In this case temperature changes
$\Delta S=nC_P\ln \dfrac{T_f}{T_i}$ Eq. (03)
where $C_P$ is the heat capacity given in [J/mol K].
For a non isochorical process
In this case volume changes
$Delta S=nR\ln \dfrac{V_f}{V_i}$ Eq. (04)
For a non isobaric process
In this case pressure changes
$\Delta S=-nR\ln \dfrac{p_f}{p_i}$ Eq. (05)
The above equations come frome the idea that being entropy a state function it is determined by the conditions of the system. Also, since several conditions may appear in a given process it is convenient to split those effects so that the case can be sutdied and then the result added-up.
No comments:
Post a Comment