Entropy- it's more than you think
The universe is constantly changing.
There was a time when this statement would have been met with criticism and the Church's labelling of it as blasphemy would have soon followed, with a trial and perhaps even suspension of the person who said it.
But now, we know better, with experimentation, that the universe is indeed changing.
What exactly is changing though? Is the configuration of galaxies changing? Or is the very fabric of space-time itself changing? No, it is just that the universe is getting more chaotic. Not in the way like the whole of the universe is at Chandni Chowk, though.
It was quite the same expression in the mid 1800s, as people started speaking about entropy and particularly one Rudolf Clausius, who propounded the theory of disgregation, which is the magnitude of the degree in which molecules of the body are separated from each other.
This concept would be useful in just a two-particle system, where you don't have to do so much to record the chaos. It is like the Batman and the Joker in the interrogation room, you can see what's happening quite clearly and you can gauge what might happen next. But now, think of the time when Bane's goons and the police fight. As a group of people, they collectively descend into chaos, but each particle's personal goals are unknown, especially if Bane's men wanted to desert, or the policemen got cold feet. Neither happened, but anyway.
I quite like the Batman movies, so here are some pictures. Notice before chaos, and during chaos.
This is basically how the idea of entropy began, but with better examples and more experiments, oh and being confused for something else as well, because the name 'entropy' was given to another property, while 'disgregation' was actually closer to the entropy we know today. However, disgregation was flawed, as I explained with Batman.
All of this went along for some while, till around the late 1870s, when Ludwig Boltzmann provided a statistical model for entropy, by saying that entropy is the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. In normal English, it basically means that entropy is the number of all places where a particle can be, to be in sync with the system, basically enabling it to continue increasing in randomness. Entropy is always a positive quantity, or can be zero, according to the second law of thermodynamics. It is because entropy can only increase with time, in an isolated system. This means that the difference between the final state and initial state of entropy of a system is always equal to or greater than zero, provided there is no work done upon the system by the surroundings.
ΔS = Sf - Si
ΔS ≥ 0 for all cases where work done = 0.
Consider your standard chaotic school playground before the prayer meet. Once the PE teacher comes and hits a few disobedient students (just suppose), everybody quietens down and goes into their class lines to avoid further embarrassment in front of their friends. The entropy of the playground went down drastically, because of the work done by the PE teacher. In reality, particles don't have minds, and PE teachers do not inspire much fear, but anyway.
Related to negative entropy is the third law. It states that at 0K, a perfectly crystalline solid will have zero entropy. This can be true only for a crystalline solid. This includes diamond and graphite, which have perfectly crystalline structures. Calculations show that they do not have entropy at 0K. But apart from this situation, any other imperfect crystalline solid, or amorphous solid, or liquids or gases cannt have zero entropy due to their high entropy at room temperature itself. Diamond and graphite are extremely stable allotropes of carbon, even at STP and NTP, due to their strong bonding. Both of them have extremely high melting points of about 3600℃, which imply they have low entropies at STP or NTP, and have extremely low, near zero entropies at temperatures approaching absolute zero. But in gases and liquids, zero entropy at 0K is impossible. Liquid nitrogen itself condenses at -196℃, and freezes at -210℃. So, it will still have some entropy left. We can formula-check, but not for-real-check, as the temperature of 0K is a non-realizable situation for humans.
It is unclear as to what exactly happens at 0K, but one thing is certain, that it will definitely cause a near-zero entropy for all substances. This was such a revolutionary idea in the end of the 19th century, that it was shunned by many of the leading scientists of the time, who were unwilling to accept the fact that the universe was chaotic and was not the cold, lifeless model that Newton said it would be.
A whole new field called statistical probability emerged after Boltzmann's ideas were proved correct. This is the very field that now drives our markets, based on complex calculations, that still can be undone with just the flick of a switch at the wrong time.
And finally it just reminds us that chaos can't go down and we can't do anything about it, except go along with it and remind ourselves of how large the universe is, and how insignificant we are.
Comments
Post a Comment