PDA

View Full Version : Entropy, Disorder, Information etc HELP!!!

Fontwell
5th December 2008, 12:15 PM
Help me Oh Jref, you're my only hope!

This is going to be a bit of a confessional. I'm interested in science (or popular science), I've been an electronic engineer for 19 years plus another six years of study and I don't really get what is meant by entropy. Please note that I'm not disputing anything here, my UK education never mentioned the word once and also I gave up chemistry at age 14/15 and so I just don't get it.

I have read and watched a lot of stuff in which the term is used and also read a few definitions so I know roughly what is meant by entropy but something doesn't feel right about it. Usually when that has happened to me in the past is has been due to one of two things. One is that there was some extra piece of information that I was assumed to know but somehow didn't. The other is that someone is using a term to mean something and I am using it in a different way. I don't know which it is here but possibly both.

I've got the classic problem that I half know what my problem is but I don't know what to ask. So I'm just going to dive in and say what I think and see what happens.

So as I understand it, entropy is disorder. It's not just disorder but "S" a measure of disorder (units? range? I don't know). Now I'm already a bit lost, because I can rate the disorder in my kitchen on a scale of 1 to 10 (family visit in 1 hour to PAAAARTY!!!). But it is an arbitrary scale, like the Beaufort wind scale. I couldn't do any meaningful mathematics with it. So how do we get a meaningful number from a woolly concept like disorder? Is the term disorder being used in a very technical sense to mean something very specific - well I guess it must be, so what does it <u>really</u> mean? What process and what inputs are needed to come up with an actual number? Verbal descriptions preferred to detailed equations.

I also have similar problems with information and complexity and order. It may be related but it could equally be a separate issue. Annoyingly I did once read something that clarified it for me but I can't remember what it was and I've reverted to being confused.

Right. If you release some energy you can swap it for more order (I hope that bit is right!). That order could also be described as complexity(?). I'm imagining building something, like say, a bicycle. So it started as a load of bits of ore etc in the ground - fairly disordered - and ends up as a bicycle - fairly well ordered. I understand that the work done to do this will probably end up as heat which is 'very disordered' (what ever that means!) so we haven't broken any laws, that's not my problem. My problem is this. If I make another bicycle it is pretty much that same as the first one (say). This new bicycle provides no new information to the world (or not much more). To describe this new bicycle I could say - see that old bicycle over there, well the new one is just like that, only it's in my shed. Whereas, to describe the constituent parts in the ground before it was made would take a huge description. So it looks like the amount of information is going down as orderedness goes up (because the heat loss/entropy increase was the same as for the first bike but we lost information by ending up with two similar things). Is that right? Are two bicycles twice the complexity of one bicycle? Is complexity the same as order? How does information relate to entropy? Also, HELP!!

BTW I have looked around the internets but I seem to be after something that is between levels - I'm not a physicist looking to evaluate the entropy change in freezing argon but "Entropy = Disorder" is not enough for me.

Any external links or explanations here would be very much appreciated.

drkitten
5th December 2008, 12:27 PM
So as I understand it, entropy is disorder.

Good enough for folk music.

It's not just disorder but "S" a measure of disorder (units? range? I don't know). Now I'm already a bit lost, because I can rate the disorder in my kitchen on a scale of 1 to 10 (family visit in 1 hour to PAAAARTY!!!). But it is an arbitrary scale, like the Beaufort wind scale. I couldn't do any meaningful mathematics with it. So how do we get a meaningful number from a woolly concept like disorder?

There are several different formulations, depending upon exactly what you're measuring. You're measuring your kitchen? Well, how many micro-states are there that would all count as "clean"? For example, I could swap the knives and the spoons in the silverware drawer, and it would still be "clean." But I couldn't scatter the knives all over the counter and still have it be "clean," right?

There are lots and lots of ways to arrange the "stuff" in your kitchen to have it be messy. There are very few ways to arrange stuff that you would consider to be clean. The higher number of rearrangements means that there is more entropy in a "messy" kitchen than in a clean one.

Is the term disorder being used in a very technical sense to mean something very specific - well I guess it must be, so what does it <u>really</u> mean? What process and what inputs are needed to come up with an actual number?

To get the actual number, calculate the number of microstates and take the logarithm. The mroe microstates, the higher the value.

Right. If you release some energy you can swap it for more order (I hope that bit is right!). That order could also be described as complexity(?). I'm imagining building something, like say, a bicycle. So it started as a load of bits of ore etc in the ground - fairly disordered - and ends up as a bicycle - fairly well ordered. I understand that the work done to do this will probably end up as heat which is 'very disordered' (what ever that means!) so we haven't broken any laws, that's not my problem.

Again, close enough for folk music, although I don't like saying you're "releasing" energy. More like "spending" it.

My problem is this. If I make another bicycle it is pretty much that same as the first one (say). This new bicycle provides no new information to the world (or not much more). To describe this new bicycle I could say - see that old bicycle over there, well the new one is just like that, only it's in my shed. Whereas, to describe the constituent parts in the ground before it was made would take a huge description. So it looks like the amount of information is going down as orderedness goes up (because the heat loss/entropy increase was the same as for the first bike but we lost information by ending up with two similar things). Is that right?

Yes.

But it's not a simple 1:1 ratio between "information" and "order."

Are two bicycles twice the complexity of one bicycle?

No.

Is complexity the same as order?

No, but they're near neighbors.

How does information relate to entropy?

Look up "Shannon entropy" and see if those links help.

Fontwell
5th December 2008, 12:42 PM
Thanks drkitten. One thing I don't get straight away is the "more microstates" definition. I think I understand roughly what you mean by microstates and how a rarer condition could be thought of as more ordered but it seems a very anthropomorphic/cultural way of defining order. Is my kitchen more ordered when the tea cups are randomly arranged in the cupboard than when they are in size order along the floor. Who gets to choose which states are equivalent?

How is one state deemed to be less ordered than another. It can't be 'because I think it looks a mess' surely? There has to be some sort of objective sciencey measure.

fuelair
5th December 2008, 12:54 PM
Actually, more to the fast point, the fact that your teacups are teacups rather than their component atoms randomly distributed is order. Eventually (see Heat Death of the Universe) all the order the Universe and us have brought out of the basic building blocks will go back to random, even arrangement of almost unmoving particles filling (about 1 per cubic meter or more) the Universe at a temperature of a few billionths or trillionths of a Kelvin above Absolute Zero. No order, no higher objects. (No light)(or other e.m. either)

And that, he said,more-or-less quoting G. A. Effinger, is What Entropy Means to Me!

drkitten
5th December 2008, 01:02 PM
Thanks drkitten. One thing I don't get straight away is the "more microstates" definition. I think I understand roughly what you mean by microstates and how a rarer condition could be thought of as more ordered but it seems a very anthropomorphic/cultural way of defining order.

Hey, you wanted it without the math.

Is my kitchen more ordered when the tea cups are randomly arranged in the cupboard than when they are in size order along the floor. Who gets to choose which states are equivalent?

Ultimately, physics does. But that's an unsatisfactory answer, so I need to dig a little deeper.

In general, any two electrons are identical and literally cannot be told apart by any method known to science. Similarly, any two atoms are identical if they have the same number of protons and neutrons -- you literally can't tell one uranium atom from another if they're the same isotope. If I swapped them for my own nefarious reason, you'd literally never know.

In this ultimate physical sense, then, we're counting microstates that cannot be told apart. If you had teacups that could not be told apart at all, then "randomly arranged" would be the only possible way they could be (arranged) -- and they'd all be interchangeable and hence high-entropy.

Fontwell
5th December 2008, 01:06 PM
Shannon entropy stuff looks very good. It will take a while for me to digest but quite a lot of it looks OK. I touched on some of this in electronics but we didn't use quite the same terms (they had to keep it simple for us!). Thanks.

Fontwell
5th December 2008, 01:12 PM
Thinking...

Fontwell
5th December 2008, 01:28 PM
Actually, more to the fast point, the fact that your teacups are teacups rather than their component atoms randomly distributed is order.

But why?! It isn't just density of matter is it? What do you measure that gives a little number in teacups but not in "their component atoms randomly distributed"? I mean, at a subjective level I can see they are different but what has changed? What if I went to great trouble to get those atoms arranged just how I wanted them floating around as a gass? I know this must be painful but I'm keep getting the feeling that there is some piece of information you take for granted and I don't know what it is.

RecoveringYuppy
5th December 2008, 01:34 PM
If you have them "arranged" then your arrangement of atoms isn't acting like a gas. Gases have different entropy from a solid made of the same atoms specifically because the atoms are moving (experiencing a wider range of micro-states).

Fontwell
5th December 2008, 02:21 PM
One last attempt. Say I set off a firework and it explodes in the sky and makes a pretty pattern all the way down just how I wanted. I get the feeling that as it explodes it is already gaining entropy. But as a human experience it isn't becoming disordered, everything is just right. What do we measure to spot the increase in entropy - is it just some energy density thing?

Actually, it occurs that this might be a 'conservation of energy' type scenario (but not literally). By which I mean, it is almost impossible to explain conservation of energy in any helpful way to someone who doesn't get it, other than to keep repeating how it is true. But eventually, when you've seen a hundred different calculations involving momentum/electrical power/heat you just get used to the fact that all the energy put in ends up somewhere else as other energy out and you absorb that universal truth almost by osmosis.

progressquest
5th December 2008, 03:05 PM
One last attempt. Say I set off a firework and it explodes in the sky and makes a pretty pattern all the way down just how I wanted. I get the feeling that as it explodes it is already gaining entropy. But as a human experience it isn't becoming disordered, everything is just right.

Your feeling is correct: it is gaining entropy. The answer is that it had a lower entropy before the explosion; and even less before it was shot into the sky. The firework has to be carefully packaged (ordered) in order to first shoot up, then explode precisely as it did.

To illustrate, imagine trying to take the bits that fall to the ground and sticking a match to the remains. Nothing (interesting) happens. That is because it most defiantly had more order (less entropy) before the match was touched to the fuse.

Fontwell
5th December 2008, 03:28 PM
So if I had a firework, how would you, in principle, calculate its entropy. drkitten talked about microstates, are there fewer microstates in the firework before it is set off? It is still the same number of atoms afterwards, just spread out a bit more. How does that change the number of microstates?

Ziggurat
5th December 2008, 03:48 PM
But why?! It isn't just density of matter is it?

No, it isn't. It's also a function of the fact that within the cup, atoms are NOT randomly arranged. They're arranged is rather particular groupings, with certain atoms stuck next to certain other atoms in regular structures.

What do you measure that gives a little number in teacups but not in "their component atoms randomly distributed"?

As a practical matter, one doesn't generally measure absolute entropy, but entropy change. And that's usually done with heat capacity measurements. The teacup example is rather far from typical thermodynamics problems, but if you really wanted to do it, this is what you'd do: you measure how much heat it took (as a function of temperature - that bit is important) to vaporize your cup into its constituent atoms, then cool the gas back down and measure how much heat you can extract as it cools (again, as a function of temperature). Then you integrate the equation dS = dQ/T (where dQ is something you measured, as a function of temperature) to find out how much entropy changed on the way up, and on the way back down, and the difference is your total change in entropy.

What if I went to great trouble to get those atoms arranged just how I wanted them floating around as a gass?

Then they would be in a low-entropy state (because there's only one microstate they could be in, the one you put them in). But they wouldn't STAY in a low-entropy state, and will soon end up in a high-entropy state, because nothing is constraining to remain in that one microstate, and it has LOTS of other microstates it could change to. Whereas the tea cup will not gain entropy over time, because it does have constraints on it (the chemical bonds that keep atoms in their place).

Ziggurat
5th December 2008, 03:53 PM
So if I had a firework, how would you, in principle, calculate its entropy. drkitten talked about microstates, are there fewer microstates in the firework before it is set off? It is still the same number of atoms afterwards, just spread out a bit more. How does that change the number of microstates?

The more places a given atom could be, the more microstates it could be in. You can (at least in principle) calculate the number of microstates by finding all the quantum energy eigenstates of the system. But you can also get a VERY good approximation of this number by looking at classical phase space (meaning a graph of the possible positions and momentums of all your particles). The number of accessible microstates is proportional to the area of classical phase space that is accessible to the system. The more possible momentums of each particle, the higher the entropy. The more possible places an atom could be, the higher the entropy. So a mol of gas at room temp in a small box has less entropy than a mol of gas at room temp in a large box.

Fontwell
5th December 2008, 04:42 PM
Thanks for all the effort guys. I think it is a bit clearer but what I'm really picking up is that there probably isn't a satisfying explanation that works at my level. I either need to really dig down a whole extra layer or just stick with the popular science.

As an aside, I've never really been comfortable with the phrase 'increase in entropy' I want it to be a 'decrease in order' - don't ask why coz I don't know. It just feels like saying, "I'm a year less young". Its like in electrical stuff when you start working in admittance instead of impedance - it all feel upside down. Anyway cheers for the replies.

sol invictus
6th December 2008, 04:07 AM
Thanks for all the effort guys. I think it is a bit clearer but what I'm really picking up is that there probably isn't a satisfying explanation that works at my level. I either need to really dig down a whole extra layer or just stick with the popular science. .

The best way to understand it is to think of simple examples. Take a room full of air. Now divide the room into two halves by an airtight wall, and evacuate all the air from one half with a pump (so it's all in the other half). That configuration has some entropy, given more or less by the log of the number of possible configurations of the air.

Now remove the barrier, and the air will rush back into the other half. Why did that happen? It's not because the molecules are pushing on each other - it would happen even if each molecule were totally independent of the rest, with no collisions. The reason (or one way to think about it at any rate) is that the entropy is vastly larger when the air spreads out, because each molecule has twice as much volume to live in - so the number of states increases by a factor of 2^N for N molecules, where N~10^23. That's an incredibly huge increase. So it's simply overwhelmingly probable that the air will rush into the evacuated part, because there's nothing stopping it, and because there are so many more possible arrangements in which it does than in which it does not.

Uncayimmy
6th December 2008, 01:10 PM
Sol, perhaps you can expand (pardon the pun) on your room example. Suppose for the sake of argument this air can be cooled into a liquid and then into a solid form.

When the energy is removed, it cools to a liquid. Entropy is then decreased, yes? That energy is moved somewhere else (say a block of aluminum), so entropy will be increased there, but not as much, yes? When the liquid becomes solid, then there are even fewer places for the molecules to be.

Am I heading in the right direction or is my brain showing signs of entropy? :-)

Uncayimmy
6th December 2008, 01:13 PM
Sol, perhaps you can expand (pardon the pun) on your room example. Suppose for the sake of argument this air can be cooled into a liquid and then into a solid form.

When the energy is removed, it cools to a liquid. Entropy is then decreased, yes? That energy is moved somewhere else (say a block of aluminum), so entropy will be increased there, but not as much, yes? When the liquid becomes solid, then there are even fewer places for the molecules to be.

Am I heading in the right direction or is my brain showing signs of entropy? :-)

Ziggurat
6th December 2008, 01:25 PM
When the energy is removed, it cools to a liquid. Entropy is then decreased, yes?

Yes.

That energy is moved somewhere else (say a block of aluminum), so entropy will be increased there,

Yes.

but not as much, yes?

No. The entropy of the aluminum must increase by at least as much as the cooled gas decreases. If the aluminum ends up at a higher temperature than the liquified gas (which will happen if it doesn't start out very cold AND large enough to absorb all that heat without increasing in temperature too much), then it will actually take more energy to increase the entropy of the aluminum sufficiently to compensate for the loss of entropy in the air. This is why it takes energy to run a refrigerator, even though all you want to do is move energy: you can't get rid of entropy.

When the liquid becomes solid, then there are even fewer places for the molecules to be.

Yes. But if it's colder (and not simply at higher pressure), then there are also fewer possible momentums for each molecule.

69dodge
6th December 2008, 06:50 PM
In this ultimate physical sense, then, we're counting microstates that cannot be told apart.

If they cannot be told apart, what does it mean to say that there are many of them? Why isn't there just one?

It seems like we need to imagine that they are, simultaneously, distinct but also identical. That's weird.

69dodge
6th December 2008, 06:57 PM
Then they would be in a low-entropy state (because there's only one microstate they could be in, the one you put them in).

But they're always in whatever single microstate they're in, whether he put them there or not.

69dodge
6th December 2008, 07:19 PM
If you have them "arranged" then your arrangement of atoms isn't acting like a gas.

No, it's the same gas. The only difference is that he has decided that he wants all those atoms to be exactly where they are, rather than not caring about where they are.

Gases have different entropy from a solid made of the same atoms specifically because the atoms are moving (experiencing a wider range of micro-states).

So you're saying that the gas has a high entropy now, because it will later pass through many different microstates?

(Which aren't really different...)

More weirdness.

RecoveringYuppy
6th December 2008, 09:04 PM
No, it's the same gas. The only difference is that he has decided that he wants all those atoms to be exactly where they are, rather than not caring about where they are.

So you're saying that the gas has a high entropy now, because it will later pass through many different microstates?

(Which aren't really different...)

More weirdness.
Because they have momentum now. (which made me question "arranged")

fuelair
6th December 2008, 09:57 PM
Note the air/divided room example: remove the blockade, more entropy as the atoms/molecules (N2, O2, CO2, H2O and trace) move into more space/further apart gaining entropy. The highest entropy is when the most (all) particles are the greatest distance apart they can be (evenly spread throughout the Universe). (aka, Heat Death of the Universe!

Ziggurat
7th December 2008, 09:51 AM
But they're always in whatever single microstate they're in, whether he put them there or not.

Yes, but your knowlege of what microstates they can be in matters if you want to calculate expectation values of any quantity. Which is what statistical mechanics is all about. If you have more information, the statistics change.

Ziggurat
7th December 2008, 10:02 AM
In general, any two electrons are identical and literally cannot be told apart by any method known to science. Similarly, any two atoms are identical if they have the same number of protons and neutrons -- you literally can't tell one uranium atom from another if they're the same isotope. If I swapped them for my own nefarious reason, you'd literally never know.

In this ultimate physical sense, then, we're counting microstates that cannot be told apart.

This is not correct. The microstates can, in principle, be told apart. Not without interfering with the system (and possibly transfering energy to or from the system - ie, Maxwell's daemon doesn't work), and in practice there's just FAR too many parameters to pin down in anything but the smallest systems, but they are indeed distinct. Microstates are NOT the same as fundamental particles.

69dodge
7th December 2008, 08:18 PM
Yes, but your knowlege of what microstates they can be in matters if you want to calculate expectation values of any quantity. Which is what statistical mechanics is all about. If you have more information, the statistics change.

So, entropy is like (Bayesian) probability? We talk about them as if they were properties of an object but actually they're properties of our knowledge of the object?

Ziggurat
7th December 2008, 08:37 PM
So, entropy is like (Bayesian) probability? We talk about them as if they were properties of an object but actually they're properties of our knowledge of the object?

Yes. But in equilibrium thermodynamics, you can't know more about an object unless you're putting additional contraints on that object (your example of putting a gas in a specified microstate and letting go is a non-equilibrium situation), and those restraints directly affect the properties of the object. So it's generally not unreasonable to treat entropy as being a property of the system in question.

sol invictus
8th December 2008, 04:32 AM
Yes. But in equilibrium thermodynamics, you can't know more about an object unless you're putting additional contraints on that object (your example of putting a gas in a specified microstate and letting go is a non-equilibrium situation), and those restraints directly affect the properties of the object. So it's generally not unreasonable to treat entropy as being a property of the system in question.

I don't disagree with that, but I would phrase is slightly differently. Entropy is not uniquely defined even for a given system in a given state, precisely because of the ambiguity that I think is bothering 69dodge and perhaps the OP.

Assuming the system is in a specific microstate, the fine-grained entropy is zero (and will remain so absent interactions with another system). But we can choose a coarse-graining, meaning we can choose to regard some classes of states as indistinguishable. In that case, the entropy will be non-zero (roughly it will be the log of the number of states we've chosen to consider indistinguishable from the true state). Depending on the state and our choice, that entropy can change - it usually stays near a maximum, or rises rapidly towards it if it started off smaller.

In my example of the gas in the box above, the coarse-graining in question was over volume - one can imagine drawing a little ball around each molecule, and any region inside at least one such ball counts as occupied. But one could of course choose a different coarse graining - only over momentum, for example - and get a rather different answer for what happens to the entropy.

dakotajudo
9th December 2008, 09:02 AM
Is my kitchen more ordered when the tea cups are randomly arranged in the cupboard than when they are in size order along the floor. Who gets to choose which states are equivalent?

Perhaps this thought experiment will help. I won't claim it's strictly accurate, but should help with the general concept.

Consider your kitchen. Image a giant coming along and shaking the building until all the contents of the kitchen are evenly distributed on the floor. After enough shaking, nothing really moves, because each object has reached it's lowest energy state (or, at least, the nearest local mimina of such a state).

How much work would it take to put the kitchen back in order? That's a measure of the original entropy of the kitchen. A very messy kitchen would not take many shakes, would it? Nor take much time to get back to the original mess.

Consider cups randomly arranged in the cupboard, vs cups in size order along the floor. Which state would require fewer shakes to reach a mess?

As Zigguraut notes, you can't really measure entropy in meaningful units; it's easier to measure a change, heat being among the easiest.

Ziggurat
9th December 2008, 11:49 AM
As Zigguraut notes, you can't really measure entropy in meaningful units; it's easier to measure a change, heat being among the easiest.

That's not a unit problem, that's a problem of defining your zero. I think that's what you meant, but the distinction does matter. And actually, there are ways to deal with the absolute value of entropy (not just the change) as well, even experimentally. In particular, the entropy of most solids approaches zero at zero temperature, and the absolute entropy of high-temperature, low-pressure (ie, ideal) gasses is known analytically. It's frequently not woth the effort to measure the heat capacity down to very low temperature (get low enough at you can extrapolate to zero quite reliably) or up to very high temperature (where everything eventually vaporizes), but it can be done.

Fontwell
14th December 2008, 05:20 PM
Thanks for all the extra posts guys. I've been too busy to properly digest everything posted here at the moment plus it hasn't helped that the OP got moved and I lost where it went. When I get time to consider it all, I'll let you know of any break throughs.