But perhaps this discussion takes me too far away from the main subject of this thread. I want to know what information is. I want to know what entropy is. And I hope to learn!
Then let us continue. Let us move towards entropy. I have waited a long time for this. However, we will begin atypically by discussing permutations.
Imagine a discrete Universe where space is divided into cells. There can be only one particle per cell
. This assumption will apply to all of the examples in this post. To begin with, let's assume that we have an isolated system composed of two cells and two particles. Our particles and cells are distinguishable. Let’s number the particles as 1 and 2, and let the cells intuitively be numbered from left to right.
So we have the following situation:
In how many ways can we arrange the particles in cells now? How many possible settings are there in this system? We see that only two:
So let us create a slightly more complicated system. Now we have three cells and three particles. How many possible settings are there? Let's list them:
1 2 3
1 3 2
2 1 3
2 3 1
3 1 2
3 2 1
So we have six possible settings. How many possible settings will be in a system that consists of four particles and four cells? And how many of them will be in a system with n particles and n cells?
I will answer this question. The general formula is n!. We need to know that
Hence we know that:
and so on.
This function is called factorial, and it is by no means a new invention. As a digression, I will quote an excerpt from Wikipedia:
The use of factorials is documented since the Talmudic period (200 to 500 CE), one of the earliest examples being the Hebrew Book of Creation Sefer Yetzirah which lists factorials (up to 7!) as a means of counting permutations. Indian scholars have been using factorial formulas since at least the 12th century. Siddhānta Shiromani by Bhāskara II (c. 1114–1185) mentioned factorials for permutations in Volume I, the Līlāvatī. Fabian Stedman later described factorials as applied to change ringing, a musical art involving the ringing of several tuned bells.
Why is that and how does this apply to our settings? Well, when we have n cells and n particles, after assigning the first particle to a specific cell, the next particle has only n-1 possibilities. Another one has just n - 2. We can check it on any simple example, for example for n = 3.
Nevertheless, we must remember that 0! = 1. Why is it like that? If anyone is interested, I can write a separate comment on this. I will then write about the function as a mapping of sets. For now, however, we can ignore this and make this assumption. So 0! = 1. Let us remember this.
Now one of my favourite questions should be asked. When does matter begin (I mean classical matter, like solid or gas)? Is one atom already classical matter? Or maybe one molecule of water is classical matter? And 5 atoms? And 100 atoms/molecules? And 6.02214076∙1023
? But why 6.02214076∙1023
? Is that some kind of magic number? I will come back to this question in a moment.
First of all, what does it actually mean that we are dealing with matter? Well, I'm going to start with the statistical idea of entropy, which is applicable to thermodynamics and statistical physics, so I’m abandoning my beloved quantum world, because I want a macroscopic model, my world is to be Newtonian. Therefore, I will write my question more precisely: How many atoms or molecules do I need to be able to describe their behaviour in classical (Newtonian) terms?
Now let’s go back to our magic number: 6.02214076∙1023
. In physics and chemistry, this constant is called Avogadro’s constant and denoted by the symbol NA
. And more specifically:
is equal to 1/mol.
So it is nothing more than a school definition of a mole. One mole contains 6,02214076∙1023
elementary objects. Elementary objects may be atoms, molecules, and even pieces of paper, unicorns or whatever we can think of. When we look at the periodic table of elements, we find, among others, information about their atomic mass. This atomic mass corresponds to the number of grams of the given substance in one mole. So we see that one mole of hydrogen weighs 1 gram, one mole of oxygen weighs 16 grams, hence one mole of H2
O weighs 2 ∙ 1 g + 16 g = 18 g and so on.
So when we talk about a mole of atoms or molecules, we have macroscopic sizes that meet the principles of classical mechanics. However, the question about the origin of matter still remains unanswered. And half of the mole is already matter or not? And a quarter of a mole? Is this border sharp or is it fluid? These are just more questions, but let’s make an assumption to make it easier to understand the basic concept of entropy.
Imagine a typical classical system. Suppose we have, for example, 1025
atoms or molecules. Going back to our world that is made up of cells, we also have 1025
How many possible settings do we have now? Let us look at the following table:
Now let us move on to our everyday life. Let us imagine we have two paints - red and blue. Red paint consists of red particles, and blue - of blue particles. We mix them together and get a violet colour. How long do we have to mix the paints to separate them again and to have red and blue paint again?
If we mix forever, we will surely succeed. However, let’s try to estimate the probability that we will be able to achieve the intended effect much faster. Let the number of all molecules equal the number of all cells and be equal to 1025
. We assume that molecules can move between cells in any way. We do not pay attention to whether the cells are close or far apart. Contrary to appearances, it is by no means so important when we consider such large numbers.
Our set of all possible settings includes now 1025
! elements. How many settings are there where the colours are separated? Of course, more than one, maybe even a billion, but that's still not much compared to the number 1025
!. Many more settings are the ones in which the paints are not separated, and this is the fundamental idea of entropy. We say that the Universe tends towards disorder, is it really so? Or maybe the Universe only selects states that do not seem ordered to us?
As I mentioned in one of the previous posts, when we roll the dice ten times and we get the same number of dots ten times, we are surprised, this result seems different, interesting. However, it is just as likely as any other “uninteresting” result that is no less unique.
This observation is the first secret of entropy. In the next post, I will mention the Maxwell’s demon, who will at all costs want to find harmony within the chaos.