6/17/2021 – How many different games of chess are possible? Everyone knows it's a very, very large number. It can be written down in seconds, using just a few digits, is unimaginably huge. Unimaginable? The great scientist Enrico Fermi recommended that we at least try to understand very large numbers, to estimate what they entail. Mathematics professor Christian Hesse attempts to do this for the number of chess games. You will be stunned!

ChessBase 16 - Mega package Edition 2021

Your key to fresh ideas, precise analyses and targeted training!

Everyone uses ChessBase, from the World Champion to the amateur next door. It is the program of choice for anyone who loves the game and wants to know more about it. Start your personal success story with ChessBase and enjoy the game even more.

Enrico Fermi was one of the most renowned physicists of the 20th century. He worked on the development of the atomic bomb in Los Alamos during World War II. In Fermi’s opinion every educated person should be able to carry out plausible calculations in order to quickly estimate any size. He asked his students questions like: how many grains of sand does the Sahara have? How much does Mount Everest weigh. In this section we will deal with a Fermi question: how many chess games are there?

As you have probably heard, it is a very large number. But how large? Is it at all imaginable?

But before we come to it, there is another well-known very large number in chess: when the inventor, Sissa ibn Dahir, presented the wonderful new game to his king, the grateful monarch wanted to know how he wished to be rewarded for his invention.

In false modesty Sissa asked for a single grain of wheat on the first square of the chessboard, two on the second, four on the third, etc., always doubling the number on the following square, until the chessboard was covered. The king agreed, until he realized that 1 + 2 + 4 + 8 + ... resulted in him owing Sissa 2^{64} – 1, which is 18,446,744,073,709,551,615. That is more than eighteen quintillion grains of wheat, about 2,000 times the annual world production. Legend has it that the emperor did not grant Sissa’s request, but instead took his head.

Now back to the question: how many different chess games are theoretically possible?

Games with more than 60 moves are rare, and on average there are 30 legal moves in each position that is on the board. This means there are 30^{120} possible games. Naturally the vast majority of these games is blatantly nonsensical. But the fact remains that it is 10^{180} different games.

Now this is a number that has absolutely no relevance to anything in our real existence. The number of elementary particles in the hundreds of trillions of stars in the known universe is unimaginably smaller.

So let us try to modify the number, make it more manageable. To do this we try to define what we will call “meaningful” chess games. We assume that there are only five reasonable moves in an average chess position. In many there may of course be more, and in many less. The average of five moves per position seems reasonable.

Again we will assume that the overwhelming majority of reasonable chess games will be no more than 60 moves long, i.e. a maximum of 120 half-moves. With these two assumptions we arrive at a number of meaningful chess games that is much smaller: 5 x 5 x 5… x 5 with 120 factors. When multiplied, we get a result that is 10^{80} meaningful chess games. In mathematical terms this is a hundred tredecillion chess games.

10^{80} instead of 10^{180}. What a relief! That, surely, is a more manageable number.

10^{80} happens to roughly correspond to the number of atoms in the known universe, which consists of an estimated two trillion galaxies, with a few hundred billion stars in each galaxy.

But is 10^{80} really a number we can comprehend (in the Fermi sense)? Let us try.

*Warning: the passages you are about to read can cause dizziness and severe mental fatigue – so a little caution is advised. Maybe you should put some pillows on the floor in case you faint.*

We are going to try to break down the number of reasonable chess games into something we Earthlings can just about grasp – like the number of grains of sand in the Sahara, the weight of Mount Everest, the distance to the moon.

We start by setting up a computer that plays through a million (reasonable) games per second, and task it with do this for a hundred tredecillion (10^{80}) games. After starting it we we walk across to the Sahara. There we pick up one grain of sand. This we transport to Arizona, crossing the Atlantic in a row-boat, and toss the grain of sand into the Grand Canyon.

Mind you, we are very slow walkers, and require quite some time for each step we take. Similarly, we are very slow rowers, and each oar stroke takes a great deal of time. Transporting this single grain of sand takes a hundred years.

As soon as the grain is in the Grand Canyon, we begin a second task. We walk over to Mount Everest and carve out a teaspoon-full of it. This we transport, at the same speed (a teaspoon per century), to Canada. There we deposit it on the ground. Then we return to Everest for a second teaspoon full. And we continue doing this until the entire mountain is standing in Canada. After this we reverse the process, moving Everest, teaspoon by teaspoon, back to Nepal. Once this is completed, we return to our computer. Has it played through the 10^{80} games, running at a million games per second? Nowhere close!

Now, we walk and row back, at the same breath-taking speed to Africa. We pick up a second grain of sand in the Sahara and transport it to the Grand Canyon. After this, we go back to Mount Everest and transport the entire mountain, teaspoon by teaspoon, to Canada, and then back to Nepal. Only then we are ready for the third grain of sand. We keep doing this, one grain of sand at a time, and after each grain we move the entire Mount Everest back and forth, until the Grand Canyon is filled with Saharan sand.

Now we proceed to do the reverse: we return all the sand, grain by grain, back to Africa. And between two grains we transport Mount Everest with a teaspoon to Canada and back. We do this until the Grand Canyon is empty. I grant you it takes a very, very, very long time.

In terms of the Sahara and Everest, we are back to square zero. To make a note of this, we take a sheet of paper, one square meter in size, and use a pencil to make a dot in the top left-hand corner. Then we proceed to repeat the entire cycle: filling the Grand Canyon, grain by grain, emptying it again, and between two grains we move all of Everest, teaspoon by teaspoon, and return it to its original place. When the cycle is complete we make a second dot on the paper, next to the first. After this we repeat the procedure: a first grain of sand, all of Mount Everest back and forth, a second grain of sand, etc. We keep doing this until the entire paper is filled with dots. Has the computer now at last finished playing through at least a few tredecillion games? Not yet!

So we repeat the entire process with a second sheet of paper, and when that is filled, with a third, and a fourth, putting one sheet on top of the other. We keep doing this until we have a pile of paper sheets that reaches the moon.

Exponential Growth, from TED talk by Adrian Paenza (animation by TED-Ed)

Then we start erasing the dots, one at a time, filling and emptying the Grand Canyon, dismantling and reconstructing Everest after each grain of sand, before erasing each dot. Slowly we are getting tired.

So please tell us, you may say, when all the sheets of paper are empty, did computer at last play through all 10^{80} games. No, but we are closing in on the number. We have to repeat the process, with a new pile of sheets reaching all the way up to the moon, filled with dots and then erased – and do this again and again. How often? Ten times or a thousand times? No, in fact we have to make 200.000 similar piles to the moon before we are done. Only then the computer has at last finished its task: playing through all the reasonable games of chess that are possible.

What is the purpose of this deeply outlandish story? It is to give you a faint impression of what really big numbers mean. Saying there can be 10^{80} possible games of chess is probably a bigger deal than you could ever have imagined.

**Also read:**How God plays chess

Advertising |

Discussion and Feedback
Join the public discussion or submit your feedback to the editors

More...

7

More...

7

I was following the professor's argument with some awe when he arrived at the square metre sheets of paper. He did not say how many dots each sheet can carry! Without this we cannot determine if the figure is really big or not. It was at this point that I lost grasp of the number...

OMG what an interesting work, many thanks for that revelation !

@Frederick interesting fun fact. I was also amazed when Lawrence Kraus pointed out that we are able to measure and detect gravitational waves, which are so small in general by the time they reach us that it required a fine work of engineering to do so. They have two systems of light channeling. One in Louisiana, the other maybe in Arkansas and when a gravitational wave reaches us, then the length of space temporarily changes at one of the system and then at the other system. They detect such anomalies and validate that a gravitational wave indeed reached us if both systems were affected in short succession. Gravitational waves travel in the speed of light, so the difference between the two resonates that, but in real life it is not that simple, because the wave can come from any direction (but we expect them to be more likely to come from the center of the galaxy).

@dumkof good point about particles. In some circumstances there are no atoms. Also, according to Einstein, energy is another manifestation of mass, as his famous formula suggests. I would also add to your skepticism with my own, since we only know what we know. While this sounds obvious, it is a serious argument, since outside the bounds that we can observe, we do not know how much material is spread. It could be that the universe (or universes in some theories) is so large that the observable universe is just a speck inside of it. And, to further add to the fun, we say that there are more chess games/moves than atoms. Which implies, using the word "is" that we are comparing them now. However, the information we get from distant places travelled for millions or even billions of years until it reached us, so a star we see now may be long gone.

@dumkof good point about particles. In some circumstances there are no atoms. Also, according to Einstein, energy is another manifestation of mass, as his famous formula suggests. I would also add to your skepticism with my own, since we only know what we know. While this sounds obvious, it is a serious argument, since outside the bounds that we can observe, we do not know how much material is spread. It could be that the universe (or universes in some theories) is so large that the observable universe is just a speck inside of it. And, to further add to the fun, we say that there are more chess games/moves than atoms. Which implies, using the word "is" that we are comparing them now. However, the information we get from distant places travelled for millions or even billions of years until it reached us, so a star we see now may be long gone.

dlemper, life is too short to count all these particles. Neutrons and protons are ok, but photons and electrons are too fast to be counted. It's not worth the effort ;)

The number of all possible move sequences in chess is probably another 10^50 times bigger than the number of particles.

The number of all possible move sequences in chess is probably another 10^50 times bigger than the number of particles.

Awesome article.

But I am uneasy when someone tries to add up the universe. The first issue is that Professor Hesse mentions atoms and particles. These are different.

Cosmologists commonly give 10^80 as the number of atoms. Actually the majority, perhaps 90 %, are not 'atoms' but charged ions in the plasma of stars.

Counting difficulties escalate if you try to enumerate particles. There are neutrons packed densely in collapsed stars. What about the electrons, neutrinos and photons in space ? Do we count hypothetical gravitons ? They might be real. What about quarks, virtual particles and the unknown of dark matter and inside singularities ?

Counting is establishing a one-to-one correspondence between the set being enumerated and a subset of the natural numbers. Particles are constantly splitting, merging, being created or annihilated. During a supernova, the collapse of a massive star, an average of 10^57 neutrinos are released.

Even if we could magically fly through the universe and click a Veeder-Root for each particle, one could never complete the count.

Whether one can count an ephemeral, constantly changing, set should be left to the philosophy of mathematics. I suppose they'll pull out of the hat some sort of limit. Norm Weiner would laugh if you tried to count clouds.

But I am uneasy when someone tries to add up the universe. The first issue is that Professor Hesse mentions atoms and particles. These are different.

Cosmologists commonly give 10^80 as the number of atoms. Actually the majority, perhaps 90 %, are not 'atoms' but charged ions in the plasma of stars.

Counting difficulties escalate if you try to enumerate particles. There are neutrons packed densely in collapsed stars. What about the electrons, neutrinos and photons in space ? Do we count hypothetical gravitons ? They might be real. What about quarks, virtual particles and the unknown of dark matter and inside singularities ?

Counting is establishing a one-to-one correspondence between the set being enumerated and a subset of the natural numbers. Particles are constantly splitting, merging, being created or annihilated. During a supernova, the collapse of a massive star, an average of 10^57 neutrinos are released.

Even if we could magically fly through the universe and click a Veeder-Root for each particle, one could never complete the count.

Whether one can count an ephemeral, constantly changing, set should be left to the philosophy of mathematics. I suppose they'll pull out of the hat some sort of limit. Norm Weiner would laugh if you tried to count clouds.

A physicist friend asked me if we knew (and could imagine) the smallest number. In CERN they have managed to measure the weight difference between two charm-mesons in quantum superposition. It is 10^-38 or 0.00000000000000000000000000000000000001 grams. Try to imagine that! Here's an article on the subject: https://www.ox.ac.uk/news/2021-06-08-subatomic-particle-seen-changing-antiparticle-and-back-first-time

The author obviuosly had the assumption that most of the games are not lasting more than 60 moves, which is correct. But there are many games that last more and mind you, we have a single starting position and we have deep complications starting from there. Why does the author assume that using the 61th move as a starting position we would not see a similar pattern of deep complexities, maybe even deeper than the deepness of the 60 move bounds the author boxes chess into? Also, "sensible" is a subjective term. What makes sense to a GM, does not make sense to an amateur. What makes sense to a computer does not make sense for a GM.

So, let's focus on the actual numbers, without adding subjectivity. There are three main factors: 1. the length of the longest game (2 plies * 50-move rule * number of pawns * 7 squares for the pawn to move * number of nonking pieces - number of pawn takes), 2. the average number of legal moves on each ply, 3. anomalies, like bottleneck factors, that cut a few branches from the imagined exponential game tree.

1. is known, 2. is unknown, but can be observed, 3. is unknown. In order to have a proper research, we need to address 2 and 3. But we also need to avoid fallacies. Let's exclude subjectivity, like "reasonable" games and let's exclude the assumption that we play the game well. It's too complex, so it well might be that entities with 4000 ELO would have an average length for their games of 150 moves or so. I'm not saying that this is the case, but I do say that we should not take human games as universal, as we are very limited in our abilities in comparison of what a perfect chess game would necessitate.

So, let's focus on the actual numbers, without adding subjectivity. There are three main factors: 1. the length of the longest game (2 plies * 50-move rule * number of pawns * 7 squares for the pawn to move * number of nonking pieces - number of pawn takes), 2. the average number of legal moves on each ply, 3. anomalies, like bottleneck factors, that cut a few branches from the imagined exponential game tree.

1. is known, 2. is unknown, but can be observed, 3. is unknown. In order to have a proper research, we need to address 2 and 3. But we also need to avoid fallacies. Let's exclude subjectivity, like "reasonable" games and let's exclude the assumption that we play the game well. It's too complex, so it well might be that entities with 4000 ELO would have an average length for their games of 150 moves or so. I'm not saying that this is the case, but I do say that we should not take human games as universal, as we are very limited in our abilities in comparison of what a perfect chess game would necessitate.

"we will assume that the overwhelming majority of reasonable chess games will be no more than 60 moves long, i.e. a maximum of 120 half-moves."

That's a premature assumption. Mind you that from the starting position we have a single game with many variations. The article calculated with 30 moves on average upon each ply. However, if there is a single reasonable game which lasts from the 60th move onwards, maybe containing some promotions and several queens on the board for a while, with an open position and lots of options, then we could easily reach very long endgames, with more than 30 possible moves on average and possibly longer than a sequence of 60 moves. So the assumption in the article is bogus. I understand the need to limit our research into well-defined bounds, but this approach is prematurely ignores lots of things. Let me emphasize this: if there is a single 61th move that ends up to be a 200-move game and is reasonable, having more than two queens and a few rooks on the board (easily provable fact), then the complexity dramatically increases, since, from the starting position we had a single game with subvariations as well in the first place. And there are more than a single 60+ moved games.

That's a premature assumption. Mind you that from the starting position we have a single game with many variations. The article calculated with 30 moves on average upon each ply. However, if there is a single reasonable game which lasts from the 60th move onwards, maybe containing some promotions and several queens on the board for a while, with an open position and lots of options, then we could easily reach very long endgames, with more than 30 possible moves on average and possibly longer than a sequence of 60 moves. So the assumption in the article is bogus. I understand the need to limit our research into well-defined bounds, but this approach is prematurely ignores lots of things. Let me emphasize this: if there is a single 61th move that ends up to be a 200-move game and is reasonable, having more than two queens and a few rooks on the board (easily provable fact), then the complexity dramatically increases, since, from the starting position we had a single game with subvariations as well in the first place. And there are more than a single 60+ moved games.

@floryncd You have 2^0 for the first square and since the power is incremented by one for each square, having 64 squares, you get:

2^0 + 2^1 + ... 2^63

This number, converted into a two-base number can be represented by 64 bits, all having 1 as a value.

2^0 + ... 2^(n-1) = 2^n - 1

Proof: if we have n-1 bits, all ones and we add one to this number, then the rightmost will be 0. Because of that, the second rightmost will be zero. Since all the binary digits were 1, each and everyone will be 0 and a new digit on the left will appear with a value of 1. Hence:

[2^0 + ... 2^(n-1)] + 1 = 2^n

Let's subtract 1 from both sides:

2^0 + ... 2^(n-1) = 2^n - 1

Since n - 1 = 64, as seen above, the number of grains of sand is 2^64 - 1, so the article was correct about this.

2^0 + 2^1 + ... 2^63

This number, converted into a two-base number can be represented by 64 bits, all having 1 as a value.

2^0 + ... 2^(n-1) = 2^n - 1

Proof: if we have n-1 bits, all ones and we add one to this number, then the rightmost will be 0. Because of that, the second rightmost will be zero. Since all the binary digits were 1, each and everyone will be 0 and a new digit on the left will appear with a value of 1. Hence:

[2^0 + ... 2^(n-1)] + 1 = 2^n

Let's subtract 1 from both sides:

2^0 + ... 2^(n-1) = 2^n - 1

Since n - 1 = 64, as seen above, the number of grains of sand is 2^64 - 1, so the article was correct about this.

Excellent story! Very picturesque!! Thanks....

Stunning!!

@floryncd Actually I think it's 2 to 64th power minus 1. Let's observe, for instance, the 3rd square. It's 1+2+4 = 7 = 2 to 3rd power minus 1. The 5th square goes as follows: 1+2+4+8+16 = 31 = 2 to 5th power minus 1. That makes the 64th square: 2 to 64th power minus 1. Or maybe I make a mistake...

Not 2 to 64th power minus 1, but 2 to 63rd power minus one. Please, correct.

wow! simply wow...

Breathtaking! Absolutely incredible! what a beautiful, vivid, imaginative picture...

1