The need to generate numbers by chance dates back to humanity's earliest civilizations. In Mesopotamia, around 3000 BC, the Sumerians used knucklebones (astragali) to obtain random outcomes during divination rituals. In ancient Greece, Athenian democracy relied on the kleroterion, a lot-drawing machine invented in the 5th century BC, which randomly selected citizens to serve as jurors or magistrates. Aristotle himself argued that sortition was more democratic than election. In Rome, the Sortes Virgilianae involved opening Virgil's Aeneid at a random page and reading an omen — a primitive form of randomness drawn from text.
Throughout the Middle Ages and the Renaissance, chance remained inseparable from the sacred. Dice, the ancestors of modern number generators, served both as gaming instruments and divination tools. In 1494, the mathematician Luca Pacioli presented in his Summa de Arithmetica one of the first formal problems of equitable division involving chance. Later, in 1654, the famous correspondence between Blaise Pascal and Pierre de Fermat on the "problem of points" laid the foundations of probability theory, providing the first rigorous mathematical framework for the concept of a random number.
The modern era saw the first systematic attempts to produce tables of random numbers. In 1927, British statistician Leonard H.C. Tippett published the first table of 41,600 random numbers, derived from census data. In 1947, the RAND Corporation launched a far more ambitious project: using an electronic roulette wheel, it generated one million random digits, published in 1955 in the landmark book "A Million Random Digits with 100,000 Normal Deviates," which became an indispensable reference for researchers worldwide for decades.
The computing revolution transformed the field radically. In 1946, mathematician John von Neumann proposed the "middle-square" method for ENIAC, one of the first computers: take a number, square it, and extract the middle digits as the next number. Despite its flaws — some sequences converge to zero — this method inaugurated the era of pseudo-random generators. In 1949, Derrick Henry Lehmer invented the linear congruential generator (LCG), based on the formula Xn+1 = (aXn + c) mod m, which remained the standard algorithm for decades. In 1997, Makoto Matsumoto and Takuji Nishimura created the Mersenne Twister, whose astronomical period of 2¹⁹⁹³⁷−1 made it the most widely used pseudo-random generator in the world.
Cognitive psychology has revealed that humans are poor random number generators. A classic 1972 study by William Wagenaar showed that when asked to produce random sequences, subjects systematically avoided repetitions and regular patterns, generating sequences that were too "balanced" to be truly random. In 1991, psychologist Peter Ayton demonstrated that people overestimate the probability of alternation in random sequences — known as the "gambler's fallacy" or "Monte Carlo fallacy." Research by Daniel Kahneman and Amos Tversky showed that our brains search for patterns even in pure noise, a phenomenon called apophenia.
Today, random number generators are ubiquitous and critical. Modern cryptography relies on CSPRNGs (Cryptographically Secure Pseudo-Random Number Generators) such as Fortuna, designed by Bruce Schneier and Niels Ferguson in 2003. Monte Carlo simulations, invented by Stanislaw Ulam and John von Neumann in 1946 at Los Alamos National Laboratory, use billions of random numbers to model complex phenomena, from finance to nuclear physics. For "true" randomness, quantum devices exploit the fundamental indeterminacy of quantum mechanics: the Australian National University streams real-time random numbers generated by quantum vacuum fluctuations.