Building No Man’s Sky’s life-sized digital universe

A lot of ink is being spilled about this lovely indie game, and surely the technology behind it is notable, regardless of the actual gameplay.

Raffi Khatchadourian, for the New Yorker:

To build a triple-A game, hundreds of artists and programmers collaborate in tight coördination: nearly every pixel in Grand Theft Auto’s game space has been attentively worked out by hand. [Chief architect Sean Murray] realized early that the only way a small team could build a title of comparable impact was by using procedural generation, in which digital environments are created by equations that process strings of random numbers.

The article cites Acornsoft’s Elite as one of the first games to employ procedural world generation, owing to the technique’s inherent economy. Generate things on the fly, and you only have to store the most basic parameters of each item as you encounter it. But procedural worlds also have their drawbacks. Pure chaos looks like nonsensical noise—or, even worse: monotonous. To avoid that, Murray and his team have developed techniques that bring some order to the chaos. But figuring out where that balance is struck is an interesting part of the process, too:

Because of No Man’s Sky’s algorithmic structure—with every pixel rendered on the fly—the topography would not be known until the moment of encounter. Theoretically, the game could quickly render a sample of the terrain before deciding that a particular pixel belonged to a river, but then it would also have to render a sample of the terrain surrounding that sample, and so on. “What would end up happening is what we call an intractable problem to which there is only a brute-force solution,” Murray said. “There’s no way to know without calculating everything.” After much trial and error, he devised a mathematical sleight of hand to resolve the problem. Otherwise, the computer would have become mired in building an entire world merely to determine the existence of a drop of water.