Low on story, high on simulation and experimental features, Terminator requires you to move a group of robot explorers across the face of an alien planet in order to collect unconscious astronauts from the surface and bring them to your ship, where they can receive medical attention. Meanwhile, the terminator line, representing the arrival of destructive sunlight, moves in steadily from the east. You have only so many turns before the encroaching light forces your spaceship to launch, abandoning any astronauts and robots remaining.
On my first tour I managed to rescue only three of six crew members; the other three were within sight of my spaceship but just hadn’t quite made it.
At your disposal you have three “hauler” robots (capable of picking up fallen astronauts) and three “scout” robots (faster to move, longer-sighted, and able to lay down beacons that will provide direction to the other robots). They will all obey your directional commands, both in the form of absolute directions (EAST, WEST) and in terms of relative directions based on landmarks and other robots (GO TOWARDS SPACESHIP). This is particularly ingenious because you can command multiple robots at once, and of course ROBOTS, GO TOWARDS SPACESHIP might mean each of those robots moving in a different compass direction.
So there are several quite cool things going on here, from an experimental point of view. One, the space is being modeled as an extent and the robots in the space are generating descriptions of what they can see around them, including things that are technically in other “rooms”. Because the scouts can see further than the haulers, it’s sometimes necessary to piece together information from the reports of multiple robots to work out where they all are relative to one another: Scout Alpha can see Hauler Zeta, and Scout Beta can see Scout Alpha and the Spaceship; thus I can work out where they all are relative to this ship.
Two, the system is handling multiple simultaneous action commands for various NPCs. At its best, this system made me feel as though I was conducting a rather majestic dance, sending my robots moving in concert across the planetary surface. What’s more, this construct is possible because of the flexibility of the parser command line, which lets you do things like “BETA AND DELTA, GO TOWARDS BEACON THREE” or “SCOUTS, EAST” or “ROBOTS, GO TO SPACE SHIP.” While it’s possible to imagine constructing a choice-based interface that would sort of let you do this, it seems like it would be a good bit more fiddly.
Terminator is kind of more interesting to think about than it is fun to play, though. The whole conceit requires that locational information be presented in a high-friction way; you have to go to a lot of effort to work out where the robots are, and it’s hard not to wish for a nice grid screen that would just lay it out for you. Don’t get me wrong — I did enjoy playing it — but it had a kind of strained, unnatural quality as a piece of game design. You’d only make this game, I felt, if you started from the premise that it was going to be a text game and then came up with the interaction hook; if you started with the interaction, you’d wind up with a graphical map at the least.
Is it possible to imagine these technical innovations in some context that would make more sense as a text-based game? I think so. My mind immediately offers me Harem Simulator 2015 (“BELLA AND FREDERICO, REMOVE PANTS. WOMEN, EAT GRAPES.”) but perhaps more plausible would be a heist, or a dance, or a game about high-level diplomacy (“CHINA AND RUSSIA, PRESSURE NORTH KOREA. UN, SANCTION IRAN.”).