Zendegi by Greg Egan (Gollancz, 2010).

This is an entertaining but rather slight novel that places Egan’s usual concerns—transhumanism, uploading of minds, the ethics of artificial intelligence—in a democratic Iran in the early and mid twenty-first century. (You can read chapter 2 of the novel at Egan’s website.)

Martin Seymour is an Australian reporter who travels to Iran to report on the pro-democracy demonstrations surrounding the 2012 elections. He marries Mahnoosh, a political activist, and settles in Tehran. Then Mahnoosh is killed in a car crash and Martin discovers that he is dying of cancer, and needs to make provision for the future welfare of his young son Javeed. His friends Omar and Rana will provide materially for Javeed, but Martin worries that they won’t be able to instil in Javeed the western values that Martin believes in.

Mahnoosh’s cousin Nasim Golestani has a background in artificial intelligence research, but has quit academia to work for ‘Zendegi’, a multiplayer online video game platform developer. She applies her expertise to the creation of more realistic non-player characters (“Proxies” in the book) using the fictional technology of “side-loading” (an ingenious name, paralleling the traditional sf term “uploading”). Side-loading requires a test subject to respond to stimuli in an MRI scanner, while training a neural network to respond in the same way: not just in input/output terms, but in structure and detail.

Martin realises that he may be able to use this technology to create a Proxy which can meet and mentor Javeed in the game-worlds of Zendegi on his behalf after his death.

The idea of a father seeking some way to form the character of his son after his death is a strong one, and Martin’s plan to do this via immersive video games seems psychologically plausible in a way that, say, writing a book setting out his moral values, does not. It adds pathos to the scenes in which Martin accompanies Javeed through games in Zendegi, which take up a big portion of the book.

However, in the end I felt let down: the novel never delivers the emotional impact that it promises. Ned Beauman in SFX calls it a “tepid meditation on fatherhood and Middle Eastern democracy,” which is a fair summary. Egan’s characterisation is simply not good enough to support the story he wants to tell. In his better novels this doesn’t matter because either he’s writing about post- or non-humans, as in Diaspora or Incandescence, or where the science-fictional ideas are so overwhelming that it doesn’t matter that the human part of the story is a bit weak, as in Quarantine. But in Zendegi something more is needed, and Egan fails to deliver.

For example, in order for the plot to work, Martin needs to have a genuine worry about Omar bringing up Javeed. This is covered by a couple of scenes in which Omar makes racist and sexist remarks. But these scenes stick out badly as having been inserted for this purpose: they don’t arise naturally from Omar’s character.

For a second example, consider Nasim’s change of heart near the end, when she is persuaded that it is cruel and immoral to run her semi-sentient Proxies as if they were insentient automata. All sorts of human factors should make this decision very difficult for her: she has invested her life’s work in this project; her career depends on the success of the Proxies, as does the quality of the game worlds in Zendegi; there’s the moral imperative from Martin’s dying wish; and not least the fact that she is being blackmailed into this change by saboteurs. She seems to change her mind much too easily for this scene to be psychologically plausible.

Another area where I think Egan has missed a trick is in the treatment of the games. Martin and Javeed play many games that run on the Zendegi platform, of which four are described in detail. With my game-developer hat on I have to say that the Zendegi games, as portrayed, seem pretty feeble, bland experiences, more like simulations than games proper. (This is a common fault of video games as presented in novels, though, and I’m sure writers have a corresponding complaint about the quality of the dialogue and plots in video games. And I think it is insightful for Egan to present the major source of competition in the games industry as being between game platforms, not individual games.) Nonetheless, the games provide a great opportunity to deepen the thematic effectiveness by creating correspondences between the ‘real’ world of the novel and the game-worlds. For example, the idea that it’s immoral to treat artificially created conscious beings as slaves is one that could be powerfully treatable in a fantasy game. I looked out for correspondences along these lines, and if they are meant to be there, I have to say I missed them.

Perhaps Egan thought the idea of thematic correspondences between the ‘real’ world and the games would be too much of a cliché? (Nonetheless, I think it would be justifiable in the novel because it’s the characters who chose which games to play, so it’s plausible that they might, consciously or otherwise, pick stories which resonate with their situation.) Or perhaps he was keen to incorporate scenarios from the Iranian national epic, the Shahnameh, and nothing therein was suitable?

Egan has always had difficulty in portraying characters whose views he disagrees with. They always end up seeming like puppets or strawmen, pure mouthpieces for a viewpoint. And this causes trouble in another strand of Zendegi, which is a mildly satirical look at transhumanism. Now you can satirize by nastiness, or by mockery, but Egan is too nice for the former, and not accurate enough at mimicry for the latter. It ends up being a bit feeble, and the targets are not likely to be much hurt.

Who are the targets of Egan’s satire? Well, here’s one of them, appealing to Nasim to upload him:

“I’m Nate Caplan.” He offered her his hand, and she shook it. In response to her sustained look of puzzlement he added, “My IQ is one hundred and sixty. I’m in perfect physical and mental health. And I can pay you half a million dollars right now, any way you want it. […] when you’ve got the bugs ironed out, I want to be the first. When you start recording full synaptic details and scanning whole brains in high resolution—” […] “You can always reach me through my blog,” he panted. “Overpowering Falsehood dot com, the number one site for rational thinking about the future—”

(We’re supposed, I think, to contrast Caplan’s goal of personal survival with Martin’s goal of bringing up his son.)

“Overpowering Falsehood dot com” is transparently overcomingbias.com, a blog set up by Robin Hanson of the Future of Humanity Institute and Eliezer Yudkowsky of the Singularity Institute for Artificial Intelligence. Which is ironic, because Yudkowsky is Egan’s biggest fan: “Permutation City […] is simply the best science-fiction book ever written” and his thoughts on transhumanism were strongly influenced by Egan: “Diaspora […] affected my entire train of thought about the Singularity.”

Another transhumanist group is the “Benign Superintelligence Bootstrap Project”—the name references Yudkowsky’s idea of “Friendly AI” and the description references Yudkowsky’s argument that recursive self-optimization could rapidly propel an AI to superintelligence. From Zendegi:

“Their aim is to build an artificial intelligence capable of such exquisite powers of self-analysis that it will design and construct its own successor, which will be armed with superior versions of all the skills the original possessed. The successor will produce a still more proficient third version, and so on, leading to a cascade of exponentially increasing abilities. Once this process is set in motion, within weeks—perhaps within hours—a being of truly God-like powers will emerge.”

Egan portrays the Bootstrap Project as a (possibly self-deluding, it’s not clear) confidence trick. The Project persuades a billionaire to donate his fortune to them in the hope that the “being of truly God-like powers” will grant him immortality come the Singularity. He dies disappointed and the Project “turn[s] five billion dollars into nothing but padded salaries and empty verbiage”.

Here’s a summary of the differences between Egan and Yudkowsky:

Likely route to AI? Goal-directed system along the lines of traditional AI, but recursively self-optimizing. Simulation of human brains.
Superintelligent AI? Likely. Quantitative differences in speed will lead to qualitative improvements in intelligence, leading to a “difference similar to that distinguishing a human from a chimpanzee, or a dog, or a possibly a rock.” Unlikely. There’s nothing an AI can do that you or I couldn’t do with suitable computer support.
Imperative for AI researchers? Prevent the AI from destroying humanity. Treat conscious AI ethically.
The transhumanist movement? It’s irrational not to take it seriously. It has aspects of a quasi-religious cult.1

There are plenty of fine aspects to the novel. The Iranian setting is unusual in science fiction (in English, anyway). I liked the description of the immersive video game technology. And one passage early on made me laugh, while making a point about software complexity.

[Nasim asks her sysadmin how come her mobile phone is continuously reporting her location to the world via Google Maps.]

“You know AcTrack? […] It’s a reality-mining plug-in that learns about academic networking using physical proximity, along with email and calling patterns. Last semester we put it on everyone’s phones.”

“All right,” she said, “so I’m running AcTrack. Is everyone else who’s running AcTrack appearing on Google Maps?”

“No,” Christopher conceded. “but you know Tinkle? […] It’s a new femtoblogging service going through a beta trial. […] Like microblogging, only snappier. It tells everyone in your network where you are and how you’re feeling, once a minute. […]”

“But why am I running it at all,” Nasim asked wearily, “and why is it telling complete strangers where I am?”

“Oh, I doubt you’re actually running a Tinkle client,” Christopher said. “But on the server side, AcTrack and Tinkle are both application layers that run on a lower-level platform called Murmur. It’s possible that there’s been some glitch with Murmur—maybe a server crash that was improperly recovered and ended up corrupting some files. Tinkle does hook into Google Maps, and though it shouldn’t be putting anyone on the public database, if you don’t belong to any Tinkle Clan it might have inadvertently defaulted you to public.”

  1.  See for example this thread on lesswrong.com.

Update 2011-01-10. There’s some discussion of this review at LessWrong.