westworld

Far West, artificial intelligence and philosophical reflection: Westworld

Westworld, the new HBO show written by Jonathan Nolan and Lisa Joy, premiered on October 2nd 2016, reaching 3.3 million viewers, and thus breaking Game of Thrones record (2.2 million viewers)

What’s Westworld? Westworld is a massive western game with a huge map and a hundred different storylines. What’s so special about it? It’s real. And by real I mean physically concrete. The players, also known as Guests, interact with Hosts, androids specifically programmed to imitate human behaviour but ultimately designed to repeat the same actions over and over again, totally forgetting what they did the previous day. Since they are not human, they can be used by the Guests as they please or killed without any legal consequence. On the other hand, humans can’t be harmed by the Hosts, so they’re totally free to do whatever they want.

Westworld has the potential to be a great show and not only because of this basic idea, inspired by the original movie Westworld (1973) directed by Michael Crichton, but also for its great story development – which allows us to follow different characters both inside and outside the park – the exquisite music and the deep ethical and philosophical implications that constantly arise, especially from the point of view of the creators of the Hosts. It is certainly difficult to add many reflective parts to the narration, but the show writers are able to hold the audience attention thanks to tense dialogues and action scenes wisely distributed in between.

And that’s one thing that I love. The show is able to introduce complex themes (even in the first two episodes only!) in an involving way. “Some people choose to see the ugliness in this world. The disarray. I choose to see the beauty. To believe there is an order to our days, a purpose.” Dolores, one of the Hosts, repeats many times, allowing us to wonder about two interesting points.

First of all, Hosts don’t only have knowledge of simple everyday things, but they’re also given a philosophical opinion about life in general. Dolores’ perspective is then very optimistic, but we know that in the end her life doesn’t have any meaning at all, because she is no more than an object the Guests can play with. Is it really like that, though? Before being decommissioned, her “father”, another Host, started to realize how things were because he found a picture some Guest lost in the park, and managed to say some words to her: “These violent delights have violent ends” (quoted from act 2 scene 6 of Shakespeare’s Romeo and Juliet). At first, she believes he’s just mad, but then starts to have visions that bring back memories from her previous “lives”. And when she says the same words to Maeve, another Host, she experiences the same too, but she’s also able to recall memories as dreams. Maeve plays the part of the brothel keeper and has her view on life: she seeks only pleasure and makes jokes about her “little voice”, a clear reference to her conscience, that left her when she started a new life in the new world.

However, all these anomalies wouldn’t have happened if one of the creators, Ford, hadn’t added a new script in the hosts update. Normally, as Theresa, the head of the safety department, says, “the Hosts are supposed to stay within their loops, stick to their scripts with minor improvisations“ but now they have little programming mistakes, intentionally left to make them do little gestures – the “reveries” – linked to their memories. Therefore they have a sort of subconscious but they shouldn’t remember exactly what happened to them. Ford wants to make the Hosts as lifelike as possible and sees the “reveries” as an experiment to see   how they evolve. Bernard, a programmer, who didn’t know about it, thought someone was sabotaging the Hosts. “The simplest solution”, he says, but Ford replies, “Ah. Mr. Occam’s razor. The problem, Bernard, is that what you and I do is so complicated. We practice witchcraft. We speak the right words. Then we create life itself out of chaos. William of Occam was a 13th century monk. He can’t help us now, Bernard. He would have us burned at the stake.” Where is all this going? We have to wait and see.

One more consideration: this is one of the few examples of movies and TV series where AI doesn’t strictly follow the Three Laws of Robotics. In fact, this point is not clear: there is a script that should prevent Hosts from hurting living things but they can’t tell themselves and humans apart and in some narratives they shoot each other and defend themselves if attacked, to a make the “game” more challenging. However, humans can’t be harmed and it seems that weapons inside Westworld have no effect on Guests: they are hit but not injured. Aside from that, the difference between Hosts and Guests is  reduced almost to nothing. Anyway, as a Host says when asked if she’s human, “Does it matter if you can’t tell?”.

Westworld first two episodes really surprised me. If the next ones keep the same level, I would say this is one of the best tv series ever. But will they?

Amedeo Zorzi

Rispondi

Inserisci i tuoi dati qui sotto o clicca su un'icona per effettuare l'accesso:

Logo di WordPress.com

Stai commentando usando il tuo account WordPress.com. Chiudi sessione /  Modifica )

Foto di Facebook

Stai commentando usando il tuo account Facebook. Chiudi sessione /  Modifica )

Connessione a %s...

Questo sito utilizza Akismet per ridurre lo spam. Scopri come vengono elaborati i dati derivati dai commenti.