If you are aware of current pop culture, you will have found out that Rosalía has suffered a disgusting abuse by a semi-known reggaeton player that, in an attempt to attract attention, has edited some photos of the singer to make it look like she was going out naked. Many of you will remember that time when, from time to time, intimate images of celebrities were leaked, terrifying violation whose seriousness we did not always know how to assess. In this case, the ringlet is twisted, since the body of the artist (or the body quoted; a body that at that moment reads like Rosalía’s) without her having even participated in the creation of the snapshot. I imagine the feeling of strangeness when seeing herself and not fully recognizing herself; also the feeling of usurpation when discovering herself sexualized, reduced to a joke and displayed in front of a world eager to consume her. I say that I imagine it but, in reality, I am incapable of reproducing within myself, in my own flesh, the type of anguish that she must have experienced. It happens that empathy reaches a certain point: pain is, fortunately or unfortunately, non-transferable.
According to what he himself has recounted (I refuse to name him and give him what he is looking for), the tool used for such a vulgar misdeed has been Photoshop, Ergo nothing too fancy despite its effectiveness. That being said, as soon as I heard about the incident, I immediately thought of one thing: artificial intelligence and, specifically, the danger that women face with its development. It is not necessary to read Simone de Beauvoir to perceive that, although progress brings with it advances in rights and freedoms, it also brings more subtle and refined ways of hurting vulnerable humans. Science and technology are subject to a context; they are the product of a context and are used in that context and, since our context is misogynistic, there is no excess —between the excitement and the revelry by how ChatGPT responds to us— prepare ourselves for the possible misogynistic drifts of its application.
Last December it premiered in Spain manticore, a film by Carlos Vermut that deals with the limits (legal and moral) of fantasy. In the film, the protagonist uses virtual reality to execute a wish of a pedophile character, taking inspiration from his neighbor’s son —a child who, of course, exists— to design the child-like character that accompanies him in that non-existent world. The debate that arises is the following: How to judge what has not really happened? By now, most of you will have come across some deepfake; one of those videos in which a familiar face (Donald Trump’s, for example) says things that the person to whom that face belongs never said. Until a few years ago it was relatively easy to detect errors in facial gestures, small clues that revealed to the viewer that what he was observing was just an artifice, but for some time now it is almost impossible to discern. In Rogue One: A Star Wars StoryPrincess Leia appears with the face of a young Carrie Fisher when, in reality, she was played by another actress, the Norwegian Ingvild Deila. Rogue One It was released in 2016 and artificial intelligence learns at a fast pace.
I can think of countless areas in which this is going to be a problem. I sense that, to begin with, the democratic processes are going to be profoundly altered, since the already arduous task of identifying the truth is going to get more complicated every day. To continue: I suspect that technological advancement will take particularly disturbing forms in that highly complex arena that is sex (and the sex industry). I thus return to the subject at hand: women. If we already have it bad to make understand simple concepts like that of consentif we already exist under the threat of spiteful men disseminating images that we send in a framework of trust and privacy, if a worrying number of breakups end in obsessions that culminate in femicides… if daily life presents obstacles of this nature, how not to fear the use that will be given to such a powerful tool.
In the dystopia that I predict, an ex unable to turn the page masturbates looking at a moving body on which he has imposed the face of the girlfriend who left him, a teenager sends his friends a false video of the teacher who has failed him, thousands of gentlemen carry out repulsive fantasies stealing faces from women around them; women who have not agreed to bond with them but with whom they bond, although not really. I’m not a technophobe, I celebrate innovation, but I’m not stupid either. The spell of the novelty does not turn off the alarm in the face of what I know too well. We know how cumbersome it is to legislate and, given that countless media trials confirm that, on many occasions, society resists adapting to change, I am wary of the speed of reaction to coordinates that escape us. The only thing I can say today is that I hope I’m wrong. I end up returning to the still warm present to send a big hug to Rosalía.