Print this page

Image and identity

Is your image your identity? Is your identity just what other people see, whether in real life or in media? Or is your identity somewhere else? Is your image more a kind of "tool" curated by your (mostly hidden) inner identity? Do you have an image? Do you feel bad if you don't?

Do you control your image?

As we just saw, identity in the sense of "image" is a thing that digital technology makes easy to simulate. Other people mainly know our identity as an image, a voice, mannerisms - outward manifestations that people experience through their senses. These can be recorded, edited, and even simulated for our online identity. If we become an online presence, many people may only know us as media images. In the future, because most of us are turning ourselves into media, will our identities be as hard for us to control as Tupac's now is, now that Tupac doesn't exist as anything but media? WIll an egotistical husband have us all praising our daughter's marriage choice from beyond the grave?

Can you still separate your identity from your profilic public image? Even if you don't personally accept that your image is your identity, will the rest of the world still insist that it is? To the extent that someone’s "identity" exists through electronic media and recordings, that identity can now be edited, repurposed, framed, taken out of context, added to by others, whether the person is no longer alive or not. This is not just true for celebrities any more, though of course, that is where identities-as-images is still the most pronounced. But it could happen to you. If you're lucky?

Deep Fakes

Technology that allows someone's image to be appropriated is progressing by leaps and bounds. Deep Fakes are made by software that allows face-swapping and body swapping to create videos that appear to show someone doing or saying something that someone else actually did or said. The technology uses Artificial Intelligence to combine two video sources. 95% of existing deepfakes put a famous person’s head on a porn actor. So you could watch "Rihanna" do porn, for instance. Now the technology is being used on YouTube for humorous parodies, but could presumably become common in advertising, movies, and for dangerous political uses.

As the Belgian developer says in the news feature, people have become worried about the use of Deep Fakes for political propaganda, for instance making Justin Trudeau seem to say or do things in a video that he never said or did. As these technologies mature, it will presumably be harder to use video footage as evidence in a court of law (or the court of public opinion), since it could so easily be faked. The developer discusses how a lot of work is being done on finding ways to use technology to detect these fakes so that we can know when video footage is a true representation of something that happened.

As mentioned earlier, Deep Fake technology was originally developed for and still most often seen in pornographic "entertainment products." In an Ted talk from 2019, Mitali Thukor talks about how this is being treated as only relevant now that it might be used for political purposes, and how the way people are trying to combat it is by creating more technology to expose when something is a fake. WIth a "skill-thinging" focus, the idea of faking people's identity itself - originally in porn - is not debated from a moral and political perspective, bot only a technical one. These "entertainment products," whether comedic or pornographic are treated as harmless fun, and the process is not apparently technically illegal in most contexts:

On YouTube and TikTok you can find lots of funny deep fake videos – Elon Musk as a baby, etc – presented as mere comic entertainment. The legalities around this kind of identity appropriation are unclear and these “mashups” – like most of the appropriation that currently goes on online – are largely tolerated. Tom Cruise is still alive (unless he's just a hologram), so at least he has some recourse if he is not happy with being face-swapped. Dead people, however, should arguably be left to rest in peace. Their identities have been finalized, and whatever truly made them them is gone.

Much of the general public has come to see all media as entertainment and to therefore want to shrug off this type of “identity theft” as meaningless fun. There may also be a tendency to see celebrities as “asking for it” by being famous. If they want to be famous, they should expect to have their images misappropriated. And finally there is a tendency in our culture to treat women in particular as images, objects, and commodities, not as people. To concentrate on the freedom the consumer has to imagine they’re watching a celebrity crush have sex, as Thakor says, is to ignore the human rights violations of the people (even if they are performers) whose images are being appropriated for your pleasure. And remember – as Thakor points out – it isn’t only celebrities who are vulnerable to deep fakery. The porn actors are also being appropriated, presumably against their will and without remuneration in many cases.

And you could be next. If you have shared enough images of yourself via social media, you may be face swapped onto a porn star yourself, or some other situation you wouldn't have opted for. Among the technologies currently being perfected is a kind of couples’ “deep fakes” platform called DaF Masking (DaF stands for Dreams and Fantasies). I'll talk about it more in the next lesson, on sex.

All of these technologies call into question whether there is an important difference between human identity outside of the mediated realm of profilicity and media images of a person, which can be simulated or edited by others. Some would say that the integrity of human identity, to the extent that it is created by others’ perceptions of us, is thus in danger of being undermined by fake media that is as realistic as "real" media. Including, of course, the fake (or curated) media we create of ourselves.

And that may include you, even if no one cares about seeing you in a porno. Imagine you are 90 or 100 years old, and pass away. Your bereaved family orders a holographic re-creation of you, so that they can still keep you around. You are gone, but you continue to grow and change – as a virtual re-creation. Is this a form of appropriation that needs more individual and perhaps legislative attention? Do you have no control over your own image? Are individuals bascially reducible to media? Are you an image? Do you consider your self a media construct? Now that's hyperreality.

NEXT

Print this page