The Singularity is coming, and cute cat videos are its herald.
This all became clear to me yesterday when Google’s Assistant placed on offering before me, like a ruthless hunting cat depositing a dead bird on its human’s doorstep.
It was a video, like so. Watch it. It’s only 21 seconds, and you can mute the silly music:
Cute cat video, right?
Here’s the thing. I didn’t make it. I didn’t even ask for it.
On its own volition, Google Assistant rummaged through the photos I’ve uploaded there over the past few years. Somehow it – they? what mix of human and AI? – decided to make a video for me. Somehow they determined that it should be about cats. Then that shadowy decider not only identified some of my photos as including cats – accurately – but also selected just the ones featuring one particular cat, Hunter. (We have three cats now)
That Google Assistant-thing then assembled them in a sequence according to some logic, assigning a time for each photo to be shown, and limited the whole lot according to some mysterious rubric. Somehow it pair similar photos, like two of Hunter sprawling over my laptops, or several of him sleeping, all in a row. An appropriate musical track appeared (from where? who by?). Then the bot placed the results before my sleepy morning eyes.
All I did was share some photos of my cat Hunter over time, and, like Faust accepting a TOS from Mephistopheles, click “accept.” That’s the limit of my involvement and work.
This is how the posthuman era begins. With plenty of fur and meowing.
All right, I’ll get more serious. For a couple of years I’ve been telling people to think about the many ways by which automation participates in the world of digital creativity, and how that’s evolving (here’s a sample). Yesterday’s untitled cat video shows one way this theme plays out: a giant company using AI to remix user-generated content into a coherent artifact. Facebook and Google have been doing this for a few months at least.
So we’re already in the age of posthuman videos being handed to us. That’s interesting.
And unsettling. There’s no transparent logic here, as both Facebook and Google are black boxes on this score. The way they rifle through our content is also discomfiting, on average. This use of our content is part of the tech giants’ now-familiar practice of aiming to make money on our uncompensated work. Moreover, as Alan Levine has argued, when a giant and powerful entity does this for us, it might not encourage us to make stuff on our own.
Are there any advantages to this? If we had more control over such Assistants, we could point them in directions to aid our creativity, as some cross between collaborator and tool. I could, say, ask Alexa to whip me up a music video based on CC-licensed photographs about driving in winter, then work with the results. Or I could possibly benefit from what Google, Facebook, et al discover when they sift through my various digital outputs; that’s akin to what Meta is trying to do with the science research corpus. What connections or ideas will the machines glimpse that I haven’t yet perceived?
Let’s look ahead a few years, and assume that such AI creative services keep improving. What impact will that have on humans? They could wow us into passivity, like broadcast media did all too often in the 20th century. Or perhaps they’ll spur us to make more stories and art on our own.
From a different, more gloomy angle, consider the postmortem issue. After my cats pass (they are already ten years old), will Google Assistant or its successors realize this? Will it seek to cheer me up with memories by creating more elaborate videos and stories? Or will they offer a new service, an emulation of the cat with which I can interact as a way of mourning? Already people are exploring after death emulations of humans for this purpose (here’s the most powerful fiction about it I’ve come across). Beloved pets might be easier to simulate.
These are still early days, and the age of emulation looks to be far off. But the artificial creative wave is rising, and we need to think hard about it now.