20 Comments

This reminds me a little bit of a Philip K. Dick story called the Golden Man. A human is genetically engineered without sapience, only instincts. But it’s designed to be sexually appealing and beautiful, and turns out to have mild precognitive abilities. It’s traits make it more survivable than merely intelligent humans, and the genes are dominant. So humans breed with it and sapience becomes extinct. Cuteness wins.

Expand full comment

After a dog of mine had a corn cob surgically removed, I asked our vet, “Exarcly what is the evolutionary strategy that makes swallowing corn cobs whole seem like a good idea?” She said, “I think evolution ceased to be a major factor is this species a long time ago.” Dogs were bred to be cute and useful to humans—not for self-preservation.

Expand full comment

It's it in the Philip K. Dick Reader. A lot of great stories in there. My favorite Dick short story, Piper in the Woods, is not.

Expand full comment

We see this unfold in that Huxley, not Orwell, has proven the more prescient to date. No need to place a boot on our necks when they can merely seduce us with comforts and conveniences.

Expand full comment

I used to debate Huxley v Orwell in my head for years (and yes, I agree, seems like Huxley wins it running away these days) -- but then I read That Hideous Strength, and now I think Lewis beat them both!

gaty.substack.com

Expand full comment

I think you might be right!

Expand full comment

Exactly this. I finally got around to reading Brave New World a few years ago and was amazed at how prescient it seemed. We worry way too much about Orwell’s 1984 when we’re already at the beginning of Huxley.

It’s an easy read if anyone is on the fence about taking the time to read it.

Expand full comment

There's another book...a pulp novel from 1954...which, though not on the same literary level as Huxley and Orwell, IMO comes closer to describing the dangers of our current situation. I reviewed it here:

https://ricochet.com/871838/book-review-year-of-consent-by-kendell-foster-crossen/

Expand full comment

Thanks! I’ll check it out!

Expand full comment

I can't let you do that, Dave...

Expand full comment

This is kind of the theme of Spike Jonez's "Her." I'd say that it's the theme of the early portion of Gerard Johnstone's "M3gan," though that veers into more traditional AI concerns. ... ... As Shakespeare wrote in Hamlet, "The devil hath power to assume a pleasing shape." The process you describe reminds me of the imprinting of ducks. In a psychologically critical moment, the first creature the baby duck sees becomes "mother," and they'll follow that creature relentlessly, be it human or dog or fowl. ... ... Your AI concern is more "Brave New World" than "1984."

Expand full comment

Great. Just great. Now who’s going to rock me to sleep tonight?

Expand full comment

As I often tell my students, the best minds in the world are working overtime to figure out how to get you to swipe and tap their app. AI will just make it easier.

On a different note, I used to tell my students that robotics is the future and you need to learn to work on them. Now I tell them AI-enhanced robotics is the future and you need to learn how to shut them down.

Expand full comment

We are already seeing this with Alexa, Siri, ChatGPT, etc. It’s just going to get worse. Someone will soon use the OpenAI API (ChatGPT is based on OpenAI) to create a personal assistant. And we know that ChatGPT’s training has led it to be balanced in the direction of the OpenAI team’s own biases.

Expand full comment

"Essentially, that it will further empower the tech/political class that wants more than anything else to control discussion, debate, and ultimately thought"

Marc Andreessen, on Dec 3-5 of last year:

"Seriously, though. The censorship pressure applied to social media over the last decade pales in comparison to the censorship pressure that will be applied to AI."

"“AI regulation” = “AI ethics” = “AI safety” = “AI censorship”. They’re the same thing."

"The level of censorship pressure that’s coming for AI and the resulting backlash will define the next century of civilization. Search and social media were the opening skirmishes. This is the big one. World War Orwell."

Expand full comment

"But despite the perceived difference in intelligence, kids felt neither the Roomba nor the Alexa deserve to be yelled at or harmed." Yeah, nice kids, those. When I introduced my niece to Siri, she took great delight in telling it that it was stupid and a poopy-head.

I see this happening in the near term. Your subscription tier will determine the level of service. Celebrity impersonation will cost more. Amazon already has a virtual Samuel L. Jackson licensed, but you can't make it Alexa's default personality.

Expand full comment

Plausible.

Hmm. More than plausible--I'd say likely.

You might have even inspired some early action, Professor. /:

Expand full comment

Good points. What seems nice can hurt us more than that which is threatening.

Expand full comment

Great article, as always! We would argue, though, that a form of this already exists: social media stars are increasingly getting funding from lobbyists to promote certain ideas and discussions (see Wired's "The Lobbyist Next Door") while creating toxic parasocial relationships with their fans. The mental health of those fandoms is abysmal, and it remains to be seen whether AI will have the same kind of effect, despite its "superhuman cuteness."

Expand full comment

In the early 1950s, there was much speculation as to what the then-new computers ('thinking machines') and industrial automation systems would mean for humanity, and a lot of SF stories were written on this theme. I discuss some of them here:

https://chicagoboyz.net/archives/68808.html

Particularly relevant to our current circumstances, I believe, are:

'Dumbwaiter'...maybe the threat is not super-intelligent systems, but super-rigid systems.

'Appointment in Tomorrow'...a supercomputer is viewed as an oracle, but its answers are *really* being simply written by the group of Experts that constructed it.

'Boomerang'...AI-based weapons systems might act in ways that their developers did not expect.

Expand full comment