Growing up you were probably asked by your parents “If everyone jumps off a cliff, will you jump too?” when you did something stupid. The idea is to appeal to your logic and critical thinking, which should (hopefully) prevent you from doing something stupid you know is stupid because everyone else is doing it.

So if Google’s new brilliant AI tells you to put glue in your pizza, you wouldn’t do it, right? Because some reporters would do what it takes to draw more eyeballs to their articles:

Google AI said to put glue in pizza — so I made a pizza with glue and ate it

I knew my assignment: I had to make the Google glue pizza. (Don’t try this at home! I risked myself for the sake of the story, but you shouldn’t!)

Hey, at least she’s telling us we shouldn’t do it. Not that it’s going to help: some people are just dumb.

She also did the research:

I did use Google to make sure that “nontoxic” glue was indeed semisafe to eat. Google’s AI answer said that small quantities might lead to an upset stomach but not, say, death. That’s good enough for me.

And since it was good enough, she made the pizza and ate it. Google said so.

What does this all mean? For me personally, this means that I’m an idiot who eats glue. But what does it mean for Google and the future of AI-powered search?

I don’t doubt Katie Notopoulos is smarter than what she pretends to be for this article. Her article is now trending all over the place, and hey, she even made it to this blog.

It’s time to stop blaming Google for what it’s supposed to do, minimum work for maximum profit, and to apply common sense. And tech journalism… well. Tech journalism. What can I say?