Weekly recap: 2023-03-19

Posted by Q McCallum on 2023-03-19

What you see here is the last week’s worth of links and quips I have shared on LinkedIn, from Monday through Sunday.

For now I’ll post the notes as they appeared on LinkedIn, including hashtags and sentence fragments. Over time I might expand on these thoughts as they land here on my blog.

2023/03/13: Google goes “code red” on generative AI

I get it: Google feels caught off-guard by OpenAI/ChatGPT. But this … sounds like madness to me.

Google’s Plan to Catch ChatGPT Is to Stuff AI Into Everything” (Bloomberg)

Senior management has declared a “code red” that comes with a directive that all of its most important products—those with more than a billion users—must incorporate generative AI within months, according to a person with knowledge of the matter.

Asking teams to look into generative AI and see where it could help? Perfect. Telling them that they must use it? Not so much.

Every AI professional has seen this before: a company that is desperate to jam the Hot New Technology somewhere, anywhere, hoping to keep up with competition. So I’m not exactly surprised to see a company do this with generative AI.

I just didn’t expect Google would be among their number.

This approach rarely goes well. Not for AI, not for anything else. Case in point:

Some Google alumni have been reminded of the last time the company implemented an internal mandate to infuse every key product with a new idea: the effort beginning in 2011 to promote the ill-fated social network Google+.

(If you’d prefer the thoughtful path to implementing AI in your company and products, hit me up. We can take the time to map out meaningful use cases: https://qethanm.cc/consulting/ )

2023/03/15: AI is hot … again

I posted a short writeup describing the recent VC interest in AI, thanks to generative tools like ChatGPT.

It’s called “Same name, new face for AI.

2023/03/16: A subtle lesson: when AI doesn’t say “I don’t know”

There’s a key AI lesson in this article, but it’s subtle.

“How Siri, Alexa and Google Assistant Lost the A.I. Race” (New York Times)

Let’s start with this excerpt:

In contrast [to LLM-based chatbots like ChatGPT], Siri, Alexa and Google Assistant […] can understand a finite list of questions and requests like “What’s the weather in New York City?” or “Turn on the bedroom lights.” If a user asks the virtual assistant to do something that is not in its code, the bot simply says it can’t help.

It’s all in that last sentence. Siri/Alexa/Google have been built to say “I don’t know. I can’t handle that.”

This is … not exactly a problem.

When it comes to AI, this is a feature. This is a Good Thing™!

Why so? As I’ve noted before, ML/AI models have no idea when they are operating out of their depth. They’re going to give you SOME answer, no matter what you’ve asked.

It’s up to you to draw boundaries around them so they can’t run wild. And it sounds like the creators of the popular voice assistants have done just that. Invalid questions don’t make it all the way to the models.

Digging deeper, this gets to the popularity of LLM-based chatbots: the current design – “ask me anything and I’ll answer!!” – appeals to society’s unhealthy faith in people who never say “I don’t know.” (Western business culture, in particular, worships at the altar of the overconfident. Ask anyone who’s ever been berated for providing a nuanced answer to a question…)

2023/03/17: This week’s newsletter: the Silvergate meltdown

Tuesday’s Block & Mortar newsletter is mostly about the Silvergate meltdown, with lessons from SVB and other bank runs.

https://blockandmortar.xyz/newsletter/046.form-letters-yet-another-meltdown-and-a-cookie-market/

Silvergate was known as a “crypto bank” … but this story isn’t so much about crypto as it is about:

1/ Concentration risk. When one bank is so popular with a single market segment, both the bank and the market are exposed to any problems in the other party.

2/ The illusion that a single dollar can exist in multiple places at once. When enough people finally need that dollar to be in just one place, there’s trouble.