Noah Brier | April 6, 2023

The AI Overconfidence Edition

On code, software, and predictions

Noah here. With all the AI stuff I’ve been doing over at BrXnd, I’ve been trying to keep a bit of church and state and steer clear of the topic around here. But then last week, at least ten people sent me this SK Ventures piece, “Society's Technical Debt and Software's Gutenberg Moment.” The article’s thesis is that software production is ripe to be fundamentally changed by the emergence of AI in the form of Large Language Models (LLMs) that will collapse the cost of production. From the piece:

For now, however, let’s turn back to software itself. Software is even more rule-based and grammatical than conversational English, or any other conversational language. Programming languages—from Python to C++—can be thought of as formal languages with a highly explicit set of rules governing how every language element can and cannot be used to produce a desired outcome. Programming languages are the naggiest of grammar nags, which is intensely frustrating for many would-be coders (A missing colon?! That was the problem?! Oh FFS!), but perfect for LLMs like ChatGPT. 

Why is this interesting?

The implications here are pretty massive. The world will change dramatically if, all of a sudden, the cost of producing software drops to almost nothing. Even more so if, as many people believe, the ability to make it expands to everyone with a ChatGPT account. But I’m much less convinced that’s what’s happening here. For one, that’s not what the tools are good at today. 

We are experimenting with running some weekly classifieds in WITI. If you’re interested in running an ad, you can purchase one through this form. If you buy this week, we’ll throw an extra week in for free on any ad. If you have any questions, don’t hesitate to drop a line.

There are less than 10 tickets left for the BrXnd Marketing X AI Conference. If you haven’t got one yet, get yours today.

I know it’s early days, and it’s very possible I’m displaying a complete lack of imagination. But when I use these tools, particularly the coding ones, I don’t see something that magically gives development abilities to folks with no technical expertise. I’m probably in the 90th+ percentile of technical knowledge if you took all humans, but if you narrowed it down to just software developers, that number drops severely. For people like me, this is amazing: a true superpower. It helps me with syntax, makes me significantly faster, and now, with the addition of GPT4, it gives me the confidence to try new things. It’s amazing. But I know a lot about where to start. I can and have done this independently and can guide, debug, and ask the right questions. (There’s a whole other edition to be written about how creating software is a social problem, not a technical one.

Again, it’s very possible the tools will evolve immensely, but it’s not the thing I see today. I keep coming back to this Tweet from Miles Brundage, who does policy at OpenAI:

I’m spending tons of time with this tech, and while it doesn’t give me much in the way of foresight (it’s all moving so fast, and it’s so fun to play with it, it’s hard to make that happen), it does make me feel like I have a well-tuned overconfidence detector. (NRB)

Thanks for reading,

Noah (NRB) & Colin (CJN)

Why is this interesting? is a daily email from Noah Brier & Colin Nagy (and friends!) about interesting things. If you’ve enjoyed this edition, please consider forwarding it to a friend. If you’re reading it for the first time, consider subscribing.

© WITI Industries, LLC.