Today's sharp take: LLMs are coming for deployment pipelines
The next billion dollar software business is in plain English deploy
Technical breakthrough unlock downstream startup value—think of what Amazon and Shopify did for e-commerce. They allowed anyone to sell worldwide without needing to be a tech whiz. We should think of LLMs and coding the same way. As of mid-2024, you can whip up an app in minutes using just plain English. While running in local does not equal production-ready yet, it’s been magical enough to turn thousands of people onto coding in just the last few weeks—including my 8 year old kid.
But running on my Mac isn’t the trick. The real shift will come when we move from prototyping in plain English to building scalable, maintainable software in plain English.
And that is coming. The coding barrier has dropped to nearly zero. New developers won’t have the patience or interest (by and large) to learn to deploy and sustain production code the old-fashioned way. Just as Vercel solved CDNs for developers who don’t want to configure edge distribution manually.
Really, we’ve done this kind of cheat code simplification as long as we’ve had computers. And LLMs are the ultimate cheat code. Builders working in the prototype to deploy-and-sustain space have a massive opportunity to reinvent software now. LLMs aren’t there now, but they will be!
Today’s sharp thought: We need to project logarithmically around LLM opportunity spaces. Just two years ago, the thought of my kid coding like this would have been astonishing. Now it’s obvious and we forget how strange it was. If deploying and sustaining production code in plain English seems just as wild, wait a year.