During this period of AI-driven cosmological inflation in tech there is a market premium on shipping. You’re either shipping or you’re being left behind. S-curves and exponential growth are the order of the day. Lovable is a unicorn eight months after launch. In June 2025, Replit CEO Amjad Masad announced that his company had crossed $100M ARR, up from just $10M at the end of 2024.
Aggression, developer experience and shipping product are everything right now. As an industry analyst it’s hard to keep up with the pace of innovation. As I have said – everything is in play. My colleague Stephen concurs.
Take Google, for example, which is playing the game adroitly, and steering into the curve.
AI was always Google’s market to lose. Of all of the hyperscalers and major tech vendors Google was the most associated with AI, and felt with some justification that it was uniquely ready to benefit from AI. So when ChatGPT dropped and the world suddenly changed, Google ended up on the back foot. OpenAI had built on the Transformer model Google had invented (see the seminal paper Attention is All You Need) – productising Google’s invention. The best packager in any tech wave wins, and wins big.
Inventing a technology is of course no guarantee of success in tech. Xerox invented modern desktop computing, only to see Apple and Microsoft and Apple dominate the space. IBM invented the relational model only to see Oracle dominate the market for SQL databases.
So ChatGPT dropping in late 2022 was a burning platform moment for Google. It was Netscape in 1994, or the iPhone in 2007.
Google was on the back foot through 2023, but last year it hit its stride in terms of delivering on the promise of AI, and this year it’s in increasingly good shape. Its current push around its capable Gemini frontier model is a case study in big company aggression. At Google Next in April it had its swagger back. With a set of industry partners it launched A2A, an industry standard framework for agent to agent communication. But it also made it very clear that it was also adopting the competing Model Context Protocol. But most importantly in terms of swagger, Google was able to lead with Gemini 2.5. A month later, at Google I/O, it had plenty more to say and launch. Two events in the space of just a few weeks, both packed with news.
With the launch of GeminiCLI in June 2025 put down a marker – an open source Cloud Code competitor, built from the ground up by a small Google team, with a focus on getting to market as quickly as possible. Perhaps just as impressive as the launch itself is the rate of new feature delivery – for example launching support for cutting and pasting images into the CLI (a favourite feature in coding agents and assistants of my colleague Kate Holterhoff) just a few weeks after the initial launch.
Oh yeah, GeminiCLI is already at 61k GitHub stars and 5.6k forks since launch last month – maybe there’s life in open source yet.
Game on. And then there’s Google CEO Sundar Pinchai casually hiring some of Windsurf’s senior staff, including CEO Varun Mohan and co-founder Douglas Chen, and licensing a chunk of its IP in a deal worth $2.4bn to further accelerate its own efforts. This after some outlets had reported that OpenAI had already acquired Windsurf. TDLR – OpenAI and Meta definitely aren’t going to get all the talent. DeepMind is still an aspirational place to carry out fundamental research in AI. And apparently, it pays well. Google is back in the game. Like I saw, swagger.
But what of Amazon Web Services, which has been uncharacteristically uncertain in the face of the AI big bang, shipping enterprise products like Bedrock, but leaving developers cold with some of its other efforts? This week it seems to have finally shaken off its shackles. It launched Kiro, a vibe coding tool, which attempts to bring back a little software engineering rigour alongside the You Only Live Once (YOLO) vibes, with a spec-driven development approach. Vibe coding meets AI Engineering. Kiro’s reception by developers has been really positive – so much so that the service has been returning errors, and Amazon has had to reintroduce a waitlist, to throttle adoption.
Even Corey Quinn is impressed. The RedMonk team has already kicked Kiro’s tyres a fair bit, and Kate likes it enough to be using it most evenings, and coined the phrase Hot Vibe Code Summer accordingly.
Talking of heat, Deepak Singh, AWS VP DevEx & Agents lit the fire with this product. Having helped drive AWS to success in the container market he’s now leading efforts here. Intriguingly Kiro isn’t branded AWS or even Amazon, it stands alone – it is simply Kiro. Amazon is going to markedly increase velocity in building its AI dev tools. Kiro: watch this space.
As I said on linkedin this week
If Amazon Web Services (AWS) plays its cards right with Kiro it’s going to be one of the company’s fastest growing products ever. People like the tool. If Amazon can get out of the way, play the token game adroitly, and scale the service, it will be in really good shape. One way or another this is the most successful developer launch from AWS we’ve seen in a long time. The team deserves a lot of credit.
Selling to enterprises isn’t enough. You’ve got to sell to the builders. Kiro takes that very much to heart.
If you’re wondering about my take on Microsoft, in the context of this piece you should read this post about its Build conference – the TDLR is essentially that Microsoft is in decent shape, but really needs a moonshot for its own Frontier LLM model.
To conclude – the industry giants of the cloud buildout may have taken a while to get their gets together and start successfully addressing AI market opportunities, but we’re certainly seeing interesting moves that map to the needs for aggressive shipping, with a much stronger focus on developer experience in the AI cosmic inflation era.
disclosure: AWS, Google Cloud, and Microsoft are all clients.
No Comments