- by foxnews
- 25 Nov 2024
For more than a decade, the tech industry has been defined by two economic zeros. The "zero interest rate policy" (ZIRP) across the western world saw the price of money plummet, letting startups run at a loss for years and giving investors massive appetite for risky bets that might pay off in a big way. At the same time, the "zero marginal cost" of the software industry gave outsized returns to effort, allowing for situations like WhatsApp: 55 employees serving 420 million users and selling to Facebook for $19bn.
But both those conditions are coming to an end. Governments around the world have raised interest rates in a desperate attempt to keep post-pandemic inflation under control, while the rise of AI technologies threatens the production model that brought the sector to its current dominance. And because of that, the next decade could be very different from the last.
'A ZIRP phenomenon'
Most of the western world didn't actually see interest rates hit zero, but as inflation and growth flatlined after the Great Recession, rates were cut low enough as to make no difference. In America, the Federal Reserve cut rates to "0.25%" in 2008 and kept them there for seven years, before gradually raising them as high as 2.5% in 2018 and then cutting them down to near-zero in the midst of the pandemic. In the UK, the rate was cut from 1% to 0.5% in 2009, and didn't go all the way back for the next 13 years.
An economic interlude: central bank interest rates have two major effects on the economy. On the one hand, they're effectively the "cost of money". If you need cash, you can borrow it, and pay interest on it; if the interest rate is low, you pay less for your money, and can borrow more of it for the same price. On the other hand, they also provide investors with a benchmark "risk-free" rate of return. By lending to a central bank, you are guaranteed your money back, which sets a floor on potential investments. A zero interest rate is a low floor indeed, which leads to investors hungrily chasing riskier bets that could pay off.
In the economy at large, then, low interest rates stimulate greater investment, free up credit and hopefully kickstart an economy in the doldrums.
In the tech sector, though, that general push had very specific outcomes. With low rates of return from conventional investments, the venture capital ecosystem - one of the few legitimate financial products that tries to offer a thousand-fold return on investment - became flush with cash. Yes, the risk was high, but with rates so low it was a risk that was worth taking.
And that flood of inward investment was patient. Rates were low, so it didn't matter if the pay off was a year or a decade away: a company that could promise megabucks in five years' time was far more compelling than one that would simply turn a modest sustainable profit next quarter.
The downstream effects of that didn't just change the technology sector. They defined it. Everything from "Blitzscaling" (the Uber-like practice of growing so fast that your competitors simply run out of money and go bust trying to compete with you) to seven-figure starting salaries (as you bid for engineering talent against a pool of competitors who all have access to the same infinite capital as you) has its roots in ZIRP.
And the effects go further still. Facebook's enormous annual profits are in large part because of its enormous advertising revenue, and much of that revenue comes from venture-funded startups paying huge sums to acquire customers at a loss, as they race to scale up.
But ZIRP is over. Interest rates are sky-high and the pool of cash is drying up. We're already seeing some of the short-term effects of this on the industry, in the form of sector-wide layoffs and startups panicking about preserving their "runway", the period of time they can survive without extra investment.
Free sushi lunches? That's a ZIRP phenomenon. Massive discounts for new users? ZIRP phenomenon. Burning money on a metaverse? Definite ZIRP phenomenon. In Silicon Valley, it's even become a vaguely trendy insult. Your pal's not getting as many dates any more? Maybe all those Tinder swipes were a ZIRP phenomenon.
Free as in beer
Then there's the other zero: marginal cost. The marginal cost of a product is the cost of making one more unit. It doesn't take into account expensive fixed costs like your R&D, factories or CEO salary. But in textbook economics, it's the cornerstone of basic theories of pricing and supply and demand.
Again, the simple economics explanation is that the marginal cost sets a floor for prices: if you sell a product for less than it costs to make it, you go out of business extremely quickly. And once you've invested the fixed cost of creating your product, it's always worth selling more of it at any price above the marginal cost. So, in the long run, production expands and prices fall until price equals the marginal cost.
But software wrecks it. Because the marginal cost of almost anything in the world of software is as close to zero as makes no difference. Signing up for a Facebook account, downloading an app, or reading an article on a newspaper website - all of these things have zero marginal cost.
That means they can be, and frequently are, offered for free, with the fixed costs of production recouped in other ways: frequently advertising, but also revenue streams like donations, merchandise sales or selling customer data on the sly.
And then came AI. There's a lot to be said about the rise of generative AI like ChatGPT and Midjourney, but one of the important undercurrents is that it is meaningfully expensive. The fixed cost of training the models has been well covered, with a GPT-scale AI costing billions to train, but even getting results out of a trained model is expensive, between the electricity required to operate and the risk of congestion in the datacentres.
As a result, a single ChatGPT prompt has been estimated to cost around a hundred times that of a web search, and that was before OpenAI rolled out GPT-4, a substantially bigger model that is correspondingly more expensive to run.
That's why so much of the cutting edge of this field is subscription-based. ChatGPT Plus charges users for access to GPT-4 and still limits them to a hundred queries a day, while Midjourney allows free users just 24 minutes of processing time ever before they're prompted to take out a monthly subscription starting at $10.
If you want to offer generative AI to your users, in other words, you have to charge them. But that's hard to do: we've spent a decade expecting consumer technology to be free at the point of use, with maybe some fees for bonus features like removing adverts.
Unlike ZIRP, the death of zero marking cost isn't guaranteed. There's a push to run some cutting-edge AI "on-device", slimming it slightly to the point that it can use the powerful processors in an iPhone or laptop rather than relying on expensive datacentres.
But that's a technical challenge, and it seems likely that the most powerful AI models will always be those hosted centrally and costing huge sums to run.
We can't know for sure what the next decade holds, and it's tempting to think that the massive economic shift from the death of the two zeros will be rendered moot by the even bigger technological one from the rise of AI. But I'm not so certain. The shape of the last tech boom was fundamentally set by these two economic facts: what's the next one going to look like without them?
If you want to read the complete version of the newsletter, please subscribe to receive TechScape in your inbox every Tuesday
Here are 10 destinations for "quiet travel" in the U.S. to check out if you're ready to unplug and unwind on your next vacation. From Maine to Florida, Oregon and more, see the list.
read more