What We’re Reading (Week Ending 06 June 2021) - 06 Jun 2021
Reading helps us learn about the world and it is a really important aspect of investing. The legendary Charlie Munger even goes so far as to say that “I don’t think you can get to be a really good investor over a broad range without doing a massive amount of reading.” We (the co-founders of Compounder Fund) read widely across a range of topics, including investing, business, technology, and the world in general. We want to regularly share the best articles we’ve come across recently. Here they are (for the week ending 06 June 2021):
1. The Cost of Cloud, a Trillion Dollar Paradox – Sarah Wang and Martin Casado
However, as industry experience with the cloud matures — and we see a more complete picture of cloud lifecycle on a company’s economics — it’s becoming evident that while cloud clearly delivers on its promise early on in a company’s journey, the pressure it puts on margins can start to outweigh the benefits, as a company scales and growth slows. Because this shift happens later in a company’s life, it is difficult to reverse as it’s a result of years of development focused on new features, and not infrastructure optimization. Hence a rewrite or the significant restructuring needed to dramatically improve efficiency can take years, and is often considered a non-starter.
Now, there is a growing awareness of the long-term cost implications of cloud. As the cost of cloud starts to contribute significantly to the total cost of revenue (COR) or cost of goods sold (COGS), some companies have taken the dramatic step of “repatriating” the majority of workloads (as in the example of Dropbox) or in other cases adopting a hybrid approach (as with CrowdStrike and Zscaler). Those who have done this have reported significant cost savings: In 2017, Dropbox detailed in its S-1 a whopping $75M in cumulative savings over the two years prior to IPO due to their infrastructure optimization overhaul, the majority of which entailed repatriating workloads from public cloud.
Yet most companies find it hard to justify moving workloads off the cloud given the sheer magnitude of such efforts, and quite frankly the dominant, somewhat singular, industry narrative that “cloud is great”. (It is, but we need to consider the broader impact, too.) Because when evaluated relative to the scale of potentially lost market capitalization — which we present in this post — the calculus changes. As growth (often) slows with scale, near term efficiency becomes an increasingly key determinant of value in public markets. The excess cost of cloud weighs heavily on market cap by driving lower profit margins.
2. Twitter thread that rebuts the “The Cost of Cloud, a Trillion Dollar Paradox” article – Zack Kanter
Excellent *financial* analysis of using *commoditized* cloud infrastructure (vanilla servers). It misses: i) the (long-term devastating) cultural cost of recruiting world-class engineers to do undifferentiated heavy lifting; ii) it’s unfeasible to recreate noncommodity infra. 1/n
On i: saving 50% on COGS sounds great – until you realize that it means recruiting & retaining engineers instead of paying an AWS/GCP invoice. Opportunities to buy technical competence with a credit card are extremely rare; you can’t buy core product competence per API call. 2/n
Every sufficiently-funded software CEO on earth will tell you that their constraining factor is hiring great engineering talent – repatriating commodity servers to save on COGS means increasing engineering headcount requirements, definitionally making the constraint worse. 3/n
It follows that the optimal strategy is to do *the exact opposite* of reducing third-party API COGS: fanatically review labor COGS and shift it to third-party API COGS wherever possible – regardless of cost! You’re effectively buying autoscaling, on-demand top talent. 4/n..
…Part ii [it’s unfeasible to recreate noncommodity infra]: if you look at what your cloud provider is doing for you & your takeaway is “we could do this cheaper ourselves,” then your problem is you’re using the cloud incorrectly by choosing lowest common denominator services. 8/n
Instead of saying “we can run servers ourselves for cheaper,” you should be asking: how can we use AWS/GCP in ways that we couldn’t possibly do better ourselves? This is called “servicefull” architecture – using your provider’s cloud-native services to replace server code. 9/n
If you’re using AWS/GCP to run vanilla servers, you’re building software to work the same way it did when companies ran servers in their office 15 yrs ago. That should be a wake up call about your technology choices – not a call to put servers back in your figurative office. 10/n
3. How the World Ran Out of Everything – Peter S. Goodman and Niraj Chokshi
The most prominent manifestation of too much reliance on Just In Time is found in the very industry that invented it: Automakers have been crippled by a shortage of computer chips — vital car components produced mostly in Asia. Without enough chips on hand, auto factories from India to the United States to Brazil have been forced to halt assembly lines.
But the breadth and persistence of the shortages reveal the extent to which the Just In Time idea has come to dominate commercial life. This helps explain why Nike and other apparel brands struggle to stock retail outlets with their wares. It’s one of the reasons construction companies are having trouble purchasing paints and sealants. It was a principal contributor to the tragic shortages of personal protective equipment early in the pandemic, which left frontline medical workers without adequate gear.
Just In Time has amounted to no less than a revolution in the business world. By keeping inventories thin, major retailers have been able to use more of their space to display a wider array of goods. Just In Time has enabled manufacturers to customize their wares. And lean production has significantly cut costs while allowing companies to pivot quickly to new products.
These virtues have added value to companies, spurred innovation and promoted trade, ensuring that Just In Time will retain its force long after the current crisis abates. The approach has also enriched shareholders by generating savings that companies have distributed in the form of dividends and share buybacks.
Still, the shortages raise questions about whether some companies have been too aggressive in harvesting savings by slashing inventory, leaving them unprepared for whatever trouble inevitably emerges.
“It’s the investments that they don’t make,” said William Lazonick, an economist at the University of Massachusetts.
Intel, the American chip-maker, has outlined plans to spend $20 billion to erect new plants in Arizona. But that is less than the $26 billion that Intel spent on share buybacks in 2018 and 2019 — money the company could have used to expand capacity, Mr. Lazonick said.
Some experts assume that the crisis will change the way companies operate, prompting some to stockpile more inventory and forge relationships with extra suppliers as a hedge against problems. But others are dubious, assuming that — same as after past crises — the pursuit of cost savings will again trump other considerations…
…Just In Time was itself an adaptation to turmoil, as Japan mobilized to recover from the devastation of World War II.
Densely populated and lacking in natural resources, Japan sought to conserve land and limit waste. Toyota eschewed warehousing, while choreographing production with suppliers to ensure that parts arrived when needed.
By the 1980s, companies around the globe were emulating Toyota’s production system. Management experts promoted Just In Time as a way to boost profits.
“Companies that run successful lean programs not only save money in warehouse operations but enjoy more flexibility,” declared a 2010 McKinsey presentation for the pharmaceutical industry. It promised savings of up to 50 percent on warehousing if clients embraced its “lean and mean” approach to supply chains.
Such claims have panned out. Still, one of the authors of that presentation, Knut Alicke, a McKinsey partner based in Germany, now says the corporate world exceeded prudence.
“We went way too far,” Mr. Alicke said in an interview. “The way that inventory is evaluated will change after the crisis.”
Many companies acted as if manufacturing and shipping were devoid of mishaps, Mr. Alicke added, while failing to account for trouble in their business plans.
“There’s no kind of disruption risk term in there,” he said.
4. Can Apple Change Ads – Ben Evans
Apple regards itself not just as a platform provider but as a system provider. Your iPhone is a system, and Apple decides how it works and what developers can do on it, and just as Apple controls security, wireless networking, power management or multi-tasking, it also controls privacy. This year Apple started requiring apps to get permission before sharing information to track users across different sites (‘ATT’), just as the EU and California’s cookie laws have required the same on the web. The main reason to do this tracking is to make advertising more relevant (and therefore more valuable for publishers), and ATT, cookie laws, and Apple and Google’s decision to block third party cookies on the web anyway, in Safari and Chrome, all mean that the foundation of a lot of online advertising has collided with privacy and shattered, with very little clarity on what comes next.
In parallel, Apple has built up its own ad system on the iPhone, which records, tracks and targets users and serves them ads, but does this on the device itself rather than on the cloud, and only its own apps and services. Apple tracks lots of different aspects of your behaviour and uses that data to put you into anonymised interest-based cohorts and serve you ads that are targeted to your interests, in the App Store, Stocks and News apps. You can read Apple’s description of that here – Apple is tracking a lot of user data, but nothing leaves your phone. Your phone is tracking you, but it doesn’t tell anyone anything.
This is conceptually pretty similar to Google’s proposed FLoC, in which your Chrome web browser uses the web pages you visit to put you into anonymised interest-based cohorts without your browsing history itself leaving your device. Publishers (and hence advertisers) can ask Chrome for a cohort and serve you an appropriate ad rather than tracking and targeting you yourself. Your browser is tracking you, but it doesn’t tell anyone anything -except for that anonymous cohort.
Google, obviously, wants FLoC to be a generalised system used by third-party publishers and advertisers. At the moment, Apple runs its own cohort tracking, publishing and advertising as a sealed system. It has begun selling targeted ads inside the App Store (at precisely the moment that it crippled third party app install ads with IDFA), but it isn’t offering this tracking and targeting to anyone else. Unlike FLoC, an advertiser, web page or app can’t ask what cohort your iPhone has put you in – only Apple’s apps can do that, including the app store.
So, the obvious, cynical theory is that Apple decided to cripple third-party app install ads just at the point that it was poised to launch its own, and to weaken the broader smartphone ad model so that companies would be driven towards in-app purchase instead. (The even more cynical theory would be that Apple expects to lose a big chunk of App Store commission as a result of lawsuits and so plans to replace this with app install ads. I don’t actually believe this – amongst other things I think Apple believes it will win its Epic and Spotify cases.)
Much more interesting, though, is what happens if Apple opens up its cohort tracking and targeting, and says that apps, or Safari, can now serve anonymous, targeted, private ads without the publisher or developer knowing the targeting data. It could create an API to serve those ads in Safari and in apps, without the publisher knowing what the cohort was or even without knowing what the ad was. What if Apple offered that, and described it as a truly ‘private, personalised’ ad model, on a platform with at least 60% of US mobile traffic, and over a billion global users?
5. Explained: Neural networks – Larry Hardesty
Deep learning is in fact a new name for an approach to artificial intelligence called neural networks, which have been going in and out of fashion for more than 70 years. Neural networks were first proposed in 1944 by Warren McCullough and Walter Pitts, two University of Chicago researchers who moved to MIT in 1952 as founding members of what’s sometimes called the first cognitive science department.
Neural nets were a major area of research in both neuroscience and computer science until 1969, when, according to computer science lore, they were killed off by the MIT mathematicians Marvin Minsky and Seymour Papert, who a year later would become co-directors of the new MIT Artificial Intelligence Laboratory.
The technique then enjoyed a resurgence in the 1980s, fell into eclipse again in the first decade of the new century, and has returned like gangbusters in the second, fueled largely by the increased processing power of graphics chips…
…Neural nets are a means of doing machine learning, in which a computer learns to perform some task by analyzing training examples. Usually, the examples have been hand-labeled in advance. An object recognition system, for instance, might be fed thousands of labeled images of cars, houses, coffee cups, and so on, and it would find visual patterns in the images that consistently correlate with particular labels.
Modeled loosely on the human brain, a neural net consists of thousands or even millions of simple processing nodes that are densely interconnected. Most of today’s neural nets are organized into layers of nodes, and they’re “feed-forward,” meaning that data moves through them in only one direction. An individual node might be connected to several nodes in the layer beneath it, from which it receives data, and several nodes in the layer above it, to which it sends data.
To each of its incoming connections, a node will assign a number known as a “weight.” When the network is active, the node receives a different data item — a different number — over each of its connections and multiplies it by the associated weight. It then adds the resulting products together, yielding a single number. If that number is below a threshold value, the node passes no data to the next layer. If the number exceeds the threshold value, the node “fires,” which in today’s neural nets generally means sending the number — the sum of the weighted inputs — along all its outgoing connections.
When a neural net is being trained, all of its weights and thresholds are initially set to random values. Training data is fed to the bottom layer — the input layer — and it passes through the succeeding layers, getting multiplied and added together in complex ways, until it finally arrives, radically transformed, at the output layer. During training, the weights and thresholds are continually adjusted until training data with the same labels consistently yield similar outputs.
6. Ant Group searches for direction in a new era of Chinese fintech – AJ Cortese
When Ant Group, then Ant Financial, launched its mobile payment service Alipay in 2004, it was meant to be a complementary function to improve shoppers’ checkout experience on e-commerce marketplace Taobao. Few could have predicted the main role it would go on to play in the development of mobile payments in China.
Ant Financial was spun off from Alibaba in 2014, with Alipay as its core business. The mobile payment platform generated the majority of the firm’s revenue until 2018. However, as Ant’s business strategy began to shift in 2019 to cultivate different growth engines based on new digital financial services, Alipay’s central role began to vanish, with the e-wallet generating only 36% of the firm’s total revenue in 2020. At the same time, the company’s credit services alone accounted for a whopping 40% of total revenues for Ant Group in the first half of 2020.
In July 2020, Ant Financial was rebranded as Ant Group to better reflect its role as “an innovative global technology provider” for businesses and financial institutions. Ant leveraged Alipay’s 900 million users to create a one-stop marketplace for financial products, including short and long-term credit loans, insurance products, and wealth management offerings. The restructuring and the new offerings were the special sauce that powered the company to within arm’s reach of the world’s largest IPO.
Leading up to the IPO in March 2020, Ant Group’s former CEO, Simon Hu, unveiled a new horizontal strategy for Alipay, looking to expand the range of services available on the platform, from food delivery to travel bookings. Alipay’s Chinese slogan was changed from the mundane but functional “Use Alipay to make payments” to the broader and catchier “Live well, Alipay.”
However, following the stalled IPO and new regulations severely limiting cash machine loan products like Huabei and Jiebei—which can no longer be offered as direct payment options—Alipay is back at the center of Ant Group’s thinking, also fueled by an announcement from the People’s Bank of China (PBOC) in December 2020, clearly instructing Ant Group to “return to its roots in payments.”
Despite the changes, Ant Group’s range of offerings will not be limited to just mobile payments. The company can still offer a wide range of services as long as it stays away from lending. In fact, the firm is actively onboarding service providers, aiming to reach 50,000 in total by 2023, up from 10,000 in mid-2020. This service-oriented approach represents the next logical development in Alipay’s maturation…
…Just this week, Alipay was incorporated in the PBOC’s digital yuan rollout pilot program, which expanded its scope to include private operators like Ant Group and Tencent. Alipay and WeChat Pay are likely to maintain a prominent position in the future when the digital yuan will be rolled out at scale.
“It is a misconception that the digital yuan is meant to be a direct competitor to Alipay and WeChat. In fact, the expanded rollout of the digital yuan will allow new industries, payment flows, and data to be digitized in areas like employee salaries,” Turrin explained.
Instead of routing worker’s wages through a bank’s system to then be transferred into a digital wallet like Alipay, salaries could be directly integrated into these wallets using the digital yuan, Turrin said. Going forward, the aim is to facilitate more use cases for the digital currency, which is reliant on consumer platforms like Alipay. “The digital yuan will fail without these types of consumer platforms,” Turrin said.
“It is not a zero-sum game where the central bank’s digital currency takes market share away from the payment platforms. Actually, the digital yuan will enlarge the overall size of the digital payment pie,” he added.
If China’s digital yuan isn’t meant to cut into Alipay and WeChat’s duopoly in mobile payments, the two payment platforms will likely retain and even reinforce their dominant positions, especially as other verticals of their businesses face regulatory challenges.
7. Own The Internet – Packy McCormick
What if I told you about a business with strong network effects and 200x YoY revenue growth that was preparing to offer a 25% dividend and implement a permanent share buyback program? Is that something you might be interested in?
That’s pretty much Ethereum. It’s one of the most fascinating and compelling assets in the world, but its story is obfuscated by complexity and the specter of crypto.
Ethereum is so many things at once, all of which feed off of each other. Ethereum, the blockchain, is a world computer, the backbone of a decentralized internet (web3), and the settlement layer for web3. Its cryptocurrency, Ether (ETH), is a bunch of things, too:
- Internet money.
- Ownership of the Ethereum network.
- The most commonly-used token in the Great Online Game.
- Yield-generating.
- A Store of Value (SoV).
- A bet on more on-chain activity, or the web3 future.
Because Ethereum is so much at once, it’s hard to understand. This post is an attempt to help Ethereum be understood. To a group like us, people interested in technology businesses, finance, and strategy, it’s much more fascinating than bitcoin, but that comes with a tradeoff. It’s much harder to grok than bitcoin, and because of that, it hasn’t gotten the mainstream or institutional attention that bitcoin has…
…Ethereum is so much more than a cryptocurrency. It’s a “world computer,” and the “value layer” of the internet. It lets people build apps and products with money baked into the code. If you believe that web3 is going to continue to grow, then you likely believe that over time, Ethereum will become the settlement layer of a new internet. All sorts of transactions, whether they happen on Ethereum, another blockchain, or even Visa, will turn to Ethereum to exchange funds and keep secure, immutable records. A year ago, I wouldn’t have said that.
Disclaimer: None of the information or analysis presented is intended to form the basis for any offer or recommendation. Of all the companies mentioned, we currently have a vested interest in Alphabet (parent of Google), Apple, and Tencent (parent of TenPay). Holdings are subject to change at any time.