What We’re Reading (Week Ending 28 August 2022) - 28 Aug 2022
Reading helps us learn about the world and it is a really important aspect of investing. The legendary Charlie Munger even goes so far as to say that “I don’t think you can get to be a really good investor over a broad range without doing a massive amount of reading.” We (the co-founders of Compounder Fund) read widely across a range of topics, including investing, business, technology, and the world in general. We want to regularly share the best articles we’ve come across recently. Here they are (for the week ending 28 August 2022):
1. Once Science Fiction, Gene Editing Is Now a Looming Reality – Katie Hafner
Ask any expectant couple what they hope their baby will be, and one answer is likely to be “healthy.”
But one gene gone awry can imperil a child’s health, causing serious disease or a disability that leaves one more susceptible to health issues. With advances in gene-editing technology, though, biomedicine is entering an uncharted era in which a genetic mutation can be reversed, not only for one person but also for subsequent generations.
Public debate has swirled around genetic engineering since the first experiments in gene splicing in the 1970s. But the debate has taken on new urgency in recent years as gene modification has been simplified with CRISPR (short for Clustered Regularly Interspaced Short Palindromic Repeats) technology.
Scientists have compared the technology to word processing software: It acts like a cursor placed next to a typo, capable of editing a gene at a level so granular it can change a single letter in a long genetic sequence.
While still highly theoretical when it comes to eliminating disabilities, gene editing has drawn the attention of the disability community. The prospect of erasing some disabilities and perceived deficiencies hovers at the margins of what people consider ethically acceptable.
“People are understandably very scared of it, of the many different roads it could take us down as a society,” said Meghan Halley, a bioethics researcher at Stanford University and mother of three children, including a 5-year-old with a disability. “Broadly speaking, this is always going to be problematic because of the many things that disability means.”
CRISPR is widely seen as holding great promise for treating diseases that until now have been intractable.
There are two main types of CRISPR-based editing. One is the correction of a gene in an individual living with a condition or disease. This is known as somatic cell editing (“somatic” refers to the body). In June, NPR reported that Victoria Gray, a woman with sickle cell disease, experienced a significant decrease in her episodes of painful sickle crises in the first year after gene-editing treatment.
In the future, Ms. Gray’s children or grandchildren may be able to take advantage of the other type of CRISPR “fix”: an edit to the human germ line. This involves making changes to a fertilized egg that not only last through the life of an individual but also are passed on to future generations.
This type of “inheritable” gene editing is inapplicable to conditions like autism or diabetes, in which the hereditary component is caused by many different genes. But it is suited to disorders caused by variation in a single gene. Sickle cell disease fits into that category, as do cystic fibrosis and Duchenne muscular dystrophy.
Yet bioethicists point out that inheritable gene editing raises large societal questions, given the dire consequences of an error, as well as the ethical questions that arise at the prospect of erasing disability from human existence. There is also concern that gene editing for health reasons will be out of reach for many because of its cost…
…The debate grew more heated in 2018, after a scientist in China announced the birth of the world’s first gene-edited babies — twin girls — using CRISPR to give them immunity to H.I.V. The announcement generated outrage around the world. In December 2019, a court in China sentenced the scientist to three years in prison for carrying out “illegal medical practices.”
Last year, a group of scientists from seven countries called for a global moratorium on changing inheritable DNA to make genetically modified children.
Professor Halley’s middle child, Philip, was born with multiple anomalies of his gastrointestinal system. In addition, Philip has had a succession of health complications, including a stroke just before age 2, leading his doctors to suspect a still-unidentified genetic disorder.
Families like hers, Professor Halley said, offer good examples of “very ripe cognitive dissonance” around the topic of inheritable gene editing. As a mother, she said, she would do anything to prevent the pain her son has been through.
Yet she is aware of the inconsistency between that desire and her unwillingness to “do anything that would take away him, take away who he is,” she said. “And he is who he is partly because of the challenges he has faced.”…
…That ambivalence toward gene editing was reflected in a 2018 Pew Research Center survey, which found that a majority of Americans approved of gene editing that would result in direct health benefits while they considered the use of such techniques to bolster a baby’s intelligence to be going “too far.”
The survey also found that the biggest worry was that gene editing would be available only to those who could afford it.
“That’s a critical ethical issue,” Dr. Marson said. “We have a real responsibility when we develop these technologies to think about how to make them scalable and accessible.”
2. Mark Tomasovic – ChargePoint: Leading the EV Charge – Jesse Pujji and Mark Tomasovic
[00:06:27] Jesse: Talk about how they evolved with the industry. It’s somewhat of an obvious business model, electric cars on the road. Something needs to charge them. Let’s make the stuff that’s going to allow them to be charged all over. But how has the industry evolved? What are sort of the important pieces to understand, if you’re going to get into a business like this?
[00:06:43] Mark: If we go back to that 1% number, 1% of the total number of cars on the road today are electric. What that means is there’s 2 million electric vehicles in the US. And what that means is one in every 15 cars sold in the US is an electric vehicle. So about 7% of all cars sold today in the US are electric vehicles. But what we’ve seen historically in other countries, for instance, in Norway, when a country reaches 5% adoption of electric vehicle adoption in new sales, a tipping point occurs and the adoption becomes exponential. And we think today we’re at that tipping point. And just to put it in context, in 2018, there were 12 car commercials that ran during the Super Bowl. None of them were for an electric vehicle, but this year there were nine car commercials that ran during the Super Bowl, and seven of those nine commercials featured an EV.
So it’s still pretty early and overall relative market share of EVs versus combustion engine vehicles. But we think over the next five years, the US EV growth rate will be a 40% CAGR. And we’re going to four X the total number of EVs on the road by 2027, as there are today. So a lot of those kind of going back to your question on what’s causing EV charging infrastructure adoption. A lot of that adoption is based on the improvement of battery technologies, which is decreasing the cost of electric vehicles and increasing their range and making them broadly more affordable and more attractive to the average consumer.
[00:08:19] Jesse: And the infrastructure, who are the players in all of this? Not just competitors, but even the value chain a bit. It seems like ChargePoint has the vast majority of market share, but help us break it down a little bit to better understand it.
[00:08:30] Mark: There’s about four main players across the whole EV charging infrastructure layer. And the first is the equipment supplier itself. So the first is people like ChargePoint, that are the hardware OEMs, or the OEMs at the physical charging infrastructure. Their role involves engineering and manufacturing of the chargers. And we think of this space as a pretty competitive space. So we expect over time that margins in the hardware OEM category will decrease, as the hardware becomes more commoditized. And so these suppliers will have to find some way to differentiate, which could involve having yet unique, go to market motion, or unique partnerships, or involve selling proprietary software systems to run their chargers. And they’ll have to figure out ways to be cost efficient in how they source and manufacture their products. The second important part in the value chain is that software layer. As I mentioned, one way that the equipment suppliers could differentiate, is by offering software systems to run their chargers. And these software systems provide capabilities like payments and the ability to lock and unlock charge points and see into ChargePoint performance and even manage electrical load on the chargers themselves.
And so a lot of the hardware providers, the OEMs of the equipment are really good at building hardware, but they’re not really good at building software. So there’s this whole second ecosystem of just software providers that can work with certain types of hardware, or across all different hardware types.They’ve really established themselves as software only capital like businesses, that are just the operating system for hardware, for the charging infrastructure itself. The third player in the value chain is the installer and the installer is, think of them as mid to large scale electrician shops. Some of them are specialized charger installation companies, or engineering and procurement firms that offer turnkey solutions, this super fragmented, low barriers to entry. They actually do the physical construction installation. And then finally the last players are the site owners and the ChargePoint operators. And the site owners typically the one that owns the physical real estate and sells the electricity. And if they don’t want to operate the actual charger itself, then they’ll hire a ChargePoint operator that can monitor the charging status and coordinate maintenance.
[00:10:47] Jesse: How do the auto manufacturers play into this? How do they have relationships with ChargePoint, or what’s their relationship to this world?
[00:10:55] Mark: Yeah, well, auto manufacturers like to stay agnostic, because the goal of the auto manufacturer is decrease range anxiety across potential customers. And what I mean by range anxiety is, the inability to find a charger when you’re traveling across the state or traveling to work. It’s one of the major reasons why people are hesitant to purchase an EV, is because they feel like, “Hey, if I need to travel a couple hundred miles, I won’t be able to fill up, like I do a traditional gas station.” The auto OEMs like to stay agnostic to the different types of hardware players, because they want to minimize range anxiety and they want to sell more cars. What’s interesting is, ChargePoint has actually been able to capitalize on this by partnering with a couple auto OEMs. So ChargePoint has what’s called an open-wall network. And so, any auto OEM can charge on ChargePoint chargers, and ChargePoint has partnered with a couple of these OEMs so that you can have the ChargePoint app as part of your car’s dashboard. And so, you can control things like payments, reserving parking spots, from the ChargePoint app in your car.
[00:12:07] Jesse: Whenever I’m with my dad in his Tesla, he’s searching in Tesla, and is he finding ChargePoint things or they have their own system? And how do you view that?
[00:12:14] Mark: So Tesla is a closed-wall network. So if you want to use a Tesla charger, you can find Tesla chargers that only work with Tesla vehicles, to do things like super fast charging, where you can charge your vehicle within 30 minutes. But you can also use Teslas on ChargePoints. For instance, because ChargePoint likes to stay agnostic to the type of auto OEM brand.
[00:12:37] Jesse: They say at Switzerland they’re like, “We’ll charge anyone. We don’t care.” And obviously if you’re a gas station or whoever wants to buy these things, you want it to be open to as many people as possible.
[00:12:46] Mark: Exactly, because the gas station will make a margin on the electricity that they sell. And also, in certain instances, if you have a charger at your location, then it becomes a strategic advantage because the folks that are charging can come in and buy your products, whether it’s at a grocery store or a retail location or something like a gas station.
[00:13:07] Jesse: You guys expect this market to be very competitive over time. What do people compete on? What are the big areas and why is ChargePoint winning so far? And why do you think they’ll probably continue to keep winning?
[00:13:19] Mark: When we look across just the overall EV charging infrastructure and landscape, it is a land grab. It’s a physical land grab, because you’re out there and you’re trying to secure these parking lots or these locations before anyone else does. So ChargePoint definitely has the first mover advantage in this land grab. They were the first to market for all intents and purposes. They’re the first public company in this space and they developed significant first mover advantage before many of their competitors. So I would say overall, first mover advantage is a clear advantage in the space because the hardware is getting commoditized. The second advantage are things like, being able to offer services that make these ChargePoints relatively simple and easy to maintain with high uptime. So if I’m a ChargePoint owner and I want to have ChargePoints at my apartment complex, for instance, I don’t know how to operate a ChargePoint.
I don’t want to deal with it. I don’t want to schedule maintenance. I want someone else to do that for me and ChargePoint, because they offer their proprietary app and they also offer other services, like ChargePoint Assure or even one called ChargePoint as a Service. They can be this fully integrated, hands off solution, where they say, “Hey, buy our hardware and buy our services, and we’ll take care of it all for you.” So that becomes an advantage, because it becomes relatively low touch for the actual ChargePoint owner themselves. I would say the third is actually partnerships too. So if I’m ChargePoint and I need to acquire site-by-site retail locations, there’s not very much distribution leverage in that. And it becomes really difficult and high customer acquisition costs, if I need to go around to individual shopping malls or whatever it may be, apartment complexes and acquire these individual customers.
But if I can form relationships with the installers, the local electricians, the engineering and procurement firms, for example, and I can have those installers recommend ChargePoint to their networks. Well, then I just gained a whole bunch of distribution leverage. So I can have one partnership with one installer, and then let the installer recommend the hardware and recommend the ChargePoint software on top of that. And so, ChargePoint has done a really good job of forming these partnerships within the industry, whether it’s with the auto OEM themselves or with the installers, or within real estate owners that own massive amounts of real estate across the US…
…[00:19:36] Jesse: How much does one of these things cost?
[00:19:37] Mark: They can be expensive. Let me use this quickly to get into the difference between level one, level two and level three chargers, because they all have various degree of costs. Level one chargers are, what are called trickle charge chargers. And what they’re used for is, primarily charging your EV at home, overnight. So they plug into your standard 120 volt household outlet. They usually require no installation costs, and actually most EVs are sold with one of these level one chargers. They charge at a rate of one to two kilowatts, which means it can often take several days to charge a full car battery. The second type of charger is a level two charger, and these chargers can be found in the home, the workplace or public settings. And if you want to install a level two charger at home, they require a 240 volt outlet, which is typically what you would need for your washer and dryer.
The level two chargers are more expensive. They’re about 10 to $30,000, but they can charge significantly faster than those level one chargers. So the level two chargers can fully charge an EV between four to 10 hours and over 80% of public charge points in the US are level two chargers. So they’re great for places where your car is stationary for a while, which is workplaces or parking garages, things of that nature. And then finally, ChargePoint also sells level three chargers, and level three chargers are also known as DC fast chargers. These can charge your car in about 20 to 30 minutes, but the downside is, they’re significantly more expensive. So a level three charger can cost over $150,000. And a site upgrade for a level three charger could be over a million dollars for a site. So currently, they only make up about 20% of overall chargers available in the US, but they work well with fleets and they work well in locations where you need a quick charge. So in highway corridors, for example…
…[00:33:14] Jesse: Can you speak a little bit to the regulatory environment?
[00:33:17] Mark: We’ll, mainly focus on the US here, but first there’s obviously the infrastructure bill. So in November of last year, the infrastructure bill was passed in the house with $75 billion of investment in EV charging over the next five years, which includes $5 billion of corridor charging for highway corridors and then $2.5 billion for alternative fueling infrastructure to support Biden’s executive order. But actually, there’s also electric vehicle tax credits, which we expect to contribute to the overall adoption of EVs in general. And certain EVs can get up to a $7,500 tax credit. If they’re currently purchased from companies who have sold less than 200,000 electric vehicles.
So this will change with the inflation reduction act, but currently it doesn’t apply to Tesla, GM or Toyota, but there are EV tax credits available for these other brands where you can get up to $7,500 if you are to purchase an EV brand. And then finally the LCFS. So LCFS is the low carbon fuel standard. It’s an emission tradings act enacted in California and 2007. And if a fuel has a carbon intensity lower than certain guidelines, that asset can create an LCFS credit. So ChargePoint can actually create these credits by selling electricity in California, and then those credits can be sold to other regulated parties under LCFS.
3. Meditations: A Requiem for Descartes Labs – Mark Johnson
Descartes Labs was on a mission to better understand our planet through satellite imagery. To enable such a lofty mission, we built a data refinery, a petabyte scale repository of satellite and other geospatial data combined with petaflop scale supercomputing power. Because of our platform, we attracted some of the best scientists from around the world, often fleeing universities and government institutions, looking for an intellectual island to do impactful science. Our Cartesians built some of the most remarkable demonstrations of technology I’ve ever seen in my career.
Even with almost $100m of invested capital (strangely, Crunchbase doesn’t list the last two rounds), >$200m in total going into the company (revenue+investment), ending 2021 with over $17m in revenue, and multiple 8-figure government contracts, the company was sold in a fire sale.
Writing that sentence is absolutely shocking to me. How is it possible that a company with an incredible team, so many successes, so much revenue, and so much invested into the underlying technology could possibly be worth close to nothing?
Given the wild ride of Covid and the economic rollercoaster ride we’re currently on, it’s easy to dismiss Descartes Labs as a victim of macroeconomic circumstances. I don’t believe that to be the case and will make the argument that there were two main reasons for the mismatch in the actual value of the company versus the price that was paid:
- The company was burning too much cash.
- The sales process was run poorly. (i.e., the process of selling the company)
At fault is the management team, who executed poorly, and especially the board, who knew these facts and chose to do nothing…
…Though I, too, was extremely skeptical of their business plan, I hopped on a plane to New Mexico on Bastille Day 2014 because I had been obsessed with The Manhattan Project since I was a little kid. When I met the cofounders, I was blown away by their brilliance and humility. Immediately after that first meeting, I knew we needed to start a company.
By the end of 2014, we had settled on satellite imagery instead of media search, we created a company, gave it a “temporary” name of Descartes Labs, raised $275k in angel capital by October, signed a deal with LANL, and closed a $3m seed funding with Crosslink Capital in December of 2014…
…And did we ever! After releasing our predictions ahead of a very important US Department of Agriculture (USDA) forecast, we singlehandedly moved the market 3%, thanks to a well-timed Bloomberg article. Even though we moved the price in the wrong direction (2015 wasn’t our most accurate prediction), it didn’t matter. Big agricultural companies who weren’t returning our phone calls started answering and we were able to raise another $5m from AgTech VC Cultivian-Sandbox.
In 2016, we focused on selling our global agricultural yield forecasts. Cargill, the largest agricultural company in the world and largest privately-held company in the US by revenue (~$130b), approached us with a challenge: could we combine our data with their proprietary data to create better models for their agricultural supply chain business. After many months of hard work, Cargill decided to partner with us, becoming both a customer and an investor.
It’s hard to understate the importance of Cargill to Descartes Labs. Not only did the relationship provide revenue to the company, allowing us to raise a Series B and hire a lot more people, but the thought partnership between Cargill and our team helped us to figure out who we were. They realized before we did that the magic of Descartes Labs wasn’t in a SaaS product to democratize satellite imagery. No, we were a modern AI consulting company (cf., Palantir) who had assembled brilliant minds, built an internal set of tools that gave our scientists access to huge amounts of data, and created a collaborative internal environment to solve complex problems…
…By the end of 2016, we were feeling pretty good about defying the odds of being a New Mexico spinout of a National Lab. We were even profitable in Q4 2016, unheard of for a science startup.
Descartes Labs had become the Miracle on the Mesa.
All the while, a debate was brewing within the company, known as the Descartes Labs Dialectic: are we building a product or a services company?
Descartes Labs crossed $10m of revenue in 2017, typically a massive milestone for software startups. The problem was that 90% of our revenue was in just a handful of accounts. Thanks to a brilliant cofounder in sales and a technical team to back up whatever custom deal was sold, we scored some really big wins for ourselves and for our customers.
The problem was that we hadn’t built any products yet.
Our corn (and eventually soy) forecast was a product of sorts, but we had trouble selling it. Proving that it had “alpha” (i.e., an edge that gives a trader an information advantage) was difficult and the value of a straight forecast was limited. I came to believe that selling data products was the wrong business model for AI startups. (A post from a16z penned by two of their partners in early 2020 is an even better statement of the sentiment.)
Another potential product was the underlying data refinery. Perhaps we were building a platform? We used it to great effect with our customers in building custom models for them. But, the companies that could derive the most value out of our platform were typically in commodities like agriculture, shipping, metals & mining, and forestry. They didn’t have the technical expertise to extract the full value out of the data or computing power within our platform.
We, like many AI companies like Palantir, were a hybrid consulting company: we built a robust platform that we were the best in the world at using and charged our clients lots of money for building unique, extremely valuable, proprietary (read: cannot be sold to others) models. Even if we structured the revenue cleverly by selling our customers a platform subscription and subscriptions to the models we built for them, we still weren’t a SaaS company.
If I could go back and change just one thing, it would be the resolution of the Descartes Labs Dialectic. I would have shut down internal debate in the company from the crowd that thought we should be building a software company. I would have been much clearer in our fundraising decks about what strategy we were pursuing: AI companies can build an enormous amount of enterprise value though specialized consulting contracts.
I succumbed to the market narrative, pushing startup founders to pursue a Software-as-a-Service (SaaS) business model.
To “solve” the Dialectic and package Descartes Labs as a software company, we separated the team into the Platform Team and the Applied Science team. The Platform Team built the tools and the Applied Science team built the models. Our typical customer engagement started off with a pilot project, usually around $100k, and our theory was “land-and-expand” (another popular SaaS trope): grow the small account into a much larger account, paying >$1m / year. Even though we were taking consulting contracts in the short term, in the long term, a product would emerge from the engineering team.
Or, at least that’s the theory on which we raised our $30m series B from March Capital in the Summer of 2017. Series B was another pre-emptive round, raised without me having to go through a full process and shake the money trees on Sand Hill Road. This new capital was intended to grow our sales & marketing team, build out our “product,” and move into a growth stage so we could raise a healthy series C.
Looking back on Series B, I’m conflicted. On one hand, I absolutely believe in taking money when it’s available. In 2017, we were 8 years into a bull run and we wondered how long the gravy train would keep running. On the other hand, when you take a big round, it’s important to be prudent with the cash and expand only when you believe that there is a viable business model. This requires investors and the company to be aligned around what constitutes product-market fit and what signals indicate that the business model is ready to be scaled.
In 2018, we scored a few minor wins, which kept our revenue up and to the right, close to $20m. Thanks to superior performance against our original DARPA contract, we were able to translate that contract into a much larger deal to build a geospatial data refinery for the government, in a contract worth up to $7.2m. We also made a considerable amount of progress booking pilot contracts from $50-$150k, not bad for initial projects. For 2019, our plan was to continue that pipeline of pilots and translate some of those pilots into large, multi-year 7-figure contracts and we’d be flying high.
However, by the end of 2018, we realized that our land-and-expand thesis was encountering roadblocks. We were losing money on pilots (read: negative gross margins) and translating those pilots into larger deals was difficult. Predicting revenue was nearly impossible (read: not a repeatable business model), driven by long lead times and uncertainty around how much value our pilot projects would deliver to the customer.
Things got dicey around our Q4 2018 board meeting, where we presented our 2019 plan. We projected that 2019 would be a 50% increase in revenue year-over-year, given that we hadn’t quite figured out our sales process or product . The board was not pleased, they wanted to see a much steeper growth curve.
Now the Descartes Labs Dialectic reared its head. Our investors wanted us to be a SaaS company with SaaS metrics and SaaS growth. We simply were not. We should have structured our entire business around being a high-end consulting company. Perhaps we wouldn’t have gotten SaaS multiples. Perhaps. Or perhaps we would have focused our energy on what we did best. I’d rather have $100m / year of long-term consulting contracts than burning expensive venture capital on a fantasy SaaS product.
Ultimately we capitulated to the board’s desire for an unrealistic revenue expectation because, in my mind, isn’t that what I signed up for when I raised the money? Unfortunately, it caused the company to focus on improbable but high-value deals instead of getting our product philosophy and sales strategy in order. By pushing us into unreasonable growth expectations, the board drove us to hit our numbers in the short term, not build a long-term engine for growth.
4. Robert F. Smith – Investing in Enterprise Software – Patrick O’Shaughnessy and Robert F. Smith
So Robert, I was toying with where to begin our discussion. And because we’re in such an interesting market for software companies, let’s say, if you just looked at public equities where multiples have come way down, my first question is one of perspective. You’ve been investing in software, maybe more specifically enterprise software, for a very long time. I think if you have to find the right pond to fish in, you found this pond earlier than most. Maybe just lend some perspective to us on what this market looks and feels like to you versus your long history backing these sorts of businesses.
Robert [00:03:03] The whole reason I even got into this space, I started my career as a chemical engineer. It was at a time when we were really doing some interesting things in that area. And part of it was we were digitizing the operational elements of what we call unit operations, i.e., running plants and facilities, et cetera. And we were going from an analog model, where you’re, what I call, observing an event, might be a reaction, a state of reaction, and then making the adjustment, adjusting various inputs to modulate that reaction.
Well, when you introduce computing technology into that environment, you go from an observation being episodic or periodic, depending upon who’s the operator and what they happen to be thinking about at that moment in time or human foibles, yes, exactly, literally, to you’re actually measuring the output of a system thousands of times a second, which gives you the ability to then make changes to the inputs at a similar rate. The nature of that is it actually eliminated a tremendous amount of waste in process industries.
And then once Moore’s Law kicked in and computing capacity became more readily available and the insights and the skill sets of people like myself and others expanded to our colleges and university systems, we’re able to bring that productivity into the broader industrial environment and then, of course, the office environment, where we went from doing things quite manually in the office environment to automated, things as simple as Word processing and calculations and spreadsheets and ultimately utilizing things like artificial intelligence to aid in decision-making, created massive efficiency and productivity in our economy. The insight around all of this was, well, at the end of all of this, that is enterprise software, business software that ultimately has now been identified as the most productive tool introduced in our business economy over the last 50 years and likely for the next 50…
...Patrick [00:05:53] If you think about the percent of the time that a given person that’s a professional working somewhere or spends in some form of software, it’s crazy high. I don’t know what the percentage is, but it’s really, really high. And I think about it almost like an oil well or something, how much is there left to mine or extract of people’s time and their workflow? And my question is how mature it feels to you in the world of enterprise software. Like in the one sense, we’re all using software all the time, so it seems like it’s a penetrated story. Like lots of people use software in their daily work all the time. When you got started, that probably was much less the case. As you think about the next 50 years, what drives that? Do you share that sense that there is some saturation in this market? We’ll talk about the business model and a million other things in a minute. But just at a high level, how far into this does it feel to you?
Robert [00:06:41] Not to be too much of a cliche, but we are in the early innings of this. You have to remember, in the early days, the access to computing power was the rate-limiting step. It was like who could afford a computer? Governments and large corporations and ultimately, universities, et cetera, and then you went through the micros and the minis and now the next generation of this, the superscalers, even the hosted computing systems, cloud computing is how people really think about it.
What that has done, in many cases, you look at some of these businesses that have multibillion-dollar valuations, all they really did, in some cases, was to digitize a manual workflow. There’s some that you can think about now for managing sales flows. Before, you had a bunch of salespeople sitting around, and they put together their information about, “Oh, yes. All right. Here’s what I think I’m going to do, and here’s my prospect list.”And they’d send it to the regional manager, and that person kind of moved their Ouija board around and say, “Here’s what I think it is.” They’d go to the national manager and then they report to the CEO and the Head of Finance and, “Okay, here’s what we think our sales are going to be for this month, this quarter, this year,” whatever it might be.
And a lot of big companies, actually what they’ll do is they digitize that. But did they really bring any insights or start to create predictive analysis? That is just now starting. That’s why I talk about the data that now has been digitized across these systems is now accessible. And then you can start to implement algorithmic systems on those to now make it predictive and a little more accurate and, in some cases, now reach forward and saying, “Here’s where you ought to spend your time.” Or how do you now bring in 2 or 3 of those systems and platforms together so that it’s not just one system of record, i.e., what your salesperson put in, but also starts to evaluate what your customers are actually doing every day. What are their buying and spending patterns? And that confluence brings some insights into that.
If you think about it, I call it the second order effect of enterprise software, which is data. The analytics of that data is in its infancy. And that is where the actual true expansion of productivity will now come. I do think there’s a lot of opportunity set there. I think it is orders of magnitude bigger than what we’ve seen in the productivity to date but is going to continue to require the adoption, I’ll call it, of the platforms and then, of course, the application of the analytics on top of those platforms.
And frankly, there are constraints still around how we do that. One of the biggest constraints, of course, is finding capable people. We always talk about this war for talent in our space. 7.5 billion people on the planet, there’s only 29 million of us who write code for a living. I mean it’s the most productive tool out there. It impacts every business. During the time of COVID, we all “went home” or went to some place I wasn’t working. We’re accessing our work through these systems, but there’s only 29 million of us who actually write the code for those systems…
…Patrick [00:14:33] If you think about the landscape, if I had every single eligible, let’s say, enterprise software business in the world together and I wanted to arrange them in a room from lowest quality to highest quality, they’re not all created equal, right? Everyone seems to have come to accept that software businesses can be the best businesses in the world based on just some economic features of them, but some are very bad. So if you think about that spectrum from bad to great, how is the spectrum itself defined? What’s a Vista company? What are the qualities boiled down that are most critical to you when evaluating one of these businesses?
Robert [00:15:07] Sure. And I’ll give you some of the things that are critically important to have and some of the things we really do, I’ll call it, in a differentiated way versus anyone else out there. So look, everybody knows the nature of enterprise software lends itself to what we call these mission-critical, business-critical type of businesses. And if you run them well, you’ll have high retention rates with your customers. You’ll be able to have businesses that have visible recurring revenue components to the business. Those are things that we look at, not surprisingly, and understanding the quality of all of those elements and what are the quality of the relationships and the quality of the product. Where are they in the stages of what I call product superiority? And then what are the elements of execution excellence that the company needs to emphasize or support?
But on the other side, and this is when people get it wrong, they get it wrong here most of the time, of the businesses, the enterprise software companies that have failed and gone bankrupt or had all sorts of financial challenges, it’s because, typically, they have too much technical debt. Most people don’t really understand what that is. They think about debt in terms of financial debt. But technical debt is compounding. And as you write code over a year, 5 years or 10 years or 20 years, often, if you don’t take the right product development approach to it, you create a tremendous amount of code that has some flaws. It has some bugs. It has some architectural idiosyncrasies that might work for 8 customers or 200 but not for the other 5,000, that every time you make an upgrade in the code, you have to go back and make those adjustments, so that your customers’ products continue to work with their solutions, how they use and continue to work. And that is often an oversight for almost every investor that I know of outside of Vista that they don’t really spend enough time on that.
And if and when you do, this is when we take a pass on a company, it’s often because they have too much technical debt relative to the pace of the market that they are in. We have a whole set of best practices around reducing and then ultimately eliminating technical debt. To me, that is one of the more liberating elements that one can do in managing these businesses. And it’s not something that you just naturally know how to do as an investor. A lot of investors think, “Oh, let me just buy a good company and hope the management team can figure it out.” But you’d be surprised at how many management teams I know that we spend time with in due diligence evaluating businesses that actually don’t really have a sense for the amount of technical debt that they currently have and never thought of that, yet they provide increasing levels of resources against managing their existing code base and not really realizing that they’re losing ground every day because they actually haven’t taken the right approach to eliminating technical debt. They just figure it’s just the cost of doing business…
…Patrick [00:20:32] And in terms of just now software has become so proliferated, there’s lots of subcategories. There’s cyber software, there’s critical market, operating system-type software. Are there categories that you’ve gravitated to or away from that you feel most lend themselves to this sort of engineered approach that you’ve outlined to managing them?
Robert [00:20:51] There are some industries where the participants, the software providers have relinquished or given up a lot of their intellectual property to their customers. You think about pharmaceuticals, those are sort of areas, for instance, where, often, the end user environment is so concentrated, they actually have the market power in how that software and that code advances. We look for more broadly distributed customer bases, where you don’t have high concentration risk in your customers, and we look for where there’s actually still a high ROI, return on investment, of the products that you were selling to your customer base. I mean this is just an internal statistic. But we measure every year what’s the average return on investment for the products that we sell to our customers.
Think about we have 80-plus software companies, 300 million users of our software over, I think, it’s now 2.2 million customers, I think 800,000-plus enterprise, 1.4 million small to medium businesses. But we look for what is the value of the software that we’re selling. The average ROI of the products that we sell to our average customer is 640% ROI. Now think about it. There’s very little business investment that you can make that gives you that level of ROI. So a small to medium business, actually typically higher, more like 900% enterprise, depending upon what it is in terms of the product, again the average is 640%, maybe a little lower. But what that says is your next incremental dollar, as almost any business in any industry, is best spent on buying more software.
Patrick [00:22:24] How do you measure that? That seems like a hard thing to measure and capture.
Robert [00:22:27] Not really. It’s as simple as if I implement, pick it, a solution for payroll, okay, if I implement — this is old school, but just how many payroll clerks do I no longer need if I implement a payroll software system? Pretty easy to calculate. Here’s how much it costs, all right? Or waste, how much of a waste do I eliminate by implementing this software solution to measure some energy usage or something? It’s math at the end of the day, and you’ve got to have some analytics around the math. But that’s what we do. And we do it for every one of the companies. And so that gives you a sense for, okay, here’s how valuable that software is to that industry. Media industry, “Okay, if we give you this system to manage how your media spins or buys are done, how much more efficiently can you promote and/or sell it? How fewer resources do you need to develop or to devote to that activity?”
Robotic process automation, so you asked me an earlier question about, “Okay, where are we in the inning?” Okay, if you’ve got a bunch of people going around and doing some clerical work, let’s say, and moving one set of data from one system to another day-to-day, I call it a swivel chair kind of an enterprise, pull it out of one system, put it in another, versus put a robotic process automation system and, guess what, ROI is massive. It could be as simple as transaction reconciliation, how you reconcile a purchase order versus what was shipped. And if that’s done manually versus put an RPA system in it, guess what, you get massive, massive implications of ROI.
Patrick [00:23:56] So the 640% or whatever the number is, that napkin math is an absolute no-brainer for the customer. If we take that napkin math, I’ll call it, IRR napkin math now to the investing side, as you think about what you’re willing to pay in terms of a multiple, let’s say, for one of these businesses, how has that evolved over time? Like public multiples on enterprise software got so crazy a year ago, and now they’ve come basically back down to their long-term norm. But what are the key things in the napkin math that you’ve evolved over time as you think about the price you’re willing to pay for a single one of these businesses?
Robert [00:24:26] We kind of look at it in 2 buckets: there are the companies that are growing faster, and then there’s what I call the companies that have met their addressable market and they’re operating on a profitability metric, take out the last 6 months. But the prior 2 years has been the most frothy from a valuation perspective that we’ve seen in the history of enterprise software and the history of software, period. We at Vista took advantage of that and took 6 companies public and sold into that market. Valuations are rough, guess what, let’s sell into that market, right? And we actually were one of the few, and actually one of our investors pointed that out, that actually decelerated in the investing in that marketplace in terms of public market, not in the break of investing, but public market, which were really experiencing those lofty valuations.
Interestingly enough, we actually maintained the same market multiple. It actually biases down over those 2 years that we did than the prior 5. We used a thing called a growth adjusted multiple in the first, call it, half of the decade. A growth adjusted multiple, on average, in the overall market was like 0.43. The market in the last couple of years maxed out at twice that amount, 0.93.
Patrick [00:25:42] And what does that number mean? What is the growth adjustment?
Robert [00:25:44] How do you think about a price revenue growth multiple? What’s the price that you’re paying something relative to the growth multiple of that business? So you’ve heard of PEG ratios, price-to-earnings ratio?
Patrick [00:25:56] Yes, of course. Sure.
Robert [00:25:58] Rather than earnings, for a fast-growing software company, use revenue. So you do that, and you divide it basically by the growth rate of that business. So it’s kind of doubled over those 2 years. Our average was still, over the last 2 years, below that 0.43. Whereas the market went as high as 0.93, the market average went to 0.61, 0.62. These are people buying into these enterprise software businesses. We’ve always maintained a very disciplined approach to how we buy those businesses. In the last couple of years, we’ve been buying them at half the price the last 2 years of the overall market. The vast majority of those businesses we bought the last couple of years were private.
We only did 2 take privates in that period of time. Whereas any time there’s a downturn, we took 6 before that and 2 before that, 2 before that in terms of public to private type of models because those public markets were experiencing those lofty valuations. That’s one way that we really think about it, really evaluate and understanding that when the markets are being lofty, that’s a good time to sell into them, not surprisingly, makes sense. And when the markets have come down where they are today, and if you look at our overall market, these growth adjusted revenue multiples back down towards 0.41 type of a level, which is just slightly above where we’ve been buying our companies over the last couple of years.
Patrick [00:27:15] What have you learned about the role of churn, like customer churn inside these businesses? That seems to be a variable that’s coming at me from every direction, that this is the ultimate arbiter of revenue quality for a software business, not net dollar retention but just actual customers leaving and why. What did you learn about that over the 22 years?
Robert [00:27:34] We have a couple of metrics. One is recurring revenues, and one is retention rate. And we look at net dollar retention rate as an important statistic or KPI that we measure. In many cases, I mean, our overall portfolio, which is, whatever, call it fifth-largest enterprise software company if you just added all of it up together, over these last 2 years, it’s a 104% net retention rate. Think about that. That’s pretty staggering in terms of, okay, showing that what we provide to our customers is of value. The thing you have to think about in an environment like this, in a recessionary environment is, okay, where do you get that churn? We just walked through the mission-critical nature of these businesses, the actual ROI of the products that we sell to our customer base and how important it is.
So if you’ve got churn, you’ve got to look at it. Is it because the customers picked another solution or they went out of business? If they’ve gone out of business, is there some fundamental underlying element of that industry that you’re serving that gives you some suggestion that this is not one that you want to necessarily invest in?
Our teams — and I’ve just got a marvelous investment team, frankly, really think about that and unpack that and get very granular by geography, by segment, by customer, by customer size. And we really spend a lot of time on it to understand, are there some inherent issues with the industry or the product’s position in that marketplace that should give us some pause or confidence on how mission-critical, business-critical, not only the solution is, but that specific solution is with the company that we’re evaluating or the company that we own? And then that informs us as to how we bring forward our value creation motion and what are the things we want to focus on. Because that may inform me, guess what, here’s a segment in the marketplace that we need to spend more time on or invest in or build a different set of solutions for or enhance, call it, the go-to-market opportunity with that segment.
5. An Interview With Eric Seufert About the Future of Digital Advertising – Ben Thompson and Eric Seufert
Let’s start there. Walk me through SKAdNetwork 4.0 and the changes that are coming.
ES: SKAdNetwork 4.0 is basically like a page one rewrite of SKAdNetwork. It seems like they just burned down the schematic and built something totally new with a blank page. It’s better across the board and so all credit due to Apple, this is a welcome change, I think there’s a lot of potential here. I wrote a post about it, there’s a lot of detail.
There’s three big buckets of change — really four but three that I care about. Basically at a high level, big concept, SKAdNetwork 3.0 very much operates on this binary conception of privacy, you either surpass some threshold that Apple has determined for privacy or you don’t, and if you don’t, then you get almost no in-app context back in the postback that gets sent, you basically just get an install log but it’s sent on this random timer system and so you don’t get it in real time, you get it on this delay. There’s this privacy threshold which is not public, we don’t know if it’s dynamic, there’s a lot of mystery around it. But either you surpass that or you don’t in terms of the number of events that have been captured from a specific campaign from a specific app, and if you don’t then you get no context.
What they’ve done is they’ve added gradations now. It’s not just binary yes or no, but there’s gradations, and if you reach the highest level of what they’ve called now in the new terminology they’re using here — it’s not new but it’s new to this — is Crowd Anonymity. Basically the more people that you have in a group the harder it is to identify any single one of them, that’s the crux of that idea. So what they’ve done is, as you increase the level of Crowd Anonymity with the number of events that are generated, then you unlock more value from the system. At the highest level of Crowd Anonymity, you get very many source identifiers that pertain to the campaign and you could codify those however you want. You could get potentially creative information embedded in that and various parameters of the campaign information encoded in that. You can set it up however you want, but if you don’t, then you get just the normal 100. They’ve done the same with Conversion Values. Now, if you reach the highest level of Crowd Anonymity, you get the 6-bit options of Conversion Values. But if you don’t, if you just reach that middle tier and there’s three tiers, then you get what they call a coarse grained Conversion Value, which it’s just three values. It’s high, medium, low. But you could encode information in that it helps you understand whether that’s a good campaign or not.
Just to jump in on this because I think this is actually a really important point. Big picture, we were pretty grumpy at Apple last time, we could try to put on our “Let’s try to give the best possible explanation for what Apple is doing” hats here. I think the concern is if you are sending super specific feedback and you only have ten users just to use an extreme example, you can back out who specific users are and so what they’re saying is if you have millions of users we can send you very specific events because it’s basically impossible to figure out whoever every single one was. I just want to put a stake in the ground on that, because I think that it’s actually super critical to understanding broader changes in the ecosystem. We’ll talk about the Unity stuff later, but by implication you have to be large because if you’re going to get the best feedback, you have to have a huge number of impressions and events to make this all work. I just wanted to double down on that specific point.
ES: What I would add to that, I’m not going to dig into the details here because I think it goes beyond the scope of the conversation, but yes you just described it really well. But the problem was before, there was a chicken-and-egg problem. I have a campaign, and I’m spending some minimum amount of money against it because I want to measure the performance before I start increasing the amount of money that I spend against it. But I’m not getting any data from the postbacks, because I’m not spending enough money to generate enough events.
You don’t want to dump millions of dollars into that campaign, you don’t know if it works.
ES: I’m at an impasse, I don’t want to spend more money on it because I’m not getting any performance feedback, but I’m not getting any performance feedback cause I’m not spending any money on it — so, chicken-and-egg. Now with this, if I spend a little bit more money and I get to that middle level of crowd anonymity, I get the high, medium, low feedback, and I can encode that to mean something that is relevant to me to give me some signal as to whether this campaign could be a good one. Just having that intermediary step gives me a lot of really valuable context that prevents this from being binary yes or no and breaks me out of that chicken-and-egg problem. Now I could spend a little bit more money because I’m getting some signal that, “Hey, this is all coming back with the high value” and if I’ve set that to mean the predicted value of this user after ninety days, “Okay that’s high”, that means these are good users, I should spend more money, and it gives you the signal that you need in that early stage to determine whether you can spend more money against the campaign or not to really unlock all the value.
Are they also speeding up the feedback loop so that once you start spending high, you can immediately see if this is worth it or not before you’ve spent too much and make changes?
ES: Yes. The other thing that I’m really excited about is they just did away with the timer system, which was overly engineered, complex, and convoluted. Now there’s just attribution windows, and there’s three. So now you can potentially get up to three postbacks, it’s not just a single postback and the attribution windows are fixed. Now they’re not zero day, it’s zero to two days is the first window, but it’s fixed and so you know when you’ll get that postback. On top of that, because you know what the fixed interval is, you can actually model much more easily on top of that to get to install estimates whereas before you couldn’t, because the timer system was random…
…Just to rewind though, you had three things about SKAdNetwork. You had the gradation sort of privacy, you had the faster and set time limits for when they’re giving you feedback, what was the third one?
ES: Just to recapitulate, so there’s hierarchical source identifiers, hierarchical conversion values, the third one is multiple conversions and then there’s a fourth one — those three are the primary considerations for me. There’s a fourth one that SKAdNetwork can now handle attributions from the web, so there’s four big new pieces of functionality coming.
There is one more point I wanted to make about SKAdNetwork: I would say that if at the outset, if at WWDC 2020, what Apple had presented to advertisers was SKAdNetwork 4.0, the spec, and they had unified the personalized ads texts across Apple’s own opt-in prompt and the ATT prompt, and Apple Search Ads used SKAdNetwork instead of the Apple Ads Attribution API, if all those things were true, and those all seem like reasonable things to ask for, I would’ve applauded ATT. I would’ve said, “This is the right thing to do, this is good for consumers, this is the direction that advertisers should have predicted they were going to have to move in.” I would’ve said, “ATT is excellent and I fully support this.”
The problem was SKAdNetwork 2.0 — 2.0 is what they unveiled at WWDC 2020 — was just totally dysfunctional and didn’t work. Then the differences in treatment between their own network and their own opt-in prompts between the ATT prompt and other ad networks. If all those things were addressed, I would be absolutely an advocate for ATT, I think it would’ve been the right policy, but it’s just the preferential treatment in addition to this totally dysfunctional SKAdNetwork, which to your point, was unnecessarily dysfunctional, it created pain that wasn’t necessary to protect consumer privacy…
…You mentioned this a little bit earlier, but as we’re wrapping up here, looking forward you can see Apple maybe doing some more crackdowns on fingerprinting or some of this third party stuff, there’s potential regulation. It kind of feels inescapable to me that there’s going to be more consolidation. The returns to being very large are going to be huge and I think particularly for Meta, it’s pretty interesting because if you have this situation where 1) you need to be large and 2) ads need to be increasingly contextual, which is going to just require a lot more computing capacity and modeling and things along those lines relative to an IDFA, which was kind of straightforward — you just match column A to column B. Is it wrong to actually be somewhat optimistic about Meta’s fortunes looking forward?
ES: No, I think you’re exactly right. So Sheryl [Sandberg] — what a run she had at Meta — in her last earnings call, brought up click-to-message ads like five times or ten times.
We’ll miss the Sheryl anecdotes in general, I should note.
ES: Yeah, she seems very optimistic about those and that kind of aligns with my content fortress thesis. [CEO] Mark [Zuckerberg] even said that one of the big initiatives of the company is to grow first party understanding of people’s interests by making it easier for people to engage with businesses in our own apps. So just to generate much more first party data that they can use to target ads. The other thing is Reels, obviously if your ability to target ads has been degraded, the value of the ads decrease. Well, how do you maintain the same amount of revenue? You show more ads, and you probably don’t want to increase the ad load — ad load is the number of ads seen per session. So then you want to increase the session length, or per minute of session or whatever is ad load and so if you increase the session length, then ad load can stay the same, you get more ads. Even though they’re lower CPM, which is exactly what happened to them last quarter, they had total ad impressions was up 15%, average price for ad was down 14% and they were down 1% year over year on revenue.
This is the counterintuitive thing about these businesses, that’s great news! I remember back when Google first IPOed and people would look at, “Oh, the cost per ad is going down.” It’s like, “That’s good!” It’s very counterintuitive because it’s dangerous when — this happened to Facebook a couple times — where they just run out of places to grow as far as ads goes.
This happened right before Stories. Their price per ad was going up, and people were getting excited about this, “Wow, look at the pricing power.” And it was like, “No”, this is actually a very problematic sign because that means they don’t have sufficient inventory to show, that also means advertisers are going to other platforms, it’s giving other platforms money and the ability to increase their competitive product, and Stories was this huge triumph from a business perspective because they just exploded their ad inventory and their price-per-ad plummeted.
It’s hilarious, because their stock got killed, the second biggest drop after Facebook’s first greatest drop this year was Facebook after the Stories earnings, when they announced that this was happening. I remember at the time I’m like, “No, this is really great news,” because it basically sets them up for multiple years of growth opportunities, which is what happened. And just to double down on your Reels point, them focusing on Reels, it’s kind of like of all the stuff that’s gone wrong for Meta/Facebook, the fact that they’re doing this Reels double down at the same time they’re having these ad business challenges is actually a piece of good fortune in my opinion.
ES: Yeah, I think so. Sheryl even mentioned that this is the playbook. It’s like we get a toe hold, we find a new front and we expand our inventory and then we get better at targeting ads in that inventory, so the value of the ads goes up. But that’s a process, you have to find the new port point of entry for showing these ads and then you’ll over time increase the value of them because you’ll get better at understanding what signals to use for targeting.
Just to follow up on what you said, it’s important, I think we’re getting to a point where size, scale, sophistication, all of those things are going to be really important just for any sort of competitive staying power, and especially with SKAdNetwork 4.0. Because 2.0 and 3.0 kind of leveled the playing field.
(laughing)It was terrible for everyone.
ES: Yes, exactly. With 4.0 your ability to parse out meaning from the added context that you’ll get with potentially three postbacks with the hierarchical conversion values, Facebook’s going to be able to do a lot more with that than probably any other ad platform except for maybe Google. So they’ll be better positioned to take that new context and build value out of it. If you’re thinking about — and Sheryl said this too — we’re at the beginning stages of clawing back the measurement that we lost. But I think they’re very realistic about where they are in this process. Just giving them more surface area to extract value from will benefit them in a way that a lot of the app install networks probably can’t do. Maybe even like a Snap, a Pinterest, a Twitter can’t do, I think it’s going to make the topography more diverse again so the playing field will no longer be level.
Yeah, you just need rules of the road. I think the problem with SKAdNetwork 2.0 is because it was worthless, everyone was just adrift on the sea and with the new SKAdNetwork, if you actually know the rules to play by, yeah those rules might be crappier than the rules you had previously so you’re going to have to build it again, but as long as there’s something you can rely on, and something you can model against, you can build something again. But to the degree it’s hard to do, the more that favors Facebook. It’s kind of like the GDPR thing, I wrote this at the beginning, I’m like, “This is so clearly going to benefit the biggest companies because they can afford to hire all the compliance officers, they can afford to do all this work.” I think it’s going to be a similar dynamic here.
ES: 100% and I think it’s doubly true with the AI initiatives. I think when Zuckerberg talks about AI, he doesn’t mean it in the way that most people interpret it. He’s talking about parsing signal out of video.
And it’s going to be the same problem. Show the right video to someone in Reels, and show them the right ad, it’s the same challenge.
ES: Exactly. But it’s a much more appreciable challenge than doing that with text, and who can compete there? Facebook’s got its own data centers with its own hardware, that was always Snap’s problem, by the way, their cost of revenue was so high because they were totally dependent on Google Cloud and Amazon Cloud.
That’s a great point. That’s going to actually get worse for them now that they have to start doing this AI analysis of video.
ES: Yeah, Facebook said they’re doubling or tripling a number of GPUs that they’re going to have in their own proprietary hardware, in their own owned data centers. Who can compete with that?
6. Why Huawei founder Ren Zhengfei’s new memo has gone viral on China’s internet – Iris Deng
A leaked internal memo by Ren Zhengfei, founder of Huawei Technologies Co., has gone viral on China’s social media, as his bleak outlook of the global economy strikes a chord in the country’s business and technology sectors.
The memo, which was first reported by Chinese media outlet Yicai on Tuesday, painted a gloomy picture of a world heading into economic recession. It called for employees to focus on the company’s survival and give up on wishful thinking.
Huawei declined to confirm or deny the memo, but sources told the Post that the text, which has been widely reported locally and shared on the internet, is authentic.
“The next 10 years will come down as a painful period in history, as the world economy goes into recession … Huawei needs to tone down on any over-optimistic forecast and make survival its most important creed in the next three years,” Ren wrote.
This is not the first time that Ren, 77, has reminded Huawei employees that the firm is navigating a business crisis. Huawei’s rotating chairman, Eric Xu Zhijun, also repeatedly said in 2020 and 2021 that the company’s goal was to survive US sanctions, which barred its access to US-origin technology, such as advanced smartphone chips.
However, Ren’s new warnings come amid fresh challenges, as Beijing carries on with draconian Covid-19 controls despite the Chinese economy being in its worst shape in decades.
China’s gross domestic product grew only 0.4 per cent in the second quarter, the worst since the first quarter of 2020, when the coronavirus shut down large swathes of the country, driving GDP down by 6.8 per cent.
7. Big beliefs – Morgan Housel
Most fields are a hierarchy of truths with big ideas at the top and laws, rules, and finer details branching off below them. Viewing ideas in isolation, without recognizing the family tree of where they came from, gives a distorted view of how a field works and can overcomplicate what are often simple answers.
Beliefs are the same. How many business and investing beliefs do I have – opinions, ideas, models, etc? I don’t know, thousands probably. It’s a complex topic. But most of them derive from a few core beliefs.
A few big things I believe:
The inability to forecast the past has no impact on our desire to forecast the future. Certainty is so valuable that we’ll never give up the quest for it, and most people couldn’t get out of bed in the morning if they were honest about how uncertain the future is…
...It takes less effort to increase confidence than it does ability. Confidence gives the impression of removing uncertainty, which we desperately want and are quick to embrace, while ability is constantly under attack from competition and an evolving economy…
…Sitting still feels reckless in a fast-moving world, even in situations where it offers the best odds of long-term compounding. It’s like being told that you should play dead if a grizzly charges you – running for your life just feels more practical. The bias towards action is one of the strongest forces in business investing for three reasons: It can be the only signal to yourself and others that you’re not oblivious to risks. It can be the only signal to others that you’re worth your salary. And it can provide the illusion of control in a world where so much is out of your hands.
It’s hard to determine what is dumb luck and what is unfortunate risk. Investing is a game of probabilities, and almost all probabilities are less than 100%. You can make a good bet with the odds in your favor and still lose, and a reckless bet and still win. It makes it difficult to judge others’ performance – lots of good decisions end up on the unfortunate side of risk and vice versa.
Calm plants the seeds of crazy. If markets never crashed they wouldn’t be risky. If they weren’t risky they would get expensive. When they’re expensive they crash. Same for recessions. When the economy is stable people become optimistic. When they get optimistic they go into debt. When they go into debt the economy becomes unstable. Crazy times aren’t an accident – they’re an inevitability. The same cycle works in reverse, as depressed times create opportunities that plant the seeds of the next boom. One way to summarize it: Nothing too good or too bad lasts indefinitely.
Disclaimer: None of the information or analysis presented is intended to form the basis for any offer or recommendation. Of all the companies mentioned, we currently have a vested interest in Alphabet (parent of Google), Apple, Meta Platforms (parent of Facebook), and Tesla. Holdings are subject to change at any time.