What We’re Reading (Week Ending 28 April 2024) - 28 Apr 2024
Reading helps us learn about the world and it is a really important aspect of investing. The late Charlie Munger even went so far as to say that “I don’t think you can get to be a really good investor over a broad range without doing a massive amount of reading.” We (the co-founders of Compounder Fund) read widely across a range of topics, including investing, business, technology, and the world in general. We want to regularly share the best articles we’ve come across recently. Here they are (for the week ending 28 April 2024):
1. 10 Questions with Chris Beselin – Michael Fritzell and Chris Beselin
Today, I’ll be interviewing Chris Beselin, who runs a Vietnam-focused activist fund and two tech businesses from his base in Ho Chi Minh City…
…3. Why do you Vietnam has been so successful as an economy – why has it developed faster than almost any other nation on earth?
There are a range of factors, of course, but just to outline a few:
It’s a balanced economy and growth model – it’s not your typical emerging market, where the economy is overly dependent on one or a handful of commodities.
Rather, the Vietnamese growth model has multiple core engines: its one of the most trade-focused economies in the world (measured as (export+import)/GDP) with free trade agreements signed with countries representing 60% of global GDP, it has a young and well-educated population where English proficiency is on par with e.g. India and South Korea, it has a sizeable and confident middle class that is rapidly growing and it has a stable government that has been focused on pro-market deregulations for the past 35 years.
And in contrast to what many people think from the outset, Vietnamese exports are primarily engineering-driven (as opposed to lower value-add textiles and similar). Around 45% of the exports are electronics, smartphones, laptops and machinery components. In this sense, my conviction is that Vietnam is much more the next South Korea or Japan than the next China.
To me, this all boils down to the fact that the number one asset of the country is its young, savvy and hungry engineering population (ca. 100,000 engineers are educated per year in Vietnam, of which around 50% are within software). The attractiveness of the Vietnamese engineering talent pulls foreign capital to invest in onshore engineering-centered manufacturing, which in turn has vast ripple effects on the employment of thousands of additional factory workers around the engineers…
…5. What misconceptions do you think foreigners typically have about the country?
I think there are many. Just to name a few:
The first one is perhaps “Vietnam is almost like China, but smaller and less developed”. I went through a bit of the difference in the fabric of the economies and demographics previously, but then there is also the very important difference in politics. Geopolitically, Vietnam is not and will never be or perceive itself to be a global superpower like China – it doesn’t have any geopolitical ambitions outside its own borders like China has.
Vietnam is primarily interested in developing its economy through trade and FDI, this in turn means that Vietnam in practice benefits from being geopolitically neutral between East and West and by trading/being friends with “everyone”. So far the country has managed this balance very astutely for decades.
Another common misconception (particularly for Westerners growing up during the Vietnam War) is that “Vietnam is just getting back on its feet after the recent war”. Obviously, this perspective is wildly outdated, but it’s still surprisingly common among foreign visitors. To put it into perspective, perhaps a suitable analogy is if you would have been saying/thinking similar things about France or the UK in the mid 90s… (also then ca. 50 years from the end of the Second World War, just like Vietnam is today 50 years away from its war ending in 1975).
2. Private Credit Offers No Extra Gains After Fees, New Study Finds – Justina Lee
A trio of academics has a bold take on the booming $1.7 trillion private credit market: after accounting for additional risks and fees, the asset class delivers virtually no extra return to investors…
…“It’s not a panacea for investors where they can earn 15% risk-free,” said Michael Weisbach, a finance professor at Ohio State University who co-wrote the research with Isil Erel and Thomas Flanagan. “Once you adjust for the risk, they basically are getting the amount they deserve, but no more.”
Behind the research is complex math to try to untangle the alpha part of a return that’s down to skill, and the beta part that might just come from stumbling into a bull market. While comparing stock pickers to a market benchmark like the S&P 500 is standard by now, it’s not obvious what the right yardstick is for private-credit funds, which make idiosyncratic and opaque loans to a wide array of companies…
..The three economists dissected MSCI data on 532 funds’ cash flows, covering their incoming capital and distributions to investors. They compare the industry’s performance to stock and credit portfolios with similar characteristics, whose fluctuations end up explaining the majority of private-credit returns. The study makes the case that these private credit funds also carry some equity risk, since around 20% of their investments contain equity-like features such as warrants.
After accounting for those risks, they find that there’s still alpha left on the table — which only vanishes once fees paid to these managers are deducted…
…As private markets boom, some quants — most notably Cliff Asness of AQR Capital Management — have suggested that investors are being misguided by returns that mask volatility and may be less impressive than they appear.
True at Adams Street Partners, who co-wrote one of the first papers on private-credit performance, cautions that until the industry faces its first downturn it may be hard to determine real alpha. But he says the NBER study is a good step toward digging beneath the surface of private-credit returns.
3. Americans are still not worried enough about the risk of world war – Noah Smith
When Americans think of World War 2, we usually think of the roughly four years of the war that we participated in, from late 1941 through 1945. Those years were indeed the most climactic and destructive of the war, by far, but the war actually began earlier. In fact, although the official start date is September 1, 1939, it’s easy to make an argument that the war began long before that…
…But throughout the 1930s, there were a number of conflicts that led into World War 2, and eventually merged with that overall conflict, like tributaries emptying into a great river. Let’s do a quick timeline of these.
In 1931-32, Japan seized Manchuria from China, an act that led inexorably to a wider war between the two powers. The Manchurian war and occupation also set Japan on a path toward militarism, bringing to power the regime that would ultimately prosecute WW2 itself.
In 1935-36, fascist Italy invaded and conquered Ethiopia. The League of Nations halfheartedly tried to stop the war and failed, leading to the League being discredited and the post-WW1 order being greatly weakened. That emboldened the fascist powers. Ethiopian resistance to Italian rule would eventually become a minor theater of WW2.
From 1935 through 1939, Japan and the Soviet Union fought an undeclared border war, ultimately culminating in major battles in 1939, which the USSR won. That led to Japan seeking an alliance with Nazi Germany, and eventually led to the Soviets’ entry into the war against Japan at the very end of WW2. (The realization that Japan couldn’t defeat the Soviets and conquer Siberian oil fields also prompted Japan to try to take Southeast Asian oil instead, when it needed oil to prosecute its war against China; this led to Pearl Harbor and the Pacific War.)
From 1936 through 1939, Nazi Germany, fascist Italy, and the Soviet Union fought each other in a proxy war: the Spanish Civil War. Units from all three powers officially or unofficially engaged in the fighting. When the Nationalists won, it emboldened the fascist powers even further.
In 1937, Japan invaded the remainder of China, in what’s called the Second Sino-Japanese War. This became a major theater of World War 2, accounting for almost as many deaths as the Nazi invasion of the USSR. It also prompted Japan to go to war with Britain and the U.S., in order to seize the oil fields of Indonesia to support the invasion of China. (The fact that we don’t count this as the start of WW2 seems like pure eurocentrism to me.)
In 1939, before the Soviet Union joined World War 2, it invaded Finland in what’s known as the Winter War, seizing some territory at great cost. This war would continue all the way through WW2 itself.
So there were no fewer than six wars in the 1930s that ended up feeding into World War 2 itself!..
…So if you were living at any point in 1931 through 1940, you would already be witnessing conflicts that would eventually turn into the bloodiest, most cataclysmic war that humanity has yet known — but you might not realize it. You would be standing in the foothills of the Second World War, but unless you were able to make far-sighted predictions, you wouldn’t know what horrors lurked in the near future.
In case the parallel isn’t blindingly obvious, we might be standing in the foothills of World War 3 right now. If WW3 happens, future bloggers might list the wars in Ukraine and Gaza in a timeline like the one I just gave.
Or we might not be in the foothills of WW3. I think there’s still a good chance that we can avert a wider, more cataclysmic war, and instead have a protracted standoff — Cold War 2 — instead. But I’m not going to lie — the outlook seems to be deteriorating. One big reason is that China appears to be ramping up its support for Russia…
…So it makes sense to view the Ukraine War as a European proxy conflict against Russia. But what’s more ominous is that it also makes an increasing amount of sense to view it as a Chinese proxy conflict against Europe.
A little over a year ago, there were reports that China was sending lethal aid to Russia. Believing these reports, I concluded — perhaps prematurely — that China had gone “all in” on Russia’s military effort. Some of the reports were later walked back, but the fact is, it’s very hard to know how much China is doing to provide Russia with drones and such under the table. But now, a year later, there are multiple credible reports that China is ramping up aid in various forms.
For example, the U.S. is now claiming that China is providing Russia with both material aid and geospatial intelligence (i.e. telling Russia where Ukrainian units are so Russia can hit them):
The US is warning allies that China has stepped up its support for Russia, including by providing geospatial intelligence, to help Moscow in its war against Ukraine.
Amid signs of continued military integration between the two nations, China has provided Russia with satellite imagery for military purposes, as well as microelectronics and machine tools for tanks, according to people familiar with the matter.
China’s support also includes optics, propellants to be used in missiles and increased space cooperation, one of the people said.
President Joe Biden raised concerns with Xi Jinping during their call this week about China’s support for the Russian defense industrial base, including machine tools, optics, nitrocellulose, microelectronics, and turbojet engines, White House National Security Council spokesperson Adrienne Watson said.
This is very similar to the aid that Europe and the U.S. are providing Ukraine. We also provide geospatial intelligence, as well as materiel and production assistance. If reports are correct — and this time, they come from the U.S. government as well as from major news organizations — then China is now playing the same role for Russia that Europe and the U.S. have been playing for Ukraine.
In other words, the Ukraine War now looks like a proxy war between China and Europe…
…Of course, World War 3 will actually begin if and when the U.S. and China go to war. Almost everyone thinks this would happen if and when China attacks Taiwan, but in fact there are several other flashpoints that are just as scary and which many people seem to be overlooking.
First, there’s the South China Sea, where China has been pressing the Philippines to surrender its maritime territory with various “gray zone” bullying tactics…
…The U.S. is a formal treaty ally of the Philippines, and has vowed to honor its commitments and defend its ally against potential threats.
And then there’s the ever-present background threat of North Korea, which some experts believe is seriously considering re-starting the Korean War. That would trigger a U.S. defense of South Korea, which in turn might bring in China, as it did in the 1950s.
It’s also worth mentioning the China-India border. China has recently reiterated its claim to the Indian state of Arunachal Pradesh, calling it “South Tibet” and declaring that the area was part of China since ancient times. India has vigorously rejected this notion, of course. An India-China border war might not start World War 3, but the U.S. would definitely try to help India out against China, as we did in 2022…
…America hasn’t mustered the urgency necessary to resuscitate its desiccated defense-industrial base. China is engaging in a massive military buildup while the U.S. is lagging behind. This is from a report from the Center for Strategic and International Studies:
The U.S. defense industrial base…lacks the capacity, responsiveness, flexibility, and surge capability to meet the U.S. military’s production needs as China ramps up defense industrial production. Unless there are urgent changes, the United States risks weakening deterrence and undermining its warfighting capabilities against China and other competitors. A significant part of the problem is that the U.S. defense ecosystem remains on a peacetime footing, despite a protracted war in Ukraine, an active war in the Middle East, and growing tensions in the Indo-Pacific in such areas as the Taiwan Strait and Korean Peninsula.
The United States faces several acute challenges.
First, the Chinese defense industrial base is increasingly on a wartime footing and, in some areas, outpacing the U.S. defense industrial base…Chinese defense companies…are producing a growing quantity and quality of land, maritime, air, space, and other capabilities. China…is heavily investing in munitions and acquiring high-end weapons systems and equipment five to six times faster than the United States…China is now the world’s largest shipbuilder and has a shipbuilding capacity that is roughly 230 times larger than the United States. One of China’s large shipyards, such as Jiangnan Shipyard, has more capacity than all U.S. shipyards combined, according to U.S. Navy estimates.
Second, the U.S. defense industrial base continues to face a range of production challenges, including a lack of urgency in revitalizing the defense industrial ecosystem…[T]here is still a shortfall of munitions and other weapons systems for a protracted war in such areas as the Indo-Pacific. Supply chain challenges also remain serious, and today’s workforce is inadequate to meet the demands of the defense industrial base.
“The Chinese defense industrial base is increasingly on a wartime footing.” If that isn’t a clear enough warning, I don’t know what would be. You have now been warned!
4. Memory Shortage and ASML – Nomad Semi
Although we are down to 3 major DRAM manufacturers, memory has always been a commodity since the 1970s. The wild swing in gross margin for SK Hynix in a year proves the point. Supply is the underlying driver of the memory cycle. Memory prices should always trend down over time with cost as memory manufacturers migrate to the next process node that allows for higher bit per wafer. There is always a duration mismatch in demand and supply due to the inelasticity of supply. When there is supernormal profit, capex will go up and supply will follow in 2 to 3 years. Supply will exceed demand and DRAM prices should fall to cost theoretically. This post will instead focus on the current upcycle and how it could actually be stronger than the 2016 cycle.
How we get here today is a result of the proliferation of AI and the worst downcycle since the GFC. To be fair, the 2022 downcycle was driven by a broad-based inventory correction across all the verticals rather than very aggressive capex hike from the memory producer. The downcycle led to negative gross margin for SK Hynix, Micron and Kioxia. Negative gross margin led to going concern risk, which led to Hynix and Micron cutting their capex to the lowest level in the last 7 years. This is despite the fact that we have moved from 1x nm to 1b nm which will require higher capex per wafer.
HBM (High Bandwidth Memory) has become very important in AI training, and you can’t run away from talking about HBM if you are looking at DRAM…
…HBM affects current and future DRAM supply in 2 different ways. The 1st is cannibalization of capex from DRAM and NAND where Fabricated Knowledge gave a very good analogy. The 2nd is as Micron mentioned in the last call that “HBM3E consumes approximately three times the wafer supply as DDR5 to produce a given number of bits in the same technology node”. In fact, this ratio will only get worse in 2026 as HBM4 can consume up to 5x the wafer supply as DDR5. The way it works is a HBM die size is double that of DDR5 (which already suffers from single digit die size penalty vs DDR4). HBM die size will only get bigger as more TSV is needed. Yield rate of HBM3e is below 70% and will only get harder as more dies are stacked beyond 8-hi. Logic base die of the HBM module is currently produced in-house by Micron and Hynix although this could be outsourced to TSMC for HBM4. In summary, not only is HBM consuming more of current wafer supply, but it is also cannibalizing the capex for future DRAM and NAND capacity expansion
In past upcycles, capex will often go up as memory producers gain confidence. Nobody wants to lose market share as current capex = future capacity → future market share. However, SK Hynix and Micron will be unable to expand their DRAM wafer capacity meaningfully in 2024 and 2025.
SK Hynix has limited cleanroom space available for DRAM expansion (~45k wpm) at M16 and this will be fully utilized by 2025. The company will have to wait till 2027 before Yong-In fab can be completed. Even when the balance sheet situation for SK Hynix improves in 2025, it will be limited by its cleanroom space.
For Micron, the situation is slightly better. Taichung fab also has limited space available for capacity expansion, but this will likely be earmarked for HBM production. Micron will have to wait until the new Boise fab is ready in 2026. Both Micron and Hynix will be limited in capacity expansion in 2025 against their will.
5. Artificial intelligence is taking over drug development – The Economist
The most striking evidence that artificial intelligence can provide profound scientific breakthroughs came with the unveiling of a program called AlphaFold by Google DeepMind. In 2016 researchers at the company had scored a big success with AlphaGo, an AI system which, having essentially taught itself the rules of Go, went on to beat the most highly rated human players of the game, sometimes by using tactics no one had ever foreseen. This emboldened the company to build a system that would work out a far more complex set of rules: those through which the sequence of amino acids which defines a particular protein leads to the shape that sequence folds into when that protein is actually made. AlphaFold found those rules and applied them with astonishing success.
The achievement was both remarkable and useful. Remarkable because a lot of clever humans had been trying hard to create computer models of the processes which fold chains of amino acids into proteins for decades. AlphaFold bested their best efforts almost as thoroughly as the system that inspired it trounces human Go players. Useful because the shape of a protein is of immense practical importance: it determines what the protein does and what other molecules can do to it. All the basic processes of life depend on what specific proteins do. Finding molecules that do desirable things to proteins (sometimes blocking their action, sometimes encouraging it) is the aim of the vast majority of the world’s drug development programmes.
Because of the importance of proteins’ three-dimensional structure there is an entire sub-discipline largely devoted to it: structural biology. It makes use of all sorts of technology to look at proteins through nuclear-magnetic-resonance techniques or by getting them to crystallise (which can be very hard) and blasting them with x-rays. Before AlphaFold over half a century of structural biology had produced a couple of hundred thousand reliable protein structures through these means. AlphaFold and its rivals (most notably a program made by Meta) have now provided detailed predictions of the shapes of more than 600m.
As a way of leaving scientists gobsmacked it is a hard act to follow. But if AlphaFold’s products have wowed the world, the basics of how it made them are fairly typical of the sort of things deep learning and generative AI can offer biology. Trained on two different types of data (amino-acid sequences and three-dimensional descriptions of the shapes they fold into) AlphaFold found patterns that allowed it to use the first sort of data to predict the second. The predictions are not all perfect. Chris Gibson, the boss of Recursion Pharmaceuticals, an AI-intensive drug-discovery startup based in Utah, says that his company treats AlphaFold’s outputs as hypotheses to be tested and validated experimentally. Not all of them pan out. But Dr Gibson also says the model is quickly getting better…
…A lot of pharma firms have made significant investments in the development of foundation models in recent years. Alongside this has been a rise in AI-centred startups such as Recursion, Genesis Therapeutics, based in Silicon Valley, Insilico, based in Hong Kong and New York and Relay Therapeutics, in Cambridge, Massachusetts. Daphne Koller, the boss of Insitro, an AI-heavy biotech in South San Francisco, says one sign of the times is that she no longer needs to explain large language models and self-supervised learning. And Nvidia—which makes the graphics-processing units that are essential for powering foundation models—has shown a keen interest. In the past year, it has invested or made partnership deals with at least six different AI-focused biotech firms including Schrodinger, another New York based firm, Genesis, Recursion and Genentech, an independent subsidiary of Roche, a big Swiss pharmaceutical company.
The drug-discovery models many of the companies are working with can learn from a wide variety of biological data including gene sequences, pictures of cells and tissues, the structures of relevant proteins, biomarkers in the blood, the proteins being made in specific cells and clinical data on the course of disease and effect of treatments in patients. Once trained, the AIs can be fine tuned with labelled data to enhance their capabilities.
The use of patient data is particularly interesting. For fairly obvious reasons it is often not possible to discover the exact workings of a disease in humans through experiment. So drug development typically relies a lot on animal models, even though they can be misleading. AIs that are trained on, and better attuned to, human biology may help avoid some of the blind alleys that stymie drug development.
Insitro, for example, trains its models on pathology slides, gene sequences, MRI data and blood proteins. One of its models is able to connect changes in what cells look like under the microscope with underlying mutations in the genome and with clinical outcomes across various different diseases. The company hopes to use these and similar techniques to find ways to identify sub-groups of cancer patients that will do particularly well on specific courses of treatment.
Sometimes finding out what aspect of the data an AI is responding to is useful in and of itself. In 2019 Owkin, a Paris based “AI biotech”, published details of a deep neural network trained to predict survival in patients with malignant mesothelioma, a cancer of the tissue surrounding the lung, on the basis of tissue samples mounted on slides. It found that the cells most germane to the AI’s predictions were not the cancer cells themselves but non-cancerous cells nearby. The Owkin team brought extra cellular and molecular data into the picture and discovered a new drug target. In August last year a team of scientists from Indiana University Bloomington trained a model on data about how cancer cells respond to drugs (including genetic information) and the chemical structures of drugs, allowing it to predict how effective a drug would be in treating a specific cancer.
Many of the companies using AI need such great volumes of high quality data they are generating it themselves as part of their drug development programmes rather than waiting for it to be published elsewhere. One variation on this theme comes from a new computational sciences unit at Genentech which uses a “lab in the loop” approach to train their AI. The system’s predictions are tested at a large scale by means of experiments run with automated lab systems. The results of those experiments are then used to retrain the AI and enhance its accuracy. Recursion, which is using a similar strategy, says it can use automated laboratory robotics to conduct 2.2m experiments each week…
…The world has seen a number of ground breaking new drugs and treatments in the past decade: the drugs targeting GLP-1 that are transforming the treatment of diabetes and obesity; the CAR-T therapies enlisting the immune system against cancer; the first clinical applications of genome editing. But the long haul of drug development, from discerning the biological processes that matter to identifying druggable targets to developing candidate molecules to putting them through preclinical tests and then clinical trials, remains generally slow and frustrating work. Approximately 86% of all drug candidates developed between 2000 and 2015 failed to meet their primary endpoints in clinical trials. Some argue that drug development has picked off most of biology’s low-hanging fruit, leaving diseases which are intractable and drug targets that are “undruggable”.
The next few years will demonstrate conclusively if AI is able to materially shift that picture. If it offers merely incremental improvements that could still be a real boon. If it allows biology to be deciphered in a whole new way, as the most boosterish suggest, it could make the whole process far more successful and efficient—and drug the undruggable very rapidly indeed. The analysts at BCG see signs of a fast-approaching AI-enabled wave of new drugs. Dr Pande warns that drug regulators will need to up their game to meet the challenge. It would be a good problem for the world to have.
Disclaimer: None of the information or analysis presented is intended to form the basis for any offer or recommendation. We currently have a vested interest in Alphabet (parent of Google DeepMind), ASML, and TSMC. Holdings are subject to change at any time.