What We’re Reading (Week Ending 22 December 2024)

What We’re Reading (Week Ending 22 December 2024) -

Reading helps us learn about the world and it is a really important aspect of investing. The late Charlie Munger even went so far as to say that “I don’t think you can get to be a really good investor over a broad range without doing a massive amount of reading.” We (the co-founders of Compounder Fund) read widely across a range of topics, including investing, business, technology, and the world in general. We want to regularly share the best articles we’ve come across recently. Here they are (for the week ending 22 December 2024):

1. Meet Willow, our state-of-the-art quantum chip – Hartmut Neven

Errors are one of the greatest challenges in quantum computing, since qubits, the units of computation in quantum computers, have a tendency to rapidly exchange information with their environment, making it difficult to protect the information needed to complete a computation. Typically the more qubits you use, the more errors will occur, and the system becomes classical.

Today in Nature, we published results showing that the more qubits we use in Willow, the more we reduce errors, and the more quantum the system becomes…

…This historic accomplishment is known in the field as “below threshold” — being able to drive errors down while scaling up the number of qubits…

…There are other scientific “firsts” involved in this result as well. For example, it’s also one of the first compelling examples of real-time error correction on a superconducting quantum system — crucial for any useful computation, because if you can’t correct errors fast enough, they ruin your computation before it’s done. And it’s a “beyond breakeven” demonstration, where our arrays of qubits have longer lifetimes than the individual physical qubits do, an unfakable sign that error correction is improving the system overall.

As the first system below threshold, this is the most convincing prototype for a scalable logical qubit built to date. It’s a strong sign that useful, very large quantum computers can indeed be built…

…As a measure of Willow’s performance, we used the random circuit sampling (RCS) benchmark. Pioneered by our team and now widely used as a standard in the field, RCS is the classically hardest benchmark that can be done on a quantum computer today…

…Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 1025 or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch…

…Willow was fabricated in our new, state-of-the-art fabrication facility in Santa Barbara — one of only a few facilities in the world built from the ground up for this purpose. System engineering is key when designing and fabricating quantum chips: All components of a chip, such as single and two-qubit gates, qubit reset, and readout, have to be simultaneously well engineered and integrated. If any component lags or if two components don’t function well together, it drags down system performance…

…The next challenge for the field is to demonstrate a first “useful, beyond-classical” computation on today’s quantum chips that is relevant to a real-world application. We’re optimistic that the Willow generation of chips can help us achieve this goal. So far, there have been two separate types of experiments. On the one hand, we’ve run the RCS benchmark, which measures performance against classical computers but has no known real-world applications. On the other hand, we’ve done scientifically interesting simulations of quantum systems, which have led to new scientific discoveries but are still within the reach of classical computers. Our goal is to do both at the same time — to step into the realm of algorithms that are beyond the reach of classical computers and that are useful for real-world, commercially relevant problems.

2. X (previously Twitter) thread on quantum computing and Google’s Willow – Jeffrey Scholz

Like a regular computer, a quantum computer keeps bits in groups. So a 64 bit quantum computer would have a vector of 64 2d vectors serving as it’s “word.”

Here is where the speedup happens: in a regular computer, each of the 64 bits don’t know anything about the value of any of the other 64 bits.

If we want one bit to affect another bit, we have to explicilty combine them with a logic gate.

However, in a quantum computer, each of the 64 qbits can “talk to each other” via “quantum entanglement.”

Running a quantum circuit means you plug in a quantum vector, run it through a bunch of matrix multiplications, then collapse the output.

The final vector will be the correct answer. Technically, quantum computers can give wrong answers, but if you run the computation multiple times, then you will get the correct answer on average…

…The current problem with quantum computers is that as the circuit gets bigger, they become less correct on average. All of the “talking to each other” creates so much noise the system stops working.

Once your probability of being correct drops below a certain threshold your quantum computer becomes useless. This is a major blocker for current quantum compute.

Let’s look at a specific (oversimplified but helpful) example. Suppose you shine a laser beam into an ice cube.

Actually simulating what the laser will do when it exits the ice cube is very hard to predict because some quantum phenomena is involved.

To actually compute what the laser will do means you have to explicilty compute quantum entanglement, which is slow for classical computers but “built in” to a quantum computer.

However, you can *estimate* the distribution of how the laser will scatter without a quantum computer, so you can have at least a rough idea if your answer might be correct…

…By analogy, this is what Google was doing. The computation Google was doing was a “pseudo-random quantum circuit” (think pseudoranom ice cube) but we know a quantum circuit is just matrix multiplications (on crack). Therefore, it is a bunch of random matrix multiplications with an output that looks right.

Google’s actual breakthrough was that the output of the circuit “looks correct” — which sounds underwhealming — and compared to the headlines, it definitely is. The academic breakthrough is that Google was able to use a larger circuit and notice an apparent *increase* in accuracy when modeling how a laser shines through an ice cube. That is noteworthy.

You can definitely tell if a computation has failed, and it seemed to be failing less as the circuit got bigger…

…However, note that the problem is “rigged” in favor of quantum computers. The benchmark is explicitly modeling a quantum phenomenon, so *of course* we get a speedup.

In other words, Google created a random distribution on the output that “seems correct.” Why does it “seem correct?” well because by design, the computation cannot be run on a classical computer. But if we can’t run it on a classical computer, how do we know the quantum computer is actually giving the right answer? The answer is we don’t, and this is a serious gap…

…Quantum computing is kind of at the stage right now where some smart teenager wired a few logic gates together in a random fashion and said “hey look, my circuit made a random output and didn’t explode!” Compared to previous attempts, it is an improvement. But he is still a long way from training an LLM.

3. Volatility: A Double-Edged Sword for Long-Term Equity Investors – Daniel Crowley

The ability to measure risk in a portfolio has long been a puzzle for the financial world. When Harry Markowitz introduced Modern Portfolio Theory in 1952, he revolutionized how institutions approached risk and return. His use of standard deviation as a proxy for volatility offered a clean, mathematical way to quantify the unpredictability of markets. It gave investors a seemingly precise tool to compare assets and assess portfolio risk. Over time, this approach became gospel, with concepts like beta and the Sharpe ratio reinforcing volatility as the core measure of risk.

But here’s the problem: volatility tells only part of the story. Financial markets don’t follow the neat patterns of a normal distribution, which is what these models assume. Extreme events occur far more often than traditional models predict. We’ve seen this play out time and again—from the collapse of Long-Term Capital Management to the Great Financial Crisis. The models couldn’t account for the market’s tendency to behave irrationally and with far greater extremes than the math suggested. That’s why I’ve come to view volatility not as risk itself but as a signal, an invitation to investigate further…

…Volatility is often misunderstood because it treats upward and downward price movements as equal. A stock with erratic upward swings may have high volatility but poses little risk if the business fundamentals are sound. Conversely, a stock that steadily declines might appear “safe” on paper but can quietly destroy wealth.

The market’s reliance on volatility as a measure of risk often misses these nuances.

This misunderstanding creates a divide among investors. On one side are those who cling to volatility as the ultimate arbiter of risk, building models that rely on neat equations and assumptions about market behavior. On the other are those who dismiss it entirely, treating volatility as irrelevant noise.

My view lies somewhere in the middle. Volatility is neither good nor bad—it’s just a clue. It’s a signal to dig deeper and assess whether the market’s movements are justified by changes in a business’s intrinsic value.

What I’ve come to appreciate about volatility is its ability to surface opportunity. Markets are emotional, driven by fear, greed, and short-term thinking. Prices frequently diverge from reality, creating moments where high-quality businesses are available at steep discounts. When markets panic, as they did during the COVID-19 pandemic or the Great Financial Crisis, those who can stay calm and look beyond the noise can identify extraordinary opportunities.

Volatility, far from being a risk, is often the price of admission for outsized returns.

4. The AI nuclear renaissance – SMRs role – Rihard Jarc

The global nuclear power market is about 10% of global electricity (about $350-$400B annually) and around 32% of zero-carbon electricity generation.

As of 2023, nuclear energy accounted for about 18.6% of total electricity generation in the United States. The International Energy Agency (IEA) highlights that global nuclear power output must more than double by 2050 to meet net-zero emission targets. Most of the U.S.’s nuclear power plants are over 50 years old and nearing the end of their operational lives. While their lifespans have been extended to support the grid, they will need to be replaced in the coming decades…

…The introduction of ChatGPT and the AI boom that we have experienced in the last 2 years have only accelerated as AI workloads and AI chips consume much more energy than traditional data center workloads. This Nuclear Energy expert gives a good example:

» If you provide a simple search in Google, you consume 0.3 W per hour of electricity. If you do the same with ChatGPT or Alexa or Gemini, any AI that we can imagine, this 0.3 W transforms into 2.9 W, so it means 10X the consumption.«…

…Driven by artificial intelligence (AI), cloud computing, and digital transformation, U.S. data centers consumed an estimated 150 TWh of electricity in 2023, equivalent to around 3% of the nation’s power demand. According to Goldman Sachs estimates, data center demand hovered at 340 TWh in 2023 globally, which is about 1.3% of worldwide electricity use. U.S. data center power use is expected to triple between 2023 and 2030 roughly and will require about 47 gigawatts of new generation capacity…

…Nuclear energy has become very attractive because companies want to be carbon-neutral and have stable power. An additional benefit of nuclear power is that it can provide more stable long-term contracts that are less sensitive to inflation and supply chain problems…

…Interest in nuclear energy, particularly Small Modular Reactors (SMRs), is growing as they have been heralded as a solution to streamline nuclear power production, offering flexibility, lower upfront costs, and modular deployment. The simplest way to imagine SMR is that it is a smaller version of the traditional nuclear reactor. One of their most significant benefits is that they are modular. They are designed to be built in factories, not on-site. Because they are built in factories, they are easier to assemble and control. From quality checks to a more predictable supply chain and quality of workers. When assembled, they are then shipped to the site of the nuclear plant, where they are stacked together to form the whole plant. In terms of energy output, traditional nuclear plants have outputs between 1,000-1,600 megawatts of electric (MWe) per reactor, while SMRs are around 50-300 MWe per module. Some SMRs are also said to be safer due to passive safety features, which rely on natural processes like convection to prevent meltdowns in emergencies. But they also come with cons. The primary one is that they are much smaller than traditional nuclear plants, so they do not have the cost benefits of economy of scale. Because of that, producing the same amount of energy is more expensive than on a traditional nuclear plant…

…Over 25 countries, according to the International Atomic Energy Agency (IAEA), are investing in SMRs. In March, Wood Mackenzie estimated the pipeline of SMR projects was worth more than $176 billion and that SMRs could account for as much as 30% of the global nuclear fleet by 2050…

…We can look at the example of NuScale, which has its Pressurised Water Reactor design. Their levelized cost of electricity ranges from $89-135/MWh, while traditional nuclear plants are in the $110-160/MWh. However, looking at the most traditional alternative in data centers, which is combined solar and gas, gas costs $45-70/MWh, and solar plus storage costs $30-60/MWh…

…State-backed projects in countries like China and Russia have made more progress, leveraging integrated supply chains, controlled costs, and assured revenue streams. But even for them, the costs to build these reactors compared to first estimates are still much bigger…

…We must also face reality, which says that only 2 SMRs are operational right now, one of which is in Russia and the other one in China.

Another important topic when assessing nuclear energy is the problem of nuclear waste and its storage. Most SMR designs produce a similar amount of nuclear waste on a unit production basis than traditional nuclear plants, so the problem of storing nuclear waste stays.

5. How to invest without relying on target prices – Chin Hui Leong

The US stock market is soaring to new heights. But what does that mean for your stock returns in 2025? I would like to give you a definite answer but if I did so, I would be lying to you. In fact, you should view anyone who gives you target prices with suspicion.

Here’s the hard truth: No one can control where the market is headed in the short term. Yet, the allure of target prices persists…

…The answer lies in the inherent difficulty in predicting the future of rapidly evolving technologies.

The best example is Amazon.com. In mid-2010, when I first invested in the company, it had just reported US$24.5 billion in annual revenue, primarily from its online retail business. Here is the twist: it was impossible to know what the business would look like a decade later…

…Fast forward to 2023, and AWS had become a financial cash cow with nearly US$90 billion in annual revenue and an impressive US$24.6 billion in operating income. In other words, AWS, an insignificant division back in 2009, had generated more operating income in 2023 than the entire company’s revenue in 2009…

…I like to go back to the reason why valuation is used in the first place: to reduce your investment risk. The way I see it, valuation is one of the many ways you can employ to manage risk. But valuation is not the only risk in investing.

A weak, shrinking business can pose risks that no amount of stock valuation can solve. Hence, starting with high-quality businesses is my preferred approach.


Disclaimer: None of the information or analysis presented is intended to form the basis for any offer or recommendation. We currently have a vested interest in Alphabet (parent of Google) and Amazon. Holdings are subject to change at any time.

Ser Jing & Jeremy
thegoodinvestors@gmail.com