What We’re Reading (Week Ending 08 February 2026)

What We’re Reading (Week Ending 08 February 2026) -

Reading helps us learn about the world and it is a really important aspect of investing. The late Charlie Munger even went so far as to say that “I don’t think you can get to be a really good investor over a broad range without doing a massive amount of reading.” We (the co-founders of Compounder Fund) read widely across a range of topics, including investing, business, technology, and the  world in general. We want to regularly share the best articles we’ve come across recently. Here they are (for the week ending 08 February 2026):

1. Software Is Dead. Long Live Software – Eugene Ng

SaaS software stocks have declined significantly since Oct 2025 amid broader concerns that software is in decline, disrupted, displaced, and replaced by AI. Companies will use AI to redesign and unbundle their workflows over time, and the markets are effectively pricing in a software apocalypse.

The selloff has been almost indiscriminate, and the market is overly pessimistic…

…Software is a digital tool. It does not make sense to keep reinventing tools (e.g., a calculator or a hammer). If there are new tasks that have not yet been automated and can now be automated with software, now is the best time. Software is a TAM accelerator, and companies can create new and more products in shorter time frames.

The future appears to be agentic, with agents constituting the new digital workforce for humans, working for us and with other agents on exploratory, low-value, and repetitive tasks, thereby allowing us to focus on higher-value creative and strategic tasks.

The fact that everyone has a pen or a keyboard does not mean that we will have a rush of great writers, authors, or coders. The best work will still be done by the select minority, not the vast majority. Writing code is easy. Shipping a basic V1 is just 1% of the work. 99% of building enterprise software is about writing code that actually works and keeps working, maintaining it, iterating on it, securing it, and scaling it, and that is where the real difficulties lie. Vibe coding might be incredible for prototypes, internal tools, and new products, but it is not replacing a proven tool.

It is the same with AI. It does not mean that, if one can code faster with AI assistants, one can write great code or develop a great product. It still requires deep understanding, intent, judgment, and taste. And that’s where the bottleneck lies. Try getting a first-year coder to “vibe-code” and build a massive CRM database, and you will soon realise that it is not as easy. Automation scales whatever structure already exists. Agents tend to work best when intent is explicit and stable, and struggle when it is implicit and judgment-intensive.

SaaS is heterogeneous, not homogeneous. One cannot simply be lazy and lump everything into a single category of thought. The idea that enterprises will dump all software to “vibe-code” their own software with AI agents is wildly optimistic. Larger, more complex SaaS platforms with substantial codebases, deep workflows, extensive API connectors/regulatory licenses, strong network effects, and extensive hardware infrastructure are likely to be more insulated.

Deterministic systems where precision is critical, non-negotiable, requiring it to be 100% all the time, are more likely to be more insulated, as “close enough” is simply unacceptable. Probabilistic systems, conversely, tend to tolerate some errors and accept good-enough performance, and are primarily focused on pattern recognition, content generation, basic automation, and simple decision-making. If an LLM can replicate your probabilistic product with 90% of the quality at 10% of the cost, you are likely not to have a sound business model any longer. Even having a great UI or UX won’t save you.

High-value, mission-critical, must-have software is likely to be more insulated than low-value, non-mission-critical, good-to-have software. Functions such as cybersecurity, payments, and infrastructure are likely to remain robust. Because when these go down, the business stops. Customers should continue to be willing to pay premium prices for quality and peace of mind, remain highly sticky, and rarely switch because the cost of failure is too high. They tend to have high gross retention (customers don’t leave), high net retention (customers spend more over time), and are willing to pay more as their business grows.

2. The Utilities Analyst Who Says The Data Center Demand Story Doesn’t Add Up (Transcript here) – Tracy Alloway, Joe Weisenthal, and Andy DeVries

Tracy: Interesting. One of the reasons we wanted to talk to you is because you have that contrarian take on the data center built out, and we wrote it up in the Odd Lots newsletter, which everyone should subscribe to. It got a lot of attention. Your analysis, interestingly, is just based on some pretty simple math. So why don’t you, just to start out with, why don’t you walk us through the calculations that you’re actually making to try to analyze how much capacity the utilities are taking on to actually power data centers?

Andy: As you said, it’s pretty simple math here. So data centers now are consuming around 45 GW of power. And you can switch between capacity and throughput – I’m going to stick with capacity. So 45 GW of power. And then there’s lots and lots of third party estimates for where they’re going to be in 2030, and they are centered around this, 90 GW, 95 GW. So you need to add 50 GW. For 2035, there’s a lot fewer estimates. You come around 160 GW. These estimates, they’re all over the place, they come from sell-side banks, they come from consultants, they come from everyone. BNEF has one. They’re I think one of the best out there.

Joe: Thank you.

Andy: We use them a lot. So that’s on the demand side on where you’re going to come out on these. Then you look at the supply – and everyone talks about the demand right – but then you look at the supply and all these tech bros are too cool to actually look at the supply and do utility analysis. Who wants to be a utility analyst? You were making fun of us before. So you look at the supply and these utilities are tracking all these data centers connecting to the grid because they’ve got to do a lot of work. Spend a lot of money on transmission, distribution, new substations, transformers, it’s a lot of work. But it boosts their earnings growth so they’re happy to talk about this. You look at where they’re at and where they see things coming, they’ve got around 140 GW of near-term supply. Kudos to the utilities, they break out what’s firm, committed, signed, contracted, versus pipeline behind it. Because there’s a lot of double, triple, quadruple counting. If you’re going to build a data center in the Southeast, you’re going to tell Duke, you’re going to tell Southern, you’re going to tell Dominion, you’re going to build one. So that’s the pipeline potential. But looking just at the firm, committed, whatever they want to call it, around 140 GW.

Now you got to PUE adjust that. When you connect a data center to the grid, you’ve got lights, you’ve got cooling. Those third party estimates I gave you are just for raw compute.

Tracy: Why did you split those out though? All data centers are going to need to be cooled down, right? What’s the point of splitting it out?

Andy: I’m not splitting up. I’m just adjusting it downward, because the third-party estimates are just compute. So you’re connecting to the grid, you’re going to ask for the lights, the cooling and everything. I want to go apples to apples versus the third party.

Joe: What does PUE stand for? 

Andy: Power usage efficiency. So they’re at 140 GW. So that power is down to 110 GW on apples to apples. Just to go back, you only need 50 GW on the demand side between now and 2030. The utilities are working at connecting 110 GW, so the utilities are working on already connecting almost as much as you need by 2035. Again, just to make sure we are on the same page, third party estimates 45 GW for data centers now, going to 95 GW. That’s 50 GW. Utilities are working on 110 GW. They don’t give timing for that. Some of it’s going to be past 2030. What I’m trying to say is there is a lot of supply of data centers coming and it’s very unclear if there’s going to be demand for this…

…Tracy: The wild card to me seems to be the demand forecasts. We’re already seeing those change pretty wildly. I know you mentioned Bloomberg NEF – they’ve raised their forecast, because of the data center buildout. They’ve raised their forecast of how much energy is actually needed. How much confidence do you have in those demand numbers, and how could they change over time?

Andy: Moderate confidence. Look where we’re right now. OpenAI built all the ChatGPT using 2 GW. All the big tech hyperscalers, they haven’t given their 2025 volumes yet, but if you take their 2024 volumes and then double it – and this is output, so I’m going to transfer it back to capacity – and you assume a 60% capacity factor, all the hyperscalers combine around 15 GW. That’s got to be over half the data center demand. To talk about 95 GW  – it’s a staggering number. Then you get more advances and Nvidia chip efficiency – obviously Jevon’s Paradox kicks in, you’ve had numerous guests talk about that – it’s just a lot of power.

Tracy: Can you just remind us 1 GW is enough to power what? I like these comparisons.

Andy: A million homes. It depends if you’re in Florida or the northeast. But generally speaking, that’s where you’re at…

…Andy: But then you don’t need as many new power plants as everyone’s saying.  Constellation’s CEO said on a call the other day. He said, “Use the Texas market.” He said, “87 GW peak market, you could add 10 GW to Texas tomorrow, which would be the equivalent of sending every single Nvidia chip for an entire year to Texas and running them 24/7. That’s 10 GW. You could run it right now, existing grid, existing plants for all but 40-50 hours a year.” We stress tested it. There are some coal plants that could ramp up capacity factor. There’s plenty of gas plants that can. I don’t know if it’s 40 hours, 100 hours, 140 hours, but it makes more sense to pay someone else not to run their chemical company, the refinery company, for 40-50 hours a year, rather than have the utilities go out and spend $10 billion connecting faraway wind farms. That’s the argument. We’ve come in the middle of it, but there is plenty of existing capacity on the grid that could ramp up to meet it. Then other guests have pointed out at Odd Lots, the peak demand of the grid is 850 GW. The overall size of the grid is are 1,200 GW and then you’re adding 50 GW a year of solar, and then you’re going to start adding 20 GW of gas. We’re going to handle it. I’m not really worried about any brownouts or anything.

3. Incentives > Intelligence: The Real Barrier(s) to Agentic AI – Abdullah Al-Rezwan

Such “disingenuous yet clever” strategy is actually a good glimpse of the barrier to agentic AI’s adoption. While most of us focus too much on technical capabilities of AI, we may still be underestimating the challenges related to (lack of) incentives of incumbents as well as legal frameworks for agentic AIs to flourish. “Ghosts of Electricity” had a very good piece explicitly laying out couple of real headaches:

“we highlight two main obstacles that stand in the way of AI agents becoming true digital partners. The first has to do with the design of the internet itself–the interface of nearly every website was meticulously optimized for humans. But what works for humans does not necessarily work for AI agents. Until AI can truly emulate every aspect of a human being, we will likely need to design a parallel internet for agentic commerce to work. But there’s reasons to suspect that this will not happen soon: some firms have little to gain, and potentially much to lose, from investing and facilitating a machine-readable web. This leads us to the second obstacle, which is even simpler: many use-cases for AI agents are illegal, or at least legally ambiguous. The rights around AI agents need to be clarified and developed in order for agents to participate meaningfully in economic transactions and interactions.”

In the piece, they substantiated these headaches with a couple of examples. Some excerpts below:

“Let’s say you tell your favorite AI tool (ChatGPT Atlas, Perplexity Comet, Claude, Gemini Antigravity) to purchase a concert ticket for you or to shop on Amazon. Take seat selection. The agent reaches the seat map and gets stuck because it can’t tell what’s actually available or what counts as a “good” choice. The map isn’t a simple list: seats change color when you hover, prices only appear after clicking, and availability updates every second as other people buy tickets. While the agent pauses to figure out what to do, the seat disappears, the page refreshes, and it loses its place. Every pause, waiting for pages to load, retrying after errors, handing control back to you, adds friction. What takes a human a few minutes to do turns into a brittle, ten-minute ordeal

4. The Slow Singularity – Abdullah Al-Rezwan

To understand why the future might be sluggish, the authors first had to decode the past. In a methodological twist that fits the subject matter perfectly, they employed OpenAI’s Deep Research to dig through economic history and construct a dataset of 150 essential tasks over the last century. This analysis revealed a counterintuitive “Zero Productivity Paradox” as switching a task from labor to capital contributes zero to Total Factor Productivity (TFP) growth at the exact moment it happens. This is because firms switch exactly when the costs are equal. The growth comes entirely from what happens after the switch: the task is now performed by a machine that improves exponentially faster than a human.

They estimate that while machine productivity on automated tasks grows at a blistering 5% annually, human task efficiency grows at a meager 0.5% and in some sectors, human efficiency appears to be declining. To prove how vital this dynamic is, they calculated a “frozen” counterfactual: if we had stopped automating new tasks in 1950, but allowed computers to keep getting faster at the things they were already doing, US economic growth would have essentially flatlined for the last 70 years…

…The same logic explains why the AI “singularity” is likely to be a slow burn rather than an explosion. The economy operates on a “weak link” principle. Production requires a chain of complementary tasks; you need high-speed coding, but you also need management, legal compliance, physical logistics etc. Because these tasks are interlinked, the economy is constrained by its slowest components. Even if AI automates cognitive tasks with infinite speed, total output remains bottlenecked by the essential tasks that still require slow-improving human labor.

5. The Hidden Book Value of Community Banks: Why Call Reports Matter More Than Public Financials – Dirt Cheap Banks

Call Reports exist for safety and soundness, not for investors. They are not designed to be friendly, summarized, or marketed. They are designed to tell regulators whether a bank can survive stress, fund itself, and absorb losses. That is exactly why they are so valuable.

The first thing to understand is structure. When you buy stock in a small community bank, you are almost always buying the holding company, not the bank itself. The holding company often has no real operations. It owns one asset, the bank. It might have a little cash, maybe some legal expenses, sometimes holding company debt, but that is it. The bank owns the loans, the deposits, the securities, the real estate, and the earnings power…

…Public financial statements typically show the holding company only, and often only once a year…

…Call Reports are different. They are filed quarterly by the bank itself. They show the full balance sheet, income statement, and capital position of the operating bank. If the bank earns money and retains it, equity goes up in the Call Report immediately, whether or not a dividend is paid to the parent. If securities move and AOCI changes, you see it. If credit costs rise, you see it. If loan growth accelerates, you see it.

When people ask which book value is the real one, the answer from decades of bank investing is simple. The bank level equity in the Call Report is the economic book value. That is what generates earnings. That is what a buyer would pay for in a sale. That is what regulators protect. The parent level equity is just an accounting wrapper…

…West Shore Bank Corporation $WSSH is a textbook case of how public financials can materially misstate economic reality for small community banks, and why Call Reports create an information advantage…

…At December 31, 2024, the consolidated balance sheet shows:

Total stockholders’ equity of approximately $48.2 million.

This is the number scraped by data aggregators. It is the number displayed on OTC Markets. It is the number most investors implicitly anchor to when thinking about book value.

With a current market capitalization of roughly $45 million, West Shore appears to be trading at or near book value based on these public financials. To a casual observer, the stock looks fairly valued. There is no obvious discount screaming off the page…

…In the Call Report, under Total bank equity capital, the number is dramatically higher.

As of the most recent Call Report dated 9/30/2025, total bank equity capital is approximately $73 million.

This is the capital base regulators use to determine whether the bank is well capitalized. It reflects retained earnings, balance sheet growth, and changes in AOCI on a quarterly basis.

Nothing magical happened between these two documents. There was no recapitalization. No asset sale. No accounting maneuver.

The difference exists because the two statements are answering different questions.

The annual report answers:

What does the holding company’s GAAP equity look like at year end?

The Call Report answers:

How much capital does the operating bank have today?

Those are not the same question, and in small community banks, the answers often diverge significantly over time.

Using the same $45 million market capitalization:

  • Based on public financials, West Shore appears to trade at roughly 0.9x to 1.0x book value
  • Based on Call Report data, West Shore is trading at approximately 0.6x bank-level book value

That is the entire disconnect.


Disclaimer: None of the information or analysis presented is intended to form the basis for any offer or recommendation. We currently have a vested interest in Alphabet (parent of Gemini) and Amazon. Holdings are subject to change at any time.

Ser Jing & Jeremy
thegoodinvestors@gmail.com