#51 - Book Review: Work

Summary and commentary on "Work: A Deep History, from the Stone Age to the Age of Robots" by James Suzman

In 1957, Vere Gordon Childe, a recently retired world-famous archaeologist was hiking along Govett’s Leap in Australia’s beautiful Blue Mountains. During the hike, the Sun was beating, and he was exhausted, so he temporarily removed his glasses to wipe the beads of sweat dripping on the lens and down the frame. However, during this moment the short-sighted professor lost his footing and fell, bringing his illustrious career and life to a tragic end. Or, at least that’s what the coroners said when they found Childe’s lifeless body.

Twenty-three years later, the truth about Childe’s death was revealed: Suicide. A few days before plunging off Govett’s Leap, Childe wrote a letter to Professor William Grimes, his successor at the University of Londons’ Institute of Archeology. Childe asked Grimes to keep the letter to himself for at least a decade, but eventually, Grimes published the letter in full. Excerpted, it reads:

For myself I don’t believe I can make further useful contributions to prehistory. I am beginning to forget what I laboriously learned—forget not only details (for these I never relied on memory), but even that there is something relevant to look up in my note-book. New ideas very rarely come my way. I see no prospect of settling the problems that interest me most . . . In a few instances I actually fear that the balance of evidence is against theories that I have espoused or even in favour of those against which I am strongly biased . . . I have no wish to hang on the fringe of learned societies or university institutions . . . I have become too dependent on a lot of creature comforts—even luxuries—to carry through some kinds of work for which I may still be fitted . . . I am just a burden on the community. . . On my pension I certainly could not maintain the standard without which life would seem to me intolerable and which may be really necessary to prevent me becoming a worse burden on society as an invalid. I have always intended to cease living before that happens . . . Life ends best when one is happy and strong.

Childe is on one end of the extreme with respect to his attitude towards work: Be productive and useful to society, or die. We admittedly may not go that far, but let’s be honest—even if we dial this back a couple of notches, some of this resonates. Many of us find moral value, dignity, or self-respect in our ability to work and to produce good work.

What’s up with our attitude towards work? In Work: A Deep History, from the Stone Age to the Age of Robots, James Suzman attempts to answer this question through a historical lens. Like other good histories, Suzman doesn’t merely provide the spattering of dots, but connects them to form a narrative of the human relationship to work and where we’re going from here.

I. Hunter-gatherers

Work wasn’t always like this. Our hunter-gatherer ancestors, dubbed the “original affluent society” by Suzman, spent less than twenty hours per week doing work. They hunted and gathered only when they felt like it, and… that was that.

Suzman spills a lot of ink on the importance of fire and how it changed work for our early ancestors, but I think his discussion can be distilled into three points. First, by enabling humans to cook and consume more nutrients, fire opened up a ton of time for hunters and gatherers to do things other than work—leisure, essentially. Second, when our ancestors outsourced energy requirements to fire, they took the first steps toward creating a world where being physically strong wasn’t the only important quality—specialization of work, essentially. Third, fire made it easier for some members of early human communities to feed those unable to feed themselves—welfare programs, essentially.

Speaking of welfare, here’s how welfare worked in foraging societies: Anyone who wanted anything from another person could simply request it, and the other person would grant it. No hard feelings. No being looked down upon. No undertones of theft or freeloading. Suzman calls this a “demand-sharing” society.

To us, this should sound strange. Indeed, contrast demand-sharing with today’s economic theories of capitalism and socialism. In capitalism, the capitalists scorn the idle poor, and in socialism, the labor class scorns the idle rich. Under both of these economic schemes, you can’t achieve equality without taking away someone’s liberty, and you can’t grant everyone liberty without taking away the promise of equality. However, unlike capitalism and socialism, which are opposite sides of the same laziness coin, foragers in demand-sharing societies didn’t have a concept for laziness: Ask for whatever you’d like—you can have whatever you’d like! As a result, these societies achieved equality and liberty at the same time.

But Suzman doesn’t stop here. Contrary to today’s technologists, who claim that the Internet has abolished scarcity and created a world of abundance, Suzman claims that it was actually the hunter-gatherers, thousands of years ago, who lived in a world of abundance! According to Suzman, hunter-gatherers never worried about where their next meal would come from; they believed that nature and the environment would always provide for them. As a result, hunter-gatherers didn’t plan ahead. They were short-term focused, living in the present. When they were hungry, they spent a few hours hunting for food, but they never hunted in excess. They only accumulated enough for what they needed for the day, and having the ambition to hunt and gather in excess was looked down upon.

II. Farmers

Things began to change when we made the shift from hunter-gatherers to farmers. Here are, for instance, some of the big differences I picked up from Suzman’s book:

  • Contentment vs. anxiety/ambition: To hunter-gatherers, everything existed in the present. They didn’t carry baggage from the past, nor did they worry about the future, since they believed that nature would always provide. In other words, they typically believed their world would be more or less as it had always been. As a result, they were content with what they had, stoically accepted occasionally hardships, and weren’t hostage to outsized ambitions. On the other hand, farmers lived simultaneously in the past, present, and future, creating an anxiety-filled life. Almost every task on a farm is focused on achieving a future goal or managing a future risk based on an ongoing feedback loop of past experience.

  • Demand-sharing vs. transactional: As we’ve already discussed, hunter-gatherer society was both free and equal as people shared unconditionally among themselves. Suzman argues that this demand-sharing among members of a tribe was an organic outgrowth of how foragers viewed the environment: The environment freely gave to the foragers, so why shouldn’t the foragers give freely amongst themselves? Farmers, though, had a transactional relationship with the environment: Farmers invested labor into the environment, so the land subsequently owed the farmers a debt in the form of a bountiful harvest. Suzman argues that this transactional relationship with the environment translated into transactional relationships with one another. So, farmers invented currency and began to exchange with one another at arms-length, marking the start of a barter society.

  • Abundance vs. scarcity: To hunter-gatherers, time was always plenty, but to farmers, time was scarce. Hunter-gatherers could afford to take a day off whenever they wanted, in large part because they enjoyed the fruits of their labor immediately. Putting off the food quest for a day or two wouldn’t have any serious ramifications. But the story was different for farmers, whose efforts produced delayed returns far into the future. Sure, there were windows of time for farmers to take breaks, but outside these windows, when work urgently needed to be done, the consequences of not doing work were almost always considerably greater for farmers than for foragers. They had countless responsibilities they needed to tend to, including irrigating thirsty crops, dealing with pests, removing weeds, repairing fences for animals, etc. etc.

Starting to sound familiar?

III. Industrialization

Excess agricultural output, increased specialization, and the development of more sophisticated tools were the spark that ignited humanity’s first cities. While cities have existed since 4,000 B.C., I’ll focus here on Suzman’s discussion of industrialized cities, those that arose during the Industrial Revolution.

Industrial society saw a high level of stratification between the aristocracy, the merchants, the working class, and the peasants. Thanks to the development of new textiles and homeware in the 1500s, conspicuous consumption ran rampant as members of the lower classes found new ways to present themselves outwardly as members of the higher class. Some anguished aristocrats even intentionally dressed down to distinguish themselves from the try-hard rabble who were dressing up. The optics of rank were apparently so intense that they were even enforced and entrenched through laws. For example, the 1571 Act of Parliament demanded that all men and boys older than 6 years old (other than nobles) must wear distinctive woolen caps every Sunday and all other holy days. Suzman also suggests that the rise of mass print advertising in the 1700s misled people into believing in upward social mobility. According to Suzman, with newspapers advertising the latest and greatest gadgets, people that were able to purchase those products falsely believed that they were upwardly mobile and closing the gap between themselves and others.

During the industrial era, workers also became commoditized and funneled into mechanical, boring tasks. According to Suzman, this is largely thanks to Frederick Winslow Taylor, a high-strung grade-A stickler who apparently needed to put himself in a strait jacket to go to bed. Taylor wrote Scientific Management, a book where he took a scientific eye to extracting maximum efficiency out of workers. The tenets in this book would eventually be used by the likes of Henry Ford to turn work into a rote, mechanical task and people into just another cog in a machine.

But Taylor wasn’t all terrible for workers. Taylor believed that, in return for their efficiency at at work, first class workers should be rewarded for their productivity in the form of higher wages and more time off. But was this tradeoff—less meaningful work in exchange for higher pay and more time off—worth it? Labor movements and trade unions apparently thought so, as they focused almost all of their attention on securing better pay and more leisure rather than trying to make their jobs more interesting and fulfilling. In fact, in the industrial era, we begin to see workers saying that pay is king, more important than leisure. For example, in the 1930s, Kellogg (the cereal company) was able to reduce working hours to 30 hours per week while maintaining similar levels of pay (thanks to increased efficiency and economic output), but interestingly enough, in the 1950s, 75% of staff began asking to reinstate the 40 hour work week so they could take home more money.

Suzman also emphasizes how communities changed in the industrial era due to increasing specialization in functions. In more primitive societies societies, chiefs and shamans could simultaneously be foragers, hunters, farmers, and builders. Because everyone performed interchangeable roles, there was shared solidarity, understanding, norms, and beliefs among entire communities. During the industrial era, however, roles began to splinter into their own silos. A lawyer couldn’t moonlight as a doctor or vice versa. There arose a lawyer perspective of the world, a doctor perspective of the world, and all the way down the line across different jobs. This made it harder to bind large communities together, leading to “anomie,” the breakdown of society through the severing of social bonds.

IV. Post-industrialization

A post-industrial society is one where the services sector has eclipsed the manufacturing sector. But what, exactly, is the services sector, you ask? To be honest, I don’t really know, and even Suzman himself admits that services are hard to pin down—the sector includes “any job that does not involve producing or harvesting raw materials as in farming, mining, and fishing, or the manufacture of actual things, like the knives and forks and nuclear missiles, from those raw materials.” Notice how Suzman defines services in the negative rather than the positive (i.e., “services are not X” rather than “services are X”), making services extremely amorphous.

So, let’s put two and two together. First, you have services, this really amorphous industry—what does it actually mean? Second, in a post-industrial society, the vast majority of people are engaged in services… So Suzman’s next logical next question is: What, exactly, are people actually doing?

Echoing David Graeber, Suzman says that many, many, many of these service sector jobs are “bullshit jobs,” jobs that are “so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence” and include the likes of corporate lawyers, PR execs, health and academic administrators, and financial service providers. Suzman admits that some service jobs are important, and he admits that some people might find bullshit jobs engaging, but he ultimately argues that the vast majority of people in those bullshit jobs don’t. He then hits you with a barrage of statistics highlighting modern society’s administrative and bureaucratic bloat and people’s profound dissatisfaction with their jobs. What happened here?

Suzman puts forward a few hypotheses but ultimately believes that we did this to ourselves, not to create value for society, not to make us fulfilled, but—get this—just to keep ourselves busy. Remember the shift from hunter-gatherers to farmers? Yeah, well according to Suzman, this shift created “a culture that makes us intolerant of freeloaders and canonize gainful employment as the basis of our social contract with one another even if many jobs don’t serve much purpose other than keeping people busy.” In other words, according to Suzman, we as a society would rather have people twiddle their thumbs for money and call it a “job” than have have people twiddle their thumbs for leisure alone.

Suzman twists the knife even further and condemns post-industrial society’s widening gap between the haves and have-nots. Since the 1980s, society has been in the midst of “The Great Decoupling:” Productivity, output, and GDP have all continued to grow, but real median wages have been stagnant. A smaller and smaller elite have been capturing a larger and larger slice of the growing economic pie. Meanwhile, the lower and middle classes are hardly any better off and must incur more debt and work longer hours to climb up. Suzman attributes our current level of economic inequality to a “corporate conspiracy” pushed by the paper-pushers at McKinsey. In 1998, McKinsey published its “War for Talent,” which urged HR departments across the world to open their eyes to the diminishing pool of available talent and to wage a war to attract these rare “masters of the universe.” This led to fattening executive compensation, ballooning ambition, and the arrogant sense that you are where you are, perched on top of the world, because of merit.

Suzman, of course, believes this whole McKinsey thesis is bad and wrong. He cites scholars who argue that there’s no shortage of top talent and that the obsession with the talent is actually creating a corrosive work culture, which not only makes work less enjoyable but also makes organizations less efficient. Suzman also seems to implicitly reject the idea that there is a clear, meritocratic correspondence between wealth and hard work, but he admits that it’s not so easy for society to shrug this off. After all, the wealthy like to believe that they are worthy of their rewards, and the poor don’t want to mess with their own sense of agency, their dream that they, too, might make it one day if they work hard enough.

Through thousands of years, we have been working ourselves into a pickle of paradoxes. Our economy is now more prosperous than ever before, but we face historically abnormal levels of suicide and social stress. Despite labor productivity in industrialized nations having risen 4-5x since end of World War 2, average weekly working hours everywhere have continued to gravitate toward an average of ~40 hours per week. Finally, we’re being pinched from both sides of the work equation: Exhaustion due to over-work and detachment due to under-work. People throughout history have over-worked themselves to death, but for the first time ever, we’re working ourselves to death not because we have to (i.e., due to poverty or hardship), but because of our own ambitions. At the same time, some of us are also literally dying from our jobs being so useless, mundane, and devoid of skill.

So where do we go from here? Suzman ends by painting a bleak picture of our future: Artificial intelligence will automate most of our jobs away, creating immense value for the few capitalists that developed the AI and leaving the rest of the world jobless and depressed. But Suzman argues that it doesn’t need to be this way. Even if robots do take over, our current obsession with work and scarcity is not an immutable property of the human experience. There’s an inescapable sense that Suzman is nostalgically looking to the past to find our future, urging us to re-kindle our hunter-gatherer instincts from ages past to form a society after capitalism.

V. Commentary

The book is 400+ pages long, so there are a lot of interesting things I didn’t cover. Things like how and why we made the transition from hunter-gatherer —> farming —> industrial, and how there actually exist animal species, other than humans, that do work for no good evolutionary purpose. This is all really fascinating, and I encourage you to read the book to find out more!

The book overall was an enjoyable read, as much of a page-turner as other very good history books like Sapiens. That said, I didn’t find Work particularly eye-opening in showing me a new way to think about work. I appreciate that Suzman cites many examples of our relationship with work through history—I hadn’t known about many of these examples, and they are very interesting!—but for me, these examples just confirm a long narrative arc of work that I had already known about.

One issue I have is the amount of time Suzman spends discussing foraging societies—about 35-40% of the entire book. I guess it’s not unexpected, given that Suzman is an anthropologist who studies modern-day hunter-gatherer tribes. But when authors spend so much time talking about the distant past, they all run into the same problem: Lack of documentation. This leads Suzman to rely heavily on phrases like “It was almost certainly the case that…,” “One can't help but wonder that…,” etc., so my epistemic status when reading things like this is… very skeptical, to say the least.

But I have a bigger gripe with the focus on forager societies: I’m not convinced that a return to the forager relationship with work is in our future! Sure, I appreciate that Suzman gives us a blueprint for the future based on the past—after all, I believe that base human desires stay constant but are constantly being repackaged and channeled into new form factors and products. So, Suzman thinks that a return to the forager view of work is entirely possible—it’s not like we’re creating a new relationship with work that fundamentally contravenes human nature! But even if you agree with Suzman that this is possible, I am skeptical that it is feasible (or even desirable). The forager relationship with work is one that we abandoned a loooong time ago, and the fabric of our society today is stitched with ambition, planning, desire, and innovation. Does Suzman’s vision of the future require pulling society apart thread by thread, until innovation has been sufficiently decelerated? If so, then no thanks.

The attention paid to forager societies comes at the expense of additional detail on industrial society, which is much more tangible to us today and can therefore likely be much more feasible as a blueprint for our future. How has worked changed, century to century, from the 1400s? What was good, and how can we return to that? And what was bad, and how can we avoid that? I believe that answers to these types questions can provide more insightful, actionable paths towards a better relationship with work.

📚 What I’m reading

  1. Trillion Dollar Paint Job. (Pirate Wires)

  2. Enterprise Metaverses, Horizon Workrooms, Workrooms’ Facebook Problem.(Stratechery)

  3. The Great Climate Opportunity. (Space Capital)

  4. Compounding Crazy. (Not Boring)

#50 - Space rush

A quick look at space SPACs

One way to think about SPACs is that they democratize access to public markets by allowing companies to provide detailed financial projections of the future. In a traditional IPO process, you typically won’t say a word about future projections because you run the risk of facing lawsuits for failing to hit those projections. But SPACs actually let you do this! The result is that a lot of companies that couldn’t have gone public via a traditional IPO—because they depend heavily on future growth—are now going public via SPAC because they can now safely provide a picture of that future growth to the public.

So, space is one category of companies that SPACs work pretty well for. Over the past year, we’ve seen a bunch of space companies announce that they’re going public via SPAC, so today I’ll provide a very, very brief write-up of these companies. [Note: I’m not looking at Virgin Galactic because it SPAC’d over a year ago, and you probably know what Virgin Galactic is, anyways 😉]

Rocket Lab. Vertically-integrated space launch company capable of on-demand launch for small satellites. Currently working on a new aircraft that can launch medium-sized payload. Also manufactures satellite components. Generated ~$40m revenue in 2020.

Spire. Collects spaced-based data for maritime, aviation, and weather industries using a proprietary constellation of multi-purpose satellites. Currently has 143 satellites in orbit. Has a software platform that delivers this data to customers as a subscription. Generated ~$36m revenue in 2020.

Satellogic. Manufactures low-cost Earth observation satellites to capture images with really high spatial resolution. Currently has 13 satellites in orbit and can map the entire Earth on a monthly basis but only recently started generating revenue (in 2021). Goal is to have 300 satellites in orbit by 2025 and map Earth on a daily basis.

Redwire. Provider of “space infrastructure” solutions. To be honest, I barely know what this means, but it looks like Redwire manufactures a bunch of hardware stuff for space products (e.g., sensors, cameras, antennas, etc.). Generated > $100m revenue in 2020.

BlackSky. Earth observation. Basically, on-demand satellite imagery.+ AI/ML image analytics + bespoke, advanced satellite development programs for customers. Generated ~$22m in revenue in 2020.

Planet Labs. Again, Earth observation. Satellites to do Earth monitoring, maps, and analytics. The company has over 200 satellites in orbit and bills itself as the “Bloomberg Terminal” for Earth data. Generated ~$96m in subscription revenue in 2020 from 600+ customers.

[The remaining space SPAC companies are those that haven’t generated sustainable revenues to-date. I’ve previously written about this phenomenon here.]

Astra. Small satellite launch provider that has conducted three test launches but won’t conduct its first commercial launch until later this summer. It eventually hopes to do hundreds of launches per year, on-demand.

AST. Satellites that provide global Internet connectivity for mobile phones at broadband speeds. It launched its first satellite in April 2021 but needs 20 satellites to start generating revenue.

Momentus. Another satellite launch provider. However, the company is embroiled in securities fraud litigation with the SEC for lying to investors, and its former CEO is a Russian national with strong ties to the Russian government. So strong, in fact, that CFIUS deemed it a national security concern, prompting him to step down. At the time of the SPAC announcement, Momentus was valued at $1.2bn—now, the valuation has been adjusted downward to about ~$600m.

And, here are some figures:

Figure 1. Overview of Space SPACs. Many companies explicitly used 2023 (or 2024/25) revenue multiple for current valuation.

Figure 2. Revenue Projections for Space SPACs. Note how Astra, AST, and Momentus have the fastest revenue growth and the highest 2025 revenue, even though they are the ones that are pre-revenue.

And some takeaways:

  • Just because you’re pre-revenue doesn’t mean you can’t command a higher valuation than a post-revenue, post-product company, which I find… interesting…

  • The three pre-revenue space companies have higher revenue projections than the other post-revenue space companies, which I find… also interesting…

  • I looked at a bunch of these companies’ investor presentations, and a lot of them are (very explicitly) calculating current valuation as a ~5-10x multiple on future revenue in ‘23 (i.e., 2 years out).

  • There’s tons of interest in Earth observation. How are companies differentiating themselves here? 🤔 There’s also interest in space launch, although Rocket Lab seems to be leading.

📚 What I’m reading

  1. Amplification and its discontents. (Daphne Keller)

  2. Everything that rises must converge. (Ross Douthat)

  3. Mark in the metaverse. (The Verge)

  4. The California dream is dying. (The Atlantic)

#49 - Prestige and YCombinator

How do we think about YC's increasing batch size?


A few years ago, Yale University finished building two new “residential colleges,” increasing the school’s total undergrad capacity from 5,400 students to 6,200. I went to Yale, I think it’s a fine school, and I’m happy that more students can take advantage of a Yale education. But I remember talking with my friend about the expansion, and he was actually mildly peeved about it! The reasoning was something along the lines of, “If more students are admitted to Yale, then that dilutes the value of my degree.”

Hm, well, to be fair, at some point this has to be true. If Yale increased its class size by 10,000%, then a Yale degree would intuitively begin to mean less than it does now. The reason is obvious: The Yale degree carries some level of prestige, but college admissions prestige is a zero-sum game. One student is admitted at the expense of another. To get in, you have to beat someone out. So if you open the floodgates and let in significantly more applicants, then it’s less meaningful of an accomplishment to get in.

But hold on—what if it’s not a zero-sum game? What if we can grow the prestige pie by admitting more students? Consider two worlds that are equal in every respect, except in World A, Yale admits 2,000 students per year, and in World B, Yale admits 4,000 students per year. Everything else about Yale is exactly the same, in particular (1) the student experience and (2) the quality of admitted students. Yes, in world B, Yale admits more students, so it’s less “prestigious” to get in, but World B Yale produces more graduates (on an absolute basis) that could presumably really, really, really change the world. Doesn’t the wild success of alumni increase the prestige of the school, which might offset any loss in prestige from the increased number of admits? In other words, does prestige have network effects?

World B is completely hypothetical, but I honestly don’t think the assumptions are too wild. For example, I feel like Yale could more than afford to double its undergraduate student population while preserving the student experience by constructing more residential colleges, hiring more professors, building new classrooms, etc. with its $32bn endowment. I also feel like Yale could afford to increase its undergraduate student population while preserving high intellectual standards for its students. How different is the applicant ranked #2,000 from the applicant ranked #2,001? #3,000? #4,000? I don’t know—these elite institutions are always saying that they have way more qualified applicants than they have room for, and sure, maybe it’s a lie to preserve applicant ego, but I’m inclined to take their word for it.

Look, we’ve talked about prestige before. In order to win, someone else must lose. The desire for prestige is an outgrowth of a scarcity mindset.


Prestige is usually associated with some form of picking: You’ve been picked to win an award, to get a good grade, to get into a college, etc. But I think of picking as consisting of two specific types: (1) Picking those who meet a threshold and (2) Picking those to meet a fixed quota. The first type is benign. If you’re good enough, you’re picked. If you score 90% on a multiple choice exam, you get an A. If you pass the bar, you can practice law. There is no cap on the number of As or the number of lawyers. The second type, however, is nefarious. This is essentially how college admissions work. Even if you’re good enough, you’re still not good enough because others are better. There are a fixed number of winners and an uncapped number of losers.

Some industries are so monastic and prestigious that their entire structure is built around quota-picking. Let’s look at the legal profession, which I am admittedly somewhat a part of. I mentioned that there is no bar on the total number of lawyers in the U.S. (and this is true!), but there is certainly a de facto bar on the total number of big corporate lawyers (whose jobs are very prestigious). According to some of my classmates, the top law firms have a quota for how many Stanford Law School students they’re allowed to hire. Also, district court judges have a fixed number of clerks, appellate court judges have a fixed number of clerks, and Supreme Court Justices have a fixed number of clerks, and district court clerkships are less prestigious than appellate court clerkships, which are less prestigious that Supreme Court clerkships, so it’s all just a large rat-race to get into the highest-prestige quota possible.

So if you’re a typical law school in the United States trying to fit this rat-race mold, you’re naturally going to grade students on a strict curve. You place a quota on the number of students that get a good grade in a class! You want a nice gradient of grades so you can say that Alice is definitely better than Bob, but Bob is wayyyy better than Chris, so Alice, Bob, and Chris can succeed in the stratified legal profession according to their “skills.” It’s quotas all the way down!

Well, actually, there is an interesting exception. Harvard and Yale apparently don’t use quota-picking for grades, they use threshold-picking! In other words, any number of students can get a good grade in a course. My take on this is that Harvard and Yale are saying that their students are so damn good that future law firms, judges, government employers, etc. don’t need to see where their students land in the internal rat race for grades. If a student has great grades, then he meets our standards for greatness, so shouldn’t he meet yours as well, regardless of what grades other students have? If an institution is just that good, then perhaps it can afford to relax a bit on the quota-picking.


At this point I’ve buried the lede a bit, but let’s move on to talking about YCombinator (“YC” for short). For those unfamiliar, here’s how a startup accelerator like YC works:

  1. An early-stage startup needs help growing its business, so it applies to a startup accelerator.

  2. A startup accelerator accepts only a handful of companies to participate in a so-called “batch.” The accelerator hands each accepted startup a small amount of capital (in return for a small amount of equity), gives it access to a network of seasoned mentors and other similarly situated startups, and provides it with some amount of prestige for being associated with the accelerator.

  3. After the accelerator program for the current batch is over, the accelerator hosts a “demo day,” where the current batch’s startups present to venture capital investors.

  4. If the startup company does well and the startup accelerator is able to increase the value of its equity in the startup, then the accelerator will have an easier time attracting new startups to apply, and the cycle starts over again.

In my opinion, the most interesting step is (4): For startup accelerators, success begets even more success. If a startup accelerator does well, it’ll have more money to give to startups, it’ll have a wider network to help startups, it’ll increase in prestige, and as a result, it’ll have more startups banging on it’s door to participate in a batch.

But step (4) needs to be balanced against step (2). Just because you’ve accumulated tons of resources to invest in early-stage startups doesn’t mean you can pick startups like throwing darts. The picking function is important, and you still need to be selective in how you pick companies! If you start picking bad companies, then your returns decrease, and the flywheel in (4) begins to slow down.

But what if you start picking more companies? Here’s a chart of the number of YC companies over time:

Does relaxing the quota dilute the prestige of the brand? A few people on Hacker News seem to believe that it does. This is an important question to answer! If the prestige of YC goes down, then theoretically you might have two bad results: (1) Fewer startups want to apply to YC; (2) Fewer venture capitalists, who attend YC’s demo day to invest in companies in the batch, will be excited to invest.

How do we think about prestige and batch size?

One place to start is by intuiting, as we did for Yale undergrad admissions, whether a “World B YC,” which accepts more startups per batch, can (1) provide the same level of support for each startup and also (2) maintain the quality of startups accepted to each batch, such that the success of the larger batches outweighs any decrease in prestige from the larger batches. First, yes. We already went over this one above. Because YC is so successful, it’s cultivated an ever-wider network and more resources to devote to each batch. In fact, it’s quite possible that the larger the batch, the better the experience for each startup (again, network effects?). Second, sure. I don’t think the bar to get into YC is necessarily going down just because more startups are accepted. The acceptance rate into YC is still at ~2%—more startups are being formed, and more startups are applying, and if we assume some constant proportion of “good” to “bad” startups being formed, it’s only natural that YC can afford to increase its batch size while maintaining its threshold for “good.”

A 2016 empirical study of YC companies conducted by Andrew Barnes also seems to confirm these points. He found that batch size had a negligible impact on amount of subsequent funding obtained by each startup in the batch and a negligible impact on the time to successful exit (via acquisition). That is, even as YC has grown larger, the startups in the bigger batches had just as good of outcomes as startups in the smaller batches. Interestingly, Barnes hypothesizes that his findings are driven by the fact that YC is already renowned globally as the premier accelerator, so YC can afford to increase its batch size without compromising quality. As an aside, is this to Harvard and Yale’s threshold-picking grading scheme?

Another thing to consider is whether the startup / entrepreneurship market as a whole is monastic like the legal profession is. In other words, whether the creation of startups is a fixed-pie rat-race based on scarcity. The answer to this is, of course, no. Sure, fine, the dream for any startup is to dominate a market, but this doesn’t act as a complete bar for new market entrants as a quota would. For instance, there are like a bajillion enterprise software applications, many of which are occupying similar or adjacent markets. Venture capital funding is at all-time highs, and it’s easier than ever to become an entrepreneur. Silicon Valley has historically adopted a bias towards thinking that new value can always be created. Welcoming more entrepreneurs = good, Excluding entrepreneurs = bad.

Oh, and also, if YC has gotten less prestigious, then why hasn’t venture capital gotten less prestigious as well? You don’t need to squint too hard to see that venture capital investing is kinda’ similar to startup accelerators:

  1. Venture capitalists raise a pool of money from what are called “limited partners.”

  2. Venture capitalists deploy chunks of that pool of money into startups they think will earn them a huge return.

  3. After the investment, the venture capitalists help to support that company along its lifecycle. More often than not, the company will fizzle out, but every so often, a company is a hit, generating the vast majority of returns for the pool of money that the VC investors had originally raised. A percentage of the proceeds from the IPO / acquisition gets returned to the limited partners.

  4. If the venture capitalists did well with their pool of money, they’ll have an easier time raising their next pool of money, and the cycle starts over again.

A VC firm has a nice flywheel in step (4), just as a startup accelerator does: If a venture capital firm does well, it’ll have more startup banging on its doors for money, it’ll have more limited partners banging on its doors to give money, and it’ll have more talented investors banging on its doors for a job. Prestige is also very important in VC, just as prestige is important for startup accelerators. Entrepreneurs that are able to say “a16z led our round” will have an easier time attracting customers and subsequent capital.

Top VC firms have raised more and more capital to deploy, and their portfolios are getting larger and larger, yet I don’t hear as many people saying that a16z has lost its prestige as much as I hear people saying that YC has.

So it seems, at least to me, that the prestige of YC shouldn’t go down due to increasing batch size, but sure, I think reasonable minds can disagree here. For people who do think that YC prestige has dropped, though, Dan Gackle of YC has a great rebuttal:

YC has always sought to filter out the resume-padding sort of founder who sees YC as a step up the status ladder. Such founders are not up for the gruelling slog of really building a startup, and tend to bail not long after the batch since they've already gotten what they wanted out of it.

People should apply to YC because they believe it would [help] them them build a successful startup, not because they want prestige. If expanding YC leads to a decrease in prestige (alongside an increase in real value to startups), that's a win for everybody: more business-builders get access to YC, YC gets to fund more successful companies, and status-seekers can find something else that will look better on their resume.

In this sense YC is very different from the elite colleges whose brand depends on their exclusiveness. The upper bound on YC's expansion is how much startup opportunity exists in the world.

📚 What I’m reading

  1. Ezra Klein interview with Sam Altman. (NYTimes)

  2. Curtailing GPT-3 toxicity. (WIRED)

  3. G7 leaders grapple over China rebuke. (Politico)

  4. On synthetic data. (MIT Tech Review)

#48 - Fictions (pt. 1)

We live in a world of fictions

After a long break, I’m finally back to writing! I’m still trying to figure out what I want to do with this newsletter/blog thing, but moving forward, I’ll try to write 1-2 posts per month, still usually about tech but sometimes about other random things I’m thinking about (like this post). Thanks for reading. —Chris


Four years ago, when I was graduating college, I remember being in the pews of a large chapel, elbow-to-elbow with some of my classmates. Parents and friends were sitting in the balcony with tears in their eyes, watching newly-minted graduates sift into our seats. We were wearing black robes and black square hats with long, limp strings dangling from the side, as per usual for a graduation, and we were sweating underneath it all as spring in New England was cresting into summer. Once everyone got seated, an organ started playing, we began singing our college song, and all of a sudden, I felt out of place. I felt strange. For a brief moment or two, I couldn’t bring myself to sing.

It was a strange, visceral sensation. I can perhaps best describe it as an “out-of-body” experience: I felt like I wasn’t experiencing the world in first-person as an active participant, but in third-person as a spectator. The world zoomed out. I looked around me, and I just felt strange… What are we doing?

I’ve experienced this strange feeling with increasing frequency through the years. Just the other day, I was on a walk with a friend; we encountered a random person walking her dog, and the out-of-body feeling suddenly washed over me: We purchase these small four-legged creatures for hundreds / thousands of dollars, put collars and leashes on them, watch them poop, and talk to them in slightly higher-pitched voices as though they understand us. I had to stop and remark to my friend how weird this all was, at which point she just laughed it off.

Perhaps I’m crazy. In these moments, I engage in a thought experiment: If intelligent aliens from lightyears away visited the Earth today and watched what we were doing, how much would they understand? Graduation ceremonies would definitely be quite mysterious. So too would dog-rearing, probably. We live in a world of fictions. Much of our society—our world?—is stitched together by these fictions, things that have little to no intrinsic or universal meaning but have meaning only because we have agreed to give them meaning.


Maybe it’s easiest to explain with more examples.

  • Marriage. First and foremost, marriage is a legal fiction, a fiction backed and mediated by the law. The day before you get married vs. the day after you get married, you’re still the same person—nothing about you has intrinsically changed. But the law now recognizes you as a different person! You can now file joint income tax returns with the IRS and state tax authorities. You can now obtain insurance benefits through your spouse’s employer. You are now entitled to inherit a share of your spouse’s estate.

    Marriage is also a social fiction, one that is backed and mediated by societal traditions. There is no inherent reason why diamond engagement rings are a thing (they became popular in the West only starting in the 1940s, and other cultures have historically and obviously used other types of gifts), nor is there a transcendental reason why engagement rings should be 1/6 of your yearly salary or whatever (this notion is promulgated by De Beers, which owns a monopoly on the diamond industry). Wedding vows, wedding venues, flower girls, bouquet-throwing, open bars, groomsmen and bridesmaids, etc. — we do many things in weddings because this is part of our current social fiction. To sharpen the point: If all cultures could rebuild the “custom of marriage” from the ground-up, there’s no way we would all organically converge on the practices we do today. There is no transcendental, canonical conception of a marriage or a wedding.

  • Money. Again, a legal fiction. A dollar bill is merely a piece of paper with barely any intrinsic value, but it magically gets value from the fact that the state says you can use it to purchase goods. Gold, similarly, has little intrinsic value. Sure, it is a catalyst in chemistry and a crown in dentistry, but outside of niche uses, gold largely derives its value from the state declaring it as a store of value. Money could also be a sort of social fiction. I’m thinking here about certain speculative trading assets (see the meteoric rise of Dogecoin over the past month or two) whose monetary values are fundamentally untethered from any underlying value thanks to meme pumping and dumping.

  • Facebook. At the outset, I note that Facebook, like all other corporations, is obviously still a legal fiction: Facebook is recognized by the state as a legal entity, a corporation that is incorporated in Delaware, has a board of directors and shareholders, can be sued, must be taxed, blah blah. Even if I replaced Zuck as the CEO, even if 100% of its engineers were fired and replaced with monkeys, Facebook would still exist as a legal fiction fabricated by the state.

    You might fairly push back, though, and say that Facebook isn’t a fiction because it has intrinsic value. It has a tangible product and a market cap of $900 bn, right? Yes, I concede that Facebook the entity has value and creates value for people. But, perhaps it’s more accurate to say that the relationships / ideas that bind Facebook (indeed, all companies) together and give rise to its product or market cap are fictions.

    For example, Facebook has an internal system of rules that employees abide by. Facebook has an internal company culture that guides employee decision-making. Facebook has a mission statement that guides the future of its product development and investment of resources. All of these things were “made up” in the sense that they are not immutable truths. The executive team, or higher-level managers, create these rules and can modify them at any point. Facebook’s future (or product or market cap) is hardly set in stone—it will depend on the fabrication of these fictions and the level to which employees buy into them. Executive management promulgates these corporate fictions, and all other employees likewise buy into them. Is there a canonical Facebook in the sense that there is an objectively right or best way to run the company?


In The Matrix, the protagonist Neo initially exists in a high-fidelity virtual simulation (the eponymous “Matrix”) until he is awakened and pulled into the real world. For the first time, Neo realizes that for the past god-knows-how-many-years, his consciousness was plugged into a computer program, a fake world with fake rules that merely mimicked the immutable rules of the real world.

In one scene, after Neo is already pulled into the real world, his consciousness is plugged into a kung fu simulation so he can test the boundaries of the virtual world via a sparring match with his mentor Morpheus. Before they spar, Morpheus tells Neo that the programmed rules of these sorts of simulations can be bent and broken. But Neo initially seems not to get it. In the ensuing match, Neo huffs and puffs while Morpheus easily disposes of him without breaking a sweat. Then, with Neo face-down, belly on the floor, and complaining that Morpheus is too fast, Morpheus asks Neo, “Do you believe that my being stronger or faster has anything to do with my muscles in this place? You think that’s air you’re breathing now?”

Unlike Neo, we don’t live in a virtual simulation, but many parts of our world are fictional in the sense that they aren’t immutable truths. To be a contrarian is to reject the fiction. To be an entrepreneur (or legislator or changemaker or whatever) is to rewrite the fiction. The challenge is being able to see that the walls around us perhaps aren’t even walls in the first place.

I’ll conclude with one of my favorite Steve Jobs quotes:

When you grow up you tend to get told the world is the way it is and your life is just to live your life inside the world. Try not to bash into the walls too much. Try to have a nice family life, have fun, save a little money.

That’s a very limited life. Life can be much broader once you discover one simple fact, and that is – everything around you that you call life, was made up by people that were no smarter than you. And you can change it, you can influence it, you can build your own things that other people can use.

The minute that you understand that you can poke life and actually something will, you know if you push in, something will pop out the other side, that you can change it, you can mold it: That’s maybe the most important thing. It’s to shake off this erroneous notion that life is there and you’re just gonna live in it, versus embrace it, change it, improve it, make your mark upon it.

📚 What I’m reading

Most of these links are from a few months ago, but they’re great reads nonetheless!

  1. Why is everything liberal? (Richard Hanania)

  2. On seriousness. (Katherine Boyle)

  3. How global tech executives view U.S.-China tech competition. (Brookings)

  4. Goldman analysts work too hard. (Matt Levine)

  5. Infrastructure, governance, and trust. (Francis Fukuyama)

#47 - The iconoclastic VC

Does she exist?

Administrative note: I’m going to be taking a break from writing for a bit.


Last week, I analogized entrepreneurship to a magic act: The entrepreneur is the magician, and the venture capitalists are the audience members, inspecting the entrepreneur’s act to find the trick but nevertheless hoping to be dazzled.

I also framed this in terms of The Pledge, The Turn, and The Prestige (from the famous movie The Prestige):

The Pledge is the entrepreneur’s visionary pitch to the investor . . .

The Turn is when the entrepeneur’s pitch is so dazzling that, in spite of the investor’s due diligence, she still excitedly cuts a check . . .

But the investor doesn’t stand up to clap yet, nor does the entrepreneur, because the hardest part has yet to happen: The Prestige . . . The Prestige requires the entrepreneur to grow the company, meet and exceed financial projections, and eventually not only return money to the venture capitalist, but also change the world in the way the entrepreneur had envisioned during The Pledge.

I framed (1) The Pledge, (2) The Turn, and (3) The Prestige from the point of view of the entrepreneur, but it turns out that we can assess these three functions from the point of view of the investor as well. VC firms need to do these three things well in order to be successful:

(1) Deal flow: VC firms need to have a constant stream of companies at the top of the funnel. Using the magic analogy, they need to be hustling to get tickets to as many shows as possible.

(2) Picking: VCs need to know what companies to pick and what companies to walk away from.

(3) Support: After picking a company, VCs need to provide support to help the company grow and accomplish The Prestige. For example, does the VC firm have a wide network that the company can plug into? Does the firm have some sort of expertise in a particular area (e.g., regulatory issues, hardware expertise, etc.)?

[Note: The fourth function the VC needs to do well is winning the deal, which happens in between picking and support. I’ll try to discuss this in another post sometime.]


The stereotypical VC is the singular visionary who excels at the picking function. She thinks futuristically, pondering what the world will look like ten years from now and how a startup could bring that future to fruition. She has strong convictions but at the same time is willing to suspend disbelief of outlandish ideas. The paragon VC can identify diamonds in the rough that everyone else, even other VCs, pass over. In short, she is an iconoclast.

However, I wonder if the picking function is getting easier, if the iconoclastic VC is in shorter and shorter supply (or even non-existent). Venture capital has been around for quite a while now, and a lot of ink has been spilled on how to identify superstar companies. For example, consumer apps with product-market fit are supposed to see their user numbers skyrocket, at something like > 100% growth per year for the first few years. Strong SaaS companies are supposed to triple their annual revenue, then triple, double, double, and double it again (called “T2D3”). Because the strength of a SaaS company is relatively easy to ascertain, VC firm Tribe Capital has even quasi-automated the picking function for SaaS companies with its Magic 8-Ball software.

To be fair, I’m not super confident in saying that picking has become drastically easier. Different companies are different, and there is no strict mold for success—if a SaaS company grow revenues by 150% instead of 200%, a venture firm probably isn’t going to immediately write it off... Moreover, people might ask why, if picking really were that much easier, do venture returns still adhere to power laws? Why don’t we see a higher proportion of winners among an investment portfolio?

My point is not that picking has become easy on an absolute basis. Investing in startups, especially at the early-stage, is still really risky (ahem, startups still have to complete The Prestige). Rather, my point is merely that there are ballpark markers for future success (e.g., user growth, revenue growth, employee growth, founder quality, etc.) that will de-risk an investment to a certain point, and many VC firms know to diligence these markers when thinking about an investment opportunity. Perhaps you can get the investment risk down to an asymptotic value, but the continuing existence of high amounts of risk and VC power laws doesn’t automatically mean that picking hasn’t gotten at least relatively easier.

Figure 1. Risk of investment into Company X. Note that this is an over-simplification of risk assessment. A VC firm can further de-risk an investment opportunity (to a point below the asymptote) based on how well the VC firm supports the startup post-investment.

If my hypothesis is correct, that picking has gotten a bit easier, then perhaps what we’d really expect to see is more VCs getting on unicorn deals (rather than only the “best” VCs being uniquely able to identify the unicorns). And indeed, this seems to be what we’re seeing. According to a 2014 report by Cambridge Associates, from 2000 through 2012, 70 VC firms registered at least one “top 10” deal (based on return), and no firm accounted for more than 7.7% of the top 10 deals across the period. In contrast, during the pre-2000 period, only 25 firms had at least one top 10 deal, and five firms accounted for at least 8% of all the top 10 deals in the period. Moreover, as shown in the Figure below, more and more of the value in the “top 100” investments (based on return) is being captured by new and emerging venture funds. This all lends some credence to the idea that picking is, indeed, getting a bit easier.


I’ve spoken to a few of my friends who’ve founded companies. Many don’t have positive conceptions of VCs, in no small part because they think VCs aren’t the iconoclastic visionaries they’re made out to be. And this makes sense—with so much capital in the ecosystem today, and with so many VCs having studied the home-run VC exits, you have a proliferation of VC firms that don’t need to be iconoclasts in order to earn strong returns.

But I don’t think this means the iconoclast VC investor is a myth. Some VCs proactively craft theses / roadmaps about particular markets and doggedly chase startups that match their vision of the future. It’s all very forward-thinking. Bessemer, for instance, tends to do this quite well. Other VCs like Lux Capital make a conscious decision to invest in industries or technologies that most other VCs tend to shy away from. For example, while B2B SaaS is flush with capital, deep tech hardware attracts very few VCs because, as discussed last week, deep tech is way riskier. Because there is less of a uniform playbook for deep tech investing, many VC firms are scared away, thinking the risk is too high, no matter how much due diligence they conduct (see Figure below). Iconoclastic VC firms, though, often come up with their own playbooks for the future of deep tech and have sufficient levels of conviction to make an investment, notwithstanding the risk.

Figure 3. Deep tech vs. traditional software investing. Many average VC firms love software startups but will shy away from deep tech. No matter how much diligence they do on deep tech company X, they simply won’t make an investment due to their lack of conviction. Iconoclast VC firms may be more willing to invest in deep tech because they have stronger convictions about the technology and the future. Is a VC firm’s level of conviction concomitant with the level of risk for a particular company? Note that this graph is an over-simplification because different VC firms likely have different risk tolerance vs. conviction graphs (in green).

📚 What I’m reading

  1. Late-stage pandemic is messing with your brain. (The Atlantic)

  2. The robots are coming for Phil in accounting. (New York Times)

  3. U.S. to impose sweeping rule aimed at China technology threats.
    The Biden administration plans to let the Trump-era rule on technology purchases and deals take effect, despite U.S. business objections about its scope. (Wall Street Journal)

  4. How Dapper Labs scored NBA crypto millions. (Protocol)

  5. A year of secrets. COVID-era confessions, from ski trips to lovers to second jobs. (The Cut)

  6. Scientists developed a clever way to detect deepfakes by analyzing light reflections in the eyes. 🤯 (The Next Web)

  7. Moore’s Law for everything. (Sam Altman)

  8. The end of Silicon Valley as we know it? (Tim O’Reilly)

  9. What we learned about Clearview AI and its secretive ‘co-founder.’ (New York Times)

  10. MOOCs failed, short courses won. A brief look at Coursera’s upcoming IPO. (Inside Higher Ed)

Loading more posts…