The Indies I Was Copying Were Just the Ones Who Survived
The 17 indies in my benchmarking folder were just the planes that came back. I revisit Wald's 1943 bomber problem and Denrell's 2003 vicarious-learning paper to unpack survivorship bias inside the indie ecosystem.
목차 (5)
April 2026 · GoCodeLab · Dev Mindset
The Indies I Was Copying Were Just the Ones Who Survived
When I started out as an indie, I had a bookmark folder named benchmarking. Inside it sat 17 Twitter profiles, 9 newsletters, and 6 YouTube channels. All of them were $10k/month indies. They were the people I wanted to become. I copied their build-in-public threads. I copied the tools they used. I read the books they recommended. I shipped Apsity, and the first month's revenue was zero.
Three years later, last fall, I opened that folder again. Just curiosity. I clicked through every one of those 17, 9, and 6 to see where they ended up. The result was a lot more depressing than I had expected. While I was sorting through it, the name Wald came back to me.
The Planes That Aren't in the Room — Wald, 1943
1943. World War II. American bombers were getting shot down at high rates. The military had to decide where to add more armor. They inspected the planes that came back from missions. They mapped the bullet holes as red dots. The dots clustered on the wings and the outer fuselage. The conclusion looked obvious. Reinforce the spots with the most dots.
At the end of the table sat a statistician named Abraham Wald. Wald said: The opposite. Put the armor where there are no dots.
The reason is one line. A plane that came back is a plane that took hits there and survived. The real weak spots are where the planes that didn't come back got hit. The shot-down planes aren't in the room. The room only contains the survivors. Wald looked at the data that wasn't there. He cracked the trap inside the visible data.
This is the canonical example that hardened into the term survivorship bias. In one line: when your sample is filled only with the things that survived, your conclusions get bent end-to-end.
The Indie Ecosystem's Conference Room
The Twitter/X timeline I look at every day is exactly that conference room. Only the planes that came back are in it.
The reason is simple. Indies who failed don't write. They can't. Right after failing, you don't have the headspace for a retro. Time passes and you don't want it preserved on the internet. You go quiet. You take the next job. Your SaaS attempt vanishes silently from LinkedIn, and the About section refreshes with a new title. The indies who survived, on the other hand, write. The writing becomes marketing for the next launch. As the follower count grows, the next launch performs better. Success summons content, and content summons more success. Failure stays silent in the opposite direction.
Then the algorithm adds another layer. $10k/month posts get clicked. 3 years and 10 users posts don't. The algorithm picks out only the planes that came back and shows them to you. Your feed sits at the center of the room.
Stanford's Jerker Denrell wrote a paper in Organization Science in 2003 called Vicarious Learning, Undersampling of Failure, and the Myths of Management that addresses exactly this problem academically. Denrell's argument: we learn by observing other companies. But failed companies disappear quickly. When they disappear, they fall out of the observable sample. As a result, the cases we can observe are systematically tilted toward the ones that survived. Denrell ran simulations showing how failure undersampling alone is enough to make aggressive strategies look superior. Aggressive strategies have high variance, but failed companies are invisible while jackpot ones are everywhere — so aggressive = superior is what we end up seeing.
Move this onto the indie ecosystem and you get the scenery I look at every day. Aggressive launch playbooks, bold price hikes, going full-time within three months. All high-variance decisions. The failed cases are invisible. The jackpot cases float across my feed. I read it as aggressive = superior. I copy. I'm the plane on the not-jackpot side.
What Happened to the 17, the 9, and the 6 I Tracked
Last fall I opened the benchmarking folder and clicked through them one by one. Here's what I found.
Of the 17 Twitter indies, 6 are still running the same product three years later. Two of those 6 went back to a day job while keeping the product up. One had joined another company that had acquired the product. The number actually running it as a full-time indie, same product, same seat was 3. Three out of 17. 17.6%.
Of the 9 newsletters, 4 had not broken their publishing cadence. Two of those 4 had dropped to four or fewer issues over the past year. Actively running: 2 out of 9. 22.2%.
Of the 6 YouTube channels, only 2 had posted a video in the last six months. 2 out of 6. 33.3%.
Calling these numbers survival rates would be a stretch. The population definition is sloppy and the sample is only 32 people. I'm not an academic. But the result of 32 people I tracked by hand was clearly different from the scenery I had in my head — the one where everyone seemed to be surviving. Even of the 6 + 2 + 2 = 10 who are still running, only 4 still appear regularly on my feed. Of the 17 I marked as benchmarking, 13 have, in some form, disappeared from my feed. People who disappeared can't even be seen disappearing. That's the heart of it.
When I was copying them, I was reading yesterday's posts of the people who survived to keep writing. The people who took off alongside them at the same time and crashed weren't in my view. I looked at 17 surviving planes and concluded fly with this posture and you make it. The planes that flew next to them in the same posture and got shot down weren't in the room.
Copying Isn't the Problem — Sample Awareness Is
There's a familiar trap to fall into here. Ending with so don't copy anyone. I don't agree with that conclusion.
Copying is one of the most efficient ways humans learn. Denrell didn't say stop doing vicarious learning. He said do it knowing that vicarious learning runs on top of underestimated failure. The act of building a benchmarking folder is fine in itself. What broke was that I treated the folder as the whole population. It was a survivor sample. What I should have been copying wasn't the actions of those people, but the action distribution of the cohort that started from the same position. That distribution includes the actions of the people who failed.
Translate this into practice and you get two things.
First, deliberately mix failed people into your benchmarking set. The shut down tag on r/SaaS, I'm closing my product threads on Indie Hackers, searches for post-mortem or failed startup on Twitter. The algorithm won't surface them. You have to type the keywords. For every surviving plane you study, intentionally study one that got shot down. Next to your benchmarking folder, build a post-mortem folder. Both have to be visible at once for the room to function as a room.
Second, look at your benchmarking subjects with the time axis fixed. Not yesterday's post from a current $10k/month indie, but their tweets from three years ago. Look at who was next to them three years ago. Count how many of them have disappeared. That's what I did with the 17 last year. Even adding the time axis once changes the scenery.
Listening to the Silence of the Dead
I still keep my benchmarking folder open. I still add new indies to it from time to time. I haven't stopped copying. What's different is that every time I open the folder I throw in one extra question. Out of the 100 people who started from the same place at the same time as this person, where are the other 99?
The answer is almost always I don't know. That's the honest answer. I don't know makes me move more carefully. I don't know lets me see this person's behavior as one sample instead of the whole population. Even when I copy, it pulls my estimate of the probability that this works from somewhere around 90% down to somewhere around 10%. That gap saves my schedule and my mental health.
$10k/month indies really exist. Among the 17 I tracked last year, some of them are clearly still alive. But next to them are planes that flew the same posture at the same time and never made it back to the room. When I can hear the silence of those planes, only then do I earn the right to copy the 17. Until then, copying is the fastest route to becoming another one of the planes that doesn't return.
I was that plane too. That's why I'm writing this.
- Wald, A. (1943). A Method of Estimating Plane Vulnerability Based on Damage of Survivors. Statistical Research Group, Columbia University.
- Denrell, J. (2003). Vicarious learning, undersampling of failure, and the myths of management. Organization Science, 14(3), 227–243.
This article reflects information as of April 2026. The cited studies don't change with time, but my interpretation of them might shift with better data.