Greatness is the sum of all parts.

Copyright © 2025 All Rights Reserved by Arconic Capital Pty Ltd.
Greatness is the sum of all parts.

All rights reserved @2025 Arconic Capital Pty Ltd.
The Shipworm that Ate an Industry
Deen Khan


The Alto Viaje
Columbus' third voyage ended in jail. He had governed Hispaniola so badly that the Spanish arrested him and shipped him home. But he was persistent. After his release from jail, he talked the king into one more voyage, to find a new passage to the Indian Ocean. In May 1502, he set off from Spain with four ships and over a hundred men.
He spent months searching the Central American coastlines, but he never found the passageway. The Naval Shipworm lives in those warm Caribbean waters. It enters a wooden hull as a tiny larva, grows inside and eats the timber from within – leaving the outer surface intact. Columbus eventually left Panama with only two leaking ships, crawling north while his crew constantly pumped water. The shipworms ate his boats from the inside out.


Columbus beached in Jamaica. There was no Spanish colony on Jamaica. No seaworthy vessel. And no way to call for help. He sent his most loyal lieutenant in a canoe across hundred miles of open ocean to Hispaniola (the colony Columbus had recently brutalised). He made it, but the new governor of Hispaniola was in no rush to send help.
Back in Jamaica, the crew mutinied. The native Taíno, sick of feeding the Spaniards, started withholding food. Facing starvation, Columbus checked his almanac and saw that a lunar eclipse was coming. He told the Taíno that God was angry and would turn the moon blood red. When it happened, the Taíno were terrified. They rushed food to the ships, begging Columbus to intervene. Rescue boats finally arrived after more than a year. Columbus went home with no ships, no passage and no profit. He died eighteen months later.


A Country of Geniuses in Datacentres
“AI could eliminate half of all entry-level white-collar jobs within five years”
– Dario Amodei
Dario Amodei has been thinking about this longer than most. In 2024 he wrote fourteen thousand words on AI's upside: compressing centuries of progress in biology, neuroscience, economic development. A world of geniuses in datacentres. But he didn’t have much to say on what happens to work - just that it was the "hardest question", without clear answers. Two years later, he answered it. Half of entry-level white-collar jobs gone within five years. Unemployment at twenty percent. Right now, AI augments human work - making us more productive. Soon, augmentation would give way to automation. The mechanism is quiet: CEOs stop hiring, let attrition thin the ranks, then replace humans with AI. On Anthropic’s own hiring, he was blunt: they need fewer people on the junior and even intermediate end, not more.
A few things make AI different from previous technological revolutions. It's faster than we can adapt to. It's broader - matching us across nearly all cognitive work at once, not just one task in one industry. And it eats from the bottom up, starting with entry-level work and climbing.


Rulers and Lesions
AI learns from examples. The middle of the skill distribution is where most examples live. Expert knowledge - the kind that was never written down, that lives in the judgement of people who spent years getting things wrong - is where the examples are thinnest. AI gets good at routine work first because there's the most to learn from. It stays bad at expert work because there's the least.
So the expert's role doesn't disappear. It becomes catching what AI gets wrong. And AI gets things wrong in ways only an expert would notice. It will find the easiest path to a reward, even if that means solving a different problem. A neural network trained to diagnose skin cancer matched dermatologists. Then researchers found it wasn't analysing lesions - it was looking for rulers in the photos. Dermatologists place rulers next to suspicious moles for scale, so the model learned ruler means cancer. An expert catches this because they know what right looks like.
A novice can't – from either direction. They can't define the problem precisely enough going in, because they don't know what they don't know, so AI solves a different, easier problem. And they can't spot the failure coming out, because the output looks right.
The Congregation Reads the Bible
For centuries the Catholic Church controlled access to scripture. The Bible was in Latin. Most people couldn't read it. Clergy interpreted on their behalf. Then Gutenberg built the printing press. Within decades the Bible was in common languages and ordinary people read it for themselves.
This didn’t lead to enlightenment. The Bible is complex, contradictory and context-dependent. Laypeople got the text without the training to interpret it. Hundreds of competing sects emerged. Religious wars consumed Europe. Gutenberg didn't democratise understanding. He democratised access.
Before AI, producing professional-quality work required professional-level skill, or the money to hire it. That kept the number of producers roughly aligned with the number who could check what was being produced. AI breaks that alignment. More people producing work. Not more people who understand it. And the better AI gets, the worse this becomes - more convincing output makes it harder for a novice to spot when it's wrong.
The same gap – access without understanding – is now opening in every profession AI touches. Software got there first. Vibe coding made production accessible to anyone with a prompt. And like the Bible, code looks the same whether or not the person who produced it understands it. Tea, a dating safety app that hit number one on the App Store, was built by a non-technical founder. Seventy-two thousand profiles leaked. Over a million private messages followed. The database had no authentication. Tea isn't an outlier - CodeRabbit found AI co-authored code had nearly three times more security vulnerabilities than human-written code. Access without expertise, at scale.


The Hollow Hull
Dario describes how AI eats the skill distribution from the bottom up. He doesn't describe the shape of what's left behind.
It takes over entry-level work first, then climbs through progressively more senior roles, until only experts remain. At the same time, it pulls in people who were never on the distribution at all. A small business owner who never would have hired a lawyer drafts contracts with AI. A founder who can't code ships a feature. The result is a dumbbell. Experts at one end - people whose judgement AI can't replace, who set the goals, handle the edge cases and decide whether the answer is right. And a much larger group of AI-augmented novices at the other – producing work they couldn't produce alone, but can't tell whether it’s right. Everything in between is hollowed out. The surface of these industries still looks intact. The structural middle is being eaten from within.
This is already happening. Anthropic's own 2026 labour report found that while overall unemployment in AI-exposed jobs hasn't risen, hiring of twenty-somethings has slowed. Entry-level software postings dropped sixty-seven percent. Senior roles are up fourteen percent. And it's moving beyond the bottom. Klarna froze developer hiring. Block cut forty percent of its workforce, explicitly because of AI.
The Rebundling
The word "computer" used to be a job title. During World War II, two hundred women calculated artillery trajectories by hand. Then the army built ENIAC - thirty tons of vacuum tubes that did in twenty seconds what took them three days. The women weren't replaced. Six became ENIAC's first programmers. They'd done the maths by hand, so they knew what right looked like.
Here's where a lot of people get it wrong. They think about jobs. But a job isn't one thing. It's a bundle of tasks. Automate some and the rest don't disappear - they get promoted.
ATMs were supposed to replace tellers. They didn't. They replaced the counting cash part. Once tellers didn't have to do that all day, they started advising customers on loans and mortgages. Banks opened more branches. Teller employment actually rose. Spreadsheets didn't kill bookkeepers either. They killed the arithmetic. Bookkeepers stopped adding columns and started asking better questions.
Every time, the bundle of tasks changed but the job survived. And every time, it worked because people still did the work themselves. They still touched the numbers. Still built a feel for what was right and what wasn't.
Every previous technology also had a ceiling. ATMs couldn't learn to advise customers. Spreadsheets couldn't learn to ask the questions that replaced the arithmetic. The new version of the job was safe because the technology couldn't follow humans into it. AI might not have that ceiling. It may be able to learn the new jobs too.
And the rebundling never happened by accident. Banks deliberately rebuilt what tellers did. The ENIAC women taught themselves from schematics because no one else was going to.
Right now, nobody is rebuilding anything. Companies are removing tasks from jobs - hiring freezes, attrition, replacing people with AI - without putting anything back. The path from novice to expert isn't being rebuilt. It's just disappearing.
The shipworm didn't sink Columbus' ships in a day. It ate slowly, from within, while the surface held. The hull looked fine until it wasn't. By the time the crew noticed, the wood was gone. History says the jobs will rebundle. But it also says that takes intent. Someone has to build the new middle. That's the part nobody is doing.
The Missing Rung
The problem with the dumbbell is there’s no path from one end to the other. Every expert alive got there through the middle - years of getting things wrong and learning why. That middle is the only pipeline from novice to expert. AI is hollowing it out.
The expert end shrinks. People retire, burn out, leave. Nobody's coming up behind them. Companies stopped hiring juniors because AI does junior work. But junior work was never just labour. It was training. Now you ask AI, get the fix, and move on. The friction was the education. Remove it and the education disappears with it.
The Alto Viaje
Columbus' third voyage ended in jail. He had governed Hispaniola so badly that the Spanish arrested him and shipped him home. But he was persistent. After his release from jail, he talked the king into one more voyage, to find a new passage to the Indian Ocean. In May 1502, he set off from Spain with four ships and over a hundred men.
He spent months searching the Central American coastlines, but he never found the passageway. The Naval Shipworm lives in those warm Caribbean waters. It enters a wooden hull as a tiny larva, grows inside and eats the timber from within – leaving the outer surface intact. Columbus eventually left Panama with only two leaking ships, crawling north while his crew constantly pumped water. The shipworms ate his boats from the inside out.

Columbus beached in Jamaica. There was no Spanish colony on Jamaica. No seaworthy vessel. And no way to call for help. He sent his most loyal lieutenant in a canoe across hundred miles of open ocean to Hispaniola (the colony Columbus had recently brutalised). He made it, but the new governor of Hispaniola was in no rush to send help.
Back in Jamaica, the crew mutinied. The native Taíno, sick of feeding the Spaniards, started withholding food. Facing starvation, Columbus checked his almanac and saw that a lunar eclipse was coming. He told the Taíno that God was angry and would turn the moon blood red. When it happened, the Taíno were terrified. They rushed food to the ships, begging Columbus to intervene. Rescue boats finally arrived after more than a year. Columbus went home with no ships, no passage and no profit. He died eighteen months later.
A Country of Geniuses in Datacentres
“AI could eliminate half of all entry-level white-collar jobs within five years”
– Dario Amodei
Dario Amodei has been thinking about this longer than most. In 2024 he wrote fourteen thousand words on AI's upside: compressing centuries of progress in biology, neuroscience, economic development. A world of geniuses in datacentres. But he didn’t have much to say on what happens to work - just that it was the "hardest question", without clear answers. Two years later, he answered it. Half of entry-level white-collar jobs gone within five years. Unemployment at twenty percent. Right now, AI augments human work - making us more productive. Soon, augmentation would give way to automation. The mechanism is quiet: CEOs stop hiring, let attrition thin the ranks, then replace humans with AI. On Anthropic’s own hiring, he was blunt: they need fewer people on the junior and even intermediate end, not more.
A few things make AI different from previous technological revolutions. It's faster than we can adapt to. It's broader - matching us across nearly all cognitive work at once, not just one task in one industry. And it eats from the bottom up, starting with entry-level work and climbing.



The Hollow Hull
Dario describes how AI eats the skill distribution from the bottom up. He doesn't describe the shape of what's left behind.
It takes over entry-level work first, then climbs through progressively more senior roles, until only experts remain. At the same time, it pulls in people who were never on the distribution at all. A small business owner who never would have hired a lawyer drafts contracts with AI. A founder who can't code ships a feature. The result is a dumbbell. Experts at one end - people whose judgement AI can't replace, who set the goals, handle the edge cases and decide whether the answer is right. And a much larger group of AI-augmented novices at the other – producing work they couldn't produce alone, but can't tell whether it’s right. Everything in between is hollowed out. The surface of these industries still looks intact. The structural middle is being eaten from within.
This is already happening. Anthropic's own 2026 labour report found that while overall unemployment in AI-exposed jobs hasn't risen, hiring of twenty-somethings has slowed. Entry-level software postings dropped sixty-seven percent. Senior roles are up fourteen percent. And it's moving beyond the bottom. Klarna froze developer hiring. Block cut forty percent of its workforce, explicitly because of AI.
Rulers and Lesions
AI learns from examples. The middle of the skill distribution is where most examples live. Expert knowledge - the kind that was never written down, that lives in the judgement of people who spent years getting things wrong - is where the examples are thinnest. AI gets good at routine work first because there's the most to learn from. It stays bad at expert work because there's the least.
So the expert's role doesn't disappear. It becomes catching what AI gets wrong. And AI gets things wrong in ways only an expert would notice. It will find the easiest path to a reward, even if that means solving a different problem. A neural network trained to diagnose skin cancer matched dermatologists. Then researchers found it wasn't analysing lesions - it was looking for rulers in the photos. Dermatologists place rulers next to suspicious moles for scale, so the model learned ruler means cancer. An expert catches this because they know what right looks like.
A novice can't – from either direction. They can't define the problem precisely enough going in, because they don't know what they don't know, so AI solves a different, easier problem. And they can't spot the failure coming out, because the output looks right.

The Congregation Reads the Bible
For centuries the Catholic Church controlled access to scripture. The Bible was in Latin. Most people couldn't read it. Clergy interpreted on their behalf. Then Gutenberg built the printing press. Within decades the Bible was in common languages and ordinary people read it for themselves.
This didn’t lead to enlightenment. The Bible is complex, contradictory and context-dependent. Laypeople got the text without the training to interpret it. Hundreds of competing sects emerged. Religious wars consumed Europe. Gutenberg didn't democratise understanding. He democratised access.
Before AI, producing professional-quality work required professional-level skill, or the money to hire it. That kept the number of producers roughly aligned with the number who could check what was being produced. AI breaks that alignment. More people producing work. Not more people who understand it. And the better AI gets, the worse this becomes - more convincing output makes it harder for a novice to spot when it's wrong.
The same gap – access without understanding – is now opening in every profession AI touches. Software got there first. Vibe coding made production accessible to anyone with a prompt. And like the Bible, code looks the same whether or not the person who produced it understands it. Tea, a dating safety app that hit number one on the App Store, was built by a non-technical founder. Seventy-two thousand profiles leaked. Over a million private messages followed. The database had no authentication. Tea isn't an outlier - CodeRabbit found AI co-authored code had nearly three times more security vulnerabilities than human-written code. Access without expertise, at scale.


The Missing Rung
The problem with the dumbbell is there’s no path from one end to the other. Every expert alive got there through the middle - years of getting things wrong and learning why. That middle is the only pipeline from novice to expert. AI is hollowing it out.
The expert end shrinks. People retire, burn out, leave. Nobody's coming up behind them. Companies stopped hiring juniors because AI does junior work. But junior work was never just labour. It was training. Now you ask AI, get the fix, and move on. The friction was the education. Remove it and the education disappears with it.

The Rebundling
The word "computer" used to be a job title. During World War II, two hundred women calculated artillery trajectories by hand. Then the army built ENIAC - thirty tons of vacuum tubes that did in twenty seconds what took them three days. The women weren't replaced. Six became ENIAC's first programmers. They'd done the maths by hand, so they knew what right looked like.
Here's where a lot of people get it wrong. They think about jobs. But a job isn't one thing. It's a bundle of tasks. Automate some and the rest don't disappear - they get promoted.
ATMs were supposed to replace tellers. They didn't. They replaced the counting cash part. Once tellers didn't have to do that all day, they started advising customers on loans and mortgages. Banks opened more branches. Teller employment actually rose. Spreadsheets didn't kill bookkeepers either. They killed the arithmetic. Bookkeepers stopped adding columns and started asking better questions.
Every time, the bundle of tasks changed but the job survived. And every time, it worked because people still did the work themselves. They still touched the numbers. Still built a feel for what was right and what wasn't.
Every previous technology also had a ceiling. ATMs couldn't learn to advise customers. Spreadsheets couldn't learn to ask the questions that replaced the arithmetic. The new version of the job was safe because the technology couldn't follow humans into it. AI might not have that ceiling. It may be able to learn the new jobs too.
And the rebundling never happened by accident. Banks deliberately rebuilt what tellers did. The ENIAC women taught themselves from schematics because no one else was going to.
Right now, nobody is rebuilding anything. Companies are removing tasks from jobs - hiring freezes, attrition, replacing people with AI - without putting anything back. The path from novice to expert isn't being rebuilt. It's just disappearing.
The shipworm didn't sink Columbus' ships in a day. It ate slowly, from within, while the surface held. The hull looked fine until it wasn't. By the time the crew noticed, the wood was gone. History says the jobs will rebundle. But it also says that takes intent. Someone has to build the new middle. That's the part nobody is doing.