From Solder Smoke to Silicon Clouds
- Rich Washburn
- 11 hours ago
- 7 min read


This all started with a phone call.
An old friend of mine, Boris — a fellow IBM alum and one of the few people who still remembers what IRQ conflicts felt like — called me out of the blue a few weeks back. He had a question about AI. Simple enough.
But if you’ve ever talked to two lifelong tech guys, you know how that goes.
Five minutes in, we were no longer talking about AI — we were talking about everything that led to AI. We fell straight down the nostalgia rabbit hole: Altairs, Sinclair machines, luggables, 300-baud modems, and the miracle that was Windows 3.1. It wasn’t planned, it just… happened.
Somewhere between “remember COM ports?” and “holy crap, you carried that IBM thing to school?”, I realized this was more than a catch-up call. This was the conversation — the thread that ties my entire life in tech together. I probably should’ve written it two months ago on my 51st birthday, but here we are.
Because what struck me that night was this: forty-something years ago, I was talking to a computer — literally, talking to it — by flipping switches on an Altair. It didn’t say much back, just a few blinking LEDs and maybe a beep or two, but that was enough. It was the start of a dialogue.
And here I am now, dictating this story to an AI. Same impulse. Same curiosity. Same need to connect with the machine — just a whole lot more vocabulary.
Back then, the conversation was binary. Today, it’s natural language. But it’s still the same pursuit: to make the fiction real. Just like the kid who watched Star Trek, heard Spock mention an ion drive, and grew up to build one for NASA. Or how we all laughed at tricorders and now walk around with iPhones doing half of what the Enterprise could.
This is the ride we’ve all been on — the journey from soldered boards and serial ports to a world where tech doesn’t sit on your desk, it lives around you.
So yeah — this isn’t a manifesto. It’s a memoir. A look back at how a curious kid with a screwdriver ended up here, talking to AI, still asking the same question I asked at eight years old:
“What can this thing really do?”
Before the Blink: The Maker’s Instinct
Before computers, there were clocks. And radios. And anything else I could take apart before someone caught me. I wasn’t trying to break stuff — I was trying to understand it.
By seven, that curiosity turned kinetic. My neighbor, a cop, helped me piece together what might have been one of the first “hybrid electric” go-karts ever made. We hacked together a weed-whacker engine that spun an alternator to charge car batteries, then used a police cruiser’s starter motor for drive power.
That little monster could run for seven minutes on battery alone — silent mode. Everyone else’s go-karts screamed like lawnmowers; mine glided. It had headlights, taillights, switches. I didn’t know it then, but that was my first lesson in engineering for function and cool factor.
That was the beginning — the moment I learned that with enough stubbornness and scrap metal, you could make your own future.
The Early ’80s: The Altair and the Birth of Code
In 1981, when I was eight, I built my first computer — an Altair from a Heathkit. It took two summers, a soldering iron, and the patience of a saint.
It didn’t do much — just blinked and beeped. But it was mine. And it spoke a language I could learn. Binary. Zeroes and ones. It was like talking to an alien, and I loved every second of it.
That Altair wasn’t about productivity or entertainment — it was about possibility. The idea that a kid could build a brain.
The Mid-’80s: Sinclair, Apple IIc, and the TRS-80 Era
Then came the Sinclair ZX — that little black-and-orange wonder with the squishy membrane keyboard. It plugged straight into the TV and made me feel like I was hacking WarGames.
After that, the Apple IIc. Clean design, compact, functional. That’s where I learned BASIC. That’s where I realized you could make the computer do something for you.
And then there was the TRS-80 Model 4 — my Frankenstein machine. Twin floppy drives that I eventually replaced with a dual-bay setup: one 5.25”, one 3.5”, and a 10MB hard drive that made me feel like I owned the Library of Congress. When I could boot that thing without a disk, it felt like witchcraft.
The Late ’80s: Luggables and Lunacy
Enter the IBM Luggable. Portable in the same way a bowling ball is portable. I carried that beast to school, one arm permanently longer than the other. It had a green screen I swapped for amber, then color — because who wants monochrome when you can have glory?
It ran at 1.77 MHz and had a 20MB hard card. I loved that thing. I loved what it represented — freedom. Computing without tether. Never mind that I could barely lift it.
That Luggable taught me that innovation doesn’t have to make sense to be worth chasing.
The 1990s: When the Interface Found Its Face
The 286. Cirrus Logic processor. Windows 3.1. That was the moment the command line gave way to icons. For the first time, computing felt human.
Add a Sound Blaster card, and the machine could sing. I remember ripping one single CD track onto a 20MB drive — it filled the entire thing — but pressing play felt like summoning the future.
I worked part-time at a small computer shop then, fixing, building, and arguing with DOS 3.3 through 5.0. It felt like everything was accelerating.
We thought tech was evolving fast back then — but if we saw today’s AI curve in 1993, we would’ve lost our minds.
Dial-Up Dreams and Midnight Downloads
The sound of dial-up still triggers nostalgia and mild trauma. The screech, the handshake, the hope.
We were digital insomniacs, timing downloads for 1:00 a.m. because it was the only hour no one else needed the phone line. 300 baud was entry-level. 14.4K was god-tier. A 1MB download was an all-nighter.
Connection wasn’t automatic. It was earned. Every successful handshake felt like winning the lottery.
The Early 2000s: The Wi-Fi Awakening
Wi-Fi didn’t arrive as a neat little router you grabbed at Best Buy. No — the first Wi-Fi setups were janky, beautiful chaos.
Back at IBM, I got my hands on a demo system: a wired Ethernet bridge connected to a pair of IBM 802.11b PCMCIA cards — the first Wi-Fi cards I ever saw.
They gave me the box for a month and told me to return it, but said I could keep the cards. So naturally, I found a way to keep the Wi-Fi, too.
Using Windows ME (yes, that Millennium Edition), I managed to share one machine’s internet connection wirelessly with another laptop downstairs. It worked. Sort of. And when it did, it felt like magic.
The first time I loaded a web page without a cable plugged in, I just sat there and stared. No wire. Just air.
That moment — that first leap into invisible connectivity — was when the world changed. The cord had been cut.
The Cloud Revolution (and the “Fire” Skeptics)
Around that same time, “the cloud” wasn’t a thing yet — it was an argument.
We were talking about virtualizing infrastructure, moving compute off-prem, centralizing data, and people lost their minds. I remember legitimate engineers saying, “You can’t do that — it’ll all burst into flames.” Like actual flames.
It felt like the early days of nuclear testing — “Are we sure we’re not going to destroy the atmosphere?” But we did it anyway. We virtualized. We scaled. And in doing so, we quietly rewrote the physics of IT.
Those early skeptics accidentally predicted something true — just not the way they meant it. Because yes, the cloud did catch fire. It ignited an entirely new era of connectivity.
The Wild West of Mobile
The 2000s were a circus of phone designs. Flip, slide, twist, swivel — every shape imaginable. Each manufacturer was in an arms race to invent the future in plastic and pixels.
Then the iPhone landed, and the noise stopped. The screen became the device. Everything flattened.
We stopped holding phones. We started holding portals.
The 2010s: Ambient Technology
By the 2010s, tech had disappeared into the background.
Wi-Fi was as expected as oxygen. Bluetooth was everywhere. The cloud wasn’t a buzzword anymore; it was infrastructure.
Windows CE quietly lived on inside kiosks, car systems, and forensic gear.
The stuff we once saw as devices became invisible infrastructure.
Technology stopped being something you used. It became something you lived inside of.
The 2020s: AI and the Great Compression
Now here we are — the AI decade. The culmination of everything that started with that Altair.
Keyboards, mice, monitors — they’re slowly dissolving into natural interfaces: language, vision, voice. Soon, maybe just thought.
We went from carrying 20 gadgets to carrying one. And in another decade, maybe we’ll carry none.
Tech is collapsing into invisibility — merging with our senses, embedding in our environments. We’re not just using it anymore. We’re co-evolving with it.
Epilogue: Magic Wands and Missed Chances
After fifty years of watching this evolution firsthand, here’s the truth: humanity’s surrounded by magic — and half of us never wave the wand.
We’ve built tools that can automate, amplify, and augment almost everything we do — and yet, we still get stuck in the mundane. We built technology to free us, but we often let it cage us in comfort.
Still, I’m optimistic. Every few decades, something comes along that resets everything. The Altair. The Internet. Wi-Fi. The Cloud. Now AI.
And just like back then, I find myself right where I started — talking to a machine that’s listening, learning, and responding.
Only this time, it’s not blinking back in binary.
It’s answering in kind.
Because the truth is — whether it’s 1981 or 2025 — I’m still chasing the same spark, asking the same question:
What can this thing really do?
