Most people, when they turn the tap on, they don't know where the water comes from. Try asking someone "What is the physical principle that makes the water come out of the tap? How do they make it come out?" you might be surprised how many people don't know and most importantly don't care.
The water comes out. The water has always come out, every time, so it's not really a thing worth investigating. Like the sunrise.
In many many domains I am that person.
If a person doesn't know (except in the vaguest terms) where their water comes from, where their poo goes when they flush, where their food comes from (the supermarket!), or the energy that heats their home... what do they really know? Most of us know very little about the concrete networks and systems that keep us alive.
I feel a certain comfortableness when I don't know these things. It's annoying, and yet it brings me many things as well. When I drive off in my car, I half visualize the clutch plates coming into contact with each other etc etc etc.
But my kids call any internet connection WiFi. My wife didn't understand why she couldn't print with the WiFi switch off (back when we had switches). And every time I try to tell them "how the internet flows", I take them to the hotspot and tell them what WiFi is, how the UTP cable goes to the modem and the fiber goes into the ground and somewhere it gets information from some other computer. And I tell them why they have less issues with our local Minecraft server then when he gets invited to a worlds on someone's Playstation across town (in Bugrock).
It's tiring in a way, even more so for people around me. And still, it also brings me many nice things.
> What is the physical principle that makes the water come out of the tap? How do they make it come out?
Curious coincidence, I was literally thinking yesterday: “but why does the water come out of the tap?” I self-answered “must be the pressure somehow” but did not dig much more…
The post raises several points that I wholeheartedly agree with, but the framing is poor and honestly kind of elitist (or just short-sighted). Maybe to the point that I think much of it might just be bait, lol. For example:
> Ask a twenty-two-year-old to connect to a remote server via SSH. Ask them to explain what DNS is at a conceptual level. Ask them to tell you the difference between their router’s public IP and the local IP of their laptop. Ask them to open a terminal and list the contents of a directory. These are not advanced topics. Twenty years ago these were things you learned in the first week of any serious engagement with computers.
What? Computers were everywhere in all kinds of domains by 2006, but you can bet that your average accountant of the time would most likely not be able to SSH into a server (nor should they need to...) I guess it really depends on what the author qualifies as a "serious engagement with computers."
They"ve basically got the dates pretty wrong. It's make sense if they'd said 35 years ago, that's when it was common to know that.
I'd say almost all of that became redundant for the average person with windows 3.1 release (34 years ago) or, maybe, more windows 95 (31 years ago).
I remember desperately trying to get two computers to talk to each other so we could play doom in the early 90s, whatever black magic we had to do seemed to take hours to get working.
The time we had 3 or even 4 computers playing Baldurs Gate together I swear we started trying to get the computers talking at 7pm and didn't start playing till 10 (but it was amazing).
> Ask a twenty-two-year-old to connect to a remote server via SSH. Ask them to explain what DNS is at a conceptual level.
Modern IT has become a ubiquitous commodity, much like the car. You don't need to know how an engine works to drive; while that knowledge might make you more efficient, it isn't strictly necessary to get from A to B. Besides, most twenty-two-year-olds ten years ago didn't know how to use ssh, either.
However, if you want to call yourself an engineer (and work in the field), you must understand the underlying mechanics. IMHO if you want to defeat a competitor today, you don’t need industrial espionage - you just have to cut their internet and/or AI subscriptions. Modern vibe engineers would struggle to function.
> The man page is dead for most users. The RFC is unread by most developers who depend on the protocols it describes.
Well, those who are accustomed to using man pages still use them today. I find them far more accurate than whatever an AI might spit out at any given moment. As for RFCs, they were always read by a small population - either those implementing the protocols or the few of us who like to brag about obscure technical details.
> You can now write complete programs without understanding what a single line of them does... until something goes wrong in production at two in the morning and you are completely without tools to respond.
I’m not worried about this. When things go south, there will still be experts who will know how to fix them. But since those experts will be fewer and farther between, they will likely charge $1k/hr, and rightfully so. If you are in that field, more power to you! :D
Used computers for about 35 years before the first time I first tried to "connect to a remote server via SSH". Go figure.
DNS is a phone book, I think!
But yeah, maybe "bad examples" by the author.
The one that really confuses me is this, though:
> You’ve built a generation that can’t extract a zip file without a dedicated app and calls it innovation.
Sorry, what are you saying? Software exists to unzip files. It used to be a "dedicated app" like WinZip, 7zip, WinRAR, etc. Now it's built into Windows. Or you use the 'unzip' command in Linux.
I dont think the average power user needs to understand DNS. Knowing just that it can be changed (and can fix things or break things and whatnot) is probably already plenty.
Connecting to SSH seems like something a "power user" should be able to learn but not necessarily know already (probably more likely they know what a VPN is)
> However, if you want to call yourself an engineer (and work in the field), you must understand the underlying mechanics. IMHO if you want to defeat a competitor today, you don’t need industrial espionage - you just have to cut their internet and/or AI subscriptions. Modern vibe engineers would struggle to function.
True, but on the other hand, when I started programming (hell, even before the whole LLM craze began) and you took away my internet/stackoverflow/google I would also drastically lose productivity. Especially in my more junior years, and later, of course I could still write code, but if I had to figure out how a certain library worked or why a certain error in the auth layer happened, without internet I would be nowhere.
> These are the same people who would lose their minds if their city government told them they could only buy food from vendors the city had approved, licensed, and taxed
But it is exactly like this in the developed world, and not many would buy food from a trunk of a roadside car.
I agree with the sentiment but to a certain degree you can vote with your currency. Also, in many places you can certainly vote for elected officials who are interested in using government tools to prevent and breakup monopolies.
...but you can. And, open your facebook Marketplace (I know, I know) and just type any food you like. You can buy it there, made by people like you. Risky, maybe, but you can.
It obviously depends on local laws, but it's very commonly illegal to sell prepared food without a license/permit. You might not get caught selling food on FB Marketplace, but that doesn't make it any less allowed.
I agree with the author regarding Apple's walled-garden app distribution, but the analogy just doesn't work here.
That's... Normal. Technology has always been moving towards higher-level abstractions. In terms of software, many engineers nowadays know how to code in high-level languages like JS or Java, while maybe 30-40 years ago many folks probably knew C, assembly, and all the low-level stuff like e.g. explicit memory management that most modern devs never deal with.
Most likely basically because the cut off for atdt being any use to most users was mid 1990s. I still like the fact that it is still being used millions of times per day by cell phones.
DNS and SSH were/are things 'techie' people know. I can assure you most people had no idea what their IP was or what a DNS was. Being the "hey my computer is acting kinda slow can you look at it" guy. It felt like they actively sought out to not know. Honestly cellphones and tablets have basically ended my endless side job of 'hey can you look at my computer'. Because they hid all of that techno junk that is interesting to me but to most people isnt.
What I'm saying is that being a 'power user' is not a static thing, it's relative. It changes and evolves (or downgrades) over time.
But most importantly, and what the author missed, is that it works both ways. I know how to connect to a BBS, but I was literally paralysed by the fact that there is no LAN game in Counter-Strike 2. Where is LAN?! Why do I need a Steam account for every player to play with friends sitting in the same room? Why would I even need external servers for this?
idk, it's a modern world and I don't belong to it, so perhaps we should accept the slow death of 90s or 00s 'power users', and the rise of new 'power users' of the 20s, who won't even know what floppy disk icon on the Save button means.
That is totally fair. I never really looked at it that way before. I just see it as another blob of skills to add to my growing list of useless knowledge. :)
So true that's why I love the Google? interview question: how does a modern browser work (basically one needs to understand the entire journey from a file on the server, through DNS, and browser rendering engine). Sadly, this is becoming lost to many folks.
Pre-AI, I worked with Devs that didn't even know what an HTTP request is (difference between GET/POST/etc) - we were building an enterprise software where higher level libraries abstracted that away..With AI, it's becoming even worse now - just ask Claude
I don't know if I agree. Look at modern application development: even building the most basic GUI application in any platform, but especially the web, is significantly more "power-user" than Visual Basic ever was.
People are still building these apps at every level of developer experience so the kids must be alright.
Yes people, in general, have less knowledge of file systems and networking now because things just work. Every LAN party as a kid took at least an hour of networking to get started. Now kids don't do LAN parties because everything is already networked by default. But there is also an order of magnitude more people doing these things now -- in the past those people just went without.
The vast majority most of knowledge I ever had from the time the author describes is obsolete now anyway. I can still remember so much but that's not going to help me with my React app.
I take issue with this. Knowledge like connecting to a remote server via SSH or explaining what DNS is have ALWAYS been niche topics. The article claims these are things you learned in the first week of "seriously engaging" with a computer twenty years ago, and that's just false.
You see it in how services have twisted the meaning of "saving" something. Very rarely does it mean actually putting a file on your computer for you to access on your own terms as long as you have possession of it. More often, it just means associating something to your account, which is ultimately subject to the whims of the service provider.
The purpose of what he describes is to preserve the structure of denial around the implications of possessing, and the consequences of using, any or all technology. Politics as found -- war, law, class, etiquette -- depends entirely upon that structure of denial.
"Don't Make Me Think" wasn't descriptive. It was prescriptive that became descriptive. An entire professional class, PMs and UX designers, adopted as axiomatic that cognitive effort is friction, friction is bad, therefore understanding is a design failure.
Then they spent 25 years engineering understanding out of every single interaction, and now point to the resulting learned helplessness as validation. "See? Users don't read!" No, you spent decades training them not to by ensuring that reading was never rewarded and never necessary.
My read has always been it was painful for a certain type of PM to think and so they assumed "minds like mine" and ... here we are.
> “I have nothing to hide” is the response is the response, which is not an argument — it’s a thought-terminating cliché that makes it socially awkward to point out that privacy is not about criminality, it’s about power. Whoever holds your behavioral data holds power over you. That’s true whether or not you’ve done anything wrong.
To put a slightly finer point on it for puncturing the cliche, "needing to hide" has a time-component. Everyone has something to hide from a potential future, whether they're good at predicting it or not.
I "have nothing to hide" about my religion today, but when if extremists seize power and declare "death to apostates", the exact same fact-pattern will very very much need hiding.
> The industry isn’t going to fix this. Every financial incentive points the other way.
Cory Doctorow has a hopeful--perhaps over-hopeful--idea that a disruptive wedge can be created, where a profit-motive will promote breaking the system of control. Specifically, that some place with a legal haven for tinkerers and wall-breakers will reap benefits from letting them openly sell device-unlockers, export-your-data tools, etc.
Maybe you haven't worked with someone like this? You ask them to do the simplest thing and they can't. And not like, they don't know so they go figure it out, they are just like "I don't know how to do that". Then they attempt to engineer (er, vibecode) enormously complicated solutions to solve problems that aren't problems for anyone else on the team, because other folks on the team know how to use a terminal or text editor. Like I asked someone (an engineer!) to open a text editor and he said "What's that". It's truly bizarre.
While on one hand I suggest revisiting the old Ironies of Automation by Lisanne Bainbridge on the other I've noticed that what's really missing everywhere is IT education. I come from a Computer Engineering background and I saw absolutely NOTHING regarding IT except for a few mentions here and there; I had to teach myself. Those coming from CS see even less (at least speaking for EU universities, I don't know about the US but I suspect it's not much different).
There's an attempt to deny the need for IT knowledge and expertise at every level, Big Tech does it out of self-interest, while most others do it out of ignorance. They often claim, "Oh, it only takes a few minutes or days on your own; just a couple of clicks and you can do everything." Yet, those who say this don't actually know what they're talking about and refuse to even try to prove their own theory.
The outcome is even worse: nowadays, doing it yourself is a struggle even when you have the right skills. All recent software is built to be unmanageable because there's no operation/infra vision. Don't even get me started on documentation; everyone talks about the need for a "documentation culture" yet what actually gets documented ranges from nothing to total garbage (basically text that's useless unless you feed it to an LLM and hope it can make some sense of it).
To make matters worse, standard hardware is getting more and more expensive, first it was graphics cards, then RAM, and now NVMEs, with the result that many people simply don't want to or can't afford to buy, so they're literally living on someone else's computers even if they don't like it. This is especially true for students, who are at the best stage of their lives for learning and who won't have the time or energy to do it later on.
To complete the picture, the business model just isn't sustainable; no matter how much is invested, a real digital evolution isn't possible while living on the computers of four giants limited by their own services, and this implies that a social collapse awaits us regardless.
For me, the solution is managing to have enough leverage so that we can push for mandatory FLOSS and open hardware de jure in response, in order to limit the damage and geopolitical upheavals who push anyone to relocalize, which necessarily implies starting over on a small scale. I see something coming: Nostr, Meshtastic, the Fediverse, the rise of self-hosters and their average age show that there's still an active group of people who want a different world. But they are few and far between, burdened by significant technical debt in a world that's becoming increasingly hostile,and that's exactly where things need to change.
The problems caused by centralization, from various companies getting burned by relying on giant third-party providers, to banking scandals driving crypto (not stablecoins), to the need for resilience that requires cutting down on SPOF might actually make a difference. I hope it'll be enough, and I hope anyone who gets it does their part to spread that understanding while we still can.
I find the author is projecting heavily having entirely bought into the unix way of doing things he's become his own jailer. That a filesystem is somehow "fundamental' to the function of a computer. Nope, it's an abstraction designed to help the programmer, it should be thrown away. These kind of unified approaches are going to fail to specialization every time. There are endless things to understand, deprecated tech is at the very end of that list.
If it was only the Apples' and Google's who thought sandboxed apps were the future you might have a point, but most tech savvy people arrive at something that looks and awful lot like sandboxed apps. You see power, most see[1] a[2] dumpster-fire[3]
I read the text. It's something I would agree in principle, but the text is exceedingly whiny and in many ways wrong.
Power users, tinkerers, and so on were always extremely niche. By definition they were always only a few. They are still a few, probably in similar numbers as before.
The only thing that changed is that normal people now have access to computing devices. My wife does not want to know what a file system is, or what happens behind the scenes when an app is installed. She has no idea what a DNS is. Why would she? She is a lawyer with little interest in technology. She wants to use Instagram, not self-host a Matrix instance.
The normal users are the majority, and it's more profitable to serve those users than assholes like me who get pissy when they can't sideload an apk.
> Mobile Platforms Did the Most Damage, and They Did It on Purpose
So true, but this has been going on for quite a while. Phones accelerated it and I have seem many of the concerns come up in IT where I worked.
A couple of examples:
1. My favorite, about 10 to 15 years ago. A user said this finance report always had 2147483647 in the total. This was looked at for weeks by another group.
After a few weeks our manager's manager called a meeting with everyone to look at the issue. Everyone had no idea what to do. When I saw the number it look real familiar to me. I then released it was the max value if an int. I told them the issue was its variable could was too small. A simple change fixed the issue.
Another old programmer who was not at the meeting asked me what happened. I showed him the report and he know instantly what it was too.
2. hex dumps, no one can read them now. About 25 years ago I was looking at a dump to see where a packed numeric value was, people who saw be thought it was magic. I had to explain how that number was read and what the hex represented.
I fear what will happen if AI becomes a real thing.
Thank you for sharing this article. I found it very insightful while eloquently proposing root causes. In particular, I found this bit to be wonderful: "The smartphone didn’t just shift computing to a smaller screen. It replaced a computing paradigm — one built on ownership, modification, and composability — with a consumption paradigm built on managed access, curated experience, and dependency."
On one hand, it's a "people problem"... on the other, it's a software problem too.
"back in my time", you'd have some c code, one .conf file, you'd "make", edit the config (or hope it works with default settings), run, and you'd have a program running. Now you need five different services running, it comes in a docker, running on some random port, proxied to another random port, the configs are split into 12 yaml files, plus it needs 7 gigs of hdd space...
..sometimes providing the same functionality as the old 300kB software of the yesteryear.
I wonder how much of this complexity is a form of resume driven development. Jobs are webdev with a host of microservices and a datastore and and and... This leads people to building apps in that style instead of a command line tool (or non-electron GUI application) to get the same job done.
I think this is also a big part of it, the whole "system" has become so complex and broken into so many different service layers that no one can be a power user over it all. Which somewhat resembles the "death" of the power user
> All of this was sold as a feature. “It just works.” Safety. Privacy. User experience. What it actually was, was control — Apple’s control over what you could do with hardware you supposedly bought.
Can't both of these be true?
Apple, Microsoft, and the evil tech companies us nerds love to vilify actually brought computing to the masses. In the early 1980s only tech geeks and corporations used computers—the Apple Macintosh changed that. In the early 1990s, only tech geeks and universities used the internet—Microsoft Windows changed that. In the mid 2000's, only tech geeks and business people used "smart" devices[1]—the iPhone changed that.
At every technological leap, business savvy entrepreneurs saw an opportunity to expand their markets by making their products enticing and useful to millions of more people than the previous generation of products did.
Unfortunately, this also came at the expense of the apparent "dumbing down" of computers, as every new abstraction hid more of the actual computer users had to interface with. And it also made things easy to control and lock down for corporations.
But I don't think we would've seen the explosion in the popularity of computing had this played out any other way.
I also disagree with the article's premise that power users are dying. We're still here, but we're a tiny minority of computer users now. We're both amused and frustrated at the insanity of where technology is taking us, and who is leading us there, but we still have our corners of computing we can retreat to.
And I also disagree that our favorite layer of computing is somehow more "real" than anyone else's. We scoff at Gen Z's inability to use the terminal as much as Baby Boomers scoffed at our inability to program in assembly. It's all relative. Except "AI". That is more of a disabler than an enabler, even though we're too hypnotized to see it now.
[1]: Yes, the BlackBerry was a cultural phenomenon, but it didn't have the capabilities nor mass appeal of the iPhone.
I see this even within the developer community, people who understand systems programming is becoming rarer and rarer. Developers are constantly looking for GUI solutions and don't bother understanding how to script. I think this likely stemmed from a similar issue as introducing the smartphone, we software engineers separated the disciplines of DevOps from Software Engineering, this led to people not caring about the infrastructure at the background of their solutions.
I agree with what I read of the article before stopping because of how AI this reads.
The user's other blogs also have the same telltale em dash usage and usage of “” instead of "". Reading obviously AI generated text feels unpleasant. I don't know how much thought was put into this. Is this just "hey Claude write me a blog post about the death of the power user"? Maybe they just used AI for proofreading. I can't really know but it's hard to justify putting in effort to read this when there could have been zero effort put into writing it.
By default on a Mac, "smart quotes" is enabled that turns "" into “”. I don't think that's a sign of an LLM. I believe Microsoft Word on all platforms does this too.
I found the article tedious, even though I agree with its main point. However, the so-called “obvious” signs it was written by an AI—quotation marks and em-dashes—are within the ability of anyone with a Compose key (on Linux/BSD, ask your window manager or DE; on Mac, Karabiner; on Windows, WinCompose).
I was the whole time trying to figure out if it was AI.
I was ambivalent about it. The way the arguments are presented don't read as much like AI.
Perhaps it was indeed just for proofreading. Although I know that MS Word turns single dashes into em-dash. Maybe they used Word when writing the thing?
It's one of the reasons I am sort of abandoning dashes when writing and using instead semicolons.
> The version of me you know — the writer, the ranter, the one who tears into accessibility failures or rips Linux a new one — that’s a persona.
> Not fake. Not dishonest. Just deliberate.
It’s tuned for the internet, built to survive in a world that eats subtlety alive. It’s the volume turned up, the emotion sharpened, the thoughts sculpted until they’re worth reading.
It is justification. AI writing has certain characteristics more frequently than non-AI writing. These in particular are easy to see and cite as justification. Others like the use of the "it's not X, it's Y" construction and the way that adjectives are used are noticable too but not as easy to point as justification because longer explanations are necessary.
Indeed. People use em dashes. And curly quotation marks. And emojis in headers. And use boldface and italics to emphasize things. And overuse the "it's not X, it's Y" construction. People could also do all of these things at the same time. Or maybe do them sometimes and not other times for fun.
It is obviously impossible to be able to tell with 100% certainty whether or not something is written using AI. But I think this most likely is. Could be that it's just copy edited with AI and not wholly AI slop. Who knows. Either way, it reads very much like AI to me.
The water comes out. The water has always come out, every time, so it's not really a thing worth investigating. Like the sunrise.
In many many domains I am that person.
If a person doesn't know (except in the vaguest terms) where their water comes from, where their poo goes when they flush, where their food comes from (the supermarket!), or the energy that heats their home... what do they really know? Most of us know very little about the concrete networks and systems that keep us alive.
But this is what civilisation is.
But my kids call any internet connection WiFi. My wife didn't understand why she couldn't print with the WiFi switch off (back when we had switches). And every time I try to tell them "how the internet flows", I take them to the hotspot and tell them what WiFi is, how the UTP cable goes to the modem and the fiber goes into the ground and somewhere it gets information from some other computer. And I tell them why they have less issues with our local Minecraft server then when he gets invited to a worlds on someone's Playstation across town (in Bugrock).
It's tiring in a way, even more so for people around me. And still, it also brings me many nice things.
Curious coincidence, I was literally thinking yesterday: “but why does the water come out of the tap?” I self-answered “must be the pressure somehow” but did not dig much more…
> Ask a twenty-two-year-old to connect to a remote server via SSH. Ask them to explain what DNS is at a conceptual level. Ask them to tell you the difference between their router’s public IP and the local IP of their laptop. Ask them to open a terminal and list the contents of a directory. These are not advanced topics. Twenty years ago these were things you learned in the first week of any serious engagement with computers.
What? Computers were everywhere in all kinds of domains by 2006, but you can bet that your average accountant of the time would most likely not be able to SSH into a server (nor should they need to...) I guess it really depends on what the author qualifies as a "serious engagement with computers."
I'd say almost all of that became redundant for the average person with windows 3.1 release (34 years ago) or, maybe, more windows 95 (31 years ago).
I remember desperately trying to get two computers to talk to each other so we could play doom in the early 90s, whatever black magic we had to do seemed to take hours to get working.
The time we had 3 or even 4 computers playing Baldurs Gate together I swear we started trying to get the computers talking at 7pm and didn't start playing till 10 (but it was amazing).
Modern IT has become a ubiquitous commodity, much like the car. You don't need to know how an engine works to drive; while that knowledge might make you more efficient, it isn't strictly necessary to get from A to B. Besides, most twenty-two-year-olds ten years ago didn't know how to use ssh, either.
However, if you want to call yourself an engineer (and work in the field), you must understand the underlying mechanics. IMHO if you want to defeat a competitor today, you don’t need industrial espionage - you just have to cut their internet and/or AI subscriptions. Modern vibe engineers would struggle to function.
> The man page is dead for most users. The RFC is unread by most developers who depend on the protocols it describes.
Well, those who are accustomed to using man pages still use them today. I find them far more accurate than whatever an AI might spit out at any given moment. As for RFCs, they were always read by a small population - either those implementing the protocols or the few of us who like to brag about obscure technical details.
> You can now write complete programs without understanding what a single line of them does... until something goes wrong in production at two in the morning and you are completely without tools to respond.
I’m not worried about this. When things go south, there will still be experts who will know how to fix them. But since those experts will be fewer and farther between, they will likely charge $1k/hr, and rightfully so. If you are in that field, more power to you! :D
I feel like when I was twenty two I would have been very surprised if more than a couple of my peers knew this stuff.
DNS is a phone book, I think!
But yeah, maybe "bad examples" by the author.
The one that really confuses me is this, though:
> You’ve built a generation that can’t extract a zip file without a dedicated app and calls it innovation.
Sorry, what are you saying? Software exists to unzip files. It used to be a "dedicated app" like WinZip, 7zip, WinRAR, etc. Now it's built into Windows. Or you use the 'unzip' command in Linux.
Connecting to SSH seems like something a "power user" should be able to learn but not necessarily know already (probably more likely they know what a VPN is)
True, but on the other hand, when I started programming (hell, even before the whole LLM craze began) and you took away my internet/stackoverflow/google I would also drastically lose productivity. Especially in my more junior years, and later, of course I could still write code, but if I had to figure out how a certain library worked or why a certain error in the auth layer happened, without internet I would be nowhere.
> These are the same people who would lose their minds if their city government told them they could only buy food from vendors the city had approved, licensed, and taxed
But it is exactly like this in the developed world, and not many would buy food from a trunk of a roadside car.
That's the point.
I agree with the author regarding Apple's walled-garden app distribution, but the analogy just doesn't work here.
In my time, not being able to read assembler code meant you weren't a power user.
DNS and SSH were/are things 'techie' people know. I can assure you most people had no idea what their IP was or what a DNS was. Being the "hey my computer is acting kinda slow can you look at it" guy. It felt like they actively sought out to not know. Honestly cellphones and tablets have basically ended my endless side job of 'hey can you look at my computer'. Because they hid all of that techno junk that is interesting to me but to most people isnt.
But most importantly, and what the author missed, is that it works both ways. I know how to connect to a BBS, but I was literally paralysed by the fact that there is no LAN game in Counter-Strike 2. Where is LAN?! Why do I need a Steam account for every player to play with friends sitting in the same room? Why would I even need external servers for this?
idk, it's a modern world and I don't belong to it, so perhaps we should accept the slow death of 90s or 00s 'power users', and the rise of new 'power users' of the 20s, who won't even know what floppy disk icon on the Save button means.
Pre-AI, I worked with Devs that didn't even know what an HTTP request is (difference between GET/POST/etc) - we were building an enterprise software where higher level libraries abstracted that away..With AI, it's becoming even worse now - just ask Claude
People are still building these apps at every level of developer experience so the kids must be alright.
Yes people, in general, have less knowledge of file systems and networking now because things just work. Every LAN party as a kid took at least an hour of networking to get started. Now kids don't do LAN parties because everything is already networked by default. But there is also an order of magnitude more people doing these things now -- in the past those people just went without.
The vast majority most of knowledge I ever had from the time the author describes is obsolete now anyway. I can still remember so much but that's not going to help me with my React app.
It's engineered dependency.
Then they spent 25 years engineering understanding out of every single interaction, and now point to the resulting learned helplessness as validation. "See? Users don't read!" No, you spent decades training them not to by ensuring that reading was never rewarded and never necessary.
My read has always been it was painful for a certain type of PM to think and so they assumed "minds like mine" and ... here we are.
To put a slightly finer point on it for puncturing the cliche, "needing to hide" has a time-component. Everyone has something to hide from a potential future, whether they're good at predicting it or not.
I "have nothing to hide" about my religion today, but when if extremists seize power and declare "death to apostates", the exact same fact-pattern will very very much need hiding.
> The industry isn’t going to fix this. Every financial incentive points the other way.
Cory Doctorow has a hopeful--perhaps over-hopeful--idea that a disruptive wedge can be created, where a profit-motive will promote breaking the system of control. Specifically, that some place with a legal haven for tinkerers and wall-breakers will reap benefits from letting them openly sell device-unlockers, export-your-data tools, etc.
These are the type of persons who would get a girlfriend without a master degree in psychology.
There's an attempt to deny the need for IT knowledge and expertise at every level, Big Tech does it out of self-interest, while most others do it out of ignorance. They often claim, "Oh, it only takes a few minutes or days on your own; just a couple of clicks and you can do everything." Yet, those who say this don't actually know what they're talking about and refuse to even try to prove their own theory.
The outcome is even worse: nowadays, doing it yourself is a struggle even when you have the right skills. All recent software is built to be unmanageable because there's no operation/infra vision. Don't even get me started on documentation; everyone talks about the need for a "documentation culture" yet what actually gets documented ranges from nothing to total garbage (basically text that's useless unless you feed it to an LLM and hope it can make some sense of it).
To make matters worse, standard hardware is getting more and more expensive, first it was graphics cards, then RAM, and now NVMEs, with the result that many people simply don't want to or can't afford to buy, so they're literally living on someone else's computers even if they don't like it. This is especially true for students, who are at the best stage of their lives for learning and who won't have the time or energy to do it later on.
To complete the picture, the business model just isn't sustainable; no matter how much is invested, a real digital evolution isn't possible while living on the computers of four giants limited by their own services, and this implies that a social collapse awaits us regardless.
For me, the solution is managing to have enough leverage so that we can push for mandatory FLOSS and open hardware de jure in response, in order to limit the damage and geopolitical upheavals who push anyone to relocalize, which necessarily implies starting over on a small scale. I see something coming: Nostr, Meshtastic, the Fediverse, the rise of self-hosters and their average age show that there's still an active group of people who want a different world. But they are few and far between, burdened by significant technical debt in a world that's becoming increasingly hostile,and that's exactly where things need to change.
The problems caused by centralization, from various companies getting burned by relying on giant third-party providers, to banking scandals driving crypto (not stablecoins), to the need for resilience that requires cutting down on SPOF might actually make a difference. I hope it'll be enough, and I hope anyone who gets it does their part to spread that understanding while we still can.
If it was only the Apples' and Google's who thought sandboxed apps were the future you might have a point, but most tech savvy people arrive at something that looks and awful lot like sandboxed apps. You see power, most see[1] a[2] dumpster-fire[3]
: [1] ; ls /usr/lib | wc -l
: [2] ; ls /usr/bin | wc -l
: [3] ; find /usr/share/man/man* | wc -l
Power users, tinkerers, and so on were always extremely niche. By definition they were always only a few. They are still a few, probably in similar numbers as before.
The only thing that changed is that normal people now have access to computing devices. My wife does not want to know what a file system is, or what happens behind the scenes when an app is installed. She has no idea what a DNS is. Why would she? She is a lawyer with little interest in technology. She wants to use Instagram, not self-host a Matrix instance.
The normal users are the majority, and it's more profitable to serve those users than assholes like me who get pissy when they can't sideload an apk.
And I am okay with that.
So true, but this has been going on for quite a while. Phones accelerated it and I have seem many of the concerns come up in IT where I worked.
A couple of examples:
1. My favorite, about 10 to 15 years ago. A user said this finance report always had 2147483647 in the total. This was looked at for weeks by another group.
After a few weeks our manager's manager called a meeting with everyone to look at the issue. Everyone had no idea what to do. When I saw the number it look real familiar to me. I then released it was the max value if an int. I told them the issue was its variable could was too small. A simple change fixed the issue.
Another old programmer who was not at the meeting asked me what happened. I showed him the report and he know instantly what it was too.
2. hex dumps, no one can read them now. About 25 years ago I was looking at a dump to see where a packed numeric value was, people who saw be thought it was magic. I had to explain how that number was read and what the hex represented.
I fear what will happen if AI becomes a real thing.
"back in my time", you'd have some c code, one .conf file, you'd "make", edit the config (or hope it works with default settings), run, and you'd have a program running. Now you need five different services running, it comes in a docker, running on some random port, proxied to another random port, the configs are split into 12 yaml files, plus it needs 7 gigs of hdd space...
..sometimes providing the same functionality as the old 300kB software of the yesteryear.
Can't both of these be true?
Apple, Microsoft, and the evil tech companies us nerds love to vilify actually brought computing to the masses. In the early 1980s only tech geeks and corporations used computers—the Apple Macintosh changed that. In the early 1990s, only tech geeks and universities used the internet—Microsoft Windows changed that. In the mid 2000's, only tech geeks and business people used "smart" devices[1]—the iPhone changed that.
At every technological leap, business savvy entrepreneurs saw an opportunity to expand their markets by making their products enticing and useful to millions of more people than the previous generation of products did.
Unfortunately, this also came at the expense of the apparent "dumbing down" of computers, as every new abstraction hid more of the actual computer users had to interface with. And it also made things easy to control and lock down for corporations.
But I don't think we would've seen the explosion in the popularity of computing had this played out any other way.
I also disagree with the article's premise that power users are dying. We're still here, but we're a tiny minority of computer users now. We're both amused and frustrated at the insanity of where technology is taking us, and who is leading us there, but we still have our corners of computing we can retreat to.
And I also disagree that our favorite layer of computing is somehow more "real" than anyone else's. We scoff at Gen Z's inability to use the terminal as much as Baby Boomers scoffed at our inability to program in assembly. It's all relative. Except "AI". That is more of a disabler than an enabler, even though we're too hypnotized to see it now.
[1]: Yes, the BlackBerry was a cultural phenomenon, but it didn't have the capabilities nor mass appeal of the iPhone.
The user's other blogs also have the same telltale em dash usage and usage of “” instead of "". Reading obviously AI generated text feels unpleasant. I don't know how much thought was put into this. Is this just "hey Claude write me a blog post about the death of the power user"? Maybe they just used AI for proofreading. I can't really know but it's hard to justify putting in effort to read this when there could have been zero effort put into writing it.
For the record, I am not an AI.
I was ambivalent about it. The way the arguments are presented don't read as much like AI.
Perhaps it was indeed just for proofreading. Although I know that MS Word turns single dashes into em-dash. Maybe they used Word when writing the thing?
It's one of the reasons I am sort of abandoning dashes when writing and using instead semicolons.
Oh for fuck’s sake. Can we please stop with this nonsense? You must think my blog is LLM-generated too.
I’m so tired of the illiterate denying the humanity of the literate. How do you think LLMs were trained? On human writing.
I have no idea what your long form writing looks like. But can you look at something like this https://fireborn.mataroa.blog/blog/if-you-recognize-me-in-pu... and seriously come out of it not thinking it's AI slop?
> The version of me you know — the writer, the ranter, the one who tears into accessibility failures or rips Linux a new one — that’s a persona.
> Not fake. Not dishonest. Just deliberate. It’s tuned for the internet, built to survive in a world that eats subtlety alive. It’s the volume turned up, the emotion sharpened, the thoughts sculpted until they’re worth reading.
Or perhaps the radically different stylistic decision of using emojis and using italics and boldface for emphasis in this older blog https://fireborn.mataroa.blog/blog/hellcaptcha-accessibility... could be more convincing?
But it’s not justification. You made your argument worse, not better, by citing em dashes and quotes.
Wikipedia uses these, among other characteristics, as potential signs of AI writing: https://en.wikipedia.org/w/index.php?title=Wikipedia:AICURLY https://en.wikipedia.org/w/index.php?title=Wikipedia:AIDASH
From the same article:
“Do not rely too much on your own judgment.”
“human editors and writers often use em dashes”
It is obviously impossible to be able to tell with 100% certainty whether or not something is written using AI. But I think this most likely is. Could be that it's just copy edited with AI and not wholly AI slop. Who knows. Either way, it reads very much like AI to me.