IntangibleMatter

Training for the Future

or, Why I'm an Advocate of the Trial by Fire



On Thursday, the 23rd of May, 2024, my high school psychology teacher claimed that “the future started 2 days ago.” He was referring to GPT-4o and how in another class of his several students came forward saying that, despite the anti-AI measures that had been implemented in various courses, they still were able to “cheat” with artificial intelligence and not be caught.

While his lecture had several points, the core was this: AI is the future, and we need to rule it so that it doesn’t rule us. Being a neophobe will hurt you incredibly, and will ultimately leave you behind.

This isn’t what I want to discuss, though.

A secondary point that he brought up several times was how new ways of doing things that make things easier (for example, calculators) make older ways of doing things obsolete, at least in part, and as such people spend less time bothering to actually learn them. He stated that the same thing was to happen in several fields shortly due to the use of AI.

This thought about spending less time actually learning- actually building skills in a given field is something I’ve been thinking about a lot lately, because I keep seeing it happen in many, many different fields. For me the one that sticks out is computers, because that’s where I spend a lot of time.

I’ve noticed an incredibly sharp decrease in actual computer skills over the past several years, especially among younger people, even though it’s even more prominent than ever. This is due to the babyproofing that is provided by walled gardens and other systems that don’t trust the user. By making technology more “accessible” to everyone by making it more “foolproof”, we’ve decreased the actual computer skills of people. If I had a nickel for every time that I heard someone younger than me- or the same age as me- asking how to print, how to save or find a file, or even what a file is, I wouldn’t have enough money to actually buy anything, but I’d still have an upsetting amount of nickels. It’s similar with the infamous “calculator claim” where someone claims that they don’t need to learn math because they’ll always have a calculator on them, among other similar claims made in similar scenarios.

(This is all not discussing how many small children I see who have their own smartphones- which is a similar issue.)

While there’s no perfect solution, I propose a universal policy of learning the hard way before being granted access to the easy one. This method is inspired by math classes, where a teacher may provide a complex proof, and a convoluted method when introducing a new subject, and once the students are decent with the hard way, having proven their competency, they are granted access to the “easy” version, the shortcut that saves time and brainpower and lets them spend time on other things.

Let’s take the computer example from earlier as a method of applying this.

Instead of being introduced to computers through a simplified design where much of the complexity is abstracted away (for example iOS, Android, modern Windows, ChromeOS, etc), people should first be introduced to computers on a more difficult system, which could be discussed as to what it should be, but I always use the example of Windows XP as a half-joke. Linux may be another good example of an OS to be started on. These have a higher skill floor, and let you break things in a way that modern systems don’t. Then, once competence has been established in the more difficult environment, they are granted access to the more modern, “easier” tools, but with the knowledge of how to properly use them, why they work how they do, and how to fix things when they go wrong. If we let people mess up, but have support for when they mess up, this lets them learn from their mistakes, and ultimately become better at whatever it is they’re doing.

I’m an example of this being effective. I grew up on XP, and fucked my computer up several times over the years. An early example was seeing that there was a folder on the computer that seemed to me to be a better place for my music library than its current location, so I moved it over. This, of course, broke iTunes, and made it so that no songs could be played easily. When my father discovered the issue I was encountering, he helped me fix it, and in doing so I learned more about how computers work, and how to fix them when they break. When I started to use the more “foolproof” tech I tended to push up against it, as I knew that there was more that I was able to do when given full access to a system, and as such, while I appreciate the easy-to-use nature of some things, I tend to deliberately choose more difficult environments (I’m writing this using Neovim on Linux, after all) because they let me do more.

Being able to mess up and learn from my mistakes made me who I am, and it’s a shame that we’re taking that opportunity away from people.

I don’t expect people to take this proposal particularly seriously, but I nonetheless think that if properly considered, and implemented well, this would genuinely be one of the better methods for learning things in the modern day.

At the very least, we’d have students complain about driver errors and being locked out of fixing them themselves instead of asking how printing works in the first place.

<- Reconciliation Who Cares If I End Up on the No-Fly List ->