When Dartmouth College launched the Basic language 50 years ago, it enabled ordinary users to write code. Millions did. But we've gone backwards since then, and most users now seem unable or unwilling to create so much as a simple macro
Also most student-age people today who would have become programmers 20 years ago probably won’t, because AI will be generating most code. The definition of “programming” will change to writing and tweaking effective specs for AI to generate code from. Back in the 80s and 90s I liked to say our ultimate goal as programmers was to eliminate our own jobs. Well I’ll be darned…
50 years ago people thought everyone would be able to program using BASIC, now you think everyone will be able to program using AI. It seems nothing has changed in 50 years.
What’s funny is AI is learning from developer code to write code. If it runs out of this dataset it has to eat it’s own output. This is a recipe for disaster.
Software dev myself (retired) and I’ve been very skeptical about AI generated code, but a friend of mine uses it daily in his work. During one of our in-person D&D games he told it to create a SQL Lite app to keep track of some game info, and in seconds he was using the app. AI is currently a super-emotional issue riddled with misinformation and fantasy, but there’s no denying its usefulness.
Also most student-age people today who would have become programmers 20 years ago probably won’t, because AI will be generating most code. The definition of “programming” will change to writing and tweaking effective specs for AI to generate code from. Back in the 80s and 90s I liked to say our ultimate goal as programmers was to eliminate our own jobs. Well I’ll be darned…
50 years ago people thought everyone would be able to program using BASIC, now you think everyone will be able to program using AI. It seems nothing has changed in 50 years.
Well, reading comprehension hasn’t changed. I said “most” not “everyone”. Amazingly the world isn’t binary.
What’s funny is AI is learning from developer code to write code. If it runs out of this dataset it has to eat it’s own output. This is a recipe for disaster.
Software dev myself (retired) and I’ve been very skeptical about AI generated code, but a friend of mine uses it daily in his work. During one of our in-person D&D games he told it to create a SQL Lite app to keep track of some game info, and in seconds he was using the app. AI is currently a super-emotional issue riddled with misinformation and fantasy, but there’s no denying its usefulness.
Terrible take.