In defence of traditional computer science learning for students in the age of AI
The practical lessons learned in my Computer Science degree that are only clear in hindsight
While in college, I was forever questioning what seemed like an outdated curriculum for the age of machine learning and AI. I couldn’t shake the feeling that I was wasting four years learning things that would never be practically useful and that I could get much further much faster through the one-off courses on various MOOC platforms. I always felt like my curriculum was holding me back rather than giving me a leg up.
I did finish my degree, however, and gradually climbed from intern to senior engineer while watching the tech world progress from classical machine learning to LLMs and ChatGPT in real-time. In the last two years in particular, I’ve started to see the benefits of my “outdated” traditional learning in contrast to many who may not have had the same journey.
Disclaimer before we get any further: You can absolutely get the same benefits outside of traditional schools. Online platforms like boot.dev, low-level academy, and many YouTube creators have done great things to democratise this information. Everything you can learn in college is free or significantly cheaper online - it’s just up to you to find and study these things on your own.
In defence of “outdated” courses
In college, we had courses around data structures, algorithms, graph theory, database management etc that I knew were “useful” courses. Others like compiler design, computer architecture, and computer networks - I just could not connect the course material (a lot of which was from the late ‘90s!) to the present world (coding in assembly?! I could not for the life of me imagine why we would be learning that and not Python or Go).
In hindsight, I can now see that I was really learning the fundamentals of how computers work, much of which hasn’t changed much from the punch-card days of the early 1900s. You can learn Go, Rust, or the latest javascript framework till the cows come home but knowing the fundamentals of DNS, TCP or SSL may well be what will come to the rescue when that legacy microservice written by an engineer who has since left the company suddenly stops working even though “there are no code changes” and “it works on my machine” because it’s running with outdated libraries in an outdated OS (no that hasn’t happened to me, why do you ask?)
Similarly, once you deeply understand how computationally complex it is to parse regular expressions, you’ll know never to use them in production.
Once you write enough assembly code, you can start to internalize why you might use recursion in Leetcode but simple iteration is better for production.
Only when you have deeply felt the pain of memory management and segfaults in C can you appreciate garbage collection in modern languages (and also know when and how and if you should ditch the GC languages and manage your own memory).
The number one source of computer vulnerabilities is still related to memory corruption - buffer overflows, memory leaks, data corruption, etc. We need to build an understanding of these fundamentals so that can we move towards a safer, more secure digital future.
In defence of a lack of “current” courses
Aside from the fact that there’s only so much that can be taught in a classic four-year program, the fact is that computer science is more rapidly developing than ever and the single most crucial skill to develop is learning how to learn.
Learning does not stop when you get the course completion certificate - if you are in this domain, you will have to learn every single day on the job. Gaps in the set curriculum are the best place to start building this skill. Most places will not teach you how to build APIs or applications or real-world projects from scratch. They won’t teach you how to configure (or maybe even access) remote servers, how to package and run your code safely and securely, how to debug and resolve the edge cases and the nitty-gritty issues that will be a majority of your job as a developer in the real world. Learning these skills independently will teach you more than a course ever could.
Build a game, a CLI tool, a web app, an android or iOS app - whatever it is, build it from scratch and try to host it remotely. A whole world of practical learning happens between “it works on my machine” and “it works for everyone”. Software development experience is not found as much in writing code as in deployment, instrumentation, monitoring, tooling, debugging and everything else in between.
In conclusion
All this to say - if you are in a traditional course, take what you can from it - I can guarantee that if you see it as learning how to learn and not literally “how to write recursion in assembly” it will set you up for success. If you are not in a traditional course, back up a little from the high-level programming languages and technologies that today’s computing systems require and spend some time in the low-level weeds of how the computing world works. You will be a better developer for it.
Stay sharp!
PS - The Algorithm threw this video at me as I was writing this post and I highly recommend watching it -
Resources
https://www.boot.dev is a great resource for a comprehensive, hands-on backend engineering curriculum
https://lowlevel.academy is a great resource for the nitty-gritty fundamentals of computing