Colleges Don’t Teach Useful Software Development

Colleges Don’t Teach Useful Software Development

You might also like

With regard to what I shared in the previous post – What Do You Expect From Being a Software Developer? – the author provided 10 things I enjoyed thought it’d be a good exercise to look at each point through my own experience.

At the very least, perhaps these things will be something to keep in mind if you’re looking to enter the industry or even for who have been working in the industry. At most, that article is something that provides a solid perspective from one person’s experience (which likely echo many others, too).

This lead me to think about my experience both in college and in my career in software development and how it relates to the ten points mentioned in the last post. It’s easy to say colleges don’t teach useful software development, but how true is that? Again, I’m not arguing the original post. But perhaps my perspective is a different.

I’m not going to give a deep dive into my entire history doing this (who would want to read that, anyway? 🙂 but why not share my on the 10 points mentioned?


Distinguishing Between Computer Science and Software Development

I think of computer science and software development as two different things.

This is an important distinction because if you’re talking to someone who works in the field of computer science, they may or may not be in the field of software development; however, if you talk to someone in the field of software development then it’s likely they’ll be doing exactly what the title implies.

And as the field has grown in the last 10-15 years, entire degree programs or certification programs specifically for software development have emerged (as opposed to those just for computer science). That said, there are also degrees and certifications that have a foundation in computer science with a focus in software engineering, software development, web development, and so on.

All of this is worth mentioning because when we start talking about the case of if college will prepare you for a career in software development, I think it’s important to be nuanced enough to map what the college program was teaching versus what you’re expecting from the job market.

Further, it’s also worth stating that it’s not required or necessary to go to college to get a job in this job market. This was the path I chose; others have their own.

“College Will Not Prepare You for the Job”

This is where I think it’s important to distinguish between computer science and software development. When I was in school, I majored in computer science with a focus in software engineering.

If I had to summarize it, I took a lot of classes that were focused on computer science – that is, a lot of math – and then I had a lot of courses on software engineering – that is, how to analyze, design, build, and/or maintain software. For experience, I worked as a teaching assistant and participated in a few internships.

So there’s my context for the experience I’ve had thus far.

In the original article, Mensur writes:

The college will prepare you for some basics, but what most of the colleges teach is so far away from day-to-day jobs. Most of the professors who teach at universities are not good software engineers.

Only a small percentage of them even worked as software engineers. Also, the university curriculums are heavily outdated. They trot years behind the software development market needs.

college will not prepare you for the job

There are three main points in this passage I found most relevant to what I’ve done over the last 16 years (including what I’m doing now).

1. Most Professors Are Not Good Software Engineers

I can definitively say in my experience, you could tell who were the academics and who were the practitioners.

This isn’t to say either is better than the other. If, however, you’re going into the job market then those who have been there have something to offer than those who have been in academia. Conversely, if you’re looking to continue on the track of research and education, then those who have worked in the job market may not be the ones who are your best resource.

Even then, you have to know if the professor is working at the university for research or they are practicing software development by building out something that’s yet to be made publicly available. Some institutions do that and, as such, it’s not as easy to cleanly divide the two.

First, I can unequivocally say the professors who had the greatest impact on my career and my interest in software as a career were those who had spent time outside of an academic setting and worked in the industry (in fact, I still remember their names and many of the projects we did for those classes because they were designing from real world scenarios). A lot of practical skills were taught.

Secondly, I’m not saying those who had not worked in the industry weren’t good professors. Their methods of teaching and the projects they assigned felt far more academic in nature. A lot of theory was taught.

I know: Giving two examples relevant to my own experience doesn’t make or a solid case in one direction or the other. The goal is just to add another perspective to another published article.

The point is simply this:

  • The majority of the professors who prepared us for working in the industry were those who had already been there.
  • Those who had not worked in the industry were teaching foundational concepts – they were teaching us how to think.

And I’ll have more to say about that later.

2. University Curriculum Is Heavily Outdated

This point isn’t going to be very long because I can’t speak to the state of university curriculum. Generally, it’s a strong maybe from me. I’m sure it depends on the university or the program in which you’re enrolled.

At the time, I didn’t feel like much of the content I was learning was out of date. But it’s hard to know when you’re in school, right? What do you have to judge it against?

But looking back, there’s only one course that I took that felt out of date for me and it didn’t so much have to do with the content but it had to do with the language we were using: Smalltalk.

There’s a caveat here, though: Part of the reason we were tasked with using Smalltalk was because it was also to help inculcate the foundations of object-oriented programming where everything is an object that receives messages (versus classes and functions that are invoked on an instance of the class).

So even that served its purpose because it told us how to conceptually adapt to a weird requirement.

And for those who are wondering, there were a lot of languages used in school at the time (Java, C, Python, Smalltalk, JavaScript, SQL, and so on). But to say this is heavily outdated isn’t as true as it might be for others.

  • Was Smalltalk outdated? Yes. But I recognize the concepts taught in that class along with designing or improving object-oriented based systems were the most useful.
  • What about working with a team to scope, design, build, and test a photo sharing application? This was more applicable to what I was going to be doing day-to-day.

Now, I’ve probably forgotten certain things we were taught that are no longer needed – as is apt to happen – but one thing I distinctly remember learning that I only used in my first job out of school were UML diagrams.

I get the point of them, I understand what they are supposed to convey, but rarely do teams have the time to do this when tasked with a project. Further, it’s harder to diagram part of a project when a much larger system already exists (especially if even part of it is a legacy system).

3. Learn How to Learn

As I mentioned in the first point, the biggest advantage that I came away with when graduating with my degree was that I had learned how to learn.

No, I didn’t know all the languages but I didn’t need to know them. No, I didn’t know all the various tools, systems, languages, platforms, and so on, but I knew how to learn and adapt. And that one skill alone has likely been the one that’s paid the most dividends.

So if you’re in a program – be it something from freeCodeCamp, a course in high school, at a local college, a university program, or anything in between – it’s important to learn how to pick up something new and integrate it into your workflow.

For example, when you understand how something, say dependency injection, works in one language, it’s more or less matter of semantics when using it in another language. And that’s because the act of being able to do a thing transcends the tools used to do the thing.

Anyway, figuring out how to learn is key and we all have different ways of doing it so it’s important to find what works best for you. If that means going to office hours, doing independent reading, watching more content on YouTube, or meeting with the professor or a group of other students, or hanging out with like minded at a conference, then do it. Whatever opportunities available that are tailored to sharpen your ability to learn, take advantage of them.

Ultimately, learning the why behind something helps you start formulating your own ways of how to learn something new. And when you get a good hold on that, picking up new technologies or applying something from a previous project or a previous job into a solving a new problem becomes easier.


Nothing I’ve shared here is a rebuttal to the previous article nor is it meant to even be argumentative. If anything, this is additional content. It’s an extension to an article someone else has already written.

And if anything else, this is a good exercise in writing something other than how to achieve something programmatically.

Random Articles

Being a Software Developer | Tom McFarlin

Being a Software Developer

Not everyone who works in software development has a degree in computer science (or a degree at all), and I’m…
Web Design