Andy Hedges' Blog

Learn the Fundamentals not the Shiny New Technology

Too often in software engineering people are too quick to focus on the latest technology and reticent to develop their knowledge of the fundamentals. I suppose the reason for this is to build marketable skills. Recruiters and employers tend to focus on the technology their hiring managers (often other engineers) ask them for, which is usually the latest shiny thing and so the cycle continues.

Let’s start with describing what I mean by the fundamentals with some examples. The first set are typically the things you’d learn in a good Computer Science undergraduate programme. Examples are:

  • Data structures
  • Algorithms
  • Operating system design

The next set are generally good things to know for any software engineer but aren’t always covered in mandatory modules in CS degrees. Things like:

  • Design patterns
  • Networking
  • Low level computing
  • Cryptographic primitives
  • Bit twiddling

I could go on but you get the idea, you won’t regret learning these things, whatever you go on to do in software engineering you’ll find them useful, possibly essential. They will help you learn the latest shiny technology more quickly because you’ll know the building blocks of such things.

There’s more too, if you skip learning one to two cycles of shiny things you won’t notice it in a few months or years. Does anyone regret not learning grunt, then gulp, then webpack, not really, you just need one that works and nothing too bad happens if you skip one. Have little to no knowledge of networking or algorithms you’re severly going to limit what you are capable of.

So, should you avoid learning shiny new technology altogether? Absolutely not! Whilst learning those technologies you’ll pick up reusable skills for other tech, but guess what, those reusable things are the fundamentals. Thus if you want to accelerate your speed or learning, reduce those difficult learning curves, make sure you’re spending enough time looking at the basic underpinnings of computer science, they’re so important.

Andy Hedges