Why should developers be full stack?


Posted 21 July 2023

Written by Matt Thornfield

“It’s not enough to be a (insert your programming language of choice here) developer any more, you really need to be full stack”. More and more we hear phrases like this both from employers and job hunters. The term “full stack” first appeared in the mid 2000s, but its use has grown hugely over the last 10 years and it shows no sign of slowing down.

Take a look here to see the growth in use of the term “full stack” in worldwide Google searches.

In this blog post I’ll be explaining what the term really means and why it’s so important for companies and their developers to be “full stack”.

But first, a bit of history

I started my career as a professional developer in the late 1990s building applications that could run on Windows computers. I was required to know how to program in Visual Basic 6, and to use the Microsoft Access database system. At the time, this was all that was needed. The programming language, together with the tools it came with, which was a couple of desktop applications, was sufficient to build fully functioning applications. This was of course in the early days of the adoption of the internet, and many years before the development of the smartphone or tablet.

I worked on my first web application in the early 2000s. The thinking at that time was that “3-tier” development was the best approach. The idea was that you could write better, more maintainable software, if you split your code into 3 separate layers or tiers, that were independent of each other, but could work together. Those three tiers were 1. The user interface tier – what the user would see and interact with on screen. 2. The middleware tier – the business logic that your application needed to implement. 3. The data tier – where your data would be saved, typically in some form of database, and how the system would interact with it.

As developers our role was to write all the software and make it work. We might get the assistance of some specialists, such as database engineers to create the database, or graphic designers to come up with a nice front-end, but the developer’s job was to write the entire code for the full application.

Over the next 20 years, the world of software development changed remarkably. The tools, techniques, and even the programming languages needed to develop software have evolved significantly. The way we think about and approach developing new applications is very different today.

Splitting up the monolith

Perhaps one of the biggest changes is the development of “microservices”. To keep this simple, in my early days of programming, if we built a new piece of software we did it as a single application. There might be thousands of lines of code all in one application. Today we favour a different approach – we build lots of smaller applications which interact with each other, and together the suite of applications form the required software. This approach is known as microservices – I’ll save the reasons for this change and more details of what microservices are for another blog post.

We would now call what we used to build a “monolith” application – a giant single piece of code. A team of software developers might work on that application, but the thinking was “this is software, and you are a developer, so get coding”.  

As technologies evolved, became more complex and capable, developers started to specialize. When I moved into becoming a technical trainer, around 10 years ago, companies would employ a web developer to build a website, an app developer to build a mobile app, a back-end developer for back end systems, and the list goes on. These individuals would work as part of a team and between them would build the full application suite. However, each had their own area of expertise and specialization. This approach makes sense because no-one can possibly be an expert in every kind of technology that exists.

The move to full stack

When the term “full stack” was first introduced, which is believed to be in an article on TechCrunch back in 2004, it was used to describe web developers who were able to build more than just the website part of an application. Today this meaning has evolved, and it applies to mobile apps, desktop apps and potentially other types of software too.

There’s no formal official definition of the term, but a full stack developer is one who can develop all the different parts of a typical suite of software. We might expect a typical developer to be able to build:

  • Web, mobile or desktop applications (the “front-end” user interface)
  • Server applications (the “back-end” business logic, security and other key elements)
  • Data storage (normally some kind of database)

If this list looks familiar, it’s because it’s not that far removed from the 3-tier architecture that I referred to earlier. However the difference is that to be able to build these 3 elements today, the developer needs a much broader level of knowledge.

How do I become a full stack developer?

As a senior instructor at Neueda I’ve run full stack development courses for both those starting out on their developer career, and for more experienced staff who are retraining.

Below summarises what a good full stack developer needs to know, and an example of what we cover on one of our full stack courses, to match those requirements. There’s no single range of technologies or languages that applies here. For example you could swap Java for C++, React for Angular or MySQL for MongoDB. Rather than being an exhaustive list, this table is intended to show the typical range of knowledge that we expect developers to have.

Programming languages: Java, Javascript, HTML, CSS
Development frameworks: React, Redux, Spring Boot, Hibernate
Communication: REST standards
Databases: MySQL
Automated testing: Jest, Junit, Mockito
Build tools: Maven or Gradle
Version control: Git, Github or Bitbucket
Continuous Integration / Deployment: Jenkins

Can a developer really know everything?

This list might seem daunting to the new developer, and something I find myself saying quite often on our courses is that a full stack developer has the appreciation for all these parts, but they can’t be an expert on everything.  Except for in very small organisations or teams, most developers won’t find themselves responsible for an entire application suite.

Instead many coders today specialise in one or two aspects of application building. You might for example be a front end developer, who occasionally needs to get involved in the back-end, or vice versa; that would be quite typical. All developers in a team might need to use version control, and understand REST standards for example, but there is almost always a separation of responsibilities – you can’t be an expert on everything.

And that brings us on to the “why”…

Why is being a full stack developer important?

If you can’t be an expert on everything why is being a full stack developer so sought after? I think it comes down to being better able to work as part of a larger development team. If you can understand what your colleagues need to do, what their challenges and frustrations might be, because you could do that role yourself, and they can say the same about you and your role, you are likely to have a far more cohesive and effective team.

I remember someone saying to me, early in my career, that a skill that good developers need to have is the ability to communicate with a “business” person (someone who doesn’t work in technology). The idea is that if you can communicate effectively with the person who is asking for software, you are more likely to be able to build the software they want. In practice this means that the developer needs a minimum level of knowledge about the industry they are working in, whether that’s banking, insurance, defence, production lines, etc.

Well as the range of technology expertise as grown, what the good developer needs today is the ability to understand and communicate with other technology experts. If you are a back end developer, but you’re full stack, you’ll far more likely to be able to work productively with your front end developers and your data experts, because you know what their requirements, expectations, limitations and constraints are. It’s this reason why being full stack is so essential for creating a team that is truly productive.

About the author

Matt Thornfield

With a background in developing software for retail banking, ecommerce and video content delivery, Matt is an experienced instructor specializing in full stack development and dev-ops programs. He’s passionate about building confidence alongside technical skills and finding ways to overcome blocks to learning.

View Matt Thornfield’s profile