On May 6, 1937, the Hindenburg airship burst into flames while docking, causing 35 deaths and bringing the airship era to a sudden close. In hindsight, it seems tragically obvious. Fill a giant bag with highly flammable hydrogen gas and trouble is bound to follow.
But hydrogen is also especially buoyant, which means airships filled with hydrogen could carry more passengers and more cargo than airships filled with other gases. Helium, which is not flammable, would only allow for about one-quarter the payload, and it’s more expensive and harder to obtain.
Someone, at some point, decided that the functionality offered by hydrogen was worth the risk. Maybe they didn’t fully understand the risks, maybe they underestimated the risks, or maybe they just closed their eyes and hoped for the best.
People who build software face similar choices about risk all the time. As a species, we build machines to do work for us, but frequently we neglect questions of safety and security in pursuit of functionality.
Software applications are nothing more than extremely complex machines. Software has enabled us to layer complexity on top of complexity to do wonderful things, but our ability to make sure things work right lags far behind our ability to make sure things cannot be exploited by bad people.
Part of the problem is the dizzying pace of innovation. We simply haven’t had time to get used to building one kind of software before we start building the next kind of software, and the next. Software technologies come and go like mayflies, and we’re just building, building, building as fast as we can.
What is becoming acutely clear is that security cannot be separate from software development. If you don’t think about security when you’re designing software, you will probably have design flaws that no amount of clever coding will be able to fix. If you don’t think about security when you’re testing software, you will leave vulnerabilities in your software that attackers can exploit. If you don’t think about security even after release, you won’t know about new vulnerabilities found in open source components that are part of your software.
Thinking about security through every phase of development is the basis of a secure software development life cycle (SDLC). But that’s only the beginning of the story. Software products developed using a secure SDLC can still be deployed and used in an insecure way.
For example, imagine you run a bank. You carefully research online banking software and find a vendor that uses a secure SDLC to create its product. But if you deploy that software into a cloud cluster with weak authentication or poorly configured access, attackers can compromise the software and its data. And if you deploy the software but assign every user the same password, attackers can access anyone’s account. And if you deploy the software but forget to use HTTPS, attackers can see passwords and data as they go over the network.
It’s never safe to assume that security is the responsibility of your vendor. Obviously, you must find vendors that are using a secure SDLC to minimize risk, but the deployment and operation of software are equally important. Even when you don’t create your own software, security must be deeply embedded in how you acquire and operate software.
Part of what is so exciting about software is that it is so easy. Anyone with a computer and the necessary gray cells can write software. A team of four developers working in a garage can create an application that upends an industry.
Writing better, more-secure software requires different thinking and different planning. Nevertheless, it’s the only way. A team of four developers working in a garage can create a better, more-secure application that upends an industry.
We’re all learning at the same time. Interestingly, universities are struggling to catch up to the idea that security and software development are inseparable. The analyst firm Forrester observed in an April 2019 report that not one of the top 40 computer science programs in the United States required a security class. We’re still churning out developers that know how to make things work, but don’t know how to make things better and more secure.
In the end, it’s all about risk management. As fast as we build software, criminals move just as fast to learn how to exploit software. The recent plague of ransomware is a good reminder of how far ahead of ourselves we’ve gotten with building things.
Making software better with a secure SDLC lowers risk, but overall risk won’t be lower until the entire ecosystem understands how to build, deploy, operate, and maintain software with security in mind at every step.