We live in a world that runs on software. In 2011, Marc Andreessen declared “software is eating the world,” and in the ensuing four years, software has only become more voracious. Software pervades every aspect of our lives, from the things you touch every day (laptops, smart phones, TVs, cars) to the infrastructure of society (hospitals, utilities, transportation, finance, government). It is difficult to imagine a device, system, or organization today that performs some meaningful function and does not depend on software in some way.
At the same time, nearly all software systems and devices connect to the Internet or other networks to send and receive information. Some studies estimate that by 2020 there will be over 50 billion devices connected to the Internet. The interconnected, always-on nature of our world demands systems and software that are built from the ground up to be safe, robust and secure. Without security there is no privacy. Without privacy, there is no freedom.
Buyers of software-based systems and devices deserve the means to assess and manage acceptable levels of risk associated with their use. Builders of these systems and devices need tools and processes that enable them to produce safe, robust and secure products without compromising on functionality and speed to market.
This is what makes me so excited about Codenomicon joining forces with Coverity as part of the new Synopsys Software Integrity Group. Both Coverity and Codenomicon were founded around the same time, and have both worked tirelessly to perfect our respective security testing technologies. We both have fought for the betterment of software, and as an extension, the resilience of our interconnected world.
Now, the combination of the Codenomicon industry-leading suite of black box security testing technologies with the Coverity award-winning source code analysis solutions results in an unprecedented suite of solutions that can be leveraged to meet the software security needs of both buyers and builders. Buyers of software can use our combined solution suite to assess and mitigate the risk associated with procuring or using a piece of software. Builders of software can use same suite of tools to locate and remedy software bugs and vulnerabilities.
In order to better understand how to deploy our combined product suite, I’d like to propose a concept of total vulnerability management.
The Internet is a hostile and unpredictable environment. Both reactive and proactive approaches are needed to build secure products and to safely operate them. On the reactive side of the total vulnerability management paradigm, products or organizations should adhere to established security best practices such as running anti-malware software, deploying firewalls, enforcing appropriate security policies and controls, managing their known vulnerability exposure, and subscribing to threat intelligence feeds, to mention a few. These are all needed to appropriately respond to and react to constantly changing threat landscape.
Alas, I assert that many of the reactive methods mentioned here are needed because we need to protect ourselves from poorly implemented, designed or configured software riddled with known and unknown vulnerabilities. Broadly speaking, known vulnerabilities in their various forms are errors that are generally known to the public and for which patches or remedies exist. Managing known vulnerabilities and the vicious patch-and-penetrate cycle results in significant total software lifecycle costs for both buyers and builders of software. Instead of treating the symptoms of poor software development, we could more effectively reduce the cost and risk of operating software by addressing the root cause. For implementation or programming bugs, this occurs during the development process, so that the bugs, which would become vulnerabilities in a live environment, never get released into the wild.
An unknown vulnerability is a dormant bug that exists in a software or system. It is typically a result of a programming mistake, and it can be used to crash the software, cause a denial-of-service condition, amplify DDoS attacks, compromise the affected systems or cause, for example, leakage of private information. The vulnerability is unknown because it has not been discovered by anyone, or, if it has, it hasn’t been made publicly known to those who operate or produce a given piece of software. Attacks based on unknown vulnerabilities are often undetectable by typical anti-virus (AV) or intrusion detection systems (IDS). Computer worms such as Code Red, Stuxnet and Flame leveraged unknown or unpatched vulnerabilities to propagate.
Given the availability of source code, static source code analysis today can be highly accurate and can find complex inter-procedural coding defects that our human eyes would never see. It is almost unthinkable to develop security-critical code–that is, almost any product today–without using a static source analysis solution. Using static source code analysis is a natural first step in producing safe and secure software. The Coverity static analysis platform is the preeminent source code analysis solution.
Dynamic fuzz testing complements source analysis and adds another layer of defense against unknown vulnerabilities. It can be deployed without access to source code, although when combined with source code and appropriate developer tooling, located defects can be quickly and effectively remedied. Fuzzing can bring additional confidence over source analysis results, detect vulnerabilities in binary-only third-party components, and address less common sources of vulnerabilities such as compiler mistakes and hardware flaws. Defensics is a comprehensive platform for discovering and remediating unknown vulnerabilities in software and devices. Defensics sends deliberately malformed, fuzzed inputs to expose vulnerabilities in software and systems. Fuzzing remains the leading method for both black hat and white hat hackers to locate unknown vulnerabilities. In the spirit of total vulnerability management, fight fire with fire!
Static binary analysis, referred to by Gartner Research as software composition analysis (SCA), is a relatively new technique that can be used to address and manage vulnerabilities in third-party components included with software. “Using components with known vulnerabilities” was included for the first time in 2013 to the OWASP Top 10 sources of vulnerabilities in software. Software is only as secure as its weakest link. Your first-party code can be perfect, but a single vulnerability in a third-party component can jeopardize the security of the entire software package or system. Keeping track of your third-party components (what we refer to as a software bill of materials, or BOM), as well as the vulnerability exposure of the components, can be a daunting task.
Our binary analysis solution addresses this challenge by providing builders and buyers with visibility into their software’s composition, enabling them to manage and secure their cyber supply chain. It can be used by both buyers and builders to produce a software BOM from a compiled binary or firmware and track vulnerability exposure to make informed risk decisions.
As Codenomicon and Coverity join forces, I look forward to what we can achieve by combining the functionalities of and metrics produced by our products, resulting in more powerful analysis and faster remediation of software vulnerabilities. At Codenomicon, we aspire to build a more resilient world, and by joining Coverity as part of the Synopsys Software Integrity Group, we will achieve that goal in a capacity that was not previously possible.
We are facing some very exciting and challenging times in the cyber trenches. Our goal is to help you become Truly Covered™ with the deepest level of protection, from a suite of the best-of-the-best products, all automated and integrated seamlessly into your built-in security software methodology.
Let’s get started.
Mikko Varpiola
Co-Founder, Codenomicon