Report: Software Quality Issues Costing U.S. Companies $2.08 Trillion

Content By DevOps.com

A report published today by the Consortium for Information & Software Quality (CISQ) estimates poor software quality collectively cost companies in the U.S. an estimated $2.08 trillion in 2020.

The report, co-sponsored by software quality and security tools provider Synopsys, attempts to calculate, at least theoretically, the cost of known software failures. The report identifies operational software failures, estimated to be $1.56 trillion, as the single largest cost. The next-largest cost contributor is unsuccessful development projects, which totals $260 billion, an estimated 46% increase since 2018, according to the report. The project failure rate remained steady, at about 19%, for over a decade, according to the report.

Finally, legacy systems issues contributed $520 billion to costs in 2020, a significant decrease from the $635 billion these issues cost in 2018.

That estimate does not include technical debt, which CISQ estimates at another $1.13 trillion. Technical debt costs have been growing at a rate of 14% per year since 2018, the report noted.

Joe Jarzombek, director for government and critical infrastructure programs for Synopsys, said it’s apparent there’s not enough focus on quality assurance within software supply chains. However, as organizations begin to realize how dependent their digital business transformation initiatives are on quality software, business executives will be forced to shift focus.

Unfortunately, most business executives today still don’t understand or appreciate how software is constructed. Most organizations don’t even have a bill of materials describing the components that make up an application, noted Jarzombek.

That issue becomes especially problematic in the event of a security breach, because too many organizations can’t draw a correlation between vulnerabilities and the impact on specific components of an application, added Jarzombek. At the same time, however, cybercriminals are becoming more efficient at scanning applications to discover and exploit those vulnerabilities, said Jarzombek.

Many organizations also don’t appreciate how adept cybercriminals have become at exploiting what are perceived as low-level vulnerabilities to distribute malware laterally across an application, said Jarzombek. There is a tendency to patch only the most critical vulnerabilities within an application, while the rest are left to future updates. When fixing low-level vulnerabilities is not prioritized, these vulnerabilities may remain over multiple release cycles.

Ultimately, Jarzombek said, rather than treating security separately from the application development process, it should be addressed as part of a comprehensive approach to software quality assurance. The immediate challenge, however, is that not enough developers have access to the tools they need to address these ongoing cybersecurity issues.

The impact of poor software quality on individual companies will vary widely, but poor software quality is itself a universal issue. Most software development teams are not deliberately ignoring these issues; instead, the processes currently employed to build software are not especially robust, mature or, for that matter, automated, leaving organizations vulnerable.

Leave a Reply

Your email address will not be published. Required fields are marked *