Entropy and the art of secure software development

code 459070 1280

Software development entropy

Credit: Benjamin Nelan

Those who paid attention in science class should not be surprised by the almost daily deterioration in business information security. Entropy, the second law of thermodynamics, is defined by Merriam Webster as "the degradation of the matter and energy in the universe to an ultimate state of inert uniformity." For some reason, many of those in the application development world seem to think they are not governed by the laws of thermodynamics. I would suggest otherwise.

Consider the following scenario: Acme corporation is developing a new web application. They have a well-defined software development life cycle function and follow development best practices. Their developers receive ongoing security training, and consider OWASP Top 10 in their development effort. They use a commercial tool for checking their code and conduct internal and external vulnerability testing. Any defects discovered are fed immediately back to the developers for resolution. They deploy their application in accordance with a defined change management process, with executive review and approval. Everything done, by the book, a model development operation. Once a version is done and tested, they immediately focus on the new functionality requested by marketing for version two.

In a static world, their "secure" application would pass additional vulnerability testing next week, next month and next year. Sadly, the information security world is anything but static, with the pace of deterioration increasing monthly. Examples of the changes that impact ongoing application security include:

  • Operating system patches and configuration changes on server and client
  • Browser changes
  • Discovery of previously unknown vulnerabilities
  • New hacking techniques

Some years ago, maintaining good information security was much like a chess match -- relatively slow moving, and at least somewhat predictable. Now, it is more like running NORAD -- a constant, rapid flow of new information that must be considered and addressed. Thus, if you are resting comfortably in the knowledge that your web application was secure last month, you are resting on an invalid assumption.

space center 693251 640

Software in general, and web applications in particular, must be continually evaluated and monitored. We cannot control the changes that occur, so we must make good security practices a continuous process.

The recent CareFirst breach serves as a great example of how not to ensure adequate ongoing information security. According to a report in CSO Online,  the vulnerability was first discovered in 2014, and was believed by company officials to have been resolved. Ten months later, they learned that they were wrong. The good news is that they apparently discovered the issue proactively, but they offset that success by failing to add the issue to their ongoing monitoring process.

Another timely example is the U.S. government breach reported this week. According to a New York Times article, they were warned about the vulnerabilities by the inspector general last year. They might have considered the vulnerabilities a low priority at the time, but once again entropy prevailed.

Until we accept as an industry that secure systems and networks are a rapidly moving target, we will continue to hear breach reports like that of CareFirst. We must succeed in integrating appropriate information security practices into our day-to-day operations. I would suggest the following ideas to help accomplish this:

  • Security monitoring must be someone's daily focus. Larger organizations are creating security operations centers, which do for security what network operations centers do for network stability. Smaller organizations may not be able to afford such an operation, but this does not relieve them of the responsibility. They must dedicate someone to this, be it an employee or a vendor.
  • The security monitoring process must be dynamic, adapting to day-to-day changes in the threat landscape. The person or persons overseeing this must have time to read, study, and think, applying their findings to the monitoring process. An organization with the same person responsible for daily security monitoring and keeping up with threat intelligence and new discoveries is destined to failure.
  • Tools need to be part of the monitoring strategy. There are a wide variety of automated monitoring and testing tools available, and using some of them is essential to keeping up with changes in a cost-effective manner.
  • Automated tools cannot be installed and ignored. This may seem to contradict the above point, but both are true. Tools are important to the process, but no single tool can be assumed to be right and current all of the time. Some redundancy, either multiple tools or a single tool with human monitoring and testing, is essential to the process.

I confess that I have never read any books by William S. Burroughs, but one of my favorite quotes is attributed to him: “When you stop growing you start dying.”  We in the information security world must grow every day to keep up, because I can assure you that the hackers do.

This article is published as part of the IDG Contributor Network. Want to Join?

The march toward exascale computers
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies