On September 1, someone posted to BugTraq the code to Code Green. The code, which ostensibly fixes systems that are still infected with the Code Red virus, was left for users to assemble and use--if they wanted. The author, Herbert HexXer, added the following: "I will not take responsibility for any damage that might be caused by this code. Be sure to have understood the code and it's [sic] purpose before beginning to play with it." Another post included the code for CRclean, which was deliberately broken by its author, Markus Kern. Both were intended to force the issue: either you patch your system, or I will find a way to do it for you.
The patch for the .ida vulnerability that Code Red exploited existed for some time, yet a number of IIS servers (for whatever reason) remained unpatched. As I write this, yet another primary color worm, Code Blue, is attacking IIS servers that have not patched the Web Server Folder Directory Traversal vulnerability. The existence of Code Green and CRclean demonstrates the desire by some to begin automating the process of installing patches. As with any innovation, there are pros and cons.
Worms, by design, propagate rapidly in network environments. They seek out specific types of servers and specific flaws. So, doesn't it seem reasonable that a patch could be engineered to do the same thing? A new vulnerability could be announced and then patched worldwide in a matter of hours by a do-gooder worm that terminates upon the successful installation of the patch.
One immediate question I have is, who would administer such do-gooder worms? How would you feel if Microsoft, for example, started issuing worms? Would you feel any better if the do-gooder worm arrived courtesy of some second-year network administrator for a security-related startup company? And given that software that is sold over the counter is not currently held to any set standards of excellence, who would verify that the do-gooder worm itself wasn't buggy--or worse, might open your system to more malicious worms in the future?
Even if the do-gooder worm were solid, the patch itself is sometimes the most inconvenient part of the process. Convenience is often the reason people and companies alike don't always patch their systems. For example, say that in order to apply a certain patch, the server needs to be rebooted. Maybe it needs to be rebooted more than once. If you're running a commercial Web site and the do-gooder worm comes along in the middle of robust traffic, your site might suddenly go offline while the do-gooder reboots your server. Not good, especially when you can't control the timing of the reboot process.
I favor creating a disinterested third party to establish industrywide software standards and a central repository for all vendor-supplied software patches. I know, there's already CERT and SANS, but I'm thinking of a new start-to-finish group that would test and certify all software for uniform standards regarding security, integrity, and most of all, vendor accountability. This group could raise the level of excellence expected from each participating vendor, spur real programming innovation, and perhaps even level the playing field so that smaller developers might grow and flourish.
I realize I'm being vague, so go and discuss the details amongst yourselves. My point: if we don't figure out real soon how to resolve the problem with patching software, then someone's going to write a worm that'll do it for you, on its own terms. Like it or not, automatic patches might fill the vacuum while everyone struggles to find an answer that pleases everyone.
What's your opinion of automatic patches? Will they be a help or just make matters worse? TalkBack to me.