The Race2Zero contest at Defcon added a new voice, the voice of an eager young student from New Zealand, to the conversation regarding the problems of Anti-Virus scanner evasion that has been going on for years. At the base of the effort, the organizer wanted to demonstrate the ease with which AV scanners can be evaded by tweaking already compiled malware and reveal some of the more sophisticated methods that can be used for evasion. It was unclear if he had any experience or skills in the techniques himself. However, by organizing this event, he claimed that as a researcher, he would be able to somehow quantify efforts and results to help with cost/benefit analysis of software defense: “Quantifying how much an attacker must invest to circumvent the defences that a defender has invested in is a key part of being able to evaluate where best to place security spend to gain the most benefit. Race to Zero is one way in which we as researchers can proactively answer these and other questions, while at the same time challenging some of the best minds available in the security community.”
He wanted to demonstrate AV shortcomings by providing competing teams with a set of AV-scanner detected malware samples, one after another. The samples would be tweaked by the participants in a way so that the core activity of the software would not be changed but the file would evade on-demand file scanners and remain undetected by 32 scanners. Eventually, one team would race to “zero detection” on all ten samples first. And he wanted it to be fun — “Reverse engineering and code analysis is fun.”
What he succeeded in demonstrating, from what I could tell, is that there are high levels of complexity involved in the setup, preparation, support and understanding of his “competition”.
Understanding malware, an environment for working with it, the variety of antivirus products and their uses, PE files, assembly level programming, network traffic, exploits and their delivery vectors, and the relevance of each to AV scanner effectiveness, are all beefy topics that the organizers and their helpers didn’t seem to either fully grasp, have the resources to adequately deal with, or both.
Running a handful of command line scanners across a handful of questionably selected (a MS-DOS variant, several widespread worms from several years ago, exploits against Word 2000 without any copies of Word 2000 to test against, etc) malware samples to be modified doesn’t really provide the amount of quantifiable results to make large claims for a cost/benefit analysis of security defense and the evaluation of AV scanners. Professional AV test and review groups themselves have a difficult enough time carrying out this sort of evaluation effort with hundreds and sometimes tens of thousands of samples with days or weeks of paid and competent effort, often without the limits of a group of volunteer organizers and speakers attempting the project.
While the subject of the AV evasion black market is always an interesting one for those pushing a behavioral-based technology like ThreatFire, this first “competition” didn’t seem to live up to the attention that it received (as the organizer seemed to expect). We’ll wait for a technical paper that was proposed to be delivered:
“We hope to be able to give a presentation of findings from Race to Zero at DefCon, a paper has been submitted but a decision on it has not yet been made. Following the contest, when further analysis has been conducted, a technical paper will be publicly released.”
Maybe the public paper or an event next year will bring more interesting results with it.