2013-07-01
Abstract
In the latest of his ‘Greetz from Academe’ series, highlighting some of the work going on in academic circles, John Aycock looks at the thorny issue of ethics in academic security research.
Copyright © 2013 Virus Bulletin
There is often a disconnect between academic security research and anti-malware industry research – in both directions. In the ‘Greetz from Academe’ series, Dr John Aycock, Associate Professor at the Department of Computer Science, University of Calgary, picks out some of the work going on in academic circles and summarizes the key points – Ed.
This month, I’d like to tell you about a trip I took to San Francisco. San Francisco is a beautiful city, with world-famous sights such as Fisherman’s Wharf, Alcatraz, and the Golden Gate Bridge. Or so Wikitravel tells me [1] – I was there for a workshop and saw exactly none of it.
As for why this particular workshop was interesting, I should perhaps back up a bit. I have always found the AV community to be very sensitive to ethics, and concerned with acting ethically. Sometimes it’s just statements in conversation of the form ‘X is unethical’, and at other times it’s on a larger scale. For example, I recall one VB conference presentation some years back in which the presenters had accessed a botnet’s C&C server and had debated using the botnet’s ‘kill’ command to try and clean the compromised computers in the botnet. It was remarkable to witness the audience’s wide eyes and the collective drawing in of breath as the ethical and legal ramifications of that action hit home. Even the vetting process for new recruits used by one AV company sounded like an evaluation not just of technical chops, but of whether or not the potential employee shared the same ethics as the team.
Ethics figures prominently in academic security research, too. Of course, ethics itself is ancient, dating back to Aristotle harping on about virtue and wondering how to keep his toga from chafing, but there are modern flavours that are more specialized – like computer ethics [2], [3], although they pay scant attention to the toga issue. The professional organizations that many academics belong to, such as the ACM and IEEE, also have codes of ethics [4], [5].
Additionally, there is quite a bit of research oversight. For me to undertake academic research with humans, or gather data from humans, I first have to submit an ethics application to my friendly neighbourhood research ethics board (sometimes called an institutional review board, or IRB). In it, I have to detail everything I want permission to do, which includes not just the research itself, but also how I’m planning to recruit people, how they’ll give informed consent, and how I’ll be storing and disposing of data. All of this is mandatory, courtesy of the atrocities that occurred during World War II and sundry dubious experiments in the last century.
However, if the emperor can be said to have no clothes, then the emperor of academic security research ethics... well, let’s just say that he can sometimes count to 21 without difficulty. Academics don’t have to belong to professional organizations, and it’s not clear how strictly those codes of ethics are enforced anyway or, as the ACM Code itself [4] describes the worst case: ‘membership in ACM may be terminated’. Ouch.
Review by a research ethics board works well for research involving humans, because that’s where the research guidelines they follow derive from. Throwing computers into the mix doesn’t help [6]. Being a computer security expert is not a prerequisite for reviewing ethics applications; subtle but critical nuances may be missed. There are also plenty of loopholes for security research. Say that I’m building an undetectable, highly destructive piece of malware in my lab. (Chill out, I’m not.) I’m not doing research with humans, and therefore my work requires no ethics oversight, even though it should probably have some: if the dreaded W32/Aycock escapes my lab, it may have a profound impact on humans.
This extends to academic publication venues. Let me pick on WOOT, for instance, the USENIX Workshop on Offensive Technologies [7]. As the name implies, the workshop is all about new attack methods. Of the 60 papers that have appeared there between 2007 and 2012, how many have contained any mention of ethics? Two. While heated discussion of ethics may be taking place behind closed doors, very little heat seems to leak out.
Change is coming, starting at the academic grassroots level. The workshop I attended this May in San Francisco was CREDS, the Cyber-security Research Ethics Dialog & Strategy workshop [8], and this was not the first such event. Three workshops known as WECSR, the Workshop on Ethics in Computer Security Research, were run between 2010 and 2012 [9]. There is also a recent set of guidelines on how ethical security research may be conducted, called the Menlo Report [10].
Academic publication venues are changing too. The calls for papers for some notable security venues, such as SOUPS (usable security), PETS (privacy), and the USENIX Security Symposium [11], [12], [13] now include an ethics requirement.
In conclusion, some flawed processes aside, ethics are a common concern in the AV community as well as in academic security research. Prevention of toga chafing, on the other hand, is still an open question.
[1] Wikitravel. http://wikitravel.org/en/San_Francisco.
[2] Baase, S. A Gift of Fire: Social, Legal, and Ethical Issues for Computing Technology, 4th edition. Prentice Hall, 2012.
[4] ACM Code of Ethics and Professional Conduct. http://www.acm.org/about/code-of-ethics.
[5] IEEE Code of Ethics. http://www.ieee.org/about/corporate/governance/p7-8.html.
[6] Buchanan, E.; Aycock, J.; Dexter, S.; Dittrich, D.; Hvizdak, E. Computer Science Security Research and Human Subjects: Emerging Considerations for Research Ethics Boards. Journal of Empirical Research on Human Research Ethics 6(2), 2011, pp.71-83.
[7] USENIX Workshop on Offensive Technologies. https://www.usenix.org/conference/woot13.
[8] Cyber-security Research Ethics Dialog & Strategy Workshop (CREDS). http://www.caida.org/workshops/creds/1305/.
[9] Workshop on Ethics in Computer Security Research (WECSR) http://www.cs.stevens.edu/~spock/wecsr2012/ (contains links to the previous years of WECSR workshops).
[10] Bailey, M.; Dittrich, D.; Kenneally, E.; Maughan, D. The Menlo Report. IEEE Security & Privacy, March/April 2012, pp.71-75.
[11] Symposium On Usable Privacy and Security. http://cups.cs.cmu.edu/soups/2013/cfp.html.
[12] Privacy Enhancing Technologies Symposium. http://petsymposium.org/2013/cfp.php.
[13] USENIX Security Symposium. https://www.usenix.org/conference/usenixsecurity13/submitting-papers.