“You Get Where You’re Looking For: The Impact of Information Sources on Code Security”
Location: LTS Auditorium, 8080 Greenmead Drive
Vulnerabilities in Android code have enabled real-world privacy leaks and motivated research cataloguing their prevalence and impact. Researchers have speculated that appification promotes security problems, as it increasingly allows inexperienced laymen to develop complex and sensitive apps.
In this paper, we for the first time systematically analyzed how the use of information resources impacts code security. We first surveyed 295 app developers who have published in the Google Play market concerning how they use resources to solve security-related problems. Based on survey results, we conducted a study with 54 Android developers (students and professionals), in which participants wrote security- and privacy- relevant code under time constraints. The participants were assigned one of four conditions: free choice of resources, Stack Overflow only, official Android documentation only, or books only. Participants who were allowed to use only Stack Overflow produced significantly less secure code than those using, the official Android documentation or books, while participants using the official Android documentation produced significantly less functional code than those using Stack Overflow.
Taken together, our results confirm that API documentation is secure but hard to use, while informal documentation such as Stack Overflow is more accessible but often leads to insecurity. Given time constraints and economic pressures, we can expect that Android developers will continue to choose those resources that are easiest to use; therefore, our results firmly establish the need for secure-but-usable documentation.
Joint work with Jasemin Acar, Michael Backes, Sascha Fahl, Doowon Kim†, Christian Stransky CISPA, Saarland University; †University of Maryland, College Park
Michelle Mazurek is an assistant professor in the Department of Computer Science with an appointment in UMIACS.
Her research aims to improve security- and privacy-related decision-making by understanding user needs and then building sound tools and systems.
Recent projects include analyzing how users learn and process security advice, applying machine learning to partially automate access control in the cloud, examining convenience/security tradeoffs in end-to-end encryption, and examining how and why developers make security and privacy mistakes when building mobile apps.
Mazurek received her doctorate in electrical and computer engineering from Carnegie Mellon University in 2014. Her research is funded by NIST, LTS, the NSA Science of Security lablet, and Google.