Department of Computer Science and Quantitative Methods
College of Business Administration
Office: 304 Thurmond
Use my or Make An Appointment
Email:
Phone: 803-323-4825
Office hours are all held in Thurmond 304. You can also make an appointment outside of my office hours if you need.
See your syllabus for office hour times or the hours posted on my door.
My research is in the area of Privacy and Security particularly with regard to Human Computer Interaction. I utilize a variety of methods in my work including prototype development, quantitative, and qualitative analysis. I am working on creating effective ways to configure security and privacy policies, currently focused on the configuration of policies for third party applications. As part of this research, I am exploring contextual configuration of policies at runtime instead of install time. Instead of forcing users to learn the way security systems are designed, we should be designing our security systems to work for our users. On occasion, this may mean modifying the underlying access control frameworks to meet these needs. This is the way I have approached the usable security problem in my research. I try to understand the user and their needs in order to inform the design of the underlying access control systems.
Application platforms on both Social Network Sites (SNS) and mobile platforms have become increasingly popular. Unfortunately the applications on these platforms have had access to massive quantities of user data and many users did not understand the implications of their sharing [2]. As a first step, I proposed a new mechanism and interface to correct the mental model and provide a more secure framework for SNS application platforms. Instead of working from access control system and trying to produce a design to fit on top of it, the design was used to produce the access control model. The design itself consisted of granular permissions paired with real user data to illustrate the types of information being shared. Additionally, the fact that friends data was also being consumed was reflected in the design. Finally, a link to additional information about the reason for the data sharing and an indication of other users willingness to share was presented.
The resulting access control model included "friendship based protection" which offered additional benefits to users who were unknowingly having their data consumed. I performed a user study of the prototype and found that a group of users was motivated to configure policies that were appropriate given the nature of the application requesting the data yet others were not [3]. I believe the partial success was because permissions were being configured at install time where participants had little to reason with other than the name of the application.
This has prompted my inquiry into contextual access control. Contextual access control promotes configuring the policies at run time and as close to the interaction where the permission is needed. By addressing policy configuration at runtime the user is able to reason with more information than they could at install time. As a result, users may benefit from more appropriate policies. I have performed an initial study on Android and I am preparing a study on Facebook [6]. This will lead to our better understanding of the tradeoffs between configuration of access control policies at install time and runtime.
When configuring access control policies for social applications users must make a decision to allow or deny access to profile data attributes. For users this may not be so straightforward and they might not know which decision is best. Social navigation allows the user to see previous decisions made by themselves or others in order to help with the decision at hand. I performed an empirical study of 408 Amazon Mechanical Turkers to see if social navigation could have an observable effect on user decision making in a privacy context. I found that social navigation can result in statistically significant differences in the behavior of participants, but only for those given a strongly negative indication [5]. The results are encouraging because when users are provided a sufficiently negative warning, they will adjust their behavior to be consistent with perceived social norms.
Social network sites brought about the phenomena of tagged photos; photos uploaded by others being shared on another person's profile. I performed a series of focus groups to understand concerns, design considerations, and conflict resolution [1]. Since no mechanism existed I created "Restrict Others" around the design considerations extracted from the focus groups. It allows a tagged user to request that any number of their friends be prevented from seeing the photograph, relying on the social relationship to influence the uploader to respect the moral obligation to protect the tagged user. A study of Restrict Others suggests that in most cases uploaders will apply the policies on behalf of the tagged users [4]. This work received a best paper nomination at the premier international conference on Human Computer Interaction (CHI), the top venue for HCI research.
I would like to continue to do research in the area of usable privacy and security. There is a seemly endless supply of interfaces with dropdowns, checkboxes, and confusing models that represent the underlying access control systems or privacy configuration mechanisms for any number of domains. Finding solutions to these hard problems has the potential to protect millions of users from unnecessary harm. I look forward to finding areas where my background is a natural fit for collaboration with other students, researchers, and faculty.