You are on page 1of 3

Patients are Not the Enemy: Qualitative Inquiry into Security & Healthcare

Aubrey Baker1, Laurian Vega2, Tom DeHart2, Steve Harrison2 Grado Department of Industrial and Systems Engineering1 & Center for Human Computer Interaction2, Virginia Tech, Blacksburg, VA, USA 24060 {AABaker, Laurian, TDeHart, SRH }@VT.edu

Summary
Part of the job of healthcare providers is to manage client information. Most is routine, but some is sensitive. For these reasons physicians offices provide a rich environment for understanding complex, sensitive information management issues as they pertain to privacy and security. We present findings from interviews and observations of 15 offices in rural-serving southwest Virginia. Our work demonstrates how the current socio-technical system fails to meet the security needs of the patient. In particular, we found that the tensions between work practice and security, and between electronic and paper records resulted in insecure management of files.

Problem and Motivation


Traditionally, electronic and physical security have been concerned with creating rules, locks, and passwords. However, security systems that neglect people as a significant part of the equation are seldom secure in practice [3]. Practice is what happens in the moment; it is the activity; it is what is actually done. It is often in the human-centered moment, and not in the computer-centered planning stages, when security policies or mechanisms break down and the safety of sensitive information is compromised. For this reason we propose that there exists a need to study socio-technical systems to understand what role humans and technology play in creating usable security that complement current technological ones [1]. Specifically, we propose focusing on physicians offices, where there is a plethora of sensitive patient information that exists in various stages and forms of documentation. Physicians offices are valuable loci of study given the collaborative nature of the work and the increasing adoption of electronic medical records [7]. We present data from interviews and observations of 15 physicians offices in Southwest rural-serving Virginia to continue the discussion of usable security within a particular location and with a focus on practice.

Background and Related Work


The work of usable security in healthcare is an amalgamation of prior work on healthcare, security, and HCI [1, 6]. Patients serve as users, owners of sensitive information, and as part of the healthcare system. In regards to security, prior work has demonstrated balance is essential between policies and software solutions that are constructed accounting for: social and organizational context, temporal factors from actions in that context, possible threats from information usage, and trade-offs made by the user [1]. Some considerations would be the location of computers and paper files within the physicians office and users being inconvenienced by extra steps, such as using a password every time they return to a computer or putting files back on the shelving unit in between frequent access. These factors demonstrated that all solutions are not technical: the social context must be accounted for in order to fully represent the needs of the users as argued more generally in the work of Paylen & Dourish [4]. Despite the need for such context, there has been little work done in real social practices in regards to privacy and security. Thus, our work is a valuable contribution to the growing need of observations in real social environments. Within prior work there have been few examples of qualitative analysis in regards to security and privacy in healthcare (with valuable exceptions [1]). Qualitative methods, such as interviews and observations, allowed researchers to gain a deeper understanding of lived experiences by exposing taken-for-granted assumptions by witnessing how participants live in their environment [5]. In particular, prior qualitative research in security has focused on technologically adept locations, with little research regarding those who opt not to use technology [8]. For these reasons we present qualitative data from rural-serving physicians offices in regards to their security practices.

Approach and Uniqueness


Computing as a field can be focused on presenting technological solutions to problems. In security this can be focusing on how to design more usable interfaces and more secure password systems. These systems are useful, but fail to account for the collaborative practices of work settings, like healthcare. Our work, for this reason, necessarily has to use qualitative inquiry to approach the use of technology in this novel area. We therefore used interviews and observations as our source of data. Fifteen interviews were conducted with directors of physicians' offices; and, 61.25 hours of observation were carried out at 5 locations. The participants on average had 20.16 years of experience. The average staff size was 10 people with approximately 128 patients seen weekly. Given the dearth of diversity, more identifying information cannot be provided due to participant anonymity. All participants were unpaid. The interview protocol was developed and vetted by two external researchers. Participants were asked demographic questions; questions in regards to their daily information management practices, and questions in regards to their electronic systems. Pictures and forms were collected from offices during interviews. We used phenomenology to derive the essence of security and privacy within collaborative management of patient information. Phenomenology is a qualitative method used frequently in healthcare research; see [5] for details. For our study, data was analyzed by creating a set of themes, clustering the data into sets of meanings, and establishing agreement between the researchers before examining data.

Results and Contributions


Through our interviews and observations the following findings emerged: passwords were rarely used; when used, passwords were shared; patient electronic and paper files were lost; electronic systems crashed loosing sensitive client information; and, patient information was freely available and accessible to anyone who worked at the center. More information about this data can be found in [9]. However, given the brevity of this abstract, we have found these topics that we would like to discuss with the GHC audience. Supporting Collaborative Tasks The breakdown in password utilization and personal password security reflect that the need for this feature is not represented in the work carried out in systems that have password functionality. In other words, users do not see the need for passwords, thus individual passwords are not used. Similarly, office staff often leave information out of files or did not return files to shelves immediately. This means that systems should account for quick access to information not based on the needs of the group, and not on individual access control. Systematic Flaws Electronic record systems crashing, data backups failing, difficulty of locating paper patient files, and leaving files in the open can all be attributed to flaws within the socio-technical system. The unreliability of electronic systems require practices to maintain their paper files as a reliable backup source resulting in twice the amount of files to maintain and twice the amount of data to secure. Leaving information out of files or files off the shelf, even temporarily in between uses, is in direct conflict with keeping the information secure in the sense that it is not locked away and protected from prying eyes. Redundant information represents a system flaw in regards to security, but was created to support the social system. Designers should consider the affordances of paper files that are difficult for electronic systems such as having a physical location, recognizable handwriting, and spotting inconsistencies in the system (e.g., missing information within a file). Is Patient Privacy a Fallacy? Further improvements can be made to enhance the reliability and security of electronic systems. Updates can be tracked as well as regular backups that alert the system administrator when they fail to run successfully. Additionally machine learning algorithms can process individual user access to patient files in order to identify unusual behavior. These are not flaws of malice, but flaws of negligence where the work of making client information secure and private is not clearly embodied in the practice of managing patient information. Our future work is to respond to these issues by prototyping solutions that do represent the social needs of information management.

References
[1] Adams & Blandford (2005). "Bridging the gap between organizational and user perspectives of security in the clinical domain." IJHCS 63(1-2). [2] Adams & Sasse (1999). Users are not the enemy. Communications of the ACM. [3] Bellotti & Sellen (1993) "Design for Privacy in Ubiquitous Computing Environments," Conference on CSCW, Kluwer Academic Publishers. [4] Palen & Dourish (2003). Unpacking "privacy" for a networked world. Conference on Human Factors in Comp Sys, Ft. Lauderdale, Florida, ACM. [5] Starks & Trinidad (2007). "Choose your method: A comparison of phenomenology, discourse analysis, and grounded theory." Qual Health Res 17(10. [6] Carayon (2006). "Human factors of complex sociotechnical systems." Applied ergonomics 37(4). [7] Berner, Detmer & Simborg (2005). "Will the Wave Finally Break? A Brief View of the Adoption of Electronic Medical Records in the United States." JAMIA 12(1). [8] Satchell & Dourish (2009). Beyond the user: Use and non-use in HCI. OZCHI. Melbourne, ACM. [9] Baker, Aubrey, Laurian Vega, Tom DeHart, Steve Harrison. Healthcare & Security: Understanding & Evaluating the Risks. Presented at Human-Computer Interaction International (HCII11),ACM,2011. Orlando, Florida, USA. July 9th - 14th,2011.