Human Factors in Privacy Research.

Saved in:
Bibliographic Details
:
TeilnehmendeR:
Place / Publishing House:Cham : : Springer International Publishing AG,, 2023.
©2023.
Year of Publication:2023
Edition:1st ed.
Language:English
Online Access:
Physical Description:1 online resource (380 pages)
Tags: Add Tag
No Tags, Be the first to tag this record!
Table of Contents:
  • Intro
  • Foreword
  • Acknowledgements
  • About This Book
  • Contents
  • Part I Theory
  • Data Collection Is Not Mostly Harmless: An Introduction to Privacy Theories and Basics
  • 1 Introduction
  • 2 Privacy Theories
  • 2.1 How (Not) to Define Privacy
  • 3 Why Do We Need Privacy?
  • References
  • From the Privacy Calculus to Crossing the Rubicon: An Introduction to Theoretical Models of User Privacy Behavior
  • 1 Introduction
  • 2 Homo Economicus
  • 3 Antecedents → Privacy Concerns → Outcomes (APCO) Model
  • 4 Theory of Planned Behavior
  • 5 Cognitive Consistency Theories
  • 6 Transactional Model of Stress and Coping
  • 7 Rubicon Model
  • 8 Capability, Opportunity, Motivation → Behavior (COM-B) System
  • 9 Health Action Process Approach
  • 10 Conclusion
  • References
  • Part II Methodology
  • Empirical Research Methods in Usable Privacy and Security
  • 1 Introduction
  • 2 Research Methods in UPS Studies
  • 2.1 Systematic Literature Reviews
  • 2.2 Interviews
  • 2.3 Focus Groups
  • 2.4 Co-Creation Methods
  • 2.5 Surveys
  • 2.6 Analyzing Measurement Data (Data Logs)
  • 2.7 Extracting Online Datasets
  • 2.8 Experience Sampling Method
  • 2.9 Experiments
  • 3 Techniques that Can Be Used in Combination with Methods
  • 4 Participant Recruitment
  • 5 Basics of Ethical Research Design with Human Participants
  • 5.1 Ethical Core Principles
  • 5.2 Ethical Considerations for Deceptive Research
  • 5.3 Ethical Review Boards
  • 6 Biases in Research with Human Participants
  • 7 Conclusion
  • References
  • Toward Valid and Reliable Privacy Concern Scales: The Example of IUIPC-8
  • 1 Introduction
  • 2 Information Privacy Concern
  • 2.1 What Is Information Privacy Concern?
  • 2.2 Information Privacy Concern Instruments
  • 3 Validity and Reliability
  • 3.1 Construct Validity
  • 3.2 Reliability
  • 4 Factor Analysis as Tool to Establish Measurement Instruments.
  • 4.1 Estimation Methods for Ordinal Non-normal Data
  • 4.2 Comparing Nested Models
  • 4.3 Global and Local Fit
  • 5 Approach
  • 5.1 Analysis Methodology
  • 5.2 Sample
  • 5.3 Validity and Reliability Criteria
  • 6 The Validation of IUIPC-8
  • 6.1 Sample
  • 6.2 Descriptives
  • 6.3 Construct Validity
  • Factorial Validity
  • Model Fit
  • CFA Model, Convergent, and Discriminant Validity
  • 6.4 Reliability: Internal Consistency
  • 7 Discussion
  • 8 Summary
  • Appendix
  • Materials and Sample
  • Thresholds
  • References
  • Achieving Usable Security and Privacy Through Human-Centered Design
  • 1 Introduction
  • 2 Background
  • 2.1 Human-Centered Design
  • 2.2 Usable Security and Privacy
  • 3 Mental Models in Security and Privacy
  • 3.1 Mental Models in Human-Computer Interaction
  • 3.2 Mental Models in Usable Security and Privacy
  • 3.3 Mental Model Elicitation
  • 4 Usable Security and Privacy Needs
  • 4.1 USP Needs as a Requirements Type
  • 4.2 USP Needs Elicitation and Analysis
  • 4.3 USP Needs Documentation and Validation
  • 4.4 Example Case Study
  • 5 User Group Profiles and Privacy Personas
  • 5.1 User Group Profiles
  • 5.2 Privacy Personas
  • 6 Summary and Conclusion
  • References
  • What HCI Can Do for (Data Protection) Law-Beyond Design
  • 1 Introduction
  • 2 The Call for Effective Measures: A Door Opener for Empirical Sciences
  • 3 Going Beyond Designing Law: The Case for the Full Toolbox of HCI Research
  • 4 Levels of Engagement: How HCI and Law Can Make Data Protection More Effective
  • 4.1 Case 1: Cookie Banners
  • 4.2 Case 2: Data Subject Rights
  • 4.3 Implementation: What Can Design Do for Law?
  • 4.4 Evaluation: How Well Is Law Currently Working?
  • 4.5 Identification: Challenging Existing Legal Interpretations and Concepts
  • 5 The Road Ahead
  • References
  • Expert Opinions as a Method of Validating Ideas: Applied to Making GDPR Usable.
  • 1 Introduction
  • 2 Method
  • 2.1 Collecting Interview Data
  • 2.2 Participants
  • 2.3 Thematic Analysis
  • 3 The Need to Evaluate and Measure Usability of Privacy
  • 3.1 Evaluating Usability of Privacy
  • 3.2 Measuring Usability of Privacy
  • 4 Usable Privacy Definition Adapts Well ISO 9241-11:2018
  • 5 A Comprehensive List of Usable Privacy Goals
  • 6 Ways to Meet the Usable Privacy Criteria
  • 7 Usable Privacy Cube Model as an Abstraction of Known and Implied Principles of Privacy Evaluations
  • 8 Summarizing the Results of the Validation Study
  • 9 Conclusion
  • References
  • Part III Application Areas
  • Privacy Nudges and Informed Consent? Challenges for Privacy Nudge Design
  • 1 Introduction to Nudging
  • 2 An Overview on Privacy Nudges
  • 3 Ethical Considerations
  • 4 Challenges of Designing Privacy Nudges
  • 5 Discussion of Approaches
  • 5.1 Design of Privacy-Preserving Nudges
  • 5.2 Design of Nudges that Target Reflective Thinking
  • 5.3 Ask the Users
  • 5.4 Choose a Combination of Approaches
  • 6 Summary
  • References
  • The Hows and Whys of Dark Patterns: Categorizations and Privacy
  • 1 Introduction
  • 2 Dark Patterns
  • 2.1 Why Do Dark Patterns Work?
  • Heuristics and Biases
  • 2.2 Privacy Decision-Making
  • 2.3 Categorization of Dark Patterns
  • 3 Privacy Dark Patterns
  • 3.1 Examples of Privacy Dark Patterns
  • Invisible to the Human Eye
  • UI Design Tricks
  • Constrained Actionability
  • Emotion-Related
  • Affecting Comprehension
  • Time-Related
  • Affecting Privacy Options
  • 3.2 Tackling (Privacy) Dark Patterns
  • 3.3 Dark Patterns and Implications on Businesses
  • 4 Concluding Remarks
  • References
  • ``They see me scrollin''-Lessons Learned from Investigating Shoulder Surfing Behavior and Attack Mitigation Strategies
  • 1 Introduction
  • 2 Investigating the Phenomenon
  • 2.1 Defining Shoulder Surfing (Attacks).
  • 2.2 Research Methods
  • 2.3 Key Findings on Shoulder Surfing Behavior
  • 3 Mitigating Shoulder Surfing Attacks
  • 3.1 Threat Models
  • 3.2 Algorithmic Detection of Attacks
  • 3.3 Prevention Strategies
  • 4 Challenges and Future Research Directions
  • 5 Conclusion
  • References
  • Privacy Research on the Pulse of Time: COVID-19 Contact-Tracing Apps
  • 1 Introduction
  • 2 Tracing Technologies
  • 2.1 Proximity Tracing
  • 2.2 Risk Calculation and Informing Those at Risk
  • 3 Privacy and Contact Tracing Apps-User Studies
  • 3.1 Results from User Studies-Privacy Concerns
  • 3.2 Influence of Privacy on Using a CTA
  • 4 Privacy: A Matter of Asking? Looking at Different Methods
  • 4.1 Timing and Context
  • 4.2 Who Is Asked?
  • 4.3 Privacy Concerns != Privacy Concerns
  • 5 Conclusion
  • References
  • Privacy Perception and Behavior in Safety-Critical Environments
  • 1 Introduction
  • 2 On the Relationship Between Cyber Privacy and Security Behavior
  • 3 Awareness on Data Sharing Functionalities and Acceptance of Private Data Sharing
  • 4 Critical Environment I: Digital Privacy Perceptions of Asylum Seekers in Germany
  • 5 Critical Environment II: The Role of Privacy in Digitalization-Analyzing Perspectives of German Farmers
  • 6 Conclusion
  • References
  • Part IV Solutions
  • Generic Consents in Digital Ecosystems: Legal, Psychological, and Technical Perspectives
  • 1 Challenge and Vision
  • 2 Generic Consents
  • 3 Legal Assessment
  • 3.1 Personal Information Management Systems in Digital Ecosystems
  • 3.2 Obtaining Consent via a PIMS
  • 3.3 Using Allowlists in Digital Ecosystems
  • Solution 1: Organizational Allowlists
  • Solution 2: User-Defined Allowlists
  • 3.4 Legal Conclusion
  • 4 User-Oriented Redesign of Consent Handling
  • 4.1 Psychological Effects of Cookie Banners
  • Problem 1: Upfront Consents
  • Problem 2: Coerced Consents.
  • Problem 3: Poor User Experience
  • Problem 4: Unclear Utility
  • Problem 5: Dark Patterns
  • Problem 6: Repeated Consents
  • 4.2 Solutions for Improved User Experience
  • Solution 1: Make Cookies Something of Later Concern
  • Solution 2: Reject Until Further Notice
  • Solution 3: Provide Differentiated Decision Support
  • Solution 4: Encourage Decision Review
  • 5 Feasibility of Technical Implementation
  • 5.1 Consent Representation Formats
  • 5.2 Consent Forwarding
  • 5.3 Data Forwarding
  • 6 Discussion
  • 6.1 Allowlists Created by NGOs (Solution 1)
  • 6.2 Allowlists Created by the User (Solution 2)
  • 6.3 Blocklists
  • 6.4 Usability
  • 7 Conclusion
  • References
  • Human-Centered Design for Data-Sparse Tailored Privacy Information Provision
  • 1 Motivation
  • 2 Overview of Extant Transparency-Enhancing Technologies
  • 2.1 Tailoring Potential of Transparency-Enhancing Technologies
  • 3 Solution Space for Tailoring Challenges
  • 3.1 Privacy Preferences
  • 3.2 Technical Privacy-Preserving Mechanisms
  • 4 Solution Archetypes for Tailored Privacy Information Provision
  • 4.1 Suitability of Tailoring Approaches
  • 4.2 Feasibility of Local and Remote Processing
  • 5 Conclusions
  • References
  • Acceptance Factors of Privacy-Enhancing Technologies on the Basis of Tor and JonDonym
  • 1 Introduction and Background
  • 2 Methodology
  • 2.1 Questionnaire Composition
  • 2.2 Questionnaire Data Collection
  • 2.3 Questionnaire Evaluation
  • Quantitative Methods
  • Qualitative Methods
  • 2.4 Interview Data Collection
  • 2.5 Interview Evaluation
  • 3 Results
  • 3.1 Internet Users Information Privacy Concerns
  • 3.2 Technology Acceptance Model
  • 3.3 Evaluation of Open Questions
  • 3.4 Customers' Willingness to Pay or Donate
  • 3.5 Companies' Incentives and Hindrances to Implement PETs
  • 4 Discussion and Conclusion
  • References.
  • Increasing Users' Privacy Awareness in the Internet of Things: Design Space and Sample Scenarios.