Human Factors in Privacy Research.

Saved in:
Bibliographic Details
:
TeilnehmendeR:
Place / Publishing House:Cham : : Springer International Publishing AG,, 2023.
©2023.
Year of Publication:2023
Edition:1st ed.
Language:English
Online Access:
Physical Description:1 online resource (380 pages)
Tags: Add Tag
No Tags, Be the first to tag this record!
id 50030702982
ctrlnum (MiAaPQ)50030702982
(Au-PeEL)EBL30702982
(OCoLC)1395077403
collection bib_alma
record_format marc
spelling Gerber, Nina.
Human Factors in Privacy Research.
1st ed.
Cham : Springer International Publishing AG, 2023.
©2023.
1 online resource (380 pages)
text txt rdacontent
computer c rdamedia
online resource cr rdacarrier
Intro -- Foreword -- Acknowledgements -- About This Book -- Contents -- Part I Theory -- Data Collection Is Not Mostly Harmless: An Introduction to Privacy Theories and Basics -- 1 Introduction -- 2 Privacy Theories -- 2.1 How (Not) to Define Privacy -- 3 Why Do We Need Privacy? -- References -- From the Privacy Calculus to Crossing the Rubicon: An Introduction to Theoretical Models of User Privacy Behavior -- 1 Introduction -- 2 Homo Economicus -- 3 Antecedents → Privacy Concerns → Outcomes (APCO) Model -- 4 Theory of Planned Behavior -- 5 Cognitive Consistency Theories -- 6 Transactional Model of Stress and Coping -- 7 Rubicon Model -- 8 Capability, Opportunity, Motivation → Behavior (COM-B) System -- 9 Health Action Process Approach -- 10 Conclusion -- References -- Part II Methodology -- Empirical Research Methods in Usable Privacy and Security -- 1 Introduction -- 2 Research Methods in UPS Studies -- 2.1 Systematic Literature Reviews -- 2.2 Interviews -- 2.3 Focus Groups -- 2.4 Co-Creation Methods -- 2.5 Surveys -- 2.6 Analyzing Measurement Data (Data Logs) -- 2.7 Extracting Online Datasets -- 2.8 Experience Sampling Method -- 2.9 Experiments -- 3 Techniques that Can Be Used in Combination with Methods -- 4 Participant Recruitment -- 5 Basics of Ethical Research Design with Human Participants -- 5.1 Ethical Core Principles -- 5.2 Ethical Considerations for Deceptive Research -- 5.3 Ethical Review Boards -- 6 Biases in Research with Human Participants -- 7 Conclusion -- References -- Toward Valid and Reliable Privacy Concern Scales: The Example of IUIPC-8 -- 1 Introduction -- 2 Information Privacy Concern -- 2.1 What Is Information Privacy Concern? -- 2.2 Information Privacy Concern Instruments -- 3 Validity and Reliability -- 3.1 Construct Validity -- 3.2 Reliability -- 4 Factor Analysis as Tool to Establish Measurement Instruments.
4.1 Estimation Methods for Ordinal Non-normal Data -- 4.2 Comparing Nested Models -- 4.3 Global and Local Fit -- 5 Approach -- 5.1 Analysis Methodology -- 5.2 Sample -- 5.3 Validity and Reliability Criteria -- 6 The Validation of IUIPC-8 -- 6.1 Sample -- 6.2 Descriptives -- 6.3 Construct Validity -- Factorial Validity -- Model Fit -- CFA Model, Convergent, and Discriminant Validity -- 6.4 Reliability: Internal Consistency -- 7 Discussion -- 8 Summary -- Appendix -- Materials and Sample -- Thresholds -- References -- Achieving Usable Security and Privacy Through Human-Centered Design -- 1 Introduction -- 2 Background -- 2.1 Human-Centered Design -- 2.2 Usable Security and Privacy -- 3 Mental Models in Security and Privacy -- 3.1 Mental Models in Human-Computer Interaction -- 3.2 Mental Models in Usable Security and Privacy -- 3.3 Mental Model Elicitation -- 4 Usable Security and Privacy Needs -- 4.1 USP Needs as a Requirements Type -- 4.2 USP Needs Elicitation and Analysis -- 4.3 USP Needs Documentation and Validation -- 4.4 Example Case Study -- 5 User Group Profiles and Privacy Personas -- 5.1 User Group Profiles -- 5.2 Privacy Personas -- 6 Summary and Conclusion -- References -- What HCI Can Do for (Data Protection) Law-Beyond Design -- 1 Introduction -- 2 The Call for Effective Measures: A Door Opener for Empirical Sciences -- 3 Going Beyond Designing Law: The Case for the Full Toolbox of HCI Research -- 4 Levels of Engagement: How HCI and Law Can Make Data Protection More Effective -- 4.1 Case 1: Cookie Banners -- 4.2 Case 2: Data Subject Rights -- 4.3 Implementation: What Can Design Do for Law? -- 4.4 Evaluation: How Well Is Law Currently Working? -- 4.5 Identification: Challenging Existing Legal Interpretations and Concepts -- 5 The Road Ahead -- References -- Expert Opinions as a Method of Validating Ideas: Applied to Making GDPR Usable.
1 Introduction -- 2 Method -- 2.1 Collecting Interview Data -- 2.2 Participants -- 2.3 Thematic Analysis -- 3 The Need to Evaluate and Measure Usability of Privacy -- 3.1 Evaluating Usability of Privacy -- 3.2 Measuring Usability of Privacy -- 4 Usable Privacy Definition Adapts Well ISO 9241-11:2018 -- 5 A Comprehensive List of Usable Privacy Goals -- 6 Ways to Meet the Usable Privacy Criteria -- 7 Usable Privacy Cube Model as an Abstraction of Known and Implied Principles of Privacy Evaluations -- 8 Summarizing the Results of the Validation Study -- 9 Conclusion -- References -- Part III Application Areas -- Privacy Nudges and Informed Consent? Challenges for Privacy Nudge Design -- 1 Introduction to Nudging -- 2 An Overview on Privacy Nudges -- 3 Ethical Considerations -- 4 Challenges of Designing Privacy Nudges -- 5 Discussion of Approaches -- 5.1 Design of Privacy-Preserving Nudges -- 5.2 Design of Nudges that Target Reflective Thinking -- 5.3 Ask the Users -- 5.4 Choose a Combination of Approaches -- 6 Summary -- References -- The Hows and Whys of Dark Patterns: Categorizations and Privacy -- 1 Introduction -- 2 Dark Patterns -- 2.1 Why Do Dark Patterns Work? -- Heuristics and Biases -- 2.2 Privacy Decision-Making -- 2.3 Categorization of Dark Patterns -- 3 Privacy Dark Patterns -- 3.1 Examples of Privacy Dark Patterns -- Invisible to the Human Eye -- UI Design Tricks -- Constrained Actionability -- Emotion-Related -- Affecting Comprehension -- Time-Related -- Affecting Privacy Options -- 3.2 Tackling (Privacy) Dark Patterns -- 3.3 Dark Patterns and Implications on Businesses -- 4 Concluding Remarks -- References -- ``They see me scrollin''-Lessons Learned from Investigating Shoulder Surfing Behavior and Attack Mitigation Strategies -- 1 Introduction -- 2 Investigating the Phenomenon -- 2.1 Defining Shoulder Surfing (Attacks).
2.2 Research Methods -- 2.3 Key Findings on Shoulder Surfing Behavior -- 3 Mitigating Shoulder Surfing Attacks -- 3.1 Threat Models -- 3.2 Algorithmic Detection of Attacks -- 3.3 Prevention Strategies -- 4 Challenges and Future Research Directions -- 5 Conclusion -- References -- Privacy Research on the Pulse of Time: COVID-19 Contact-Tracing Apps -- 1 Introduction -- 2 Tracing Technologies -- 2.1 Proximity Tracing -- 2.2 Risk Calculation and Informing Those at Risk -- 3 Privacy and Contact Tracing Apps-User Studies -- 3.1 Results from User Studies-Privacy Concerns -- 3.2 Influence of Privacy on Using a CTA -- 4 Privacy: A Matter of Asking? Looking at Different Methods -- 4.1 Timing and Context -- 4.2 Who Is Asked? -- 4.3 Privacy Concerns != Privacy Concerns -- 5 Conclusion -- References -- Privacy Perception and Behavior in Safety-Critical Environments -- 1 Introduction -- 2 On the Relationship Between Cyber Privacy and Security Behavior -- 3 Awareness on Data Sharing Functionalities and Acceptance of Private Data Sharing -- 4 Critical Environment I: Digital Privacy Perceptions of Asylum Seekers in Germany -- 5 Critical Environment II: The Role of Privacy in Digitalization-Analyzing Perspectives of German Farmers -- 6 Conclusion -- References -- Part IV Solutions -- Generic Consents in Digital Ecosystems: Legal, Psychological, and Technical Perspectives -- 1 Challenge and Vision -- 2 Generic Consents -- 3 Legal Assessment -- 3.1 Personal Information Management Systems in Digital Ecosystems -- 3.2 Obtaining Consent via a PIMS -- 3.3 Using Allowlists in Digital Ecosystems -- Solution 1: Organizational Allowlists -- Solution 2: User-Defined Allowlists -- 3.4 Legal Conclusion -- 4 User-Oriented Redesign of Consent Handling -- 4.1 Psychological Effects of Cookie Banners -- Problem 1: Upfront Consents -- Problem 2: Coerced Consents.
Problem 3: Poor User Experience -- Problem 4: Unclear Utility -- Problem 5: Dark Patterns -- Problem 6: Repeated Consents -- 4.2 Solutions for Improved User Experience -- Solution 1: Make Cookies Something of Later Concern -- Solution 2: Reject Until Further Notice -- Solution 3: Provide Differentiated Decision Support -- Solution 4: Encourage Decision Review -- 5 Feasibility of Technical Implementation -- 5.1 Consent Representation Formats -- 5.2 Consent Forwarding -- 5.3 Data Forwarding -- 6 Discussion -- 6.1 Allowlists Created by NGOs (Solution 1) -- 6.2 Allowlists Created by the User (Solution 2) -- 6.3 Blocklists -- 6.4 Usability -- 7 Conclusion -- References -- Human-Centered Design for Data-Sparse Tailored Privacy Information Provision -- 1 Motivation -- 2 Overview of Extant Transparency-Enhancing Technologies -- 2.1 Tailoring Potential of Transparency-Enhancing Technologies -- 3 Solution Space for Tailoring Challenges -- 3.1 Privacy Preferences -- 3.2 Technical Privacy-Preserving Mechanisms -- 4 Solution Archetypes for Tailored Privacy Information Provision -- 4.1 Suitability of Tailoring Approaches -- 4.2 Feasibility of Local and Remote Processing -- 5 Conclusions -- References -- Acceptance Factors of Privacy-Enhancing Technologies on the Basis of Tor and JonDonym -- 1 Introduction and Background -- 2 Methodology -- 2.1 Questionnaire Composition -- 2.2 Questionnaire Data Collection -- 2.3 Questionnaire Evaluation -- Quantitative Methods -- Qualitative Methods -- 2.4 Interview Data Collection -- 2.5 Interview Evaluation -- 3 Results -- 3.1 Internet Users Information Privacy Concerns -- 3.2 Technology Acceptance Model -- 3.3 Evaluation of Open Questions -- 3.4 Customers' Willingness to Pay or Donate -- 3.5 Companies' Incentives and Hindrances to Implement PETs -- 4 Discussion and Conclusion -- References.
Increasing Users' Privacy Awareness in the Internet of Things: Design Space and Sample Scenarios.
Description based on publisher supplied metadata and other sources.
Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2024. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries.
Electronic books.
Stöver, Alina.
Marky, Karola.
Print version: Gerber, Nina Human Factors in Privacy Research Cham : Springer International Publishing AG,c2023 9783031286421
ProQuest (Firm)
https://ebookcentral.proquest.com/lib/oeawat/detail.action?docID=30702982 Click to View
language English
format eBook
author Gerber, Nina.
spellingShingle Gerber, Nina.
Human Factors in Privacy Research.
Intro -- Foreword -- Acknowledgements -- About This Book -- Contents -- Part I Theory -- Data Collection Is Not Mostly Harmless: An Introduction to Privacy Theories and Basics -- 1 Introduction -- 2 Privacy Theories -- 2.1 How (Not) to Define Privacy -- 3 Why Do We Need Privacy? -- References -- From the Privacy Calculus to Crossing the Rubicon: An Introduction to Theoretical Models of User Privacy Behavior -- 1 Introduction -- 2 Homo Economicus -- 3 Antecedents → Privacy Concerns → Outcomes (APCO) Model -- 4 Theory of Planned Behavior -- 5 Cognitive Consistency Theories -- 6 Transactional Model of Stress and Coping -- 7 Rubicon Model -- 8 Capability, Opportunity, Motivation → Behavior (COM-B) System -- 9 Health Action Process Approach -- 10 Conclusion -- References -- Part II Methodology -- Empirical Research Methods in Usable Privacy and Security -- 1 Introduction -- 2 Research Methods in UPS Studies -- 2.1 Systematic Literature Reviews -- 2.2 Interviews -- 2.3 Focus Groups -- 2.4 Co-Creation Methods -- 2.5 Surveys -- 2.6 Analyzing Measurement Data (Data Logs) -- 2.7 Extracting Online Datasets -- 2.8 Experience Sampling Method -- 2.9 Experiments -- 3 Techniques that Can Be Used in Combination with Methods -- 4 Participant Recruitment -- 5 Basics of Ethical Research Design with Human Participants -- 5.1 Ethical Core Principles -- 5.2 Ethical Considerations for Deceptive Research -- 5.3 Ethical Review Boards -- 6 Biases in Research with Human Participants -- 7 Conclusion -- References -- Toward Valid and Reliable Privacy Concern Scales: The Example of IUIPC-8 -- 1 Introduction -- 2 Information Privacy Concern -- 2.1 What Is Information Privacy Concern? -- 2.2 Information Privacy Concern Instruments -- 3 Validity and Reliability -- 3.1 Construct Validity -- 3.2 Reliability -- 4 Factor Analysis as Tool to Establish Measurement Instruments.
4.1 Estimation Methods for Ordinal Non-normal Data -- 4.2 Comparing Nested Models -- 4.3 Global and Local Fit -- 5 Approach -- 5.1 Analysis Methodology -- 5.2 Sample -- 5.3 Validity and Reliability Criteria -- 6 The Validation of IUIPC-8 -- 6.1 Sample -- 6.2 Descriptives -- 6.3 Construct Validity -- Factorial Validity -- Model Fit -- CFA Model, Convergent, and Discriminant Validity -- 6.4 Reliability: Internal Consistency -- 7 Discussion -- 8 Summary -- Appendix -- Materials and Sample -- Thresholds -- References -- Achieving Usable Security and Privacy Through Human-Centered Design -- 1 Introduction -- 2 Background -- 2.1 Human-Centered Design -- 2.2 Usable Security and Privacy -- 3 Mental Models in Security and Privacy -- 3.1 Mental Models in Human-Computer Interaction -- 3.2 Mental Models in Usable Security and Privacy -- 3.3 Mental Model Elicitation -- 4 Usable Security and Privacy Needs -- 4.1 USP Needs as a Requirements Type -- 4.2 USP Needs Elicitation and Analysis -- 4.3 USP Needs Documentation and Validation -- 4.4 Example Case Study -- 5 User Group Profiles and Privacy Personas -- 5.1 User Group Profiles -- 5.2 Privacy Personas -- 6 Summary and Conclusion -- References -- What HCI Can Do for (Data Protection) Law-Beyond Design -- 1 Introduction -- 2 The Call for Effective Measures: A Door Opener for Empirical Sciences -- 3 Going Beyond Designing Law: The Case for the Full Toolbox of HCI Research -- 4 Levels of Engagement: How HCI and Law Can Make Data Protection More Effective -- 4.1 Case 1: Cookie Banners -- 4.2 Case 2: Data Subject Rights -- 4.3 Implementation: What Can Design Do for Law? -- 4.4 Evaluation: How Well Is Law Currently Working? -- 4.5 Identification: Challenging Existing Legal Interpretations and Concepts -- 5 The Road Ahead -- References -- Expert Opinions as a Method of Validating Ideas: Applied to Making GDPR Usable.
1 Introduction -- 2 Method -- 2.1 Collecting Interview Data -- 2.2 Participants -- 2.3 Thematic Analysis -- 3 The Need to Evaluate and Measure Usability of Privacy -- 3.1 Evaluating Usability of Privacy -- 3.2 Measuring Usability of Privacy -- 4 Usable Privacy Definition Adapts Well ISO 9241-11:2018 -- 5 A Comprehensive List of Usable Privacy Goals -- 6 Ways to Meet the Usable Privacy Criteria -- 7 Usable Privacy Cube Model as an Abstraction of Known and Implied Principles of Privacy Evaluations -- 8 Summarizing the Results of the Validation Study -- 9 Conclusion -- References -- Part III Application Areas -- Privacy Nudges and Informed Consent? Challenges for Privacy Nudge Design -- 1 Introduction to Nudging -- 2 An Overview on Privacy Nudges -- 3 Ethical Considerations -- 4 Challenges of Designing Privacy Nudges -- 5 Discussion of Approaches -- 5.1 Design of Privacy-Preserving Nudges -- 5.2 Design of Nudges that Target Reflective Thinking -- 5.3 Ask the Users -- 5.4 Choose a Combination of Approaches -- 6 Summary -- References -- The Hows and Whys of Dark Patterns: Categorizations and Privacy -- 1 Introduction -- 2 Dark Patterns -- 2.1 Why Do Dark Patterns Work? -- Heuristics and Biases -- 2.2 Privacy Decision-Making -- 2.3 Categorization of Dark Patterns -- 3 Privacy Dark Patterns -- 3.1 Examples of Privacy Dark Patterns -- Invisible to the Human Eye -- UI Design Tricks -- Constrained Actionability -- Emotion-Related -- Affecting Comprehension -- Time-Related -- Affecting Privacy Options -- 3.2 Tackling (Privacy) Dark Patterns -- 3.3 Dark Patterns and Implications on Businesses -- 4 Concluding Remarks -- References -- ``They see me scrollin''-Lessons Learned from Investigating Shoulder Surfing Behavior and Attack Mitigation Strategies -- 1 Introduction -- 2 Investigating the Phenomenon -- 2.1 Defining Shoulder Surfing (Attacks).
2.2 Research Methods -- 2.3 Key Findings on Shoulder Surfing Behavior -- 3 Mitigating Shoulder Surfing Attacks -- 3.1 Threat Models -- 3.2 Algorithmic Detection of Attacks -- 3.3 Prevention Strategies -- 4 Challenges and Future Research Directions -- 5 Conclusion -- References -- Privacy Research on the Pulse of Time: COVID-19 Contact-Tracing Apps -- 1 Introduction -- 2 Tracing Technologies -- 2.1 Proximity Tracing -- 2.2 Risk Calculation and Informing Those at Risk -- 3 Privacy and Contact Tracing Apps-User Studies -- 3.1 Results from User Studies-Privacy Concerns -- 3.2 Influence of Privacy on Using a CTA -- 4 Privacy: A Matter of Asking? Looking at Different Methods -- 4.1 Timing and Context -- 4.2 Who Is Asked? -- 4.3 Privacy Concerns != Privacy Concerns -- 5 Conclusion -- References -- Privacy Perception and Behavior in Safety-Critical Environments -- 1 Introduction -- 2 On the Relationship Between Cyber Privacy and Security Behavior -- 3 Awareness on Data Sharing Functionalities and Acceptance of Private Data Sharing -- 4 Critical Environment I: Digital Privacy Perceptions of Asylum Seekers in Germany -- 5 Critical Environment II: The Role of Privacy in Digitalization-Analyzing Perspectives of German Farmers -- 6 Conclusion -- References -- Part IV Solutions -- Generic Consents in Digital Ecosystems: Legal, Psychological, and Technical Perspectives -- 1 Challenge and Vision -- 2 Generic Consents -- 3 Legal Assessment -- 3.1 Personal Information Management Systems in Digital Ecosystems -- 3.2 Obtaining Consent via a PIMS -- 3.3 Using Allowlists in Digital Ecosystems -- Solution 1: Organizational Allowlists -- Solution 2: User-Defined Allowlists -- 3.4 Legal Conclusion -- 4 User-Oriented Redesign of Consent Handling -- 4.1 Psychological Effects of Cookie Banners -- Problem 1: Upfront Consents -- Problem 2: Coerced Consents.
Problem 3: Poor User Experience -- Problem 4: Unclear Utility -- Problem 5: Dark Patterns -- Problem 6: Repeated Consents -- 4.2 Solutions for Improved User Experience -- Solution 1: Make Cookies Something of Later Concern -- Solution 2: Reject Until Further Notice -- Solution 3: Provide Differentiated Decision Support -- Solution 4: Encourage Decision Review -- 5 Feasibility of Technical Implementation -- 5.1 Consent Representation Formats -- 5.2 Consent Forwarding -- 5.3 Data Forwarding -- 6 Discussion -- 6.1 Allowlists Created by NGOs (Solution 1) -- 6.2 Allowlists Created by the User (Solution 2) -- 6.3 Blocklists -- 6.4 Usability -- 7 Conclusion -- References -- Human-Centered Design for Data-Sparse Tailored Privacy Information Provision -- 1 Motivation -- 2 Overview of Extant Transparency-Enhancing Technologies -- 2.1 Tailoring Potential of Transparency-Enhancing Technologies -- 3 Solution Space for Tailoring Challenges -- 3.1 Privacy Preferences -- 3.2 Technical Privacy-Preserving Mechanisms -- 4 Solution Archetypes for Tailored Privacy Information Provision -- 4.1 Suitability of Tailoring Approaches -- 4.2 Feasibility of Local and Remote Processing -- 5 Conclusions -- References -- Acceptance Factors of Privacy-Enhancing Technologies on the Basis of Tor and JonDonym -- 1 Introduction and Background -- 2 Methodology -- 2.1 Questionnaire Composition -- 2.2 Questionnaire Data Collection -- 2.3 Questionnaire Evaluation -- Quantitative Methods -- Qualitative Methods -- 2.4 Interview Data Collection -- 2.5 Interview Evaluation -- 3 Results -- 3.1 Internet Users Information Privacy Concerns -- 3.2 Technology Acceptance Model -- 3.3 Evaluation of Open Questions -- 3.4 Customers' Willingness to Pay or Donate -- 3.5 Companies' Incentives and Hindrances to Implement PETs -- 4 Discussion and Conclusion -- References.
Increasing Users' Privacy Awareness in the Internet of Things: Design Space and Sample Scenarios.
author_facet Gerber, Nina.
Stöver, Alina.
Marky, Karola.
author_variant n g ng
author2 Stöver, Alina.
Marky, Karola.
author2_variant a s as
k m km
author2_role TeilnehmendeR
TeilnehmendeR
author_sort Gerber, Nina.
title Human Factors in Privacy Research.
title_full Human Factors in Privacy Research.
title_fullStr Human Factors in Privacy Research.
title_full_unstemmed Human Factors in Privacy Research.
title_auth Human Factors in Privacy Research.
title_new Human Factors in Privacy Research.
title_sort human factors in privacy research.
publisher Springer International Publishing AG,
publishDate 2023
physical 1 online resource (380 pages)
edition 1st ed.
contents Intro -- Foreword -- Acknowledgements -- About This Book -- Contents -- Part I Theory -- Data Collection Is Not Mostly Harmless: An Introduction to Privacy Theories and Basics -- 1 Introduction -- 2 Privacy Theories -- 2.1 How (Not) to Define Privacy -- 3 Why Do We Need Privacy? -- References -- From the Privacy Calculus to Crossing the Rubicon: An Introduction to Theoretical Models of User Privacy Behavior -- 1 Introduction -- 2 Homo Economicus -- 3 Antecedents → Privacy Concerns → Outcomes (APCO) Model -- 4 Theory of Planned Behavior -- 5 Cognitive Consistency Theories -- 6 Transactional Model of Stress and Coping -- 7 Rubicon Model -- 8 Capability, Opportunity, Motivation → Behavior (COM-B) System -- 9 Health Action Process Approach -- 10 Conclusion -- References -- Part II Methodology -- Empirical Research Methods in Usable Privacy and Security -- 1 Introduction -- 2 Research Methods in UPS Studies -- 2.1 Systematic Literature Reviews -- 2.2 Interviews -- 2.3 Focus Groups -- 2.4 Co-Creation Methods -- 2.5 Surveys -- 2.6 Analyzing Measurement Data (Data Logs) -- 2.7 Extracting Online Datasets -- 2.8 Experience Sampling Method -- 2.9 Experiments -- 3 Techniques that Can Be Used in Combination with Methods -- 4 Participant Recruitment -- 5 Basics of Ethical Research Design with Human Participants -- 5.1 Ethical Core Principles -- 5.2 Ethical Considerations for Deceptive Research -- 5.3 Ethical Review Boards -- 6 Biases in Research with Human Participants -- 7 Conclusion -- References -- Toward Valid and Reliable Privacy Concern Scales: The Example of IUIPC-8 -- 1 Introduction -- 2 Information Privacy Concern -- 2.1 What Is Information Privacy Concern? -- 2.2 Information Privacy Concern Instruments -- 3 Validity and Reliability -- 3.1 Construct Validity -- 3.2 Reliability -- 4 Factor Analysis as Tool to Establish Measurement Instruments.
4.1 Estimation Methods for Ordinal Non-normal Data -- 4.2 Comparing Nested Models -- 4.3 Global and Local Fit -- 5 Approach -- 5.1 Analysis Methodology -- 5.2 Sample -- 5.3 Validity and Reliability Criteria -- 6 The Validation of IUIPC-8 -- 6.1 Sample -- 6.2 Descriptives -- 6.3 Construct Validity -- Factorial Validity -- Model Fit -- CFA Model, Convergent, and Discriminant Validity -- 6.4 Reliability: Internal Consistency -- 7 Discussion -- 8 Summary -- Appendix -- Materials and Sample -- Thresholds -- References -- Achieving Usable Security and Privacy Through Human-Centered Design -- 1 Introduction -- 2 Background -- 2.1 Human-Centered Design -- 2.2 Usable Security and Privacy -- 3 Mental Models in Security and Privacy -- 3.1 Mental Models in Human-Computer Interaction -- 3.2 Mental Models in Usable Security and Privacy -- 3.3 Mental Model Elicitation -- 4 Usable Security and Privacy Needs -- 4.1 USP Needs as a Requirements Type -- 4.2 USP Needs Elicitation and Analysis -- 4.3 USP Needs Documentation and Validation -- 4.4 Example Case Study -- 5 User Group Profiles and Privacy Personas -- 5.1 User Group Profiles -- 5.2 Privacy Personas -- 6 Summary and Conclusion -- References -- What HCI Can Do for (Data Protection) Law-Beyond Design -- 1 Introduction -- 2 The Call for Effective Measures: A Door Opener for Empirical Sciences -- 3 Going Beyond Designing Law: The Case for the Full Toolbox of HCI Research -- 4 Levels of Engagement: How HCI and Law Can Make Data Protection More Effective -- 4.1 Case 1: Cookie Banners -- 4.2 Case 2: Data Subject Rights -- 4.3 Implementation: What Can Design Do for Law? -- 4.4 Evaluation: How Well Is Law Currently Working? -- 4.5 Identification: Challenging Existing Legal Interpretations and Concepts -- 5 The Road Ahead -- References -- Expert Opinions as a Method of Validating Ideas: Applied to Making GDPR Usable.
1 Introduction -- 2 Method -- 2.1 Collecting Interview Data -- 2.2 Participants -- 2.3 Thematic Analysis -- 3 The Need to Evaluate and Measure Usability of Privacy -- 3.1 Evaluating Usability of Privacy -- 3.2 Measuring Usability of Privacy -- 4 Usable Privacy Definition Adapts Well ISO 9241-11:2018 -- 5 A Comprehensive List of Usable Privacy Goals -- 6 Ways to Meet the Usable Privacy Criteria -- 7 Usable Privacy Cube Model as an Abstraction of Known and Implied Principles of Privacy Evaluations -- 8 Summarizing the Results of the Validation Study -- 9 Conclusion -- References -- Part III Application Areas -- Privacy Nudges and Informed Consent? Challenges for Privacy Nudge Design -- 1 Introduction to Nudging -- 2 An Overview on Privacy Nudges -- 3 Ethical Considerations -- 4 Challenges of Designing Privacy Nudges -- 5 Discussion of Approaches -- 5.1 Design of Privacy-Preserving Nudges -- 5.2 Design of Nudges that Target Reflective Thinking -- 5.3 Ask the Users -- 5.4 Choose a Combination of Approaches -- 6 Summary -- References -- The Hows and Whys of Dark Patterns: Categorizations and Privacy -- 1 Introduction -- 2 Dark Patterns -- 2.1 Why Do Dark Patterns Work? -- Heuristics and Biases -- 2.2 Privacy Decision-Making -- 2.3 Categorization of Dark Patterns -- 3 Privacy Dark Patterns -- 3.1 Examples of Privacy Dark Patterns -- Invisible to the Human Eye -- UI Design Tricks -- Constrained Actionability -- Emotion-Related -- Affecting Comprehension -- Time-Related -- Affecting Privacy Options -- 3.2 Tackling (Privacy) Dark Patterns -- 3.3 Dark Patterns and Implications on Businesses -- 4 Concluding Remarks -- References -- ``They see me scrollin''-Lessons Learned from Investigating Shoulder Surfing Behavior and Attack Mitigation Strategies -- 1 Introduction -- 2 Investigating the Phenomenon -- 2.1 Defining Shoulder Surfing (Attacks).
2.2 Research Methods -- 2.3 Key Findings on Shoulder Surfing Behavior -- 3 Mitigating Shoulder Surfing Attacks -- 3.1 Threat Models -- 3.2 Algorithmic Detection of Attacks -- 3.3 Prevention Strategies -- 4 Challenges and Future Research Directions -- 5 Conclusion -- References -- Privacy Research on the Pulse of Time: COVID-19 Contact-Tracing Apps -- 1 Introduction -- 2 Tracing Technologies -- 2.1 Proximity Tracing -- 2.2 Risk Calculation and Informing Those at Risk -- 3 Privacy and Contact Tracing Apps-User Studies -- 3.1 Results from User Studies-Privacy Concerns -- 3.2 Influence of Privacy on Using a CTA -- 4 Privacy: A Matter of Asking? Looking at Different Methods -- 4.1 Timing and Context -- 4.2 Who Is Asked? -- 4.3 Privacy Concerns != Privacy Concerns -- 5 Conclusion -- References -- Privacy Perception and Behavior in Safety-Critical Environments -- 1 Introduction -- 2 On the Relationship Between Cyber Privacy and Security Behavior -- 3 Awareness on Data Sharing Functionalities and Acceptance of Private Data Sharing -- 4 Critical Environment I: Digital Privacy Perceptions of Asylum Seekers in Germany -- 5 Critical Environment II: The Role of Privacy in Digitalization-Analyzing Perspectives of German Farmers -- 6 Conclusion -- References -- Part IV Solutions -- Generic Consents in Digital Ecosystems: Legal, Psychological, and Technical Perspectives -- 1 Challenge and Vision -- 2 Generic Consents -- 3 Legal Assessment -- 3.1 Personal Information Management Systems in Digital Ecosystems -- 3.2 Obtaining Consent via a PIMS -- 3.3 Using Allowlists in Digital Ecosystems -- Solution 1: Organizational Allowlists -- Solution 2: User-Defined Allowlists -- 3.4 Legal Conclusion -- 4 User-Oriented Redesign of Consent Handling -- 4.1 Psychological Effects of Cookie Banners -- Problem 1: Upfront Consents -- Problem 2: Coerced Consents.
Problem 3: Poor User Experience -- Problem 4: Unclear Utility -- Problem 5: Dark Patterns -- Problem 6: Repeated Consents -- 4.2 Solutions for Improved User Experience -- Solution 1: Make Cookies Something of Later Concern -- Solution 2: Reject Until Further Notice -- Solution 3: Provide Differentiated Decision Support -- Solution 4: Encourage Decision Review -- 5 Feasibility of Technical Implementation -- 5.1 Consent Representation Formats -- 5.2 Consent Forwarding -- 5.3 Data Forwarding -- 6 Discussion -- 6.1 Allowlists Created by NGOs (Solution 1) -- 6.2 Allowlists Created by the User (Solution 2) -- 6.3 Blocklists -- 6.4 Usability -- 7 Conclusion -- References -- Human-Centered Design for Data-Sparse Tailored Privacy Information Provision -- 1 Motivation -- 2 Overview of Extant Transparency-Enhancing Technologies -- 2.1 Tailoring Potential of Transparency-Enhancing Technologies -- 3 Solution Space for Tailoring Challenges -- 3.1 Privacy Preferences -- 3.2 Technical Privacy-Preserving Mechanisms -- 4 Solution Archetypes for Tailored Privacy Information Provision -- 4.1 Suitability of Tailoring Approaches -- 4.2 Feasibility of Local and Remote Processing -- 5 Conclusions -- References -- Acceptance Factors of Privacy-Enhancing Technologies on the Basis of Tor and JonDonym -- 1 Introduction and Background -- 2 Methodology -- 2.1 Questionnaire Composition -- 2.2 Questionnaire Data Collection -- 2.3 Questionnaire Evaluation -- Quantitative Methods -- Qualitative Methods -- 2.4 Interview Data Collection -- 2.5 Interview Evaluation -- 3 Results -- 3.1 Internet Users Information Privacy Concerns -- 3.2 Technology Acceptance Model -- 3.3 Evaluation of Open Questions -- 3.4 Customers' Willingness to Pay or Donate -- 3.5 Companies' Incentives and Hindrances to Implement PETs -- 4 Discussion and Conclusion -- References.
Increasing Users' Privacy Awareness in the Internet of Things: Design Space and Sample Scenarios.
isbn 9783031286438
9783031286421
callnumber-first B - Philosophy, Psychology, Religion
callnumber-subject BF - Psychology
callnumber-label BF1-990
callnumber-sort BF 11 3990
genre Electronic books.
genre_facet Electronic books.
url https://ebookcentral.proquest.com/lib/oeawat/detail.action?docID=30702982
illustrated Not Illustrated
oclc_num 1395077403
work_keys_str_mv AT gerbernina humanfactorsinprivacyresearch
AT stoveralina humanfactorsinprivacyresearch
AT markykarola humanfactorsinprivacyresearch
status_str n
ids_txt_mv (MiAaPQ)50030702982
(Au-PeEL)EBL30702982
(OCoLC)1395077403
carrierType_str_mv cr
is_hierarchy_title Human Factors in Privacy Research.
author2_original_writing_str_mv noLinkedField
noLinkedField
marc_error Info : MARC8 translation shorter than ISO-8859-1, choosing MARC8. --- [ 856 : z ]
_version_ 1792331072630947840
fullrecord <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>11028nam a22004573i 4500</leader><controlfield tag="001">50030702982</controlfield><controlfield tag="003">MiAaPQ</controlfield><controlfield tag="005">20240229073851.0</controlfield><controlfield tag="006">m o d | </controlfield><controlfield tag="007">cr cnu||||||||</controlfield><controlfield tag="008">240229s2023 xx o ||||0 eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9783031286438</subfield><subfield code="q">(electronic bk.)</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">9783031286421</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(MiAaPQ)50030702982</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(Au-PeEL)EBL30702982</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1395077403</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">MiAaPQ</subfield><subfield code="b">eng</subfield><subfield code="e">rda</subfield><subfield code="e">pn</subfield><subfield code="c">MiAaPQ</subfield><subfield code="d">MiAaPQ</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">BF1-990</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Gerber, Nina.</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Human Factors in Privacy Research.</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">1st ed.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Cham :</subfield><subfield code="b">Springer International Publishing AG,</subfield><subfield code="c">2023.</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">©2023.</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (380 pages)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">Intro -- Foreword -- Acknowledgements -- About This Book -- Contents -- Part I Theory -- Data Collection Is Not Mostly Harmless: An Introduction to Privacy Theories and Basics -- 1 Introduction -- 2 Privacy Theories -- 2.1 How (Not) to Define Privacy -- 3 Why Do We Need Privacy? -- References -- From the Privacy Calculus to Crossing the Rubicon: An Introduction to Theoretical Models of User Privacy Behavior -- 1 Introduction -- 2 Homo Economicus -- 3 Antecedents → Privacy Concerns → Outcomes (APCO) Model -- 4 Theory of Planned Behavior -- 5 Cognitive Consistency Theories -- 6 Transactional Model of Stress and Coping -- 7 Rubicon Model -- 8 Capability, Opportunity, Motivation → Behavior (COM-B) System -- 9 Health Action Process Approach -- 10 Conclusion -- References -- Part II Methodology -- Empirical Research Methods in Usable Privacy and Security -- 1 Introduction -- 2 Research Methods in UPS Studies -- 2.1 Systematic Literature Reviews -- 2.2 Interviews -- 2.3 Focus Groups -- 2.4 Co-Creation Methods -- 2.5 Surveys -- 2.6 Analyzing Measurement Data (Data Logs) -- 2.7 Extracting Online Datasets -- 2.8 Experience Sampling Method -- 2.9 Experiments -- 3 Techniques that Can Be Used in Combination with Methods -- 4 Participant Recruitment -- 5 Basics of Ethical Research Design with Human Participants -- 5.1 Ethical Core Principles -- 5.2 Ethical Considerations for Deceptive Research -- 5.3 Ethical Review Boards -- 6 Biases in Research with Human Participants -- 7 Conclusion -- References -- Toward Valid and Reliable Privacy Concern Scales: The Example of IUIPC-8 -- 1 Introduction -- 2 Information Privacy Concern -- 2.1 What Is Information Privacy Concern? -- 2.2 Information Privacy Concern Instruments -- 3 Validity and Reliability -- 3.1 Construct Validity -- 3.2 Reliability -- 4 Factor Analysis as Tool to Establish Measurement Instruments.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">4.1 Estimation Methods for Ordinal Non-normal Data -- 4.2 Comparing Nested Models -- 4.3 Global and Local Fit -- 5 Approach -- 5.1 Analysis Methodology -- 5.2 Sample -- 5.3 Validity and Reliability Criteria -- 6 The Validation of IUIPC-8 -- 6.1 Sample -- 6.2 Descriptives -- 6.3 Construct Validity -- Factorial Validity -- Model Fit -- CFA Model, Convergent, and Discriminant Validity -- 6.4 Reliability: Internal Consistency -- 7 Discussion -- 8 Summary -- Appendix -- Materials and Sample -- Thresholds -- References -- Achieving Usable Security and Privacy Through Human-Centered Design -- 1 Introduction -- 2 Background -- 2.1 Human-Centered Design -- 2.2 Usable Security and Privacy -- 3 Mental Models in Security and Privacy -- 3.1 Mental Models in Human-Computer Interaction -- 3.2 Mental Models in Usable Security and Privacy -- 3.3 Mental Model Elicitation -- 4 Usable Security and Privacy Needs -- 4.1 USP Needs as a Requirements Type -- 4.2 USP Needs Elicitation and Analysis -- 4.3 USP Needs Documentation and Validation -- 4.4 Example Case Study -- 5 User Group Profiles and Privacy Personas -- 5.1 User Group Profiles -- 5.2 Privacy Personas -- 6 Summary and Conclusion -- References -- What HCI Can Do for (Data Protection) Law-Beyond Design -- 1 Introduction -- 2 The Call for Effective Measures: A Door Opener for Empirical Sciences -- 3 Going Beyond Designing Law: The Case for the Full Toolbox of HCI Research -- 4 Levels of Engagement: How HCI and Law Can Make Data Protection More Effective -- 4.1 Case 1: Cookie Banners -- 4.2 Case 2: Data Subject Rights -- 4.3 Implementation: What Can Design Do for Law? -- 4.4 Evaluation: How Well Is Law Currently Working? -- 4.5 Identification: Challenging Existing Legal Interpretations and Concepts -- 5 The Road Ahead -- References -- Expert Opinions as a Method of Validating Ideas: Applied to Making GDPR Usable.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">1 Introduction -- 2 Method -- 2.1 Collecting Interview Data -- 2.2 Participants -- 2.3 Thematic Analysis -- 3 The Need to Evaluate and Measure Usability of Privacy -- 3.1 Evaluating Usability of Privacy -- 3.2 Measuring Usability of Privacy -- 4 Usable Privacy Definition Adapts Well ISO 9241-11:2018 -- 5 A Comprehensive List of Usable Privacy Goals -- 6 Ways to Meet the Usable Privacy Criteria -- 7 Usable Privacy Cube Model as an Abstraction of Known and Implied Principles of Privacy Evaluations -- 8 Summarizing the Results of the Validation Study -- 9 Conclusion -- References -- Part III Application Areas -- Privacy Nudges and Informed Consent? Challenges for Privacy Nudge Design -- 1 Introduction to Nudging -- 2 An Overview on Privacy Nudges -- 3 Ethical Considerations -- 4 Challenges of Designing Privacy Nudges -- 5 Discussion of Approaches -- 5.1 Design of Privacy-Preserving Nudges -- 5.2 Design of Nudges that Target Reflective Thinking -- 5.3 Ask the Users -- 5.4 Choose a Combination of Approaches -- 6 Summary -- References -- The Hows and Whys of Dark Patterns: Categorizations and Privacy -- 1 Introduction -- 2 Dark Patterns -- 2.1 Why Do Dark Patterns Work? -- Heuristics and Biases -- 2.2 Privacy Decision-Making -- 2.3 Categorization of Dark Patterns -- 3 Privacy Dark Patterns -- 3.1 Examples of Privacy Dark Patterns -- Invisible to the Human Eye -- UI Design Tricks -- Constrained Actionability -- Emotion-Related -- Affecting Comprehension -- Time-Related -- Affecting Privacy Options -- 3.2 Tackling (Privacy) Dark Patterns -- 3.3 Dark Patterns and Implications on Businesses -- 4 Concluding Remarks -- References -- ``They see me scrollin''-Lessons Learned from Investigating Shoulder Surfing Behavior and Attack Mitigation Strategies -- 1 Introduction -- 2 Investigating the Phenomenon -- 2.1 Defining Shoulder Surfing (Attacks).</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">2.2 Research Methods -- 2.3 Key Findings on Shoulder Surfing Behavior -- 3 Mitigating Shoulder Surfing Attacks -- 3.1 Threat Models -- 3.2 Algorithmic Detection of Attacks -- 3.3 Prevention Strategies -- 4 Challenges and Future Research Directions -- 5 Conclusion -- References -- Privacy Research on the Pulse of Time: COVID-19 Contact-Tracing Apps -- 1 Introduction -- 2 Tracing Technologies -- 2.1 Proximity Tracing -- 2.2 Risk Calculation and Informing Those at Risk -- 3 Privacy and Contact Tracing Apps-User Studies -- 3.1 Results from User Studies-Privacy Concerns -- 3.2 Influence of Privacy on Using a CTA -- 4 Privacy: A Matter of Asking? Looking at Different Methods -- 4.1 Timing and Context -- 4.2 Who Is Asked? -- 4.3 Privacy Concerns != Privacy Concerns -- 5 Conclusion -- References -- Privacy Perception and Behavior in Safety-Critical Environments -- 1 Introduction -- 2 On the Relationship Between Cyber Privacy and Security Behavior -- 3 Awareness on Data Sharing Functionalities and Acceptance of Private Data Sharing -- 4 Critical Environment I: Digital Privacy Perceptions of Asylum Seekers in Germany -- 5 Critical Environment II: The Role of Privacy in Digitalization-Analyzing Perspectives of German Farmers -- 6 Conclusion -- References -- Part IV Solutions -- Generic Consents in Digital Ecosystems: Legal, Psychological, and Technical Perspectives -- 1 Challenge and Vision -- 2 Generic Consents -- 3 Legal Assessment -- 3.1 Personal Information Management Systems in Digital Ecosystems -- 3.2 Obtaining Consent via a PIMS -- 3.3 Using Allowlists in Digital Ecosystems -- Solution 1: Organizational Allowlists -- Solution 2: User-Defined Allowlists -- 3.4 Legal Conclusion -- 4 User-Oriented Redesign of Consent Handling -- 4.1 Psychological Effects of Cookie Banners -- Problem 1: Upfront Consents -- Problem 2: Coerced Consents.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Problem 3: Poor User Experience -- Problem 4: Unclear Utility -- Problem 5: Dark Patterns -- Problem 6: Repeated Consents -- 4.2 Solutions for Improved User Experience -- Solution 1: Make Cookies Something of Later Concern -- Solution 2: Reject Until Further Notice -- Solution 3: Provide Differentiated Decision Support -- Solution 4: Encourage Decision Review -- 5 Feasibility of Technical Implementation -- 5.1 Consent Representation Formats -- 5.2 Consent Forwarding -- 5.3 Data Forwarding -- 6 Discussion -- 6.1 Allowlists Created by NGOs (Solution 1) -- 6.2 Allowlists Created by the User (Solution 2) -- 6.3 Blocklists -- 6.4 Usability -- 7 Conclusion -- References -- Human-Centered Design for Data-Sparse Tailored Privacy Information Provision -- 1 Motivation -- 2 Overview of Extant Transparency-Enhancing Technologies -- 2.1 Tailoring Potential of Transparency-Enhancing Technologies -- 3 Solution Space for Tailoring Challenges -- 3.1 Privacy Preferences -- 3.2 Technical Privacy-Preserving Mechanisms -- 4 Solution Archetypes for Tailored Privacy Information Provision -- 4.1 Suitability of Tailoring Approaches -- 4.2 Feasibility of Local and Remote Processing -- 5 Conclusions -- References -- Acceptance Factors of Privacy-Enhancing Technologies on the Basis of Tor and JonDonym -- 1 Introduction and Background -- 2 Methodology -- 2.1 Questionnaire Composition -- 2.2 Questionnaire Data Collection -- 2.3 Questionnaire Evaluation -- Quantitative Methods -- Qualitative Methods -- 2.4 Interview Data Collection -- 2.5 Interview Evaluation -- 3 Results -- 3.1 Internet Users Information Privacy Concerns -- 3.2 Technology Acceptance Model -- 3.3 Evaluation of Open Questions -- 3.4 Customers' Willingness to Pay or Donate -- 3.5 Companies' Incentives and Hindrances to Implement PETs -- 4 Discussion and Conclusion -- References.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Increasing Users' Privacy Awareness in the Internet of Things: Design Space and Sample Scenarios.</subfield></datafield><datafield tag="588" ind1=" " ind2=" "><subfield code="a">Description based on publisher supplied metadata and other sources.</subfield></datafield><datafield tag="590" ind1=" " ind2=" "><subfield code="a">Electronic reproduction. Ann Arbor, Michigan : ProQuest Ebook Central, 2024. Available via World Wide Web. Access may be limited to ProQuest Ebook Central affiliated libraries. </subfield></datafield><datafield tag="655" ind1=" " ind2="4"><subfield code="a">Electronic books.</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Stöver, Alina.</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Marky, Karola.</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Print version:</subfield><subfield code="a">Gerber, Nina</subfield><subfield code="t">Human Factors in Privacy Research</subfield><subfield code="d">Cham : Springer International Publishing AG,c2023</subfield><subfield code="z">9783031286421</subfield></datafield><datafield tag="797" ind1="2" ind2=" "><subfield code="a">ProQuest (Firm)</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://ebookcentral.proquest.com/lib/oeawat/detail.action?docID=30702982</subfield><subfield code="z">Click to View</subfield></datafield></record></collection>