000 07954nam a22006255i 4500
001 978-3-030-49062-1
003 DE-He213
005 20240730165937.0
007 cr nn 008mamaa
008 200602s2020 sz | s |||| 0|eng d
020 _a9783030490621
_9978-3-030-49062-1
024 7 _a10.1007/978-3-030-49062-1
_2doi
050 4 _aQA76.9.U83
050 4 _aQA76.9.H85
072 7 _aUYZ
_2bicssc
072 7 _aCOM079010
_2bisacsh
072 7 _aUYZ
_2thema
082 0 4 _a005.437
_223
082 0 4 _a004.019
_223
245 1 0 _aHuman-Computer Interaction. Multimodal and Natural Interaction
_h[electronic resource] :
_bThematic Area, HCI 2020, Held as Part of the 22nd International Conference, HCII 2020, Copenhagen, Denmark, July 19-24, 2020, Proceedings, Part II /
_cedited by Masaaki Kurosu.
250 _a1st ed. 2020.
264 1 _aCham :
_bSpringer International Publishing :
_bImprint: Springer,
_c2020.
300 _aXXI, 735 p. 325 illus., 246 illus. in color.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
490 1 _aInformation Systems and Applications, incl. Internet/Web, and HCI,
_x2946-1642 ;
_v12182
505 0 _aGesture-based Interaction -- A Human-centered Approach to Designing Gestures for Natural User Interfaces -- Comparing a Mouse and a Free Hand Gesture Interaction Technique for 3D Object Manipulation -- Research on Gesture Interaction Design for Home Control Intelligent Terminals -- A Comparative Study of Hand Gesture Recognition Devices for Games -- The Social Acceptability of Peripheral Interaction with 3D Gestures in a Simulated Setting -- Research of Interactive Gesture Usability of Navigation Application Based on Intuitive Interaction -- Gesture-based interaction: Visual gesture mapping -- The Potential of Gesture-Based Interaction -- Detecting Gestures through a Gesture-Based Interface to Teach Introductory Programming Concepts -- A Mouth Gesture Interface Featuring a Mutual-Capacitance Sensor Embedded in a Surgical Mask -- Speech, Voice, Conversation and Emotions -- The Effects of Body Gestures and Gender on Viewer's Perception of Animated Pedagogical Agent's Emotions -- Integrating Language and Emotion Features for Multilingual Speech Emotion Recognition -- A new approach to measure user experience with voice-controlled intelligent assistants: A pilot study -- Comparing the user preferences towards emotional voice interaction applied on different devices: An empirical study -- Research on Interaction Design of Artificial Intelligence Mock Interview Application Based on Goal-directed Design Theory -- The Effect of Personal Pronouns on Users' Emotional Experience in Voice Interaction -- The Effect of Naturalness of Voice and Empathic Responses on Enjoyment, Attitudes and Motivation for Interacting with a Voice User Interface -- Impression Detection and Management Using an Embodied Conversational Agent -- Expectation and Reaction as Intention for Conversation System -- Augmented Tension Detection in Communication: Insights from Prosodic and Content Features -- How to Design the Expression Ways of Conversational Agents Based on Affective Experience -- Deep Learning-basedEmotion Recognition from Real-Time Videos -- Multimodal Interaction -- Designing An AI-Companion to Support the Driver in Highly Autonomous Cars -- SilverCodes: Thin, Flexible, and Single-Line Connected Identifiers Inputted by Swiping with a Finger -- A Defocus Based Novel Keyboard Design -- Affective Haptics and Multimodal Experiments Research -- Recent Multimodal Communication Methodologies in Phonology, Vision, and Touch -- A Framework of Input Devices to Support Designing Composite Wearable Computers -- Introducing Mobile Device-Based Interactions to Users: An Investigation of Onboarding Tutorials -- Multimodal Analysis of Preschool Children's Embodied Interaction with a Tangible Programming Environment -- Identification Method of Digits for Expanding Touchpad Input-. FingerTalkie: Designing A Low-cost Finger-worn Device for Interactive Audio Labeling of Tactile Diagrams -- A Virtual Mouse Interface for Supporting Multi-User Interactions -- Floating Hierarchical Menus for Swipe-based Navigation on Touchscreen Mobile Devices -- Touch Position Detection on the Front of Face Using Passive High-functional RFID Tag with Magnetic Sensor -- Human Robot Interaction -- One-hand Controller for Human-Drone Interaction - a Human-centered Prototype Development -- Sexual Robots: the Social-Relational Approach and the Concept of Subjective Reference -- Theses on the Future Design of Human-Robot Collaboration -- Trust on Service Robots: A Pilot Study on the Influence of Eyes in Humanoid Robots during a VR Emergency Egress -- Modelling the Collaboration of a Patient and an Assisting Humanoid Robot during Training Tasks -- Multi-Human Management of Robotic Swarms -- The Current Status and Challenges in Augmented-Reality Navigation System for Robot-Assisted Laparoscopic Partial Nephrectomy -- Database Semantics for Talking Autonomous Robots -- Emotion Synchronization Method for Robot Facial Expression -- Human-Robot Interaction in Health Care: Focus on Human Factors -- Evaluating a Mouse-based and a Tangible Interface Used for Operator Intervention on two Autonomous Robots -- On positive effect on humans by poor operability of robot -- Human-Drone Interaction: Using Pointing Gesture to Define a Target Object -- Enhancing Drone Pilots' Engagement Through a Brain-Computer Interface -- The Effects of Different Robot Trajectories on Situational Awareness in Human-Robot Collaboration.
520 _aThe three-volume set LNCS 12181, 12182, and 12183 constitutes the refereed proceedings of the Human Computer Interaction thematic area of the 22nd International Conference on Human-Computer Interaction, HCII 2020, which took place in Copenhagen, Denmark, in July 2020.* A total of 1439 papers and 238 posters have been accepted for publication in the HCII 2020 proceedings from a total of 6326 submissions. The 145 papers included in these HCI 2020 proceedings were organized in topical sections as follows: Part I: design theory, methods and practice in HCI; understanding users; usability, user experience and quality; and images, visualization and aesthetics in HCI. Part II: gesture-based interaction; speech, voice, conversation and emotions; multimodal interaction; and human robot interaction. Part III: HCI for well-being and Eudaimonia; learning, culture and creativity; human values, ethics, transparency and trust; and HCI in complex environments. *The conference was held virtually due to the COVID-19 pandemic.
650 0 _aUser interfaces (Computer systems).
_911681
650 0 _aHuman-computer interaction.
_96196
650 0 _aArtificial intelligence.
_93407
650 0 _aComputers, Special purpose.
_946653
650 0 _aSoftware engineering.
_94138
650 0 _aComputer vision.
_991301
650 1 4 _aUser Interfaces and Human Computer Interaction.
_931632
650 2 4 _aArtificial Intelligence.
_93407
650 2 4 _aSpecial Purpose and Application-Based Systems.
_946654
650 2 4 _aSoftware Engineering.
_94138
650 2 4 _aComputer Vision.
_991302
700 1 _aKurosu, Masaaki.
_eeditor.
_4edt
_4http://id.loc.gov/vocabulary/relators/edt
_991303
710 2 _aSpringerLink (Online service)
_991304
773 0 _tSpringer Nature eBook
776 0 8 _iPrinted edition:
_z9783030490614
776 0 8 _iPrinted edition:
_z9783030490638
830 0 _aInformation Systems and Applications, incl. Internet/Web, and HCI,
_x2946-1642 ;
_v12182
_991305
856 4 0 _uhttps://doi.org/10.1007/978-3-030-49062-1
912 _aZDB-2-SCS
912 _aZDB-2-SXCS
912 _aZDB-2-LNC
942 _cELN
999 _c86634
_d86634