본문 바로가기 대메뉴 바로가기
open
Close

Human-Centered Intelligent Systems Lab.

Human-Vehicle Interaction

High-Fidelity Simulation of Autonomous Driving with a VR/MR/Motion support (A.I. Basic Research)


  • Goal

    • Build first person view virtual driving system for further Human-Vehicle Interaction research

  • Characteristic

    • Using A3 motion platform system for realistic driving experience

    • Make virtual environment with whole GIST map by using Unity engine

    • Build elements that interact with environment such as the day / night and the roadside

    • Relate with eye tracking and physiological sensor to monitor driver’s condition​

VROOM & MROOM


  • Concept

    • Human Vehicle Interaction experiments, which consist of virtual reality driving simulation or mixed reality driving simulation in the Real Vehicle are called VROOM or MROOM

  • Characteristic

    • Reality: Participants can feel the movement of the actual car

    • Reliability: More reliable data collection than indoor environmental experiments

    • Autonomous driving effect: It can be achieved simply through the WoZ driving of the experimenter

  • Benefits

  • System Concept

    • Real Road Autonomous Driving Simulator with an VR/MR simulation in Field Experiments

Case Study 1: Driver-Aware Disengagement in Semi-Autonomous Driving Situations (HCIK 2018, A.I. Basic Research)

Real-time driver’s task load discrimination -> Driver’s attentiveness level classification and condition discrimination -> Suggest various situation based hand-over notification interface design guidelines for control of high reliable autonomous driving vehicles

  • Procedure

    • Survey of 100 people (M= 32.66, SD= 11.20, Min: 21 ~ Max: 64, rate: male 70%, female 30%)

    • Analysis of hand-over method guidelines according to driver’s perceptive and task load aspect for each possible situation in autonomous vehicle

    • Suggest novel Hand-Over HMI (Human-Machine Interface) environment and methodology according to driver’s condition. This suggestion is based on survey of over 10 literatures with Hand-Over

  • Result

    • Evaluation of task load for notifications according to driver’s condition when Hand-Over notification is given to the driving control -> Suggest novel notification interface design guidelines to decrease task load and increase driver’s take-over speed.

    • Hand-over notification result : sound → vibration → visual

    • People prefer to be notified with devices which are used by drivers (e.g. phone, tablet, etc…)

Case Study 2: User-Centric Intelligibility of Autonomous Vehicles (HCIK 2018, A.I. Basic Research)

Exploring the modality that positively influenced the driver’s attitude when providing the combination of ‘Why message’ (e.g. construction site) and ‘How message’ (e.g. decrease driving speed) according to the behavior of the vehicle in the autonomous driving situation

  • Procedure

    • Design contextual message types, scenario-specific modality distribution and different messages by notification type

    • Provide messages to autonomous HVI (Human-Vehicle Interaction) test bed according to four kinds of situations and scenarios

    • Interim-Q for measurement of driver’s attitude (driver’s feeling and vehicle reliability)

    • Measurement of Eye-tracking, physiological data

  • Result

    • Driver’s attitude index is low when ‘How message’ is not provided --> Negative emotions and credibility

    • ‘How’ message shows the most positive response when visual + auditory message is given

    • Show less active eye tracking despite plenty of information (visual > visual + auditory) → Positive results are shown without much visual load compared to the driver’s attitude index

Case Study 3: Multi-modal NUI (Nature User Interfaces) with a Large-scale AR HUD in Vehicles (HCIK 2018, KETI)
Development of multi-modal NUI (Natural User Interface) based on voice command and touch gesture to improve user experience in vehicle

  • Procedure

    • Preliminary online survey (158 people; rate: male 109, female 49)

      • Controlling air conditioner and music player are chosen for in- vehicle infotainment

    • Prototype steering wheel by placing a touch gesture pad in the center of the wheel and on the thumb of both hands

      • Gesture: Center double tapping(Voice recognizing), Center single long tapping(Selection (play, On, Off)), Both(thumb pad) swipe(choice(before, next, temperature level selection))

    • Suggest novel Hand-Over HMI (Human-Machine Interface) environment and methodology according to driver’s condition. This suggestion is based on survey of over 10 literatures with Hand-Over

  • Result

    • Use the motion platform and virtual driving simulation to build a realistic driving environment and evaluate usability of developed interfaces

    • In the result of measured workload using NASA-TLX, pair t-test showed that participants’ subjective workload is significantly lower for the prototype than the central console

      • We verify the usability of vehicle interfaces that support voice commands and touch gesture-based multi-modal interactions.

    • Usability data analysis showed overall usability is high

QUICK
MENU
연구실 Google 홈페이지연구실 Google 홈페이지 GIST 홈페이지GIST 홈페이지 융합기술학제학부융합기술학제학부 GIST PortalGIST Portal 증명서 발급증명서 발급