Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Multi-sensory vehicle-mounted interaction method and system based on multi-modal analysis

An interactive method and interactive system technology, applied in neural learning methods, vehicle parts, character and pattern recognition, etc. It does not take into account human body state factors and other issues to achieve the effect of self-optimization

Active Publication Date: 2022-05-27
SHANGHAI JIAO TONG UNIV
View PDF7 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This technology does not consider the factors of the human body state, and cannot fully and accurately reflect the characteristics of the human body state
[0004] In the Chinese patent with application number CN201480019946.0, a "control method for vehicle functional components used to generate different multi-sensory environments in vehicles" is disclosed. This technology can control the sound, lighting, and fragrance in the car, but only Control each component according to the preset scene, without real-time active interaction features and personalized adjustment functions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-sensory vehicle-mounted interaction method and system based on multi-modal analysis

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0050] The present invention will be described in detail below with reference to specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that, for those skilled in the art, several changes and improvements can be made without departing from the inventive concept. These all belong to the protection scope of the present invention.

[0051] refer to figure 1 The invention discloses a multi-sensory vehicle-mounted interaction method based on multi-modal analysis, comprising the following steps:

[0052] Step S1: Capture and store the multi-modal source data of the driver in real time through the on-board camera, microphone, and temperature and humidity sensor.

[0053] The vehicle camera captures:

[0054] - Eye data: including the number of blinks and the distance between the upper and lower eyelids;

[0055] -Facial expressions: includi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a multi-sensory vehicle-mounted interaction method and system based on multi-modal analysis. The method comprises the following steps: capturing and storing multi-modal source data of a driver in real time through a vehicle-mounted camera, a microphone and a temperature and humidity sensor; analyzing the multi-modal source data of the driver in real time, and extracting feature vectors in the multi-modal source data; splicing the plurality of feature vectors and converting the plurality of feature vectors to the same dimension to obtain processed data; inputting the processed data into a BP neural network for training, and judging the real-time state of the driver; and according to the real-time state of the driver, a corresponding interaction service is actively provided for the driver. By adopting a multi-modal information processing means, the real-time state of a driver can be comprehensively judged, active interaction service is provided, and the accuracy of understanding the emotion and intention of a user is improved; and self-optimization of the vehicle-mounted interaction system is realized through a BP neural network training model.

Description

technical field [0001] The invention relates to the technical field of human-computer interaction, and in particular, to a multi-sensory vehicle-mounted interaction method and system based on multimodal analysis. Background technique [0002] In the stage of human-machine co-driving, multi-sensory channel fusion and interaction will establish a new interactive experience between humans and machines. By collecting and analyzing human expressions, voice, temperature and humidity, etc., the user's emotional state and intention can be comprehensively judged, and passive interaction can be transformed. for active interaction. At the same time, communicating with users in interactive ways of sight, voice, smell, and touch can significantly improve the driving experience. [0003] The Chinese patent application number CN201910764559.4 discloses "an adaptive multi-sensory sleep assistance system based on artificial intelligence", which can judge the sleep state and environmental in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B60W40/08B60W50/08G06V20/59G06K9/62G06N3/04G06N3/08
CPCB60W40/08B60W50/08G06N3/04G06N3/084B60W2540/22G06F18/25
Inventor 冯捷张峻玮孙雪雯张兴国董占勋李亚鸿
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products