Patents
Literature
Patsnap Copilot is an intelligent assistant for R&D personnel, combined with Patent DNA, to facilitate innovative research.
Patsnap Copilot

107 results about "Behavioral state" patented technology

System and Method for Visual Analysis of Emotional Coherence in Videos

A computer implemented method and system processing a video signal. The method comprises comprising the steps of: detecting a human face displayed in the video signal and extracting physiological, biological, or behavior state information from the displayed face at a first level of granularity of the video signal; processing any two or more of: (i) a script derived from or associated with the video signal to extract language tone information from said script at a first level of granularity of the script; (ii) an audio signal derived from or associated with the video signal to derive behavior state information from said audio signal at a first level of granularity of the audio signal; (iii) a video image derived from the video signal to detect one or more human gestures of the person whose face is displayed in the video signal; and merging said physiological, biological, or behavior state information extracted from the displayed face in the video signal with any two or more of: (i) the language tone information extracted from the script; (ii) the behavior state information derived from the audio signal; and (iii) and the one or more human gestures derived from the video image, wherein the merging step is based on behavior state categories and/or levels of granularity.
Owner:BLUE PLANET TRAINING INC

Matching method of real-time three-dimensional visual virtual monitoring for intelligent manufacturing

ActiveCN106157377AImprove the real-time performance of actionsIncrease or decrease the speed of movementImage data processingReal-time dataBehavioral state
The invention discloses a matching method of real-time three-dimensional visual virtual monitoring for intelligent manufacturing. According to the method, three-dimensional virtual monitoring for an intelligentization manufacturing workshop is realized by virtue of the behavior state matching of feed-forward feed compound control and by virtue of real-time data. The method comprises the following steps: first decomposing action sequences of various monitored intelligent devices of a workshop, ranking the action sequences according to an operational logic, predicting a next action of a virtual model by virtue of a feed-forward control method, and carrying out optimization matching according to the state data fed back by the intelligent device in real time and in conjunction with a prediction action, thus obtaining a current movement instruction and driving a three-dimensional model. By adopting the method, the real-time performance and accuracy of the model action in the three-dimensional visual virtual monitoring can be effectively improved, the optimization of a virtual scene is realized, and the real-time performance and real-scene feeling of the three-dimensional virtual monitoring for the manufacturing workshop can be improved.
Owner:NANJING UNIV OF AERONAUTICS & ASTRONAUTICS

Fault-recording-based protection action information analyzing method

The invention relates to a fault-recording-based protection action information analyzing method. The fault-recording-based protection action information analyzing method comprises the steps that a configuration model is built; primary equipment serves as the basis, and a configuration model of association among the primary equipment, secondary equipment related to the primary equipment and a protection device is built; the protection action behavior state and sequential relationship of the protection device of the primary equipment and the secondary equipment related to the primary equipment are determined, and the protection action behavior state of the protection device is described with time as the axis; on the basis of the protection action behavior state and sequential relationship of the protection device of the primary equipment and the secondary equipment related to the primary equipment, in combination with protection action behavior logic and related constant-value information in the actual process, the protection action behavior of the protection device is judged, and inverting is performed on the fault process; the result is comprehensively analyzed, and a protection action report is formed. The method obtains data from fault recording, a new means and a new method are provided for protection action analysis in a power grid, and a reference base is provided for protection action behavior judgment after an accident.
Owner:STATE GRID ANHUI ELECTRIC POWER +2

Method for verifying Cache coherence protocol and multi-core processor system

The invention provides a method for verifying a Cache coherence protocol and a multi-core processor system. The method for verifying the Cache coherence protocol comprises the following steps that: a plurality of queues are arranged in a monitor, every queue comprises a plurality of units, the units are used for recording all primary requests which are not processed completely, all the requests which are relevant to addresses are orderly stored in the units of the same queue according to the sequence in which the requests enter a coherence processing element, and every unit is used for tracking the performance status of recorded requests independently. According to the characteristic that the method for verifying the Cache coherence protocol based on the monitor in the invention can process the requests relevant to the memorized and accessed addresses in sequence according to the Cache coherence protocol, the monitor is used for monitoring the protocol-level behavior of the Cache coherence processing element accurately, and the behavior of every request package can be monitored accurately. Through adjusting the content in the monitor, the method for verifying the Cache coherence protocol is suitable for verifying various coherence protocols.
Owner:JIANGNAN INST OF COMPUTING TECH

Virtual reality system and method for animal experiment

The invention discloses a virtual reality system and method for an animal experiment. The system comprises a local driving module, a local information acquisition module, a mobile driving module, a mobile information acquisition module and a server module, wherein the local driving module is used for acting a stimulus on an experimental animal; the local information acquisition module is used foracquiring physiological and behavior states of the experimental animal; the mobile driving module is used for driving a mobile platform to move; the mobile information acquisition module is used for acquiring a surrounding signal; and the server module is used for generating a control instruction of the mobile driving module according to the physiological and behavior states of the experimental animal so as to drive the mobile platform to move and generating a control instruction of the local driving module according to the surrounding signal so as to act the stimulus on the experimental animal. According to the system disclosed by the embodiment of the invention, a behavioral experimental paradigm of effectively and rapidly fixing an animal is created; functions of an existing measurementinstrument can be economically and efficiently expanded; and the virtual reality system and method for the animal experiment have the ability of measuring physiological parameters of a moving mouse under the high accuracy.
Owner:TSINGHUA UNIV

Behavior prediction processing method and device and electronic equipment

The invention provides a behavior prediction processing method and device and electronic equipment, and relates to the technical field of computer data processing. The method comprises the following steps: acquiring a plurality of monitoring images corresponding to different moments from a monitoring video, and flight information corresponding to time periods of the monitoring video; acquiring anddetermining macroscopic features and microscopic features of characters in the plurality of monitoring images in a preset scene model from the plurality of monitoring images, wherein the macroscopicfeatures comprise position information of the characters in a preset scene model, the number of the characters in the preset scene model, the human density, the movement speed and the movement direction of the characters, and the microcosmic features comprise texture information in the monitoring image; inputting the distribution information into the preset probability transfer model based on theflight information, and obtaining the prediction result output by the preset probability transfer model. Prediction of the passenger behavior state is achieved, and the technical problem that abnormalconditions cannot be prevented and processed in time through a monitoring video in the prior art can be solved.
Owner:THE SECOND RES INST OF CIVIL AVIATION ADMINISTRATION OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products