Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Impression degree extraction apparatus and impression degree extraction method

A technology for extracting devices and impressions, applied in psychological devices, TV, sports accessories, etc.

Inactive Publication Date: 2011-05-25
PANASONIC CORP
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when performing such long-term shooting, how to select important parts for the user from a large amount of recorded video data becomes a big problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Impression degree extraction apparatus and impression degree extraction method
  • Impression degree extraction apparatus and impression degree extraction method
  • Impression degree extraction apparatus and impression degree extraction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0043] figure 1 It is a block diagram of a content editing device including the impression degree extracting device according to Embodiment 1 of the present invention. The embodiment of the present invention is an example of a device suitable for shooting video with a body-worn camera and editing the captured video (hereinafter simply referred to as "experience video content") at an amusement park or tourist site.

[0044] exist figure 1 Among them, the content editing device 100 is roughly divided, and has an emotion information generating unit 200 , an impression degree extracting unit 300 , and an experience video content obtaining unit 400 .

[0045] The emotion information generation unit 200 generates emotion information representing the emotion generated by the user from the biometric information of the user. Here, the so-called emotions are not only emotions such as joy, anger, sorrow, and joy, but also the whole mental state including emotions such as relaxation. E...

Embodiment approach 2

[0230] As Embodiment 2 of the present invention, a case will be described in which the present invention is applied to game content in which selective actions are performed on a wearable game terminal. The wearable game terminal has the impression degree extracting device of this embodiment.

[0231] Figure 19 It is a block diagram of a game terminal including the impression extraction device according to Embodiment 2 of the present invention, which is different from that of Embodiment 1. figure 1 correspond. right with figure 1 The same parts are denoted by the same reference numerals, and descriptions of these parts are omitted.

[0232] exist Figure 19 In, the game terminal 100a replaces figure 1 The experience video content obtaining unit 400 has a game content executing unit 400a.

[0233] The game content execution unit 400a executes game content that performs selective actions. As for the game content, here is a game in which the user raises a pet virtually, an...

Embodiment approach 3

[0242] As Embodiment 3 of the present invention, a case where the present invention is applied to editing of a standby screen of a mobile phone will be described. The mobile phone has the impression extraction device of this embodiment.

[0243] Figure 21 is a block diagram of a mobile phone including the impression degree extracting device according to Embodiment 3 of the present invention, which is different from that of Embodiment 1 figure 1 correspond. right with figure 1 The same parts are denoted by the same reference numerals, and descriptions of these parts are omitted.

[0244] exist Figure 21 , the mobile phone 100b replaces figure 1 The experience video content obtaining unit 400 has a mobile phone unit 400b.

[0245] The mobile phone unit 400b realizes functions of a mobile phone including display control of a standby screen of a liquid crystal display (not shown). The mobile phone 400b has a screen design storage unit 410b and a screen design change unit ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An impression degree extraction apparatus which precisely extracts an impression degree without imposing a strain on a user in particular is disclosed. A content editing apparatus (100) comprises a measured emotion property acquiring section (341) which acquires measured emotion properties which show an emotion having occurred in the user in a measurement period, and an impression degree calculating part (340) which calculates the impression degree being a degree which shows how strong the user was impressed in the measurement period by comparing reference emotion properties which shows an emotion having occurred in the user in a reference period and the measured emotion properties. The impression degree calculating part (340) calculates the impression degree to be higher with the increase of the difference between the first emotion properties and the second emotion properties with the second emotion properties as the reference.

Description

technical field [0001] The present invention relates to an impression degree extracting device and an impression degree extraction method for extracting the impression degree indicating the degree of strength of the impression received by the user. Background technique [0002] When selecting and saving images from a large number of photographed images, or when performing selective operations in a game, the user often makes a selection based on the strength of the impression received. However, when there are many objects, this selection operation becomes a burden on the user. [0003] For example, a wearable camera that has attracted attention in recent years can easily perform continuous shooting for a long time such as a whole day. However, when such a long-time shooting is performed, how to select an important part for the user from a large amount of recorded video data becomes a big problem. What is important to the user should be determined based on the user's subject...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06Q50/00A61B5/16H04N5/91A63F13/00
CPCH04N7/163H04N21/44218G11B27/034
Inventor 张文利江村恒一浦中祥子
Owner PANASONIC CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products