Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network-based human-computer interaction feature point annotation system and using method thereof

A human-computer interaction and neural network technology, applied in the field of human-computer interaction feature point labeling system, can solve the problems of increasing the operating platform space of the computer, inconvenient screen heat dissipation and cleaning, blocking image clarity, etc., to ensure the use effect and Heat dissipation effect, convenient data labeling, convenient management and maintenance effects

Inactive Publication Date: 2021-02-12
王冬
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In many neural network models, it is necessary to label the data well, but the current labeling equipment rarely has a machine that can specifically process the data, and the computer screen is also small, and it is also necessary to display the operation content when labeling the data. Corresponding increase, so dual-display computer screens appear, in order to increase the space of the computer's operable platform
[0003] The existing dual-screen display computer screen is usually fixed on the bracket, and it is not convenient to fold and store when not in use, so it takes up space, and because the static electricity generated by the computer screen when in use will absorb dust and adhere to the computer screen, on the one hand On the other hand, it is not easy to dissipate heat and clean the screen. In view of this, we propose a neural network-based human-computer interaction feature point labeling system and its application method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network-based human-computer interaction feature point annotation system and using method thereof
  • Neural network-based human-computer interaction feature point annotation system and using method thereof
  • Neural network-based human-computer interaction feature point annotation system and using method thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038]The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0039] see Figure 1 to Figure 7 , the present invention provides a technical solution:

[0040] A neural network-based human-computer interaction feature point labeling system, a neural network-based human-computer interaction feature point labeling system,

[0041] It includes an input unit, a receiving unit, a label selection unit, a label withdrawal unit, a label moving unit, and a label calibration unit;

[0042] The input unit receives instructions for users...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of neural network data annotation, in particular to a neural network-based human-computer interaction feature point annotation system and a use method thereof, and the system comprises an input unit, a receiving unit, an annotation selection unit, an annotation withdrawing unit, an annotation moving unit and an annotation calibration unit; the input unit receivesan instruction input by a user, and receives a user instruction through the receiving unit to annotate a picture; a user uses the annotation selection unit for annotation selection, meanwhile, the annotation withdrawing unit is used for withdrawing annotations needing to be withdrawn, in the annotation process, the annotation moving unit is used for moving the annotations, and the annotations arecalibrated according to real values of data. Through the design of a computer screen, data is annotated, and the system is mainly used for annotating pictures of human body postures and has a very good annotation effect.

Description

technical field [0001] The invention relates to the field of human-computer labeling, in particular to a neural network-based human-computer interaction feature point labeling system and a method for using the same. Background technique [0002] In many neural network models, it is necessary to label the data well, but the current labeling equipment rarely has a machine that can specifically process the data, and the computer screen is also small, and it is also necessary to display the operation content when labeling the data. Corresponding increase, so the computer screen of dual display occurs, so that the space of the operable platform of computer increases. [0003] The existing dual-screen display computer screen is usually fixed on the bracket, and it is not convenient to fold and store when not in use, so it takes up space, and because the static electricity generated by the computer screen when in use will absorb dust and adhere to the computer screen, on the one ha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F1/16F16M11/10F16M11/16F16M11/18F16M11/22F16N7/38
CPCF16M11/10F16M11/16F16M11/18F16M11/22F16N7/38G06F1/1601
Inventor 王冬
Owner 王冬
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products