Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Student experiment classroom behavior identification method based on top vision

A technology for students and experiments, applied in biometrics, character and pattern recognition, computer components, etc., can solve problems such as low pertinence, small number of cameras, incompleteness, etc., to avoid collecting students' facial information and improve privacy Effects of safety and positioning deviation reduction

Pending Publication Date: 2022-01-28
XIDIAN UNIV
View PDF7 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] One is that the experimental platform is blocked by the experimental equipment, so that there are many occluded objects in the original pictures taken by the cameras with ordinary viewing angles in the corner of the device, resulting in few and incomplete behaviors of the students, and the training images contain the faces of the students Information, when the database storing image data leaks information due to some uncontrollable factors, it will cause privacy violations;
[0008] Second, due to the complex objects in the imaging area, the number of cameras that can be used is small, and the installation position and angle of the cameras are limited, so the target behavior cannot be intercepted independently, and the extraction of background objects in the picture causes system positioning deviation, which increases the system network. Difficulty in training, low pertinence, large amount of model reasoning calculations, poor generalization ability of the system model, and low recognition accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Student experiment classroom behavior identification method based on top vision
  • Student experiment classroom behavior identification method based on top vision
  • Student experiment classroom behavior identification method based on top vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] Embodiments of the present invention will be further described in detail below in conjunction with the accompanying drawings.

[0022] refer to figure 1 , the implementation steps of the present invention are as follows:

[0023] Step 1, install the camera in the laboratory.

[0024] refer to figure 2 , install the camera vertically above the tabletop of each experimental table, and adjust the shooting angle so that it is aimed at the area where students' hands move on the tabletop of the experimental table.

[0025] Step 2, make a dataset.

[0026] 2.1) Use each camera to take a video of the hand behavior of the students during the experiment at the same time, and extract the frame from the video stream to obtain the original picture;

[0027] 2.2) For each original picture, use any method of horizontal flip, vertical flip, proportional scaling, cropping, expansion, and rotation to enhance, increase the number of pictures, and perform calibration to obtain a data ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a student experiment classroom behavior identification method based on top vision, and mainly solves the problems of high workload and poor timeliness caused by manual information extraction in traditional behavior recognition. According to the implementation scheme, the method includes installing a camera vertically above the desktop of each experiment table, performing video sampling on behaviors of students during experiment, and performing image frame extraction and calibration on sampled video streams to obtain a data set; training a target detection network by using the data set to obtain a trained target detection model; inputting a student experiment classroom video into the trained target detection model to obtain a picture for framing the hands of the student; and classifying all pictures framing the student hands by using a deep learning classification model to generate a recognition result of the student hand behaviors. According to the invention, the data processing amount and the calculation amount are reduced, pictures of the hand behaviors and actions of the students can be accurately obtained and stored in a laboratory with dense equipment and scattered students, and the pictures can be extracted by teachers and can be used for experiment teaching.

Description

technical field [0001] The invention belongs to the technical field of behavior recognition, and in particular relates to a behavior recognition method for students' experimental classrooms. It can be used for experimental teaching. Background technique [0002] With the development of industry and the progress of society, various industries need a large number of high-quality talents. In order to meet the needs of the society, the field of education, which is responsible for the output of talents, has also carried out a series of educational reforms aimed at improving the quality of teaching and cultivating talents. Among the many identification and evaluation indicators for reform, students' classroom behavior is an important reference information to reflect the effectiveness of teaching. [0003] In the traditional behavior observation, the school installed a small number of surveillance cameras in each classroom to record students' behavior in class, so that teachers c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/82G06V20/40G06V20/52G06V40/10G06V10/774G06K9/62G06N3/04
CPCG06N3/045G06F18/214
Inventor 袁晓光任爱锋刘诗若胡振勇龙璐岚
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products