Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pull-up number calculation method based on machine vision

A pull-up and machine vision technology, applied in the field of machine learning visual recognition, can solve the problem of high misjudgment rate of pull-up item counting and high labor cost, and achieve strong applicability, high counting accuracy, and strong robustness. Effect

Active Publication Date: 2021-06-22
安徽一视科技有限公司
View PDF9 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to avoid the shortcomings of the above-mentioned prior art, the present invention provides a method for calculating the number of pull-ups based on machine vision, in order to solve the problems of high misjudgment rate of counting pull-up items and high labor costs, and realize physical fitness testing The automation of counting work in the middle of the pull-up, so as to improve the recognition accuracy and counting efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pull-up number calculation method based on machine vision
  • Pull-up number calculation method based on machine vision
  • Pull-up number calculation method based on machine vision

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] In this embodiment, a method for measuring the number of pull-ups based on machine vision is applied to a collection environment where a camera is arranged above a single pole, and is performed in the following steps:

[0030] Step 1: Face and hand collection:

[0031] Use the camera to collect the front face image and the forehand grip image as the training set of the current tester, where figure 1 It is a schematic diagram of the position of the head and a single pole. After marking the faces and hands in the training set, input the YOLO target detection model for iterative training, and obtain the corrected human hand image and human face image;

[0032] Step 2: Detect whether the hand is holding the bar on the corrected human hand image. If the arm is close to 180°, that is, the angle |X 1 -X 0 |>G, it is judged that the hand has held the rod correctly; otherwise, return to step 1, where X 1 Indicates the angle of the upper limb of the arm; X 0 Indicates the ang...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a pull-up number calculation method based on machine vision, which comprises the following steps of: judging a standard pull-up action by three indexes, respectively inspecting the grasping posture, the head height and the arm bending degree of a testee, detecting the grasping posture and the head height by using a YOLO target detection model, identifying the arm posture, and calculating the pull-up quantity by using the YOLO target detection model; firstly, judging whether a handheld lever is correct or not, then taking a connecting line of midpoints of two hand recognition frames as a pull-up recognition reference line, at the moment, requiring an included angle of a large arm and a small arm to be close to 180 degrees, simultaneously recognizing the head position of a testee so as to judge whether the head passes through the line or not, and finally judging whether the elbow joint angle is qualified or not and if each index meets the requirements, judging that the pull-up action is a standard pull-up action. The invention is simple in structure, high in recognition accuracy and higher in efficiency compared with manual counting.

Description

technical field [0001] The invention relates to the field of machine learning visual recognition, in particular to a method for calculating the number of pull-ups based on machine vision. Background technique [0002] As a branch of the field of artificial intelligence, machine vision means that the machine simulates the visual cognitive processing ability of human beings, transmits the ingested image information to the image processing system, and mines and processes the low-level digital data in the image and video without human intervention. And translate it into high-level information output, so as to approximate human's ability to understand visual signals. This technology originated in the 1960s, but due to the limitations of image acquisition and computer computing power, it did not usher in explosive growth until 2010. It is currently used in intelligent video surveillance, content-based video retrieval, medical fields, man-machine There are broad application prospe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08
CPCG06N3/04G06N3/08G06V40/23G06V40/28G06V40/161G06V40/168
Inventor 唐义平汪斌祖慈张雪松李帷韬
Owner 安徽一视科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products