Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Object recognition and real-time translation method and device

A technology for object recognition and translation devices, applied in neural learning methods, character and pattern recognition, and electric-operated teaching aids. The effect of improving the learning experience

Pending Publication Date: 2019-11-19
GUANGDONG UNIV OF TECH
View PDF6 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, early childhood education product courses have generally appeared in the market. However, the current early education product courses are more inclined to test-oriented education rather than interest-oriented. The boring and rigid exam-oriented education courses cannot really improve learners' interest in English learning.
[0003] At present, there are software or devices related to Enlightenment English learning on the market that are limited to identifying fixed flat cards, so there are problems such as serious homogeneity and limited content.
In addition, the existing enlightenment English learning software or devices are mainly based on traditional literacy cards and story books, so they can only achieve simple cognitive experience, and cannot flexibly combine physical objects for learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Object recognition and real-time translation method and device
  • Object recognition and real-time translation method and device
  • Object recognition and real-time translation method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] Such as figure 1 Shown is a flow chart of the method for object recognition and real-time translation in this embodiment.

[0040] This embodiment proposes a method for object recognition and real-time translation, including the following steps:

[0041] S1: Obtain real-time images of objects in front through the camera.

[0042] S2: Input the image into a convolutional neural network model to extract depth feature information of the image.

[0043] In this step, the convolutional neural network model includes a convolutional layer and a pooling layer for extracting depth feature information of the input image.

[0044] S3: Input the extracted depth feature information into the image recognition model to recognize the category of the object, and output the recognized category of the object. The specific steps are as follows:

[0045] S3.1: Input the depth feature information corresponding to the feature points of the object into the image recognition model and perfo...

Embodiment 2

[0054] This embodiment proposes an object recognition and real-time translation device, which applies the object recognition and real-time translation method of the above-mentioned embodiments. Such as Figure 2-4 Shown is a schematic diagram of the object recognition and real-time translation device of this embodiment.

[0055] The object recognition and real-time translation device of this embodiment includes a central processing unit 1, an image acquisition unit 2, a display screen 3, a camera 4, a device housing 5, a key unit 6, a distance sensor 7, a light sensor 8, and a speaker 9, wherein The camera 4 is arranged on one side of the device casing 5, the display screen 3 is arranged on the other side of the device casing 5, the central processing unit 1 and the image acquisition unit 2 are integrated inside the device casing 5, the button unit 6, the distance sensor 7, the light The sensor 8 and the speaker 9 are respectively arranged on the device casing 5 . Specifical...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of image processing recognition and automatic translation, and provides an object recognition and real-time translation method which comprises the following steps: acquiring an image shot by a front object in real time through a camera; inputting the image into a convolutional neural network model, and extracting depth feature information of the image;inputting the extracted depth feature information into an image recognition model to recognize the category of the object, and outputting the recognized category of the object; and translating the object category into a target language through a translation algorithm and outputting the target language. The invention further provides a device applying the method which comprises a central processingunit, an image acquisition unit, a display screen, a camera and a device shell, the camera is arranged on one side surface of the device shell, the display screen is arranged on the other side surface of the device shell, and the central processing unit and the image acquisition unit are integrally arranged in the device shell. According to the invention, the current object can be recognized andtranslated, and the learning experience of a user is improved.

Description

technical field [0001] The present invention relates to the technical field of image processing recognition and automatic translation, and more specifically, to a method for object recognition and real-time translation, and a device for object recognition and real-time translation. Background technique [0002] Infants and toddlers are the period when the child's nervous system develops the fastest and language development is the most critical. It is a good time for language education for children. At present, early childhood education product courses have generally appeared in the market. However, the current early education product courses are more inclined to test-oriented education rather than interest-oriented. The boring and rigid exam-oriented education courses cannot really improve learners' interest in English learning. [0003] At present, some Enlightenment English learning software or devices on the market are only limited to identifying fixed flat cards, so ther...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06N3/04G06N3/08G09B5/06
CPCG06N3/08G09B5/065G06V20/20G06N3/045
Inventor 于兆勤韦怡婷王惠卢汝铭麦雪莹刘浩诚
Owner GUANGDONG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products