Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for projecting notes in real time based on book position

A book and projection technology, applied in the field of projection display, which can solve problems such as difficult notes and projections

Active Publication Date: 2021-09-14
SOUTH CHINA UNIV OF TECH
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The telescopic deformation of the book on the plane can be adapted through perspective transformation and can basically be accurately projected to the correct position. However, if the book has telescopic distortion in the three-dimensional direction, it is difficult to project the notes to the correct position.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for projecting notes in real time based on book position

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 2

[0125] For reading the coordinates of the book in step 2, the following methods can be used to extract the outline of the book:

[0126] 1) Use the camera to take a picture containing a complete book page.

[0127] 2) Use Photoshop to extract 5 to 7 points with representative colors within the range of the book page in the picture, read the color RGB values ​​of these points, and determine the RGB range of the book page color according to these RGB values.

[0128] 3) According to the RGB range in the previous step, use the inRange function in the CV2 library to extract the corresponding color range from the image captured by the camera to obtain a binary image.

[0129] 4) Gaussian blur the binarized picture.

[0130] 5) Then proceed to step (26), and the subsequent steps are the same.

Embodiment 3

[0132] For reading the coordinates of the book in step 2, the following methods can be used to calculate the coordinates of the four points of the book:

[0133] 1) Judging the polygon obtained in step (27), if the number of corner points of the polygon is equal to 4 (that is, a quadrilateral is extracted), then it is considered that the outline of the book has been extracted, and the coordinates of the four corner points of the polygon are ( x1, y1), (x2, y2), (x3, y3), (x4, y4).

[0134] 2) Substitute (x1, y1), (x2, y2), (x3, y3), (x4, y4) obtained in the above step into step (32), and the subsequent steps are the same.

Embodiment 4

[0136] To obtain the relationship between the projector and camera coordinates in step 1, make the following changes:

[0137] (1) Place a square piece of cardboard with a side length of 10cm on the base. Hereinafter referred to as the paper sheet, the function of the paper sheet is to provide an accurate reference for subsequent calibration.

[0138] (2) Obtain an image with a square through the camera.

[0139] (3) According to the projection size of the projector, generate a black picture that just occupies the screen of the projector with a red square at any position, and this square is recorded as a red square below. The midpoint of the red square is located at (α0, β0). The camera distortion is considered here, so the width and height are not necessarily equal. It is recorded as W0, H0, and the picture is projected on the projector, where the picture is named src.

[0140] (4) Mark the center point of the paper in the image acquired by the camera and obtain the pixel c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a book position-based real-time note projection method, which adopts an identification system with a camera, a projector and an image processing identification module. The method comprises the following steps: 1, acquiring a relationship between coordinates of the projector and the camera; 2, reading book coordinates through functions in the CV2 library; 3, judging whether the book contour is extracted or not, if so, executing the next step, and otherwise, returning to the step 2; 4, performing actual projection according to the projection coordinates. The system can project notes in real time to assist students in learning; the outline of the book is extracted through edge detection, so that inaccurate image recognition caused by ambient light can be well avoided, and compared with color recognition, the universality is wider; the edges of the book are complemented through the shape of the book, and inaccurate recognition caused by hand shielding is prevented.

Description

technical field [0001] The invention relates to the technical field of projection display, in particular to a method for real-time projection of notes based on book positions. Background technique [0002] Now online education is getting more and more attention, but traditional online education only allows students to obtain knowledge from the screen in one direction. This method does not work well in many teaching scenarios, and students are likely to be distracted. Or just listen without taking notes. Instead, a new way of online education can be adopted. By projecting the notes directly on the books, students can take notes while reading the notes in their own books, which deepens the impression of learning. And this method involves how to locate the book, and convert the position of the book into the projection coordinates of the camera, and how the note picture is converted and finally projected onto the book. At the same time, in order to improve the robustness of th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/13G06T7/181G06T7/80G06T3/40G03B29/00
CPCG06T7/13G06T7/181G06T3/4007G06T7/80G03B29/00
Inventor 杨俊曦陈安刘丞
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products