A large-screen interaction system based on laser radar positioning

A lidar and interactive system technology, applied in the field of human-computer interaction, can solve the problems of inability to directly apply, large amount of lidar output data, complex data decoding algorithm, etc., to achieve real-time accurate interactive information collection, strong reusability, The effect of efficient and accurate positioning information

Active Publication Date: 2019-05-31
合肥金诺数码科技股份有限公司
View PDF8 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the problems that the existing lidar output data is large, the data decoding algorithm is complex, and cannot be direc...

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A large-screen interaction system based on laser radar positioning
  • A large-screen interaction system based on laser radar positioning
  • A large-screen interaction system based on laser radar positioning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0064] A large-screen interactive system based on lidar positioning, such as figure 1 As shown, YDLIDAR G4 lidar is used to collect data, and the interactive system includes the following modules:

[0065] (1) The data acquisition and processing module is used to collect and process the laser radar serial port data, and then obtain effective coordinates by classifying the radar serial port data and then clustering and calculating; the specific process is as follows, as figure 2 Shown:

[0066] S11, start the background thread, obtain the radar serial port data in real time, and decode the radar serial port data after the radar serial port data arrives;

[0067] S12. Check and analyze the obtained radar serial port data to obtain scanning coordinates (x, y):

[0068] Angle Analysis Angle i =(AngleS–AngleE)*(i–1)+Angle 1

[0069] Distance analysis Distance i =((Dis_q2_LL<<8)+Dis_q2_HH) / 4

[0070] x=Distance i *Cos(π*Angle i / 180)

[0071] y=Distance i *Sin(π*Angle i...

Embodiment 2

[0101] The scheme of embodiment 2 is basically the same as embodiment 1, the only difference is that the specific operation process of the coordinate conversion module is as follows, as image 3 Shown:

[0102] S21, through the mapping relationship, the effective coordinates (x 3 ,y 3 ) to screen coordinates (x 4 ,y 4 );

[0103] Start the screen calibration, according to the screen resolution of different systems, get the calibration data of the coordinates of the 4 corners of the screen (X 0 , Y 0 ), (X 0 , Y 1 ),(X 1 , Y 0 ),(X 1 , Y 1 ), four-corner coordinate offset correction parameter calculation and calculation of the mapping relationship between physical coordinates and screen coordinates, the software obtains the screen resolution, the ratio of screen coordinates to physical coordinates is the recalculation of offset parameters for the four corner coordinates of the screen, and an effective Coordinate data of the 4 corners of the screen to obtain calibrat...

Embodiment 3

[0118] The scheme of embodiment 3 is basically the same as that of embodiment 2, the only difference is that the specific operation process of the touch module is as follows, as Figure 4 Shown:

[0119] S31 judges whether the screen coordinates obtained by the coordinate conversion module support multi-touch, and when the screen coordinates support multi-touch, perform a multi-touch data algorithm; when the screen coordinates do not support multi-touch, perform single-touch Data algorithm, through data algorithm analysis, to obtain touch data of single touch and multi-touch;

[0120] Touch recognition: Double-buffer the screen coordinate data structure, use two frames of data to calculate the distance between coordinate points, and compare the timestamps of the data packets to determine whether the touch is consistent and effective, and then track the touch data to identify touch tracking. ID for screen coordinate packet marking.

[0121] S32. Then determine whether the tou...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a large-screen interaction system based on laser radar positioning, and belongs to the field of human-computer interaction. The system comprises the following modules: a data acquisition and processing module, which is used for acquiring and processing laser radar serial port data, and obtaining effective coordinates through a method of clustering calculation after classifying the radar serial port data; a coordinate conversion module which is used for converting the effective coordinates into screen coordinates; a touch module which is used for outputting and identifying screen coordinates; a positioning module which is used for outputting screen coordinates and realizing interaction; And a local area network control module which is used for data transmission and broadcasting. According to the invention, the laser radar touch technology is applied to large-screen interaction; The technical problem that radar data cannot be directly applied is solved, accurate effective data are obtained through the method that classification is conducted firstly and then clustering is conducted, the software development efficiency and stability are improved, reusability ishigh, software data independence is very high, multi-radar superposition use can be achieved, and expansion is convenient.

Description

technical field [0001] The invention belongs to the technical field of human-computer interaction, and in particular relates to a large-screen interactive system based on laser radar positioning. Background technique [0002] Large screens such as wall curtains and floor curtains use advanced computer technology to create a fantastic and dynamic interactive experience, producing many novel effects such as animation, rotation, and fluttering. Their flexible expression techniques, novel and fashionable pictures, and the combination of sound and animation can effectively attract the audience's attention, increase the return rate and average browsing time in places with dense crowds and high traffic, and can very well enliven the atmosphere and increase the technological content. , improve the popularity of the scene, attract pedestrians to stop and watch, and greatly improve brand awareness and memory. However, at present, most of the large-scale floor curtains and wall curtai...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F3/042
Inventor 田地
Owner 合肥金诺数码科技股份有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products