Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Sound control method and system based on human-computer interaction

A technology of human-computer interaction and acoustic tweezers, which is applied in the field of acoustic control methods and systems based on human-computer interaction, can solve the problems of low precision, incoherent and smooth path deviation, low spatial resolution of acoustic tweezers, and the inability to realize control, etc., to achieve Reduces the effect of difficult manipulation

Pending Publication Date: 2022-04-05
SHENZHEN INST OF ADVANCED TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] 1. Acoustic manipulation based on standing waves: This technology can capture particle collectives and classify particles, but the multiplicity of nodes and antinodes in space excludes its specific selection, and it needs to be based on the characteristics of the target, the wave source The frequency and phase of the space are adjusted to the size and position of the space. At the same time, some artificial structures need to be designed for auxiliary control, and real-time arbitrary control cannot be achieved.
[0006] 2. Acoustic manipulation based on acoustic flow: Since bubbles and microstructure-based phenomena are nonlinear, and the design of microstructures needs to be predefined, real-time changes cannot be made, and the acoustic tweezers in this technology have low spatial resolution rate disadvantage
However, the training of deep learning requires a large number of training sets, and the cost of collecting these training sets is high, and there are disadvantages of low precision, incoherent and smooth path deviation, and large time overhead.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Sound control method and system based on human-computer interaction
  • Sound control method and system based on human-computer interaction
  • Sound control method and system based on human-computer interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] In order to make the purpose, technical solution and advantages of the present application clearer, the present application will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present application, not to limit the present application.

[0050] see figure 1 , is a flow chart of the sound manipulation method based on human-computer interaction according to the embodiment of the present application. The sound manipulation method based on human-computer interaction in the embodiment of the present application includes the following steps:

[0051] S10: Preparing an acoustic tweezers device through multiple pairs of interdigital transducers;

[0052] In this step, the acoustic tweezers device is a SAW (Surface Acoustic Wave, Surface Acoustic Wave) microfluidic chip, and multiple pairs of interdigital transducers are bound to t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a sound control method and system based on human-computer interaction. The method comprises the following steps: injecting particles into a cavity of the sound tweezers equipment; collecting a chamber image, and synchronously transmitting the chamber image to a display terminal for display; acquiring expected displacement coordinates of a target particle through the display terminal, and transmitting the expected displacement coordinates to a control terminal; and calculating an excitation signal required by the interdigital transducer when the target particle is moved to an expected displacement coordinate through a control terminal by adopting a control algorithm, exciting the interdigital transducer to generate a corresponding sound field model through the excitation signal, and performing sound control on the target particle in the cavity. According to the method and the device, the particles in the complex environment can be accurately controlled in any scene, and the control difficulty caused by the environment complexity is reduced.

Description

technical field [0001] The present application belongs to the technical field of sound manipulation, and in particular relates to a sound manipulation method and system based on human-computer interaction. Background technique [0002] Acoustic tweezers (Acoustic Tweezers, Acoustic Tweezers) is a cutting-edge technology that uses the principle of acoustic radiation force to capture and control tiny particles, which can precisely manipulate tiny particles such as cells in a large size range. Acoustic tweezers technology uses the interaction between the sound field and the object in it, so that the object absorbs or scatters it to generate energy transfer, that is, the object is subjected to the Acoustic Radiation Force (ARF). In biomedicine, due to the versatility and biocompatibility of acoustic tweezers, it has many applications in cell and particle sorting, including cell sorting, cell patterning, blood cell separation, Cell or particle transport, enrichment of rare cells...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G05B19/042
Inventor 郑海荣孟龙陈卫兴张文俊刘秀芳
Owner SHENZHEN INST OF ADVANCED TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products