Multi-person multi-mode perception data automatic marking and mutual learning method
A technology for automatic labeling and data perception, applied in the field of cross-domain perception, can solve the problems of limited perception ability of single modal data and difficulty of manual labeling data, and achieve the effect of improving recognition accuracy, improving ability, and increasing the number of categories
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0017]The following describes the technical solutions in the embodiments of the present invention clearly and completely with reference to the specific content of the present invention. Obviously, the described embodiments are only a part of the embodiments of the present invention, rather than all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention. The content that is not described in detail in the embodiments of the present invention belongs to the prior art known to those skilled in the art.
[0018]Such asfigure 1As shown, the embodiment of the present invention provides a method for automatic marking and mutual learning of multi-person and multi-modal data, which can automatically segment, mark, and then learn from each other in multi-modal data streams, including:
[0019]Step 1. Data preprocessing: including cloc...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com