A Saliency Detection Method for Infrared Images Based on Global and Local Interaction
A technology of infrared image and detection method, applied in computer parts, character and pattern recognition, calculation, etc., can solve problems such as wrong estimation results, achieve the effect of improving accuracy, improving operating efficiency, and reducing noise interference
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0053] The invention provides an infrared image saliency detection method based on global and local interaction. The technical solutions of the present invention will be described in detail below with reference to the accompanying drawings to make it easier to understand and grasp.
[0054] A method for saliency detection in infrared images based on global and local interactions, such as figure 1 shown, including the following steps:
[0055] Step 1: Calculate the structural local adaptive recursive kernel,
[0056] Let F denote the feature image extracted from the input image I, formula (1) is:
[0057] F(x, y) = Γ(I, x, y) (1)
[0058] Among them, Γ() represents a multi-dimensional function for extracting image features,
[0059] The position, gradient, brightness, LBP and HOG information of the infrared image are used as the features of the image. In the feature image F, each pixel can be described as a 7-dimensional vector, and the formula (2) is:
[0060]
[0061]...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More - R&D
- Intellectual Property
- Life Sciences
- Materials
- Tech Scout
- Unparalleled Data Quality
- Higher Quality Content
- 60% Fewer Hallucinations
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2025 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com



