Method for detecting indoor and outdoor scenes

An indoor scene and scene detection technology, applied in the field of communication, can solve the problems of strict input accuracy, low universality, poor user experience, etc., achieve high detection accuracy, reduce the possibility of misjudgment, and improve The effect of user experience

Active Publication Date: 2018-12-04
INST OF COMPUTING TECH CHINESE ACAD OF SCI +1
View PDF7 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Specifically, in the prior art, some additional auxiliary equipment is usually deployed to collect specific data for indoor and outdoor scene recognition. This method is expensive in the deployment stage and the preparation work is complicated, which greatly limits the universal application of indoor and outdoor scene recognition. It is difficult to carry out large-scale promotion due to its high quality and ease of use; it collects fingerprints through mobile smart terminals to perceive environmental information, and uses fingerprint modeling to judge the user's scene based on the matching model. And a central server is required to store the model data and respond to user requests in a timely manner. If the user's network condition is poor, it is difficult to judge the actual environment of the user and the user experience is poor, so this method has low universality; Pattern recognition and image processing methods are used to distinguish indoor and outdoor scenes. This method needs to label pictures, has strict requirements on user input accuracy and high computational complexity, and is difficult to promote on a large scale.
[0004] Therefore, the detection of indoor and outdoor scene switching by the prior art method has high time delay and low detection accuracy, and the existing technology needs to be improved to provide a method that can accurately and effectively detect indoor and outdoor scenes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for detecting indoor and outdoor scenes
  • Method for detecting indoor and outdoor scenes
  • Method for detecting indoor and outdoor scenes

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] In order to make the purpose, technical solution, design method and advantages of the present invention clearer, the present invention will be further described in detail through specific embodiments in conjunction with the accompanying drawings. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0031] According to an embodiment of the present invention, a method for detecting indoor and outdoor scenes for a mobile terminal is provided. In short, the method judges whether a mobile terminal has switched between indoor and outdoor scenes according to indoor positioning results, satellite positioning information, and historical detection results. , see figure 1 As shown, the detection method of the present invention specifically comprises the following steps:

[0032] Step S110, obtaining the indoor positioning result of the mobile terminal

[0033] Indoor positioning...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a method for detecting indoor and outdoor scenes. The method includes: obtaining satellite positioning information about a mobile terminal by detecting a satellite positioning system; determining whether the mobile terminal is currently in an indoor environment or an outdoor environment based on a historical detection result recording situations that the mobile terminal is in an indoor or outdoor scene and the satellite positioning information. The method of the invention can improve the detection accuracy of the indoor and outdoor scenes.

Description

technical field [0001] The invention relates to the field of communication technology, in particular to an indoor and outdoor scene detection method. Background technique [0002] With the rapid development of mobile smart terminals and wireless communication technologies, location-based services have increasingly become an indispensable value-added service in the field of wireless mobile communications. Location-based services (LBS) have become the focus of attention and research hotspots. Correct environment perception can improve the accuracy of location determination. For example, for two different indoor and outdoor scenarios, the positioning methods used are usually different. , if the indoor and outdoor environment of the user can be correctly perceived, the corresponding positioning technology can be activated, so as to provide better location services for the user. [0003] However, at present, there are still some problems in indoor and outdoor scene recognition o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G01S19/48
CPCG01S19/48
Inventor 罗海勇朱奕达阎硕赵方
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products