Lens focus method and device, and mobile device

A focusing method and lens technology, applied in the electronic field, can solve the problems of difficult detection of image sharpness, prolonging the time of lens focusing, etc., and achieve the effect of improving fun and accurate focusing distance

Inactive Publication Date: 2017-01-11
XIAOMI INC
5 Cites 22 Cited by

AI-Extracted Technical Summary

Problems solved by technology

The focusing method of related technologies is affected by the environment, the aperture size of the lens itself, and the viewing ability of the sensor. ...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Method used

[0080] In one embodiment, the first distance between the lens and the subject can be detected by an existing optical device (for example, an existing distance sensor in the mobile device) on the mobile device, and the multiplexing movement The optical device on the device can reduce the number of openings on the surface of the mobile device and optimize the appearance design of the mobile device.
[0123] In step S204 and step S205, in one embodiment, by establishing a correspondence table between the amount of movement and the drive current, for example, when the amount of movement is 1 mm, the drive current corresponds to 1 mA, When the movement amount is 1.5 mm, the driving current corresponds to 1.2 mA. Therefore, after the movement amount is determined, the required driving current can be searched from the correspondence table to avoid calculating the corresponding driving current based on the movement amount. , to reduce the computation load of the mobile device. In an embodiment, the corresponding corresponding relationship table may be determined by conducting experiments on different types of lenses and different image sensors.
[0125] In this embodiment, by detecting the first reflected light reflected by the subject to the image acquisition device according to the first emitted light, the first distance between the lens of the image acquisition device and...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention relates to a lens focus method and device, and a mobile device, and is used for improving focus speed. The method comprises the steps of determining the first distance between a lens of an image collection device and a photographed object; determining the required movement quantity of the lens from an initial position according to the first distance; and adjusting the image distance between the image sensor of the image collection device and the lens according to the movement quantity. According to the technical scheme, on the basis of rapid focus, the focus distance is more accurate, and the image collection device can obtain a clearer image.

Application Domain

Technology Topic

Image

  • Lens focus method and device, and mobile device
  • Lens focus method and device, and mobile device
  • Lens focus method and device, and mobile device

Examples

  • Experimental program(1)

Example Embodiment

[0077] The exemplary embodiments will be described in detail here, and examples thereof are shown in the accompanying drawings. When the following description refers to the accompanying drawings, unless otherwise indicated, the same numbers in different drawings represent the same or similar elements. The implementation manners described in the following exemplary embodiments do not represent all implementation manners consistent with the present invention. Rather, they are merely examples of devices and methods consistent with some aspects of the present invention as detailed in the appended claims.
[0078] Figure 1A Is a flowchart showing a lens focusing method according to an exemplary embodiment, Figure 1B It is a scene diagram showing lens focusing according to an exemplary embodiment; the lens focusing method can be applied to a mobile device (for example, a smart phone, a tablet computer, or a camera) with a camera or video function, such as Figure 1A As shown, the lens focusing method includes the following steps S101-S103:
[0079] In step S101, the first distance between the lens of the image capture device and the object is determined.
[0080] In an embodiment, the first distance between the lens and the object can be detected by the existing optical device on the mobile device (for example, the existing distance sensor in the mobile device), and by multiplexing the The optical device can reduce the number of openings on the surface of the mobile device and optimize the appearance design of the mobile device.
[0081] In another embodiment, such as Figure 1B As shown, the optical module 11 for determining the first distance can be added to the mobile device. The optical module 11 includes a light emitting sub-module 111, a reflected light receiving sub-module 112, and a distance determining sub-module 113. The optical module set 11 determines the first distance. One distance can make the first distance more accurate. For details on how to determine the first distance through the optical module 11, please refer to the following embodiments.
[0082] In step S102, the movement amount of the lens that needs to be moved from the initial position is determined according to the first distance.
[0083] In one embodiment, you can pass Figure 1B The calculation module 12 is shown to determine the amount of movement that the lens needs to move from the initial position. The amount of movement can be determined by the imaging formula To determine, where f represents the focal length of the lens, u represents the object distance of the object being photographed (the first distance in this disclosure), and v represents the distance between the center of the lens and the center of the image sensor (also called the image distance) For example, when the lens is in the initial position, the distance between the lens and the image sensor is l1. If you need to adjust the distance between the lens and the image sensor from l1 to v, the amount of movement the lens needs to move from the initial position is l1-v, where positive The negative sign indicates the moving direction of the lens relative to the image sensor.
[0084] In step S103, the image distance between the image sensor and the lens of the image capture device is adjusted according to the amount of movement.
[0085] In one embodiment, such as Figure 1B As shown, in one embodiment, the calculation module 12 sends the movement amount to the focusing module 13, and the driving sub-module 131 in the focusing module 13 converts the movement amount into a corresponding driving current, and outputs the driving current to the focusing sub-module 132 , So that the focusing sub-module 132 moves from near to far or from far to near according to the driving current to drive the lens to achieve focusing.
[0086] In this embodiment, the amount of movement of the lens that needs to be moved from the initial position is determined according to the first distance between the lens of the image acquisition device and the object, and the image distance between the image sensor and the lens of the image acquisition device is adjusted according to the amount of movement. Compared with the related technology that realizes focusing by recognizing the sharpness information in the image, it can make the focusing distance more accurate on the basis of fast focusing, so that the image acquisition device can obtain clearer pictures, which greatly improves the user's photographing process In the fun.
[0087] In an embodiment, determining the first distance between the lens of the image capture device and the object may include:
[0088] Controlling the image acquisition device to emit the first emitted light of the set light intensity to the object;
[0089] Detecting the first reflected light of the object to be photographed to the image acquisition device according to the first emitted light;
[0090] The first distance between the lens of the image acquisition device and the object is determined according to the intensity of the first reflected light.
[0091] In an embodiment, determining the first distance between the lens of the image capture device and the object may include:
[0092] Controlling the image acquisition device to emit the second emission light of the first set wavelength to the object;
[0093] When detecting the second reflected light reflected by the object to the image acquisition device according to the second emission light, count the number of wavelengths of the second emission light emitted by the image acquisition device;
[0094] The first distance between the lens of the image acquisition device and the object is determined according to the first set wavelength and the number of wavelengths.
[0095] In an embodiment, determining the first distance between the lens of the image capture device and the object may include:
[0096] Controlling the image acquisition device to emit a third emission light with a set amplitude and a second set wavelength to the photographed object;
[0097] When detecting the third reflected light reflected by the third emitted light to the image acquisition device according to the third emitted light, determining the amplitude of the third reflected light relative to the third emitted light at the time of emission and the phase corresponding to the amplitude;
[0098] The first distance between the lens of the image acquisition device and the object is determined according to the set amplitude, the period of the third emitted light, the amplitude and the phase corresponding to the amplitude.
[0099] In an embodiment, determining the amount of movement that the lens needs to move from the initial position according to the first distance may include:
[0100] Determine the image distance of the subject through the focal length of the lens and the first distance;
[0101] Determine the amount of movement the lens needs to move from the initial position according to the image distance.
[0102] In an embodiment, adjusting the image distance between the image sensor and the lens of the image acquisition device according to the amount of movement may include:
[0103] Determine the drive current used to drive the lens according to the amount of movement;
[0104] The lens is driven according to the driving current, so that the image distance between the lens and the image sensor of the image acquisition device conforms to the imaging formula.
[0105] In an embodiment, the method may further include:
[0106] Determine whether the first distance of the subject has changed;
[0107] If the first distance changes, determine the second distance after the change of the distance between the lens and the subject and the image distance after the lens is focused;
[0108] The lens is refocused according to the second distance and the image distance.
[0109] For details on how to focus the lens, please refer to the subsequent embodiments.
[0110] So far, the above-mentioned method provided by the embodiments of the present disclosure, compared to the related art by recognizing the sharpness information in the image to achieve focusing, can make the focusing distance more accurate on the basis of rapid focusing, so that the image acquisition device can obtain Clearer pictures greatly enhance the user's fun in the process of taking pictures.
[0111] Specific embodiments are used below to illustrate the technical solutions provided by the embodiments of the present disclosure.
[0112] Figure 2A Is a flowchart of a lens focusing method according to an exemplary embodiment 1, Figure 2B It is a schematic diagram showing the curve of light intensity and object distance according to an exemplary embodiment one; this embodiment uses the above-mentioned method provided by the embodiments of the present disclosure, and takes how to determine the first distance by the light intensity of incident light and emitted light as an example And combine Figure 1B Give exemplary instructions, such as Figure 2A As shown, including the following steps:
[0113] In step S201, the image capture device is controlled to emit the first emission light of the set light intensity to the object.
[0114] In step S202, the first reflected light reflected by the object to the image acquisition device according to the first emitted light is detected.
[0115] In step S203, the first distance between the lens of the image acquisition device and the object is determined according to the intensity of the first reflected light.
[0116] In step S201 to step S203, the mobile device may emit the photonic module 111 to emit the first emitted light with the set light intensity to the object, and when the emission light receiving sub-module 112 detects the first reflected light reflected by the object , The distance determining sub-module 113 judges the distance of the object by the intensity of the first reflected light; in one embodiment, the stronger the light intensity received by the transmitting light receiving sub-module 112 indicates the first distance between the object and the lens The closer it is, the weaker the light intensity received by the transmitting light receiving sub-module 112 indicates that the first distance between the object and the lens is farther. Among them, the curve relationship between the intensity of the first reflected light and the first distance is as follows Figure 2B As shown, when the subject is detected at point A, the light intensity of the first reflected light received by the emitting light receiving sub-module 112 is x1, and the first distance between the subject and the lens is L, then the first The corresponding formula of the distance and the intensity of the first reflected light is:
[0117] L=1+log a x1 (0
[0118] In step S204, the image distance of the subject is determined by the focal length of the lens and the first distance.
[0119] In step S205, the amount of movement required to move the lens from the initial position is determined according to the image distance.
[0120] For the description of step S204 and step S205, please refer to the description of step S102, which will not be described in detail here.
[0121] In step S206, the drive current for driving the lens is determined according to the amount of movement.
[0122] In step S207, the lens is driven according to the driving current, so that the image distance between the lens and the image sensor of the image acquisition device conforms to the imaging formula.
[0123] In step S204 and step S205, in one embodiment, a correspondence table between the amount of movement and the drive current can be established. For example, when the amount of movement is 1 mm, the drive current corresponds to 1 mA. When it is 1.5 mm, the driving current corresponds to 1.2 mA. Therefore, after determining the amount of movement, you can find the required driving current from the correspondence table to avoid calculating the corresponding driving current based on the amount of movement and reduce the movement The amount of calculation of the device. In an embodiment, the corresponding correspondence table can be determined by performing experiments on different types of lenses and different image sensors.
[0124] In another embodiment, the mapping relationship between the driving current and the amount of movement may be established, and the magnitude of the driving current may be determined by the mapping relationship, for example, the driving current may be determined by the mapping relationship I=K/ΔL, where I represents the driving current , ΔL represents the amount of movement, K represents the fixed conversion coefficient.
[0125] In this embodiment, by detecting the first reflected light reflected by the object to the image acquisition device according to the first emitted light, the first distance between the lens of the image acquisition device and the object is determined according to the intensity of the first reflected light, Compared with the related technology that realizes focusing by recognizing the sharpness information in the image, it can make the focusing distance more accurate on the basis of fast focusing, so that the image acquisition device can obtain clearer pictures, which greatly improves the user's photographing process In the fun.
[0126] Figure 3A Is a flowchart of a lens focusing method according to an exemplary embodiment 2, Figure 3B It is a schematic diagram showing the distance and amplitude of the emitted light according to the second exemplary embodiment; this embodiment uses the above-mentioned method provided in the embodiments of the present disclosure, and takes how to determine the first distance by the number of incident light wavelengths as an example. Combine Figure 1B Give exemplary instructions, such as Figure 3A As shown, including the following steps:
[0127] In step S301, the image capture device is controlled to emit the second emission light of the first set wavelength to the object.
[0128] In step S302, when the second reflected light reflected by the object to the image acquisition device according to the second emission light is detected, the number of wavelengths of the second emission light emitted by the image acquisition device is counted.
[0129] In step S303, the first distance between the lens of the image capture device and the object is determined according to the first set wavelength and the number of wavelengths.
[0130] In step S301 to step S303, the first distance between the lens and the object can be calculated by calculating the number of light waves of the first emission light emitted by the emission photon module 111. The photo-emitting module 111 starts from the first wavelength of the second emission light, and counts the number of wavelengths of the second emission light emitted. When the emission light receiving sub-module 112 detects that the second emission light is reflected by the subject during the transmission process back to the emission light receiving sub-module 112, it stops counting the number of wavelengths emitted by the second emission light, and calculates the photons emitted during this period The number of wavelengths N of the second emitted light with a wavelength of λ emitted by the module 111, such as Figure 2B As shown, the first distance L between the lens and the object can be calculated, and the corresponding formula for the first distance, the wavelength λ, and the number of wavelengths N is:
[0131] L = N X λ 2 Formula (2).
[0132] In step S304, the image distance of the subject is determined by the focal length of the lens and the first distance.
[0133] In step S305, the amount of movement required to move the lens from the initial position is determined according to the image distance.
[0134] In step S306, the image distance between the image sensor and the lens of the image acquisition device is adjusted according to the amount of movement.
[0135] For the description of step S304 to step S306, refer to the description of step S102 to step S103 above, which will not be described in detail here.
[0136] In this embodiment, when the second reflected light of the object reflected to the image acquisition device according to the second emission light is detected, the lens and the image acquisition device are determined according to the first set wavelength and the number of wavelengths of the second emission light. Compared with the first distance between the subjects in the related technology by recognizing the sharpness information in the image to achieve focusing, it can make the focusing distance more accurate on the basis of fast focusing, so that the image acquisition device can obtain more Clear pictures greatly enhance the user's fun in the process of taking pictures.
[0137] Figure 4A Is a flowchart of a lens focusing method according to an exemplary embodiment 3, Figure 4B It is a schematic diagram showing the phase and the amplitude of the emitted light according to an exemplary embodiment three; this embodiment uses the above-mentioned method provided by the embodiment of the present disclosure to determine how the first distance is determined by the phase of the reflected light relative to the incident light as Example and combine Figure 1B Give exemplary instructions, such as Figure 4A As shown, including the following steps:
[0138] In step S401, the image capture device is controlled to emit a third emission light with a set amplitude and a second set wavelength to the subject.
[0139] In step S402, when the third reflected light reflected by the third emitted light to the image acquisition device according to the third emitted light is detected, it is determined that the third reflected light is emitted relative to the third emitted light. Phase.
[0140] In step S403, the first distance between the lens of the image acquisition device and the object is determined according to the set amplitude, the period of the third emitted light, the amplitude and the phase corresponding to the amplitude.
[0141] In step S401 to step S403, the first distance can be calculated by transmitting the phase shift information of the photonic module 111. The transmitting photonic module 111 can transmit a third distance with a fixed amplitude E, a period of T, a velocity of v, and a wavelength of λ. Emit light, the wavelength of the light is relatively long, and the distance of the detected object is judged by its phase difference in the period of T/2. Such as Figure 4B As shown, the emitting photo sub-module 111 may start emitting the third emitting light at the highest amplitude E of the third emitting light, and end when the emitting light receiving sub-module 112 receives the amplitude at A as y. In an embodiment, the angle x2 at A can be obtained by y=E sin x2. Then the flight time from the light emission to the emission light receiving sub-module 112 is So as to calculate the first distance between the lens and the subject
[0142] In step S404, the movement amount of the lens that needs to be moved from the initial position is determined according to the first distance.
[0143] In step S405, the image distance between the image sensor and the lens of the image acquisition device is adjusted according to the amount of movement.
[0144] The description of step S404 and step S405 can refer to the description of step S102 and step S103 above, which will not be described in detail here.
[0145] In step S406, it is determined whether the first distance of the subject has changed.
[0146] In step S407, if the first distance changes, determine the second distance after the distance between the lens and the subject and the image distance after the lens is in focus.
[0147] In step S408, the lens is focused again according to the second distance and the image distance.
[0148] In an embodiment, the photo-emitting module 111 may keep emitting the third emitted light, and the emitting-light receiving sub-module 112 may keep receiving the third reflected light reflected by the object, and determine whether the first distance occurs according to the third emitted light. When the first distance changes, the second distance after the change of the distance between the lens and the subject and the image distance after the lens is focused are determined through the above embodiment, and the focusing method of the above embodiment is based on the second distance and the image distance Focus the lens again. Those skilled in the art can understand that in the above Figure 2A with Figure 3A After the embodiment focuses on the lens, real-time focusing of the lens can still be achieved by executing steps S406 to S408.
[0149] In this embodiment, the first distance between the lens of the image acquisition device and the object is determined according to the set amplitude, the period of the third emitted light, the amplitude, and the phase corresponding to the amplitude. Compared with the related art Recognizing the sharpness information in the image to achieve focusing can make the focusing distance more accurate on the basis of fast focusing, so that the image capture device can obtain clearer pictures, which greatly enhances the user’s fun in the photo process; in addition, After determining that the first distance of the subject has changed, the lens is refocused according to the second distance and the image distance, so as to ensure that the lens can still be adjusted in real time after the focus fails due to changes in the position of the mobile device or the subject Focus, improve user experience.
[0150] Figure 5 Is a block diagram of a lens focusing device according to an exemplary embodiment, such as Figure 5 As shown, the lens focusing device includes:
[0151] The first determining module 51 is configured to determine the first distance between the lens of the image acquisition device and the subject;
[0152] The second determining module 52 is configured to determine the amount of movement that the lens needs to move from the initial position according to the first distance determined by the first determining module 51;
[0153] The adjustment module 53 is configured to adjust the image distance between the image sensor and the lens of the image acquisition device according to the amount of movement determined by the second determination module 52.
[0154] Image 6 It is a block diagram showing another lens focusing device according to an exemplary embodiment, such as Image 6 Shown in the above Figure 5 On the basis of the illustrated embodiment, the first determining module 51 may include:
[0155] The first control sub-module 511 is configured to control the image acquisition device to emit the first emitted light with the set light intensity to the object;
[0156] The detection sub-module 512 is configured to detect the first reflected light emitted by the first emission light controlled by the first control sub-module 511 and reflected to the image acquisition device by the object;
[0157] The first determining sub-module 513 is configured to determine the first distance between the lens of the image acquisition device and the object according to the intensity of the first reflected light detected by the detecting sub-module 512.
[0158] In an embodiment, the first determining module 51 may include:
[0159] The second control sub-module 514 is configured to control the image acquisition device to emit the second emission light of the first set wavelength to the object;
[0160] The statistics sub-module 515 is configured to count the second emitted light emitted by the image acquisition device when detecting the second reflected light emitted by the object according to the second control sub-module 514 and reflected to the image acquisition device The number of wavelengths;
[0161] The second determining sub-module 516 is configured to determine the first distance between the lens of the image acquisition device and the object according to the first set wavelength and the number of wavelengths counted by the counting sub-module 515.
[0162] In an embodiment, the first determining module 51 may include:
[0163] The third control sub-module 517 is configured to control the image acquisition device to emit the third emission light of the set amplitude and the second set wavelength to the object;
[0164] The third determining sub-module 518 is configured to determine that the third reflected light is relative to the third reflected light of the image acquisition device when the third emission light controlled by the third control sub-module 517 is detected to be emitted by the object. 3. The amplitude of the emitted light during emission and the phase corresponding to the amplitude;
[0165] The fourth determining sub-module 519 is configured to determine the relationship between the lens of the image acquisition device and the object according to the set amplitude, the period and amplitude of the third emitted light, and the phase corresponding to the amplitude determined by the third determining sub-module 518. The first distance between.
[0166] In an embodiment, the second determining module 52 may include:
[0167] The fifth determining sub-module 521 is configured to determine the image distance of the subject through the focal length of the lens and the first distance;
[0168] The sixth determining sub-module 522 is configured to determine the amount of movement of the lens that needs to be moved from the initial position according to the image distance determined by the fifth determining sub-module 521.
[0169] In an embodiment, the adjustment module 53 may include:
[0170] The seventh determining sub-module 531 is configured to determine a driving current for driving the lens according to the amount of movement;
[0171] The driving sub-module 532 is configured to drive the lens according to the driving current determined by the seventh determining sub-module 531 so that the image distance between the lens and the image sensor of the image acquisition device conforms to the imaging formula.
[0172] In an embodiment, the device may further include:
[0173] The third determining module 54 is configured to determine whether the first distance of the subject determined by the first determining module 51 has changed;
[0174] The fourth determining module 55 is configured to, if the third determining module 54 determines that the first distance has changed, determine the second distance after the distance between the lens and the subject has changed and the image distance after the lens is focused;
[0175] The adjustment module 53 is also configured to refocus the lens according to the second distance and the image distance determined by the fourth determination module 55.
[0176] Regarding the device in the foregoing embodiment, the specific manner in which each module performs operation has been described in detail in the embodiment of the method, and detailed description will not be given here.
[0177] Figure 7 It is a block diagram suitable for mobile devices according to an exemplary embodiment. For example, the mobile device 700 may be a mobile phone, a computer, a digital broadcasting terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, etc.
[0178] Reference Figure 7 , The mobile device 700 may include one or more of the following components: a processing component 702, a memory 704, a power supply component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and communication Components 716.
[0179] The processing component 702 generally controls the overall operations of the mobile device 700, such as operations associated with display, phone calls, data communications, camera operations, and recording operations. The processing element 702 may include one or more processors 720 to execute instructions to complete all or part of the steps of the foregoing method. In addition, the processing component 702 may include one or more modules to facilitate the interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate the interaction between the multimedia component 708 and the processing component 702.
[0180] The memory 704 is configured to store various types of data to support operations in the mobile device 700. Examples of such data include instructions for any application or method operating on the mobile device 700, contact data, phone book data, messages, pictures, videos, etc. The memory 704 can be implemented by any type of volatile or non-volatile storage device or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable and Programmable read only memory (EPROM), programmable read only memory (PROM), read only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
[0181] The power component 706 provides power to various components of the mobile device 700. The power component 706 may include a power management system, one or more power supplies, and other components associated with the generation, management, and distribution of power for the mobile device 700.
[0182] The multimedia component 708 includes a screen that provides an output interface between the mobile device 700 and the user. In some embodiments, the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from the user. The touch panel includes one or more touch sensors to sense touch, sliding, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or slide action, but also detect the duration and pressure related to the touch or slide operation. In some embodiments, the multimedia component 708 includes a front camera and/or a rear camera. When the mobile device 700 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera can receive external multimedia data. Each front camera and rear camera can be a fixed optical lens system or have focal length and optical zoom capabilities.
[0183] The audio component 710 is configured to output and/or input audio signals. For example, the audio component 710 includes a microphone (MIC). When the mobile device 700 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode, the microphone is configured to receive external audio signals. The received audio signal can be further stored in the memory 704 or sent via the communication component 716. In some embodiments, the audio component 710 further includes a speaker for outputting audio signals.
[0184] The I/O interface 712 provides an interface between the processing component 702 and a peripheral interface module. The above-mentioned peripheral interface module may be a keyboard, a click wheel, a button, and the like. These buttons may include but are not limited to: home button, volume button, start button, and lock button.
[0185] The sensor component 714 includes one or more sensors for providing the mobile device 700 with various aspects of state assessment. For example, the sensor component 714 can detect the open/close state of the mobile device 700 and the relative positioning of the components. For example, the component is the display and the keypad of the mobile device 700, and the sensor component 714 can also detect the mobile device 700 or the mobile device 700. The position of the component changes, the presence or absence of contact between the user and the mobile device 700, the orientation or acceleration/deceleration of the mobile device 700, and the temperature change of the mobile device 700. The sensor component 714 may include a proximity sensor configured to detect the presence of nearby objects when there is no physical contact. The sensor component 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor component 714 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor or a temperature sensor.
[0186] The communication component 716 is configured to facilitate wired or wireless communication between the mobile device 700 and other devices. The mobile device 700 can access a wireless network based on a communication standard, such as WiFi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 716 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 716 further includes a near field communication (NFC) module to facilitate short-range communication. For example, the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
[0187] In an exemplary embodiment, the mobile device 700 may be implemented by one or more application specific integrated circuits (ASIC), digital signal processors (DSP), digital signal processing devices (DSPD), programmable logic devices (PLD), field A programmable gate array (FPGA), controller, microcontroller, microprocessor, or other electronic components are implemented to implement the above methods.
[0188] In an exemplary embodiment, there is also provided a non-transitory computer-readable storage medium including instructions, such as the memory 704 including instructions, which may be executed by the processor 720 of the mobile device 700 to complete the foregoing method. For example, the non-transitory computer-readable storage medium may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
[0189] After considering the specification and practicing the disclosure disclosed herein, those skilled in the art will easily think of other embodiments of the present disclosure. This application is intended to cover any variations, uses, or adaptive changes of the present disclosure. These variations, uses, or adaptive changes follow the general principles of the present disclosure and include common knowledge or conventional technical means in the technical field not disclosed in the present disclosure. . The description and the embodiments are only regarded as exemplary, and the true scope and spirit of the present disclosure are pointed out by the following claims.
[0190] It should be understood that the present disclosure is not limited to the precise structure that has been described above and shown in the drawings, and various modifications and changes can be made without departing from its scope. The scope of the present disclosure is only limited by the appended claims.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Industrial robot application and control practical training platform

InactiveCN105976694AAvoid damageAdd funEducational modelsElectrical installationWelding
Owner:ZHANGJIAGANG HAGONG ROBOT TECH CO LTD

Intelligent watch unlocking method and intelligent watch

ActiveCN105117013AImprove unlocking efficiencyAdd funInput/output for user-computer interactionMechanical clocksCurrent timeSmartwatch
Owner:GUANGDONG OPPO MOBILE TELECOMM CORP LTD

Classification and recommendation of technical efficacy words

Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products