Image processing method and image processing device
An image processing and image technology, applied in the field of image processing, can solve the problems of image processing, vaguely seeing images, and less information, and achieve the effect of increasing the amount of information
Active Publication Date: 2017-04-19
GUANGZHOU KUGOU TECH
8 Cites 7 Cited by
AI-Extracted Technical Summary
Problems solved by technology
[0003] However, the terminal of the Windows XP (Experience, experience) system cannot process the image into an image with a frosted glass effect
Therefore, when the terminal of the Windows XP system displays the window of the application program, since the terminal cannot process the image in t...
Method used
In an embodiment of the present invention, when the terminal detects that the window of the application program changes, it obtains the target image of the area blocked by the current first window of the application program; Gaussian blur processing is carried out to the target image to obtain a Gaussian blurred image ; draw the Gaussian blurred image on the first window as the base map of the first window, so that the first window can achieve a frosted glass effect, and the terminal not only displays the image in the first window, but also displays the Gaussian corresponding to the target image The image is blurred, so that the user can vaguely see the target image in the area covered by the first window through the first window, thereby increasing the amount of information displayed by the first window.
In an embodiment of the present invention, when the terminal detects that the window of the application program changes, the target image of the area blocked by the current first window of the application program is obtained; Gaussian blur processing is carried out to the target image to obtain a Gaussian blurred image ; draw the Gaussian blurred image on the first window as the base map of the first wind...
Abstract
The invention discloses an image processing method and an image processing device, and belongs to the technical field of image processing. The method comprises the following steps: when detecting that a current window of an application changes, acquiring a target image in a first target area, wherein the first target area is an area blocked by a current first window of the application; performing Gaussian blurring on the target image to get a Gaussian blur image; drawing the Gaussian blur image on the first window as the base drawing of the first window to achieve an effect of frosted glass on the first window. The device comprises an acquiring module, a processing module, and a drawing module. In the invention, Gaussian blurring is performed on a target image to get a Gaussian blur image, and the Gaussian blur image is drawn on the first window as the base drawing of the first window to achieve an effect of frosted glass on the first window. The amount of information displayed via the first window is increased.
Application Domain
Image enhancementImage analysis +2
Technology Topic
Frosted glassPattern recognition +3
Image
Examples
- Experimental program(1)
Example Embodiment
[0038] In order to make the objectives, technical solutions and advantages of the present invention clearer, the embodiments of the present invention will be described in further detail below in conjunction with the accompanying drawings.
[0039] The embodiment of the present invention provides an image processing method. The execution subject of the method may be a terminal of a designated system, where the designated system may be Windows XP, Windows 2003, or the like. See figure 1 , The method includes:
[0040] Step 101: When it is detected that the current window of the application program has changed, a target image of a first target area is acquired, where the first target area is an area occluded by the current first window of the application program.
[0041] Step 102: Perform Gaussian blur processing on the target image to obtain a Gaussian blurred image.
[0042] Step 103: Draw the Gaussian blurred image as a base map of the first window on the first window, so that the first window realizes a ground glass effect.
[0043] Optionally, the method further includes:
[0044] Taking the preset time length as the screenshot period, screenshot the target area on the terminal screen, and when the two adjacent images obtained by the interception are not the same, it is determined that the current window of the application has changed.
[0045] Optionally, acquiring the target image of the first target area includes:
[0046] A first frame of image obtained by taking a screenshot of the first target area at the current time is acquired, and the target image is determined according to the first frame of image.
[0047] Optionally, determining the target image according to the first frame of image includes:
[0048] Determine the first frame of image as the target image; or,
[0049] Acquire a second frame of image obtained by taking a screenshot of a second target area on the terminal screen during the previous week, where the second target area is the area occluded by the second window displayed by the application in the previous cycle, and obtain the The overlapping part of the first frame image and the second frame image is determined as the target image except for the overlapping part in the first frame image.
[0050] Optionally, performing Gaussian blur processing on the target image to obtain a Gaussian blurred image includes:
[0051] Obtain the frosted glass effect level selected by the user, and determine the parameter value of the Gaussian blur processing according to the frosted glass effect level;
[0052] According to the parameter value of the Gaussian blur processing, Gaussian blur processing is performed on the target image to obtain the Gaussian blurred image.
[0053] In the embodiment of the present invention, when the terminal detects that the window of the application program has changed, it acquires the target image of the area occluded by the current first window of the application program; performs Gaussian blur processing on the target image to obtain a Gaussian blurred image; The Gaussian blurred image is drawn on the first window as the base map of the first window, so that the first window achieves a ground glass effect. The terminal not only displays the image in the first window, but also displays the Gaussian blurred image corresponding to the target image. Therefore, the user sees the target image in the area occluded by the first window through the first window, which increases the amount of information displayed in the first window.
[0054] The embodiment of the present invention provides an image processing method, the execution subject of the method may be a terminal of a designated system; wherein, the designated system may be Windows XP, Windows 2003, etc. See figure 2 , The method includes:
[0055] Step 201: When the terminal detects that the current window of the application program has changed, it acquires a target image of a first target area, where the first target area is an area occluded by the current first window of the application program.
[0056] When the terminal detects that the current window of the application program has changed, it processes the current window of the application program into a ground glass effect according to the image processing method provided in the embodiment of the present invention. Therefore, in this step, the terminal needs to detect in real time whether the current window of the application has changed. Wherein, the step for the terminal to detect whether the current window of the application program has changed may be:
[0057] The terminal takes the preset time as the screenshot period to take a screenshot of the target area on the terminal screen; determines whether the two adjacent images obtained by interception are the same; when the two adjacent images obtained by interception are not the same, determine the current status of the application The window changes; when the two adjacent images obtained by the interception are the same, it is determined that the current window of the application has not changed.
[0058] The application program may be any application program installed on the terminal; for example, a video playback application program, an audio playback application program, or a social application program. The preset duration can be set and changed according to user needs, and the embodiment of the present invention does not specifically limit the preset duration. For example, the preset duration can be 1 second, 3 milliseconds, 0.1 milliseconds, and so on. Among them, if the image of the target area on the terminal screen is a dynamic image, the preset duration can also be set and changed according to the minimum change interval of every two adjacent frames in the dynamic image; for example, the preset duration is less than the dynamic image. The minimum change interval of every two adjacent frames in the image. In this way, even if the target area image is a dynamic image, the terminal can capture each frame of the image included in the dynamic image.
[0059] The target area is the area on the terminal screen that is blocked by the current window of the application. Therefore, the step of the terminal to take a screenshot of the target area may be: the terminal obtains the handle of the application, and identifies the application according to the handle. The target area where the terminal is located; the terminal obtains the window tree of all windows on the current terminal screen, which records the level of all windows displayed on the current terminal screen; the terminal detects that the current window of the application is at the first level in the window tree , According to the first level, obtain the second level above the first level in the window tree; according to the second level, obtain the window corresponding to the second level; according to the target area, intercept the target area The image of the window corresponding to the second level.
[0060] For example, the window tree of all windows on the current terminal screen is B0, B1, and B2, the first level of the current window of the application in the window tree is B2, and the second level of the window tree is the second level above B2. For the B0 and B1 layers, obtain the windows corresponding to the B0 and B1 layers, and capture the images of the windows corresponding to the B0 and B1 layers in the target area according to the target area where the application program is located.
[0061] In this step, the image obtained by taking a screenshot of the first target area at the current time is called the first frame image, and the image obtained by taking a screenshot of the second target area on the terminal screen in the previous period is called the second frame image. The second target area is the area occluded by the second window displayed by the application in the previous period; then the first frame image and the second frame image are two adjacent images obtained by interception, and the terminal determines the adjacent images obtained by interception The step of whether the two images are the same can be implemented in the following first way or the second way. Of course, in order to improve accuracy, the terminal can also combine the first way and the second way.
[0062] For the first implementation manner, the step for the terminal to determine whether two adjacent images obtained by interception are the same may be:
[0063] The terminal determines whether the size of the first frame image is the same as the size of the second frame image, and whether the position of the first frame image on the terminal screen is the same as the position of the second frame image on the terminal screen; if the size of the first frame image The size of the image is the same as that of the second frame, and the position of the first frame of image on the terminal screen is the same as the position of the second frame of image on the terminal screen, confirm that the first frame of image and the second frame of image are the same; if the first frame of image The size of the image is different from the size of the second frame of image, or the position of the first frame of image on the terminal screen is different from the position of the second frame of image on the terminal screen, it is determined that the first frame of image and the second frame of image are different.
[0064] For the second implementation manner, the step for the terminal to determine whether two adjacent images obtained by interception are the same may be:
[0065] The terminal obtains the image data of the first pixel in the first frame of image and the image data of the second pixel in the second frame of image; the terminal determines whether the image data of the first pixel is the same as the image data of the second pixel; if The image data of the first pixel is the same as the image data of the second pixel, it is determined that the image of the first frame and the image of the second frame are the same; if the image data of the first pixel is different from the image data of the second pixel, the first The frame image is different from the second frame image. The first pixel is any pixel in the first frame of image, and the second pixel is a pixel in the second frame of image that has the same position as the first pixel.
[0066] In this step, the image data of the first pixel may be the RGB (Red Green Blue) value of the first pixel, or may be the gray value of the first pixel. The embodiment of the present invention does not specifically limit the way of representing image data. The image data of the second pixel point is represented in the same manner as the image data of the first pixel point, and will not be repeated here.
[0067] Further, if the first frame image and the second frame image are different, that is, when the terminal detects that the current window of the application program has changed, the terminal determines the target image of the first target area. Wherein, the step of determining the target image of the first target area by the terminal may be: the terminal obtains the first frame image obtained by taking a screenshot of the first target area at the current time, and determines the target image according to the first frame image.
[0068] In this step, the terminal can directly use the entire content of the first frame of image as the target image; the terminal can also use the part of the first frame of image that is different from the second frame of image as the target image. Image, the step of determining the target image can be implemented by the following first implementation manner and second implementation manner.
[0069] For the first implementation manner, the step of determining the target image by the terminal according to the first frame of image may be: the terminal determines the first frame of image as the target image.
[0070] In the second implementation manner, the step of determining the target image by the terminal according to the first frame of image may be: the terminal obtains the second frame of image obtained by taking a screenshot of the second target area on the terminal screen during the previous week, and obtaining The overlapping part of the first frame image and the second frame image is determined as the target image except for the overlapping part in the first frame image.
[0071] In this step, the overlapping part of the first frame of image and the second frame of image includes the area where all the pixels in the first frame of image and the image data of the same position of the second frame of image are the same. The terminal obtains the area where all pixels are located except for the overlapping part in the first frame of image, and uses the image of the area where all the pixels except the overlapping part are located as the target image.
[0072] It should be noted that the application is installed on the terminal of the designated system. Only when the application has the frosted glass effect enabled, the terminal will process the window of the application into the frosted glass effect according to the method provided in the embodiment of the present invention; therefore, the terminal is Before detecting whether the current window of the application has changed, it is necessary to determine whether the frosted glass effect of the application is open. When the terminal determines that the frosted glass effect of the application is open, it will detect in real time whether the current window of the application is open. changes happened.
[0073] Wherein, there is a transparency button in the application, and when the transparency in the transparency button is greater than zero, the ground glass effect of the application is in the on state. Therefore, the step for the terminal to determine whether the ground glass effect of the application is in the on state can be: terminal Use the preset duration as a cycle to obtain the transparency in the transparency button; detect whether the transparency is greater than zero, and when the terminal detects that the transparency is greater than zero, the terminal determines that the ground glass effect of the application is in the on state. When the terminal detects that the transparency is not greater than zero, the terminal determines that the ground glass effect of the application is closed.
[0074] The frosted glass effect of the application program is the effect that the user can see through the current window of the application program the image of the area on the terminal screen that is obscured by the current window. The greater the transparency, the clearer the user sees the image of the area occluded by the current window through the current window of the application; the smaller the transparency, the more blurry the image of the area occluded by the current window the user sees through the current window of the application.
[0075] Optionally, when the terminal detects that the frosted glass effect of the application is on, the terminal can start a timer with a preset duration, and the timer is used to remind the terminal to take screenshots of the target area: the timer sends the terminal to the terminal every preset duration Send a screenshot instruction, and the terminal receives the screenshot instruction sent by the timer, and takes a screenshot of the target area on the terminal screen according to the screenshot instruction.
[0076] Step 202: The terminal performs Gaussian blur processing on the target image to obtain a Gaussian blurred image.
[0077] It should be noted that, in the embodiment of the present invention, a Gaussian blur method is used to blur the target image to achieve the frosted glass effect of the window. Therefore, this step can be implemented through the following steps 2021-2022.
[0078] Step 2021: The terminal obtains the ground glass effect level selected by the user, and determines the parameter value of the Gaussian blurring process according to the ground glass effect level.
[0079] The terminal stores the corresponding relationship between the frosted glass effect level and the parameter value of Gaussian blur processing. Correspondingly, this step can be: the terminal obtains the frosted glass effect level selected by the user, and according to the frosted glass effect level, from the ground glass effect level and the parameter value of the Gaussian fuzzy processing Obtain the parameter value of Gaussian blur processing corresponding to the ground glass effect level in the corresponding relationship.
[0080] Among them, the grade of ground glass effect can be expressed by transparency. The higher the transparency, the clearer the user sees the area occluded by the first window through the current first window of the application, and the higher the corresponding frosted glass effect level; the lower the transparency, the clearer the user sees through the current first window of the application It can be seen that the more blurred the area covered by the first window, the lower the corresponding frosted glass effect level.
[0081] Among them, the parameters of Gaussian blur processing can be expressed by Gaussian radius. The higher the transparency, the higher the frosted glass effect level and the smaller the Gaussian radius, that is, the smaller the parameter value of Gaussian blur; the lower the transparency, the lower the frosted glass effect level and the larger the Gaussian radius, that is, the greater the parameter value of Gaussian blur. Big.
[0082] Step 2022: The terminal performs Gaussian blur processing on the target image according to the parameter value of the Gaussian blur processing to obtain the Gaussian blurred image.
[0083] In this step, for each pixel in the target image, the terminal extracts the first image data of the pixel, and determines the weight matrix of the pixel according to the parameter value of the Gaussian blur processing through the Gaussian function; The first image data of the point and the weight matrix determine the second image data of the pixel; in the target image, the terminal modifies the first image data of the pixel to the second image data, thereby generating a Gaussian blurred image.
[0084] Further, the terminal stores the Gaussian blurred image in the memory of the terminal, and stores the corresponding relationship between the storage path of the Gaussian blurred image and the image identifier, so that the subsequent terminal can use the image identifier of the Gaussian blurred image from the storage path and the image identifier. In the corresponding relationship, the storage path of the Gaussian blur is obtained, and the Gaussian blur image is obtained from the memory of the terminal according to the storage path.
[0085] Step 203: The terminal draws the Gaussian blurred image as a base map of the first window on the first window, so that the first window realizes a ground glass effect.
[0086] The terminal obtains the image identifier of the Gaussian blurred image, obtains the storage path corresponding to the Gaussian blurred image from the corresponding relationship between the storage path of the Gaussian blurred image and the image identifier, and obtains the Gaussian blurred image from the memory of the terminal according to the storage path .
[0087] It can be seen from step 201 that the target image may be the first frame image, or may be an image in the first frame image except for the overlapping portion of the first frame image and the second frame image. Therefore, the step of drawing the Gaussian blurred image on the first window by the terminal as the base map of the first window can be implemented by the following first implementation manner and the second implementation manner.
[0088] For the first implementation manner, the target image is the first frame of image, and after acquiring the Gaussian blurred image, the terminal directly draws the Gaussian blurred image as the base map of the first window on the first window of the application.
[0089] For the second implementation, the target image is the image in the first frame of image except for the overlapping part of the first frame of image and the second frame of image, because the terminal stores the image identification of each frame of Gaussian blurred image and the corresponding storage path Therefore, this step may be: the terminal acquires the first area in the first window corresponding to the image other than the overlapping part, and draws the Gaussian blurred image as the base map of the first area into the first area; the terminal Obtain the second area in the first window corresponding to the overlapping part of the first frame image and the second frame image, and the terminal obtains the Gaussian blurred image of the second window displayed by the application in the previous cycle, and extracts the first frame The Gaussian blurred image corresponding to the image in the overlapping portion of the image and the second frame of image, and the Gaussian blurred image corresponding to the image in the overlapping portion of the first frame of image and the second frame of image is drawn as the base map of the second area to the second Area.
[0090] Further, the terminal obtains the background image of the first window set by the user, and superimposes the background image on the current Gaussian blurred image of the first window. The first window realizes the frosted glass effect.
[0091] Optionally, the user can adjust the frosted glass effect level as needed at any time, and the terminal detects the user's frosted glass effect level, that is, transparency, and re-determines the parameter value of the Gaussian blur processing, that is, the value of the Gaussian radius according to the frosted glass effect level.
[0092] Further, in step 201, the terminal can also detect in real time whether the ground glass effect level of the application program has changed; when the terminal detects that the ground glass effect level of the application program has changed, the terminal obtains the changed ground glass effect level according to the change After the ground glass effect level, perform Gaussian blur processing on the target image according to steps 2021-2022 to obtain a changed Gaussian blurred image.
[0093] The terminal obtains the image data of all pixels of the current Gaussian blurred image of the first window, and regenerates the Gaussian blurred image according to the value of the Gaussian blurred radius, and the terminal draws the regenerated Gaussian blurred image as the base map of the first window On the first window, the background image of the first window set by the user is superimposed on the Gaussian blurred image of the first window. The terminal realizes the ground glass effect of the first window according to the transparency currently selected by the user.
[0094] In the embodiment of the present invention, when the terminal detects that the window of the application program has changed, it acquires the target image of the area occluded by the current first window of the application program; performs Gaussian blur processing on the target image to obtain a Gaussian blurred image; The Gaussian blurred image is drawn on the first window as the base map of the first window, so that the first window achieves a ground glass effect. The terminal not only displays the image in the first window, but also displays the Gaussian blurred image corresponding to the target image. Therefore, the user sees the target image in the area occluded by the first window through the first window, which increases the amount of information displayed in the first window.
[0095] The embodiment of the present invention provides an image processing device, which can be applied to a terminal of a designated system to execute the steps in the above-mentioned image processing method; wherein the designated system may be Windows XP, Windows 2003, etc. See image 3 , The device includes:
[0096] The obtaining module 301 is configured to obtain a target image of a first target area when it is detected that the current window of the application program has changed, where the first target area is an area occluded by the current first window of the application program;
[0097] The processing module 302 is configured to perform Gaussian blur processing on the target image to obtain a Gaussian blurred image;
[0098] The drawing module 303 is configured to draw the Gaussian blurred image as a base map of the first window on the first window, so that the first window realizes a ground glass effect.
[0099] Optionally, the device further includes:
[0100] The determining module is used to take a screenshot of the target area on the terminal screen with the preset time length as the screenshot period, and determine that the current window of the application program has changed when the two adjacent images obtained by the interception are not the same.
[0101] Optionally, the obtaining module 301 includes:
[0102] The acquiring unit is configured to acquire the first frame of image obtained by taking a screenshot of the first target area at the current time;
[0103] The first determining unit is configured to determine the target image according to the first frame image.
[0104] Optionally, the first determining unit is further configured to determine the first frame of image as the target image; or,
[0105] The first determining unit is also used to obtain a second frame of image obtained by taking a screenshot of a second target area on the terminal screen during the previous week, where the second target area is displayed by the application in the previous cycle Obtain the overlapping part of the first frame image and the second frame image in the area occluded by the second window, and determine the image except the overlapping part in the first frame image as the target image.
[0106] Optionally, the processing module 302 includes:
[0107] The second determining unit is used to obtain the ground glass effect level selected by the user, and determine the parameter value of the Gaussian blur processing according to the ground glass effect level;
[0108] The processing unit is configured to perform Gaussian blur processing on the target image according to the parameter value of the Gaussian blur processing to obtain the Gaussian blurred image.
[0109] In the embodiment of the present invention, when the terminal detects that the window of the application program has changed, it acquires the target image of the area occluded by the current first window of the application program; performs Gaussian blur processing on the target image to obtain a Gaussian blurred image; The Gaussian blurred image is drawn on the first window as the base map of the first window, so that the first window achieves a ground glass effect. The terminal not only displays the image in the first window, but also displays the Gaussian blurred image corresponding to the target image. Therefore, the user sees the target image in the area occluded by the first window through the first window, which increases the amount of information displayed in the first window.
[0110] It should be noted that the image processing device provided in the above embodiment only uses the division of the above functional modules for illustration during image processing. In actual applications, the above functions can be allocated by different functional modules as needed. That is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above. In addition, the image processing apparatus provided in the foregoing embodiment and the image processing method embodiment belong to the same concept, and the specific implementation process is detailed in the method embodiment, and will not be repeated here.
[0111] Figure 4 It is a schematic structural diagram of a terminal provided by an embodiment of the present invention. The terminal can be used to implement the functions performed by the terminal in the image processing method shown in the foregoing embodiment. Specifically:
[0112] The terminal 400 may include an RF (Radio Frequency, radio frequency) circuit 110, a memory 120 including one or more computer-readable storage media, an input unit 130, a display unit 140, a sensor 150, an audio circuit 160, a transmission module 170, One or more processing core processors 180, power supply 190 and other components. Those skilled in the art can understand, Figure 4 The terminal structure shown in does not constitute a limitation on the terminal, and may include more or fewer components than shown in the figure, or a combination of certain components, or a different component arrangement. among them:
[0113] The RF circuit 110 can be used for receiving and sending signals during information transmission or communication. In particular, after receiving the downlink information of the base station, it is processed by one or more processors 180; in addition, the uplink data is sent to the base station. . Generally, the RF circuit 110 includes but is not limited to an antenna, at least one amplifier, a tuner, one or more oscillators, a subscriber identity module (SIM) card, a transceiver, a coupler, and an LNA (Low Noise Amplifier, low noise amplifier) , Duplexer, etc. In addition, the RF circuit 110 can also communicate with the network and other terminals through wireless communication. The wireless communication can use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication), GPRS (General Packet Radio Service, General Packet Radio Service), CDMA (Code Division Multiple Access, Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), email, SMS (Short Messaging Service, short message service), etc.
[0114] The memory 120 may be used to store software programs and modules, such as the software programs and modules corresponding to the terminal shown in the above exemplary embodiments. The processor 180 executes various functional applications by running the software programs and modules stored in the memory 120 And data processing, such as the realization of video-based interaction. The memory 120 may mainly include a program storage area and a data storage area. The program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; The data (such as audio data, phone book, etc.) created by the use of the terminal 400, etc. In addition, the memory 120 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices. Correspondingly, the memory 120 may further include a memory controller to provide the processor 180 and the input unit 130 to access the memory 120.
[0115] The input unit 130 may be used to receive inputted digital or character information, and generate keyboard, mouse, joystick, optical or trackball signal input related to user settings and function control. Specifically, the input unit 130 may include a touch-sensitive surface 131 and other input terminals 132. The touch-sensitive surface 131, also called a touch screen or a touchpad, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on or on the touch-sensitive surface 131. Operations near the touch-sensitive surface 131), and drive the corresponding link device according to a preset program. Optionally, the touch-sensitive surface 131 may include two parts: a touch detection device and a touch controller. Among them, the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 180, and can receive and execute the commands sent by the processor 180. In addition, the touch-sensitive surface 131 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch-sensitive surface 131, the input unit 130 may also include other input terminals 132. Specifically, the other input terminal 132 may include, but is not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick.
[0116] The display unit 140 may be used to display information input by the user or information provided to the user, and various graphical user interfaces of the terminal 400. These graphical user interfaces may be composed of graphics, text, icons, videos, and any combination thereof. The display unit 140 may include a display panel 141. Optionally, the display panel 141 may be configured in the form of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode, organic light emitting diode), etc. Further, the touch-sensitive surface 131 may cover the display panel 141. When the touch-sensitive surface 131 detects a touch operation on or near it, it is transmitted to the processor 180 to determine the type of the touch event, and then the processor 180 responds to the touch event. The type provides corresponding visual output on the display panel 141. Although in Figure 4 In this case, the touch-sensitive surface 131 and the display panel 141 are used as two independent components to realize the input and input functions. However, in some embodiments, the touch-sensitive surface 131 and the display panel 141 may be integrated to realize the input and output functions.
[0117] The terminal 400 may also include at least one sensor 150, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor can include an ambient light sensor and a proximity sensor. The ambient light sensor can adjust the brightness of the display panel 141 according to the brightness of the ambient light. The proximity sensor can close the display panel 141 and the display panel 141 when the terminal 400 is moved to the ear. / Or backlight. As a kind of motion sensor, the gravity acceleration sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary. It can be used to identify mobile phone posture applications (such as horizontal and vertical screen switching, related Games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, percussion), etc.; as for other sensors such as gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which can also be configured in the terminal 400, here are not Repeat it again.
[0118] The audio circuit 160, the speaker 161, and the microphone 162 can provide an audio interface between the user and the terminal 400. The audio circuit 160 can transmit the electrical signal converted from the received audio data to the speaker 161, which is converted into a sound signal for output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into an electrical signal, and the audio circuit 160 After being received, it is converted into audio data, and then processed by the audio data output processor 180, and sent to, for example, another terminal via the RF circuit 110, or the audio data is output to the memory 120 for further processing. The audio circuit 160 may also include an earplug jack to provide communication between a peripheral earphone and the terminal 400.
[0119] The terminal 400 can help users to send and receive e-mails, browse web pages and access streaming media through the transmission module 170, and it provides users with wireless or wired broadband Internet access. although Figure 4 The transmission module 170 is shown, but it is understandable that it is not a necessary component of the terminal 400 and can be omitted as required without changing the essence of the invention.
[0120] The processor 180 is the control center of the terminal 400. It uses various interfaces and lines to link the various parts of the entire mobile phone, and by running or executing software programs and/or modules stored in the memory 120, and calling data stored in the memory 120, Perform various functions of the terminal 400 and process data, so as to monitor the mobile phone as a whole. Optionally, the processor 180 may include one or more processing cores; preferably, the processor 180 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc. , The modem processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 180.
[0121] The terminal 400 also includes a power source 190 (such as a battery) for supplying power to various components. Preferably, the power source may be logically connected to the processor 180 through a power management system, so that functions such as charging, discharging, and power consumption management can be managed through the power management system. The power supply 190 may also include one or more DC or AC power supplies, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and any other components.
[0122] Although not shown, the terminal 400 may also include a camera, a Bluetooth module, etc., which will not be repeated here. Specifically, in this embodiment, the display unit of the terminal is a touch screen display, and the terminal also includes a memory and one or more programs. One or more programs are stored in the memory and configured to be processed by one or more programs. The execution of the above one or more programs by the device includes a method for performing the above image processing.
[0123] Those of ordinary skill in the art can understand that all or part of the steps in the foregoing embodiments can be implemented by hardware, or by a program instructing related hardware to be completed. The program can be stored in a computer-readable storage medium. The storage medium mentioned can be a read-only memory, a magnetic disk or an optical disk, etc.
[0124] The above descriptions are only preferred embodiments of the present invention and are not intended to limit the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention shall be included in the protection of the present invention. Within range.
PUM


Description & Claims & Application Information
We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.