Virtual button control method, terminal and computer storage medium
A computer storage and control method technology, applied in the computer field, can solve problems such as poor terminals and inability to effectively control virtual keys, and achieve the effect of improving intelligence
Inactive Publication Date: 2018-11-06
NUBIA TECHNOLOGY CO LTD
10 Cites 4 Cited by
AI-Extracted Technical Summary
Problems solved by technology
[0004] In view of this, the embodiment of the present invention expects to provide a virtual button control method, terminal and computer storage medium, which solves th...
Method used
[0053] The A/V input unit 104 is used to receive audio or video signals. The A/V input unit 104 may include a graphics processing unit (Graphics Processing Unit, GPU) 1041 and a microphone 1042, and the graphics processing unit 1041 is used for still pictures or The image data of the video is processed. The processed image frames may be displayed on the display unit 106 . The image frames processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the Wi-Fi module 102 . The microphone 1042 can rece...
Abstract
The embodiment of the invention discloses a virtual button control method. The method comprises the steps that a running scenario where a terminal is located is determined, and it is detected whetheror not the running scenario conforms to a preset scenario; if yes, the virtual button associated with the running scenario is determined; the touch control mode of the virtual button is changed. The embodiment of the invention further discloses the terminal and a computer storage medium. The terminal and the computer storage medium are to solve the problem in the prior art that the virtual buttoncannot be effectively controlled, so the intelligence of the terminal is poorer; the intelligence of the terminal is improved.
Application Domain
Video gamesInput/output processes for data processing
Technology Topic
Control modeComputer engineering +2
Image
Examples
- Experimental program(1)
Example Embodiment
[0043] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention.
[0044] It should be understood that the specific embodiments described herein are only used to explain the present invention, but not to limit the present invention.
[0045] In the following description, the use of suffixes such as “module”, “part” or “unit” used to indicate elements is only for facilitating the description of the present invention, and has no specific meaning in itself. Therefore, "module", "part" or "unit" can be used in a mixed manner.
[0046] The terminal can be implemented in various forms. For example, the terminal described in the present invention may include mobile phones, tablet computers, notebook computers, palmtop computers, personal digital assistants (Personal Digital Assistant, PDA), portable media players (Portable Media Player, PMP), navigation devices, Mobile terminals such as wearable devices, smart bracelets, pedometers, and fixed terminals such as digital TVs and desktop computers.
[0047] The following description will take a mobile terminal as an example. Those skilled in the art will understand that, in addition to elements specifically used for mobile purposes, the construction according to the embodiments of the present invention can also be applied to fixed-type terminals.
[0048] See figure 1 , Which is a schematic diagram of the hardware structure of a mobile terminal implementing each embodiment of the present invention. The mobile terminal 100 may include: a radio frequency (RF) unit 101, a Wi-Fi module 102, an audio output unit 103, and A/V (Audio/Video) Input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art can understand, figure 1 The structure of the mobile terminal shown in does not constitute a limitation on the mobile terminal, and the mobile terminal may include more or fewer components than shown in the figure, or combine some components, or arrange different components.
[0049] Combine below figure 1 Specific introduction to each component of the mobile terminal:
[0050] The radio frequency unit 101 can be used for receiving and sending signals during information transmission or communication. Specifically, after receiving the downlink information of the base station, it is processed by the processor 110; in addition, the uplink data is sent to the base station. Generally, the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with the network and other devices through wireless communication. The above-mentioned wireless communication can use any communication standard or protocol, including but not limited to Global System of Mobile Communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access 2000 (Code Division Multiple Access). Multiple Access 2000, CDMA2000), Wideband Code Division Multiple Access (WCDMA), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Frequency Division Duplex Long Term Evolution (Frequency Division Duplexing-Long Term Evolution, FDD-LTE) and Time Division Duplexing-Long Term Evolution (TDD-LTE), etc.
[0051] Wi-Fi is a short-range wireless transmission technology. Through the Wi-Fi module 102, a mobile terminal can help users send and receive emails, browse web pages, and access streaming media. It provides users with wireless broadband Internet access. although figure 1 The Wi-Fi module 102 is shown, but it is understandable that it is not a necessary component of the mobile terminal, and can be omitted as needed without changing the essence of the invention.
[0052] The audio output unit 103 can receive the radio frequency unit 101 or the Wi-Fi module 102 or store it in the memory 109 when the mobile terminal 100 is in the call signal receiving mode, call mode, recording mode, voice recognition mode, broadcast receiving mode, etc. The audio data stored in is converted into an audio signal and output as sound. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (for example, call signal reception sound, message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and so on.
[0053] The A/V input unit 104 is used to receive audio or video signals. The A/V input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042. The graphics processing unit 1041 is configured to monitor the still pictures or images obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. The image data of the video is processed. The processed image frame can be displayed on the display unit 106. The image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the Wi-Fi module 102. The microphone 1042 can receive sound (audio data) via the microphone 1042 in operation modes such as a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound into audio data. The processed audio (voice) data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 in the case of a telephone call mode for output. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to eliminate (or suppress) noise or interference generated in the process of receiving and transmitting audio signals.
[0054] The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor. The ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light. The proximity sensor can close the display panel 1061 and the display panel 1061 when the mobile terminal 100 is moved to the ear. / Or backlight. As a kind of motion sensor, the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify mobile phone posture applications (such as horizontal and vertical screen switching, related Games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, percussion), etc.; as for mobile phones, fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, etc. Other sensors such as thermometers and infrared sensors will not be described here.
[0055] The display unit 106 is used to display information input by the user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
[0056] The user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. Operation), and drive the corresponding connection device according to the preset program. The touch panel 1071 may include two parts: a touch detection device and a touch controller. Among them, the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, and can receive and execute the commands sent by the processor 110. In addition, the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may also include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, joystick, etc., which are not specifically limited here. .
[0057] Further, the touch panel 1071 can cover the display panel 1061. When the touch panel 1071 detects a touch operation on or near it, it is transmitted to the processor 110 to determine the type of the touch event, and then the processor 110 responds to the touch event. Type provides corresponding visual output on display panel 1061. Although in figure 1 In the above, the touch panel 1071 and the display panel 1061 are used as two independent components to realize the input and output functions of the mobile terminal. However, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated to realize the mobile terminal The input and output functions of the device are not limited here.
[0058] The interface unit 108 serves as an interface through which at least one external device can be connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc. The interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the mobile terminal 100 or can be used to connect to the mobile terminal 100 and external Transfer data between devices.
[0059] The memory 109 can be used to store software programs and various data. The memory 109 may mainly include a program storage area and a data storage area. The program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones. In addition, the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
[0060] The processor 110 is the control center of the mobile virtual buttons. It uses various interfaces and lines to connect the various parts of the entire mobile terminal, runs or executes software programs and/or modules stored in the memory 109, and calls Data, perform various functions of the mobile terminal and process data, so as to monitor the mobile terminal as a whole. The processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem The processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
[0061] The mobile terminal 100 may also include a power source 111 (such as a battery) for supplying power to various components. Preferably, the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system And other functions.
[0062] in spite of figure 1 Not shown, the mobile terminal 100 may also include a Bluetooth module, etc., which will not be repeated here.
[0063] In order to facilitate the understanding of the embodiments of the present invention, the communication network system on which the mobile terminal of the present invention is based is described below.
[0064] See figure 2 , figure 2 An architecture diagram of a communication network system provided by an embodiment of the present invention. The communication network system is a universal mobile communication technology LTE system. The LTE system includes a user equipment (User Equipment, UE) 201 connected in sequence, and an evolved UMTS terrestrial radio An access network (Evolved UMTS Terrestrial Radio Access Network, E-UTRAN) 202, an evolved packet core network (Evolved Packet Core, EPC) 203, and an operator’s IP service 204.
[0065] Specifically, the UE 201 may be the aforementioned terminal 100, which will not be repeated here.
[0066] E-UTRAN 202 includes eNodeB2021 and other eNodeB2022, etc. Among them, eNodeB2021 can be connected to other eNodeB2022 through a backhaul (for example, X2 interface), eNodeB2021 is connected to EPC203, eNodeB2021 can provide UE201 to EPC203 access.
[0067] The EPC 203 may include a mobility management entity (Mobility Management Entity, MME) 2031, a home subscriber server (Home Subscriber Server, HSS) 2032, other MMEs 2033, a serving gateway (Serving GateWay, SGW) 2034, a packet data network gateway (PDN Gate Way, PGW) 2035 and Policy and Charging Rules Function (PCRF) 2036 and so on. Among them, MME 2031 is a control node that processes signaling between UE201 and EPC203, and provides bearer and connection management. HSS2032 is used to provide some registers to manage functions such as the home location register (not shown in the figure), and save some user-specific information about service features, data rates, etc. All user data can be sent through SGW2034. PGW2035 can provide UE 201 IP address allocation and other functions. PCRF2036 is the policy and charging control policy decision point for service data flow and IP bearer resources. It is the policy and charging execution function The unit (not shown in the figure) selects and provides available policy and charging control decisions.
[0068] The IP service 204 may include the Internet, an intranet, an IP Multimedia Subsystem (IMS), or other IP services.
[0069] Although the LTE system is described above as an example, those skilled in the art should know that the present invention is not only applicable to LTE systems, but also applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA, and future new technologies. The network system, etc., is not limited here.
[0070] Based on the foregoing mobile terminal hardware structure and communication system, various embodiments of the present invention are proposed.
[0071] The embodiment of the present invention provides a method for controlling virtual buttons, which is applied to a terminal, and the terminal may be the mobile terminal in the above-mentioned embodiment. image 3 As shown, the method includes the following steps:
[0072] Step 301: Determine the operating scenario where the terminal is located, and detect whether the operating scenario meets a preset scenario.
[0073] Among them, the terminal in the embodiment of the present invention may be a terminal used on the move, and in a broad sense includes a mobile phone, a notebook, a tablet computer, and even a vehicle-mounted computer. However, in most cases, it refers to mobile phones or smart phones and tablets with multiple application functions. With the rapid development of integrated circuit technology, the terminal already has a strong processing capability, which also adds a broader development space to the terminal. The above-mentioned terminals have the ability to access the Internet, and are usually equipped with various operating systems, and various functions can be customized according to user needs.
[0074] The terminal determines the operating scenario in which it is located. The terminal may obtain the operating mode in which it is located, and the operating mode may be the display mode of the terminal.
[0075] The terminal determines the operating scenario in which it is located, which may be that the terminal obtains the operating scenario of the application in itself, and determines the operating scenario of the application as its operating scenario. The applications in the terminal may include chat applications, camera applications, shopping applications, office applications, video applications, game applications, etc. When the terminal starts an application in the terminal, the application runs in the foreground of the terminal. At this time, the terminal can obtain the running scene of the application. During the running of the application, the running scene obtained by the terminal may include the content displayed during the running of the application, the switching of the display mode during the running of the application, and the virtual keys corresponding to the display mode during the running of the application.
[0076] Exemplarily, when the terminal starts its own application as a chat application, the running scene may be the chat content displayed on the terminal display interface during the running of the chat application; when the terminal starts its own application as a camera application, the running scene It can be a virtual button displayed on the terminal display interface during the running of a camera application, such as a virtual button for taking a photo; when the terminal starts its own application as a video application, the running scene can be displayed on the terminal during the running of the video application The multimedia content on the display interface and the virtual buttons used by the user to control the display effect of the multimedia content; when the terminal opens its own application as a game application, the running scene can be the switching of the display mode during the running of the game application, such as the game application The application interface is switched from the first display mode to the second display mode. The first display mode can be the portrait display mode, the second display mode can be the landscape display mode, and the portrait display mode refers to the short side of the terminal screen and The horizontal plane is parallel. The horizontal display mode refers to the mode in which the long side of the terminal screen is parallel to the horizontal plane.
[0077] In the embodiment of the present invention, the preset scene may include preset content, preset display mode switching conditions, preset virtual keys corresponding to the display mode, and so on.
[0078] Step 302: If the running scene matches the preset scene, determine the virtual key associated with the running scene.
[0079] Exemplarily, the preset scene may be a switching situation of a preset display mode, such as switching from a portrait display mode to a landscape display mode. After the terminal obtains the running scene of the application, it compares the running scene with the preset scene. It is assumed that the currently running application in the foreground of the terminal is a game application, and the running scene obtained by the terminal indicates that the game application is displayed in portrait mode Switch to the horizontal display mode, then the terminal determines that the running scene matches the preset scene.
[0080] It should be noted that the virtual key associated with the running scene can be a virtual key used to control the display effect of the display content of the application interface during the running of the application, or it can be a virtual menu displayed on the display interface of the terminal during the running of the application. Buttons, virtual buttons on the main interface, and virtual buttons for returning; of course, the virtual buttons associated with the running scene may also include various virtual buttons among the above-mentioned virtual buttons.
[0081] Step 303: Change the touch mode of the virtual button.
[0082] Wherein, when the terminal determines that the running scene matches the preset scene, it then determines the virtual key associated with the running scene and changes the touch mode of the virtual key.
[0083] The terminal changes the touch mode of the virtual button, that is, the virtual button is changed from the first touch mode to the second touch mode. The first touch mode and the second touch mode respectively correspond to different touch parameters. In this way, intelligent adjustment The touch mode of the virtual button improves the intelligence of the terminal.
[0084] In the embodiment of the present invention, after the terminal changes the touch mode of the virtual buttons, the following steps may be performed:
[0085] Generate and output a prompt message indicating that the touch mode of the virtual key has been changed.
[0086] Among them, the prompt information includes description information of the changed touch mode. The terminal can display prompt information on the current running interface of the application, so that after the user adjusts the touch mode of the virtual button on the terminal, the touch mode is immediately obtained, and then the virtual button is touched based on the new touch mode .
[0087] In another embodiment of the present invention, the running scene of the terminal is the running scene of the application running in the foreground of the terminal. After the terminal changes the touch mode of the virtual buttons, the following steps may be performed:
[0088] If it is detected that the application is switched from the foreground of the terminal to the background of the terminal, the touch mode of the virtual button is switched to the touch mode before the change.
[0089] Among them, after the terminal changes the touch mode of the virtual button, the user uses the changed touch mode to touch the virtual button, and the user exits the application from the foreground. At this time, the application runs in the background of the terminal, and the terminal touches the virtual button. The control mode is switched to the touch mode before the change, so that the user can use the original touch mode corresponding to the virtual button to control other applications or perform setting operations on the terminal.
[0090] In another embodiment of the present invention, after the terminal changes the touch mode of the virtual buttons, the following steps may be performed:
[0091] First, if an application message is detected, the priority of the application message is obtained.
[0092] Wherein, the application message may be a message of any application in the terminal, such as a call message, a short message message, and an email message. When the terminal detects the application message, the priority of the application message is obtained. It should be noted that the terminal can set the priority of the application message corresponding to the application based on the usage frequency of the application; of course, the terminal can also set the priority of the application message corresponding to the application based on other methods, which is not specifically limited in the embodiment of the present invention , Subject to the control method of the virtual key of the present invention.
[0093] Secondly, if the priority is greater than the preset priority, the feature information of the application message is acquired, and the feature information is displayed on the display interface of the preset application.
[0094] Wherein, the terminal determines that the priority of the application message is greater than the preset priority, acquires characteristic information of the application message, and displays the characteristic information on the display interface of the preset application. For example, when the user is playing a game, the terminal obtains the application message of the SMS application, and the terminal determines that the priority of the application message is greater than the preset priority, obtains the keyword of the application message, and displays the keyword in the game interface.
[0095] The virtual button control method provided by the embodiment of the present invention obtains the running scene of the application when the terminal starts the application in the terminal; detects whether the running scene meets the preset scene; if the running scene meets the preset scene, it is determined with the running scene Associated virtual buttons, and change the touch mode of the virtual buttons; that is, in the embodiment of the present invention, when the terminal determines that the operating scene it is in conforms to the preset scene, it can perform the touch mode of the virtual buttons associated with the operating scene. The modification solves the problem that the virtual buttons cannot be effectively controlled in the prior art, which leads to the poor intelligence of the terminal, realizes the effective control of the virtual buttons, and improves the intelligence of the terminal.
[0096] Based on the foregoing embodiment, the embodiment of the present invention provides a method for controlling virtual buttons, which is applied to a terminal, such as Figure 4 As shown, the method includes the following steps:
[0097] Step 401: Determine the operating scenario where the terminal is located, and detect whether the operating scenario meets the preset scenario.
[0098] Step 402: If the running scene characterization terminal switches from the first display mode to the second display mode, it is determined that the running scene meets the preset scene, and the virtual key associated with the running scene is determined.
[0099] Wherein, the layout of the interface elements in the first display mode and the second display mode are different. Interface elements may include display characters, virtual keys, etc. The layout of interface elements represents the relative positional relationship between interface elements.
[0100] Exemplarily, the first display mode may be a portrait display mode, and the second display mode may be a landscape display mode. The portrait display mode refers to the mode where the short side of the terminal screen is parallel to the horizontal plane, and the landscape display mode refers to the terminal The mode where the long side of the screen is parallel to the horizontal plane. When the user is playing the game, if the terminal detects that the game interface is switched from the portrait display mode to the landscape display mode, at this time, the original in the game interface, such as the interface on the same horizontal plane in the portrait display mode
[0101] In the embodiment of the present invention, in step 402, if the operating scene characterization terminal switches from the first display mode to the second display mode, the operating scene is determined to conform to the preset scene, and the virtual key associated with the operating scene is determined, step 403 can be selected. Or step 404;
[0102] Step 403: Obtain the target configuration file of the virtual button, and change the touch mode of the virtual button according to the target configuration file.
[0103] The virtual keys include virtual keys arranged in the first area of the display screen of the terminal. The target configuration file is used to adjust the correspondence between the touch operation of the virtual button and the touch response.
[0104] Exemplary, combined Figure 5 As shown, Figure 5 Is a schematic diagram of the game application interface, the first area can be Figure 5 The area where the first virtual button 51, the second virtual button 52 and the third virtual button 53 are located (the first area is Figure 5 The area marked with a dashed frame 50). It should be noted that the virtual buttons in the first area are used to control the interface jump. For example, the first virtual button 51 is used to jump to the previous interface of the current interface, the second virtual button 52 is used to jump to the system interface of the terminal, and the third virtual button 53 is used to jump to include all running items in the terminal. Application thumbnail interface.
[0105] Changing the touch mode of the virtual button in the first area refers to changing the correspondence between the touch operation of the virtual button and the touch response. For example, before the change is made, if the user touches the first virtual button 51 once, the terminal executes a jump instruction to jump to the previous interface of the current interface. After the change, the user touches the first virtual button 51 three times, and the terminal executes a jump instruction to jump to the previous interface of the current interface.
[0106] Step 404: Map the function of the virtual button to the physical button, and the virtual button is not displayed in the running scene.
[0107] The virtual keys include virtual keys set in the second area of the display screen of the terminal, and the terminal includes physical keys, and the physical keys are set in an area other than the display screen of the terminal. The second area is different from the first area.
[0108] Exemplary, combined Figure 5 , Image 6 with Figure 7 As shown, the second area can be Figure 5 with Image 6 The area except the first area on the display interface in. The physical buttons of the terminal can be Image 6 The fingerprint recognition button 60 on the back of the terminal in the terminal, the virtual buttons in the second area include Figure 7 The first virtual selection button 71, the second virtual selection button 72, the third virtual selection button 73, and the fourth virtual selection button 74 in. The four virtual selection buttons in the second area are used to select the game character in the second area. For example, the first virtual selection button 71 is used to select character 1, the second virtual selection button 72 is used to select character 2, the third virtual selection button 73 is used to select character 3, and the fourth virtual selection button 74 is used to select character 4. The terminal can map the functions of the first virtual selection button 71, the second virtual selection button 72, the third virtual selection button 73, and the fourth virtual selection button 74 to the fingerprint recognition button 60, and the fingerprint recognition button 60 is not displayed in the running scene. , Figure 7 The four virtual selection buttons in the second area are marked with dotted lines in the middle, which means that after the terminal maps the functions of the virtual buttons to the physical buttons, the four virtual selection buttons are not displayed in the running scene. Further, the user can use the fingerprint recognition button 60 to move between the four characters by sliding clockwise or counterclockwise, and select a character according to the stay time on a certain character for a preset time such as 1 minute, such as Figure 7 The bold border of the middle character 4 shows that this character is the character selected by the user through the fingerprint identification button 60. In this way, it is possible to avoid the problem of performing related operations on the objects contained in the game interface on the terminal display interface, blocking the interface and causing poor interaction effects.
[0109] It should be noted that, for the description of the same steps and the same content in this embodiment as those in other embodiments, reference may be made to the description in other embodiments, which will not be repeated here.
[0110] Based on the foregoing embodiment, the embodiment of the present invention provides a method for controlling virtual buttons, which is applied to a terminal, such as Figure 8 As shown, the method includes the following steps:
[0111] Step 801: Determine the operating scenario where the terminal is located, and detect whether the operating scenario meets the preset scenario.
[0112] Step 802: If the content displayed in the running scene meets the preset content, and the virtual buttons are used to control the content display process, it is determined that the running scene meets the preset scene, and the virtual buttons associated with the running scene are determined.
[0113] Among them, assuming that the preset content is multimedia content, then the terminal detects that the content displayed in the operating scene where it is located is multimedia content, and the virtual button is used to control the content display process, then it is determined that the operating scene meets the preset scene. And determine the virtual key associated with the running scene.
[0114] In the embodiment of the present invention, in step 802, if the content displayed in the running scene meets the preset content, and the virtual buttons are used to control the display process of the content, it is determined that the running scene matches the preset scene, and after determining the virtual buttons associated with the running scene , You can choose to perform step 803 or step 804;
[0115] Step 803: Obtain the target configuration file of the virtual button, and change the touch mode of the virtual button according to the target configuration file.
[0116] The virtual keys include virtual keys arranged in the first area of the display screen of the terminal.
[0117] Step 804: Map the function of the virtual button to the physical button, and the virtual button is not displayed in the running scene.
[0118] The virtual keys include virtual keys set in the second area of the display screen of the terminal, and the terminal includes physical keys, and the physical keys are set in an area other than the display screen of the terminal.
[0119] It should be noted that, for the description of the same steps and the same content in this embodiment as those in other embodiments, reference may be made to the description in other embodiments, which will not be repeated here.
[0120] Based on the foregoing embodiment, the embodiment of the present invention provides a method for controlling virtual buttons, which is applied to a terminal, such as Picture 9 As shown, the method includes the following steps:
[0121] Step 901: Determine the operating scenario in which the terminal is located, and detect whether the operating scenario meets a preset scenario.
[0122] Step 902: If the operating scene characterization terminal switches from the first display mode to the second display mode, and virtual keys are displayed in the second display mode, determine that the operating scene meets the preset scene, and determine the virtual keys displayed in the second display mode It is a virtual button associated with the running scene.
[0123] Wherein, when the terminal determines that the running scene meets the preset scene, the virtual key in the second display mode is determined as the virtual key associated with the running scene. That is, the terminal only changes the touch mode for the virtual keys in the second display mode after the switching.
[0124] Step 903: Map the function of the virtual button to the physical button, and the virtual button is not displayed in the running scene.
[0125] Wherein, the virtual keys include virtual keys arranged in the first area and the second area of the display screen of the terminal, the terminal includes physical keys, and the physical keys are arranged in an area other than the display screen of the terminal.
[0126] It should be noted that, for the description of the same steps and the same content in this embodiment as those in other embodiments, reference may be made to the description in other embodiments, which will not be repeated here.
[0127] Based on the foregoing embodiment, the embodiment of the present invention provides a terminal 10, and the information acquisition terminal can be applied to Figure 3-4 And in a method for controlling virtual buttons provided in the embodiments corresponding to 8-9, refer to Picture 10 As shown, the terminal includes: a processor 1001, a memory 1002, and a communication bus 1003, where:
[0128] The communication bus 1003 is used to implement a communication connection between the processor 1001 and the memory 1002;
[0129] The processor 1001 is configured to execute a program for controlling virtual keys in the memory 1002 to implement the following steps:
[0130] Determine the operating scene where the terminal is located, and check whether the operating scene meets the preset scene;
[0131] If the running scene matches the preset scene, determine the virtual button associated with the running scene;
[0132] Change the touch method of virtual keys.
[0133] In other embodiments of the present invention, when the processor 1001 is configured to execute the step of determining a virtual button associated with the running scene in the memory 1002 if the running scene matches the preset scene, the following steps may also be implemented:
[0134] If the running scene characterization terminal switches from the first display mode to the second display mode, it is determined that the running scene matches the preset scene, and the virtual keys associated with the running scene are determined; among them, the interface elements of the first display mode and the second display mode The layout is different.
[0135] In other embodiments of the present invention, when the processor 1001 is configured to execute the step of determining a virtual button associated with the running scene in the memory 1002 if the running scene matches the preset scene, the following steps may also be implemented:
[0136] If the content displayed in the running scene meets the preset content, and the virtual button is used to control the content display process, it is determined that the running scene meets the preset scene, and the virtual button associated with the running scene is determined.
[0137] In other embodiments of the present invention, when the processor 1001 is configured to execute the step of determining a virtual button associated with the running scene in the memory 1002 if the running scene matches the preset scene, the following steps may also be implemented:
[0138] If the operating scene characterization terminal switches from the first display mode to the second display mode, and there are virtual keys displayed in the second display mode, it is determined that the operating scene meets the preset scene, and the virtual key displayed in the second display mode is determined to be and operating Virtual buttons associated with the scene.
[0139] In other embodiments of the present invention, the virtual keys include virtual keys arranged in the first area of the display screen of the terminal. When the processor 1001 is configured to execute the steps of changing the touch mode of the virtual keys in the memory 1002, the following can also be achieved step:
[0140] Obtain the target configuration file of the virtual button;
[0141] According to the target profile, change the touch mode of the virtual buttons.
[0142] In other embodiments of the present invention, the virtual keys include virtual keys set in the second area of the display screen of the terminal, the terminal includes physical keys, and the physical keys are set in an area other than the display screen of the terminal, and the processor 1001 is used to execute the memory When changing the touch mode of the virtual buttons in 1002, the following steps can also be implemented:
[0143] The function of the virtual key is mapped to the physical key, and the virtual key is not displayed in the running scene.
[0144] In other embodiments of the present invention, after the processor 1001 is configured to execute the step of changing the touch mode of the virtual button in the memory 1002, the following steps may be further implemented:
[0145] Generate and output prompt information for prompting that the touch mode of the virtual key has been changed; where the prompt information includes description information of the changed touch mode.
[0146] In other embodiments of the present invention, the running scene of the terminal is the running scene of the application running in the foreground of the terminal, and the processor 1001 is configured to execute the step of changing the touch mode of the virtual buttons in the memory 1002, and may also Implement the following steps:
[0147] If it is detected that the application is switched from the foreground of the terminal to the background of the terminal, the touch mode of the virtual button is switched to the touch mode before the change.
[0148] It should be noted that, for the specific implementation process of the steps executed by the processor in this embodiment, refer to Figure 3-4 And the implementation process of the virtual button control method provided in the embodiments corresponding to 8-9 will not be repeated here.
[0149] It should be noted that in the embodiment of the present invention, Picture 10 Processor 1001 in figure 1 Corresponds to the processor 110 in Picture 10 Memory 1002 in figure 1 The memory 109 in the corresponding.
[0150] Based on the foregoing embodiments, the embodiments of the present invention provide a computer storage medium, and the computer storage medium stores one or more programs, and the one or more programs can be executed by one or more processors to implement the following steps:
[0151] Determine the operating scene where the terminal is located, and check whether the operating scene meets the preset scene;
[0152] If the running scene matches the preset scene, determine the virtual button associated with the running scene;
[0153] Change the touch method of virtual keys.
[0154] In other embodiments of the present invention, the one or more programs may be executed by one or more processors. If the running scene matches the preset scene, when determining the virtual key associated with the running scene, the following steps may be further implemented:
[0155] If the running scene characterization terminal switches from the first display mode to the second display mode, it is determined that the running scene matches the preset scene, and the virtual keys associated with the running scene are determined; among them, the interface elements of the first display mode and the second display mode The layout is different.
[0156] In other embodiments of the present invention, the one or more programs may be executed by one or more processors. If the running scene matches the preset scene, when determining the virtual key associated with the running scene, the following steps may be further implemented:
[0157] If the content displayed in the running scene meets the preset content, and the virtual button is used to control the content display process, it is determined that the running scene meets the preset scene, and the virtual button associated with the running scene is determined.
[0158] In other embodiments of the present invention, the one or more programs may be executed by one or more processors. If the running scene matches the preset scene, when determining the virtual key associated with the running scene, the following steps may be further implemented:
[0159] If the operating scene characterization terminal switches from the first display mode to the second display mode, and there are virtual keys displayed in the second display mode, it is determined that the operating scene meets the preset scene, and the virtual key displayed in the second display mode is determined to be and operating Virtual buttons associated with the scene.
[0160] In other embodiments of the present invention, the virtual keys include virtual keys arranged in the first area of the display screen of the terminal. When the one or more programs can be executed by one or more processors to change the touch mode of the virtual keys, You can also implement the following steps:
[0161] Obtain the target configuration file of the virtual button;
[0162] According to the target profile, change the touch mode of the virtual buttons.
[0163] In other embodiments of the present invention, the virtual keys include virtual keys set in the second area of the display screen of the terminal, the terminal includes physical keys, and the physical keys are set in an area other than the display screen of the terminal. The one or more programs may When executed by one or more processors to change the touch mode of a virtual button, the following steps can also be implemented:
[0164] The function of the virtual key is mapped to the physical key, and the virtual key is not displayed in the running scene.
[0165] In other embodiments of the present invention, after the one or more programs can be executed by one or more processors to change the touch mode of the virtual buttons, the following steps can also be implemented:
[0166] Generate and output prompt information for prompting that the touch mode of the virtual key has been changed; where the prompt information includes description information of the changed touch mode.
[0167] In other embodiments of the present invention, the running scene of the terminal is the running scene of the application running in the foreground of the terminal, and the one or more programs can be executed by one or more processors to change the touch mode of the virtual buttons After that, you can also implement the following steps:
[0168] If it is detected that the application is switched from the foreground of the terminal to the background of the terminal, the touch mode of the virtual button is switched to the touch mode before the change.
[0169] It should be noted that, for the specific implementation process of the steps executed by the processor in this embodiment, refer to Figure 3-4 And the implementation process of the virtual button control method provided in the embodiments corresponding to 8-9 will not be repeated here.
[0170] It should be noted that the above-mentioned computer storage medium may be a read-only memory (Read Only Memory, ROM), a programmable read-only memory (Programmable Read-Only Memory, PROM), and an erasable programmable read-only memory (Erasable Programmable Read-Only Memory). Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Magnetic Random Access Memory (Ferromagnetic Random Access Memory, FRAM), Flash Memory (Flash Memory), magnetic Surface memory, optical disk, or CD-ROM (Compact Disc Read-Only Memory, CD-ROM) and other memories; it can also be a variety of electronic devices including one or any combination of the above memories, such as mobile phones, computers, tablet devices, Personal digital assistants, etc.
[0171] It should be noted that in this article, the terms "include", "include" or any other variants thereof are intended to cover non-exclusive inclusion, so that a process, method, article or device including a series of elements not only includes those elements, It also includes other elements not explicitly listed, or elements inherent to the process, method, article, or device. Without more restrictions, the element defined by the sentence "including a..." does not exclude the existence of other identical elements in the process, method, article or device that includes the element.
[0172] The sequence numbers of the foregoing embodiments of the present invention are only for description, and do not represent the superiority of the embodiments.
[0173] Through the description of the above embodiments, those skilled in the art can clearly understand that the method of the above embodiments can be implemented by means of software plus the necessary general hardware platform. Of course, it can also be implemented by hardware, but in many cases the former is better. 的实施方式。 Based on this understanding, the technical solution of the present invention essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes a number of instructions to enable a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to execute the method described in each embodiment of the present invention.
[0174] The present invention is described with reference to the smooth diagrams and/or block diagrams of the methods, devices (systems), and computer program products according to the embodiments of the present invention. It should be understood that each flow and/or block in the flow diagram and/or block diagram, and the combination of flow and/or blocks in the flow diagram and/or block diagram can be realized by computer program instructions. These computer program instructions can be provided to the processor of a general-purpose computer, a special-purpose computer, an embedded processor, or other programmable data processing equipment to generate a machine, so that the instructions executed by the processor of the computer or other programmable data processing equipment are generated for use To achieve in smooth Figure one Smooth or multiple smooths and/or boxes Figure one A device with functions specified in a block or multiple blocks.
[0175] These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device. The device is realized in smooth Figure one Smooth or multiple smooths and/or boxes Figure one Functions specified in a box or multiple boxes.
[0176] These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment. Instructions are provided for Figure one Smooth or multiple smooths and/or boxes Figure one Steps of functions specified in a box or multiple boxes.
[0177] The above are only the preferred embodiments of the present invention, and do not limit the scope of the present invention. Any equivalent structure or equivalent smooth transformation made using the contents of the description and drawings of the present invention, or directly or indirectly applied to other related technical fields , The same reason is included in the scope of patent protection of the present invention.
PUM


Description & Claims & Application Information
We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
Similar technology patents
Schedule managing method and terminal
Owner:HUAWEI TECH CO LTD
Multi-station steel cylinder automatic arranging and steel seal stamping line
Owner:ZHEJIANG DEREBAO ELECTRIC TECH
Method for intelligently controlling water temperature of solar energy system with multiple parameters
Owner:SHANDONG UNIV OF TECH
Control method and device for air conditioner
Owner:GREE ELECTRIC APPLIANCES INC OF ZHUHAI
Classification and recommendation of technical efficacy words
- improve intelligence
Automatic collision preventing control method of USV
Owner:HARBIN ENG UNIV
Intelligent laser 3D printing device and method
Owner:中海清华(河南)智能科技发展有限公司
Intelligent program starting method
Owner:GUANGDONG OPPO MOBILE TELECOMM CORP LTD
Message synchronization method, system, server and client end
Owner:TENCENT TECH (SHENZHEN) CO LTD
Robot interaction method and device, equipment and storage medium
Owner:上海器魂智能科技有限公司