[0020] The mobile terminal user interface adjustment system and adjustment method thereof disclosed in the present application will be described in detail below with reference to the accompanying drawings. For the sake of brevity, in the description of the embodiments of the present application, the same or similar devices use the same or similar reference signs.
[0021] figure 1 A schematic diagram of a mobile terminal user interface adjustment system 1000 according to an embodiment of the present application is shown. Such as figure 1 As shown, the user interface adjustment system 1000 includes a grip posture recognition unit 100 and an interface adjustment unit 200.
[0022] The holding posture recognition unit 100 can recognize the posture of the user grasping the mobile terminal. The interface adjustment unit 200 can adjust the user interface currently displayed by the user operating the mobile terminal according to the recognized grasping posture of the user. In this way, the user interface adjustment system of the present application can automatically adjust the user interface according to the posture of the user grasping the mobile terminal, without the user selecting the user interface through manual settings, which facilitates the user's use and improves the user experience.
[0023] figure 2 It shows a schematic diagram of a mobile terminal user interface adjustment system according to another embodiment of the present application. As shown in the figure, the user interface adjustment system includes a holding posture recognition unit 100 for recognizing the posture of the user grasping the mobile terminal; and an interface adjustment unit 200 capable of adjusting the user interface according to the recognized posture of the user grasping the mobile terminal.
[0024] Such as figure 2 As shown, the grip recognition unit 100 includes a sensing device 110 and an analysis device 120. The sensing device 110 may be arranged on the left and right sides of the body of the mobile terminal to sense the contact point where the user grasps the mobile terminal. Since the user uses the left hand or the right hand to grasp the mobile terminal, the distribution of the contact points of the user touching the mobile terminal is different, and the user's grasping posture can be recognized by analyzing the distribution of the contact points.
[0025] According to an embodiment, the sensing device 110 may be a multi-touch screen. For example, in order to perceive the user's grasping method, multi-touch screens may be provided on the side panels on the left and right sides of the fuselage of the mobile terminal. When the user grasps the mobile terminal with the left hand (or right hand), for example, the user's left hand (or right hand) will be in contact with the multi-touch screens provided on the left and right sides of the fuselage, so that the touch point data can be sampled through the multi-touch screen. Sense the contact point of the user's left hand (or right hand).
[0026] According to another embodiment, the sensing device 110 may be a light sensor provided on the left side panel and the right side panel of the mobile terminal. When the user grasps the mobile terminal with the left hand (or right hand), for example, the light of the light sensor arranged on the left and right side of the fuselage is blocked due to the user's grasp, so that the light sensor data can be sampled by the light sensor to sense Measure the contact point of the user's left hand (or right hand).
[0027] The analyzing device 120 can analyze the distribution mode of the contact points sensed by the sensing device 110, and identify whether the user is a left-handed operation or a right-handed operation according to the distribution mode of the contact points. When operating with the left hand, the palm of the left hand usually touches the left side panel of the fuselage as a whole, while the fingers of the left hand usually touch the right side panel of the fuselage in a separated state; and when operating with the right hand, the palm of the right hand usually touches the right side of the fuselage as a whole The side panel, and the fingers of the right hand usually touch the left side panel of the fuselage in a separated state.
[0028] According to an embodiment, when the analysis device 120 analyzes the distribution of the contact points as a discrete dot matrix on the right side and a continuous dot matrix on the left side, the user can be identified as a left-hand operation. When the analysis device 120 analyzes the distribution of the contact points as a discrete dot matrix on the left and a continuous dot matrix on the right, it can be recognized that the user is operating with the right hand.
[0029] reference image 3 According to another embodiment, the analysis device 120 may further include a determination module 121, a calculation module 122, and an identification module 123. The determining module 121 can determine each touch area formed by the contact point according to the contact point sensed by the sensing device 110. This method of determining the touch area based on the contact point has been widely used in the prior art. The calculation module 122 can calculate the number M of touch areas on the left and the number N of touch areas on the right according to the determined touch areas. For example, the calculation module 122 may calculate the number M of touch regions sensed by the touch screen provided on the left side panel and the number N of touch regions sensed by the touch screen provided on the right side panel. The recognition module 123 can compare the relationship between the number M of touch areas on the left and the number N of touch areas on the right. When M N, the user is recognized as a right-hand operation. In addition, when M=N, the identification module 123 may consider the current sensing result of the sensing device 110 to be invalid, and thus does not perform identification. In this way, the user interface of the mobile terminal will continue to use the current interface without modification.
[0030] In addition, since the number of contact areas between the palm of the hand and the left side panel or the right side panel of the fuselage usually does not exceed two when grasped by a human hand, the identification module 123 can be further set to, when M>2 and N>2 At this time, it is considered that the sensing result of the sensing device 110 this time is invalid, so no identification is performed.
[0031] Refer again figure 2 , The interface adjustment unit 200 includes an interface database 210 and an interface calling device 220.
[0032] According to the present application, the interface database 210 stores corresponding right-hand operation interfaces and left-hand operation interfaces provided for various styles of interfaces. Among them, the right-hand operation interface is an interface designed considering the ergonomic characteristics of the right-hand operation, and the menu layout is designed in a way that facilitates the user's right-handed one-handed operation; and the left-hand operation interface is based on the ergonomic characteristics of the left-hand operation. The designed interface, in which the menu layout is designed in a way that is convenient for users to operate with one hand with the left hand. For example, during right-hand operation, a large number of operations will appear on the upper right or lower left of the screen. Therefore, in the design of the right-hand operation interface, you can set frequently used touch buttons in the upper right or lower left area for right-hand operation to offer comfort. In the left-hand operation, since a large number of operations will appear on the upper left or lower right of the screen, in the design of the left-hand operation interface, you can set frequently used touch buttons in the upper left or lower right area to provide for right-hand operation Convenience.
[0033] After the analysis device 120 analyzes the distribution of the contact points and recognizes the user's grasping posture, the interface calling device 220 can call the corresponding left-hand operation interface or right-hand operation interface according to the recognized posture. For example, when the analysis device 120 recognizes that the user is operating by the left hand, the interface calling device 220 calls the left-hand operation interface stored in the interface database 210; and when the user is recognized as the right-hand operation, the interface calling device 220 calls the right hand stored in the interface database 210 Operation interface.
[0034] reference image 3 According to an embodiment, the interface calling device 220 may further include a judgment module 221 and an interface switching module 222.
[0035] The determining module 221 can determine whether the recognized gesture is consistent with the currently displayed user interface. For example, when the analysis device 120 analyzes and recognizes that the user's grasping posture is a left-hand (or right-hand) operation, if the user interface currently displayed on the mobile terminal is a right-hand (or left-hand) operation interface, the recognized posture is considered to be the same as the currently displayed user interface Inconsistent. Otherwise, it is considered that the recognized gesture is consistent with the currently displayed user interface.
[0036] The interface switching module 222 can perform the currently displayed user interface switching according to the judgment result. When the judgment result of the judgment module 221 is inconsistent, the currently displayed user interface is switched to the left-hand operation interface or the right-hand operation interface corresponding to the recognized posture. That is, when the user's grasping posture is a left-hand operation, the currently displayed right-hand operation interface is switched to a left-hand operation interface; when the user's grasping posture is a right-hand operation, the currently displayed left-hand operation interface is switched to a right-hand operation interface. When the judgment result of the judgment module 221 is consistent, the currently displayed user interface is not switched.
[0037] According to this application, it is possible to automatically recognize whether the user is operating with the left hand or the right hand, and then automatically adjust the interface of the mobile phone, so as to ensure that the user can obtain a convenient and good control experience regardless of whether the user uses the left-hand operation or the right-hand operation.
[0038] Figure 4 A flowchart of a method for adjusting a user interface of a mobile terminal according to an embodiment of the present application is shown.
[0039] Such as Figure 4 As shown, in step 410, the gesture of the user grasping the mobile terminal is recognized; and in step 420, the user interface currently displayed on the mobile terminal is adjusted according to the recognized gesture. In this way, the user interface adjustment method of the present application can automatically adjust the user interface according to the posture of the user grasping the mobile terminal without requiring the user to select the user interface through manual settings, which facilitates the user's use and improves the user experience.
[0040] Figure 5 A flowchart of a method for adjusting a user interface of a mobile terminal according to another embodiment of the present application is shown.
[0041] Such as Figure 5 As shown, in step 510, it is sensed that the user grasps the contact point of the mobile terminal.
[0042] In step 520, the distribution mode of the sensed contact points is analyzed to identify whether the user is a left-handed operation or a right-handed operation. For example, when the analysis of the distribution of contact points is a hash interval dot matrix on the right side and a continuous strip dot matrix on the left side, the user can be identified as a left-handed operation. When the analysis of the distribution of contact points is a dot matrix with hash intervals on the left and a continuous strip dot matrix on the right, the user can be identified as a right-hand operation.
[0043] In step 530, the corresponding left-hand operation interface or right-hand operation interface can be called according to the recognized gesture. For example, in the interface database of the mobile terminal of the present application, corresponding right-hand operation interfaces and left-hand operation interfaces provided for various styles of interfaces are stored. When the analysis device recognizes that the user is operating with the left hand, it calls the left-hand operation interface stored in the mobile terminal. When the analysis device recognizes that the user is operating with the right hand, it calls the right-hand operating interface stored in the mobile terminal.
[0044] Image 6 shown Figure 5 The illustrated analysis of the distribution of the sensed contact points is an embodiment of the method for identifying whether the user is operating with the left hand or the right hand. Such as Image 6 As shown, the method of analyzing the distribution of contact points to recognize user gestures may further include:
[0045] In step 610, each touch area formed by the contact point is determined. This method of determining the touch area based on the contact point has been widely used in the prior art.
[0046] In step 620, the number M of touch areas on the left and the number N of touch areas on the right are calculated.
[0047] If M N, in step 640, the user is recognized as a right-handed operation. In addition, if M=N, no recognition is required. In this way, the user interface of the mobile terminal will continue to use the current interface without modification.
[0048] Figure 7 shown Figure 5 An embodiment of the method of invoking the corresponding left-hand operation interface or right-hand operation interface according to the recognized gesture is shown. Such as Figure 7 As shown, the method may further include:
[0049] In step 710, it is determined whether the recognized gesture is consistent with the currently displayed user interface. Therefore, the currently displayed user interface can be switched according to the result of the judgment.
[0050] When the determined result is inconsistent, in step 720, the currently displayed user interface is switched to a left-hand operation interface or a right-hand operation interface corresponding to the recognized gesture. When the determined result is consistent, in step 730, the currently displayed user interface is not switched.
[0051] According to this application, it is possible to automatically recognize whether the user is operating with the left hand or the right hand, and then automatically adjust the interface of the mobile phone, so as to ensure that the user can obtain a convenient and good control experience regardless of whether the user uses the left-hand operation or the right-hand operation.
[0052] The exemplary embodiments of the present application have been described above with reference to the drawings. Those skilled in the art should understand that the above-mentioned embodiments are merely examples for illustrative purposes, not for limitation. Any modification, equivalent replacement, etc. made under the teachings of this application and the scope of the claims, All should be included in the scope of protection claimed by this application.