Interaction method and device of touch screen interface

A technology of interface interaction and touch screen, which is applied in the direction of instruments, electrical digital data processing, input/output process of data processing, etc., and can solve problems such as cumbersome operation of graphic objects

Inactive Publication Date: 2016-07-06
SHENZHEN AOTO ELECTRONICS
7 Cites 5 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0007] The present invention provides a touch screen interface interaction method and device, aiming to solve t...
View more

Method used

...
View more

Abstract

The invention provides a touch screen interface interaction method and device, belonging to the field of interface interaction. In the present invention, at first a plurality of graphic objects are displayed on the touch screen display; after detecting a single finger touch on the touch screen display, and detecting the movement of a single finger contact on the touch screen display, determine the translation of the viewing angle; When a single finger click occurs on the touch screen display, the graphic object corresponding to the single finger click area is selected; The movement of a single finger on the touch screen display corresponds to the movement of the graphic object; the present invention simplifies the operation of the graphic object through the touch screen interface interaction method and device.

Application Domain

Input/output processes for data processing

Technology Topic

Finger touchDisplay device +4

Image

  • Interaction method and device of touch screen interface
  • Interaction method and device of touch screen interface
  • Interaction method and device of touch screen interface

Examples

  • Experimental program(2)

Example Embodiment

[0030] Embodiment one:
[0031] figure 1 The implementation process of the touch screen interface interaction method provided by this embodiment is shown. For the convenience of description, only the parts related to this embodiment are shown, and the details are as follows:
[0032] In step 101, a plurality of graphic objects are displayed on a touch screen display. The touchscreen display shows a portion of the system view in a scaled ratio.
[0033] In step 102, when a single finger touch on the touch screen display is detected and then a movement of a single finger contact on the touch screen display is detected, the translation of the viewing angle is determined.
[0034] In specific implementation, step 102 can be divided into the following steps:
[0035] A. Obtain the first coordinates of the single finger leaving the touch screen display and the second coordinates of the single finger falling.
[0036] B. Determine the target area according to the first coordinates and the second coordinates and display the target area.
[0037] In step 103, when a single finger click is detected on the touch screen display, the graphic object corresponding to the single finger click area is selected.
[0038] In step 104, after detecting that a single finger touch occurs in the corresponding area of ​​the graphic object of the touch screen display, and then detecting the movement of a single finger contact on the touch screen display, determine the location of the graphic object corresponding to the movement of a single finger contact on the touch screen display move.
[0039] In specific implementation, step 104 can be divided into the following steps:
[0040] A. Acquiring the third coordinate of a single finger touching on the touch screen display and the fourth coordinate of a single finger falling.
[0041] B. Determine the position of the graphic object in real time according to the third coordinate and the fourth coordinate.
[0042] Further, as figure 2 As shown, step 104-2 is also included after step 103.
[0043] In step 104-2, after detecting that a single finger touches the anchor point of the graphical object on the touch screen display, and then detecting the movement of a single finger contact on the touch screen display, determine the corresponding movement of the single finger contact on the touch screen display. Changes in the shape and size of graphics objects.
[0044] In specific implementation, step 104-2 can be divided into the following steps:
[0045] A. Obtain the fifth coordinate of a single finger leaving the touch screen display.
[0046] B. Determine the shape and size of the graphic object according to the fifth coordinate.
[0047] In this embodiment, a plurality of graphic objects are first displayed on the touch screen display; after a single finger touch is detected on the touch screen display, and the movement of the single finger contact on the touch screen display is detected, the translation of the viewing angle is determined; When a single finger click occurs on the display, the graphic object corresponding to the single finger click area is selected; after a single finger touch is detected in the area corresponding to the graphic object of the touch screen display, and the movement of the single finger contact on the touch screen display is detected, it is determined to be related to the single finger. The movement of the finger contact on the touch screen display corresponds to the movement of the graphical object; thus, the manipulation of the graphical object is simplified.

Example Embodiment

[0048] Embodiment two:
[0049] Embodiment 2 of the present invention provides a touch screen interface interaction device, such as image 3 As shown, a touch screen interface interaction device 30 includes a display module 310 , a translation determination module 320 , a selection module 330 and a movement determination module 340 .
[0050] The display module 310 is configured to display multiple graphic objects on the touch screen display.
[0051] The translation determination module 320 is configured to determine the translation of the viewing angle when a single finger touch on the touch screen display is detected and then a movement of a single finger contact on the touch screen display is detected.
[0052] The selecting module 330 is configured to select a graphic object corresponding to a single finger clicked area when a single finger click is detected on the touch screen display.
[0053] The movement determination module 340 is configured to determine the graphic corresponding to the movement of a single finger on the touch screen display after detecting that a single finger touches the corresponding area of ​​the graphic object of the touch screen display and then detects the movement of a single finger on the touch screen display Object movement.
[0054] Further, as Figure 4 As shown, a touch screen interface interaction device 40 further includes a change determination module 350 .
[0055]Change the determination module 350, for after detecting that the anchor point of the graphic object of the touch screen display is touched by a single finger, and when the movement of the single finger contact on the touch screen display is detected, determine the movement corresponding to the single finger contact on the touch screen display Changes in the shape and size of graphics objects.
[0056] Among them, such as Figure 5 As shown, the change determination module 350 includes a fifth coordinate acquisition unit 351 and a shape size determination unit 352 .
[0057] The fifth coordinate obtaining unit 351 is configured to obtain the fifth coordinate of the single finger leaving the touch screen display.
[0058] The shape and size determination unit 352 is configured to determine the shape and size of the graphic object according to the fifth coordinate.
[0059] Among them, such as Image 6 As shown, the translation determination module 320 includes a first coordinate acquisition unit 321 and a target area determination unit 322 .
[0060] The first coordinate obtaining unit 321 is configured to obtain a first coordinate of a single finger leaving the touch screen display and a second coordinate of a single finger falling.
[0061] The target area determining unit 322 is configured to determine the target area according to the first coordinates and the second coordinates and display the target area.
[0062] Among them, such as Figure 7 As shown, the movement determination module 340 includes a third coordinate acquisition unit 341 and a position determination unit 342 .
[0063] The third coordinate acquiring unit 341 is configured to acquire a third coordinate of a single finger touching on the touch screen display and a fourth coordinate of a single finger falling.
[0064] The position determination unit 342 is configured to determine the position of the graphic object in real time according to the third coordinate and the fourth coordinate.
[0065] For example, if Figure 8 As shown, the touch screen display displays a partial area of ​​the system view (touch screen interactive interface) according to the scaling ratio. Graphic objects 1 and 2 are displayed on the touch screen display. After detecting a single finger touch on the touch screen display, a single finger contact on the touch screen display When moving on , determine the translation of the view angle and translate the view angle to the target area.
[0066] Another example, such as Figure 9 As shown, the touch screen display displays a partial area of ​​the system view (touch screen interactive interface 3) according to the scaling ratio, and the touch screen display displays graphic objects 3 and 4. When a single finger click is detected on the touch screen display, select the area corresponding to the single finger click Graphic object 3, after detecting that a single finger touches the corresponding area of ​​the graphic object of the touch screen display, and then detecting the movement of a single finger contact on the touch screen display, determine the corresponding graphic object 3 of the movement of a single finger contact on the touch screen display move.
[0067] To sum up, in this embodiment, a plurality of graphic objects are first displayed on the touch screen display; after a single finger touch is detected on the touch screen display, and the movement of a single finger contact on the touch screen display is detected, the translation of the viewing angle is determined. ;When a single finger click is detected on the touch screen display, the graphic object corresponding to the single finger click area is selected; after a single finger touch is detected in the corresponding area of ​​the touch screen display graphic object, the movement of the single finger contact on the touch screen display is detected At this time, the movement of the graphic object corresponding to the movement of a single finger contact on the touch screen display is determined; thus, the operation on the graphic object is simplified.
[0068] The serial numbers of the above embodiments of the present invention are for description only, and do not represent the advantages and disadvantages of the embodiments.

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products