A human-computer interaction testing method and system for mobile terminals

A mobile terminal, human-computer interaction technology, applied in the input/output of user/computer interaction, mechanical mode conversion, computer parts and other directions, can solve the problems of reducing test accuracy, lack of a unified standard, affecting picture clarity, etc. , to improve the analysis accuracy, improve the comfort of use, and improve the test effect.

Active Publication Date: 2021-12-14
KINGFAR INTERNATIONAL INC
View PDF13 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] The man-machine interface is the most direct interaction layer between software and users. The quality of the interface determines the user’s first impression of the software. Good interface design is getting more and more attention from system analysts and designers, but how to test the man-machine interface And give an objective and fair evaluation, but there is no unified standard
[0003] At present, for the test of human-computer interaction, the camera is used to shoot the screen picture, and at the same time detect the focus of the eyes, project the shot picture and the eye movement track, and track the running track of the eye vision on the screen picture, so as to realize the human-machine Interactive test, but in this test method, the screen image is saved in a picture format after shooting, and the picture format will affect the clarity of the picture, thereby affecting the test result
[0004] At the same time, because the position of the eye tracker is fixed, and the human eyes will change with the height and head movement of the person, it will affect the clarity of the eye track information collected by the eye tracker, which will affect the eye track information and reduce the test performance. precision

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A human-computer interaction testing method and system for mobile terminals
  • A human-computer interaction testing method and system for mobile terminals
  • A human-computer interaction testing method and system for mobile terminals

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment 1

[0032] A human-computer interaction test system for mobile terminals of the present application, such as figure 1 , 2 As shown, it includes a mobile terminal sub-device 1, a support base 2, an eye-tracking sub-device 3, a booster device, and an arm support sub-device; the arm support sub-device includes a support plate 4 and an arm support frame 5; the booster device includes a clamping device 6. Blessing frame7.

[0033] The mobile terminal bracket sub-device 1 is fixed on one end of the support base 2, the eye-tracking sub-device 3 is fixed on the other end of the support base 2, and the eye-tracking sub-device 3 and the mobile terminal sub-device 1 are respectively arranged on both sides of the support base 2 , is fixedly connected with the support base 2.

[0034] The eye tracking sub-device 3 is used to collect the converging point of eye vision on the mobile terminal, and the mobile terminal sub-device 1 is used to support the mobile terminal.

[0035] Specifically, t...

specific Embodiment 2

[0074] A human-computer interaction testing system for a mobile terminal of the present application differs from the first embodiment in that it also includes an electric adjustment sub-device. Stepper motors are respectively set at the joint of the seat 2, the joint of the eye tracking sub-device 3 and the support base 2, the joint of the arm support frame 5 and the support frame 7, and the joint of the arm support frame 5 and the arm support fixed platform 54 , by controlling the stepping state of each motor, the relative position and angle between each component can be controlled.

[0075] The electric adjustment sub-device adjusts the eye tracking system according to the position information of the eye tracking system, adjusts the mobile terminal according to the position information of the mobile terminal, and realizes the automatic adjustment of the position of the eye tracking system and the position of the mobile terminal; adjusts the arm support according to the height...

specific Embodiment 3

[0078] A human-computer interaction test method for a mobile terminal of the present application is based on a human-computer interaction test device for a mobile terminal, including deriving screen information of the mobile terminal and converting it into screen video information; obtaining the visual acuity of the human eye Converge point track information and convert it into eye track video information; gather screen video information and eye track video information, and then through coordinate transformation, superimpose the screen video information and eye track video information at the same time on the same screen to obtain human The running trajectory of the eyesight convergence point on the screen of the mobile terminal is used to obtain the test results of human-computer interaction.

[0079]The screen information of the mobile terminal includes screen image information and touch screen information, and the touch screen information includes touch screen orientation inf...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human-computer interaction testing method and system for a mobile terminal, including an eye tracking subsystem, a mobile terminal screen acquisition subsystem, a control center, an eye tracking subsystem, a mobile terminal screen acquisition subsystem and a control center respectively. Central connection, the eye tracking subsystem is used to obtain eye track information that meets the requirements by automatically adjusting the position of the eye tracker, and the mobile terminal screen acquisition subsystem is used to obtain the screen video information of the mobile terminal by exporting real-time screen information of the mobile screen; control The center is used to convert the eye track information into eye track video information, adjust the position of the eye tracker according to the eye track image, perform data collection and coordinate transformation on the eye track video information and mobile terminal screen images, and combine the screen video information with the eye track Video information is superimposed to obtain human-computer interaction test results. This application automatically adjusts and accurately captures the convergence point of eyesight; exports the screen image of the mobile terminal, the data is more accurate, and the test effect is improved.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction testing, in particular to a method and system for human-computer interaction testing of mobile terminals. Background technique [0002] The man-machine interface is the most direct interaction layer between software and users. The quality of the interface determines the user’s first impression of the software. Good interface design is getting more and more attention from system analysts and designers, but how to test the man-machine interface And give an objective and fair evaluation, but there is no unified standard. [0003] At present, for the test of human-computer interaction, the camera is used to shoot the screen picture, and at the same time detect the focus of the eyes, project the shot picture and the eye movement track to the screen, and track the running track of the eye vision on the screen picture, so as to realize the human-machine interaction test. Inter...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01
CPCG06F3/013
Inventor 赵起超杨苒李召
Owner KINGFAR INTERNATIONAL INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products