Password dictionary generation method and computer-readable storage medium
A password and dictionary technology, applied in the field of passwords, can solve the problems of not analyzing password setting rules, unable to expand password dictionary, etc., to achieve great application value and improve the effect of success rate
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment 1
[0137] Please refer to figure 1 , Embodiment 1 of the present invention is: a kind of password dictionary generation method, described method is based on recursive neural network, the password dictionary obtained can be used for password recovery; Described method comprises the following steps:
[0138] S1: Collect a password set, which includes real passwords and virtual passwords; in order to prevent overfitting, the password set consists of two parts, the first part comes from real passwords in real websites or information management system databases, and the second part It is a virtual password composed of commonly used password keywords (such as admin, etc.) plus a randomly generated suffix; further, the first part accounts for 70% of the total capacity of the password set, and the second part accounts for 30%, and the total capacity is not less than 10 million .
[0139] S2: Generate a test set, the test set includes plaintext passwords; the total capacity of the test s...
Embodiment 2
[0156] Please refer to figure 2 , this embodiment is a further expansion of steps S3 and S7 in Embodiment 1, taking step S3 as an example, step S3 includes the following steps:
[0157] S301: Construct a recurrent neural network model, where the recurrent neural network model includes an input layer, a hidden layer, and an output layer, where the hidden layer includes three GRU layers.
[0158] Specifically, the recurrent neural network is used as the benchmark structure of the model, in which, in order to have the ability to remember the context of the long-term information sequence, the long-short-term memory network layer (LSTM) is used as the basic unit of the model, in order to improve efficiency and reduce the amount of calculation , and further adopts the variant model of the network - Gated Recurrent Unit (GRU), the original LSTM model forgets the gate and the input gate to synthesize the update gate, and improves the dictionary generation speed while maintaining the ...
Embodiment 3
[0175] Please refer to image 3, this embodiment is a further expansion of steps S4 and S8 in the above embodiment, taking step S4 as an example, step S4 includes the following steps:
[0176] S401: Obtain a character set, where the character set includes a-z, 0-9 and special characters.
[0177] S402: Randomly select a character from the character set as the first character of a password;
[0178] S403: Using the character as the current character, and according to the current character and the dictionary model, calculate the occurrence probability of each character through a cost function, that is, the probability that each character appears after the current character.
[0179] S404: Obtain the character of the preset second number with the highest probability, and randomly select a character from the characters of the second number as the next character of the password; in this embodiment, the second The number is 5, that is, a character is selected at any time from the ...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More 


