The invention provides a method for quantizing an activation function as PRELU, which comprises the following steps: S1, data quantization: to-be-quantized data is quantized according to the following formula (1) to obtain low-bit data, and the variable description of the formula (1) is as follows: Wf is an array of full-precision data, Wq is quantized data, maxw is the maximum value in the full-precision data Wf, minw is the minimum value in the full-precision data Wf, b is the quantized bit width; S2, the PRELU activation function is quantized, a quantization formula is shown as a formula (2), and variables of the formula (2) indicate that when the value of xi is larger than 0, the value of xi needs to be multiplied by a parameter q1, if the value of xi is smaller than 0, the value of xi needs to be multiplied by a parameter ac, and c is a channel where xi is located; specific parameters illustrate that x is a three-dimensional array, namely {h, w, c}, and h, w and c are respectively the length, width and channel number of the array; the parameter a is a one-dimensional array {c}, and the values of c and c in x are equal; q1 is the quantization of 1.0; ac is the value of the cth channel in the parameter a.