The present application relates to a method, data processing method, apparatus, computer equipment and storage medium for constructing a model inference network. The method includes: acquiring a deep learning network, where the deep learning network includes a plurality of network layers; acquiring test data; performing compilation and detection on each network layer according to the test data, obtaining compilation results and detection results, and according to the compilation of each network layer The results and / or detection results determine the resource allocation strategy of each network layer, build a model inference network according to the resource allocation strategy of each network layer, and obtain the data to be processed; input the data to be processed into the model inference network, and pass the resources in the model inference network through the model inference network. The network layer whose configuration strategy is to optimize the configuration strategy processes the data to be processed, and the network layer whose resource allocation strategy is the original configuration strategy through the model inference network processes the data to be processed, and obtains the processing result of the data to be processed, which improves the performance of the entire network. Data processing efficiency.