A tensorflow-based training model storage method, driver, and computing server
A computing server and training model technology, applied in computing, instrumentation, program control design, etc., can solve problems such as failure to save training models
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0019] The training model preservation method described in this application can be applied in figure 2 In the Tensorflow On Spark architecture shown. and figure 1 compared to, figure 2 The shown Tensorflow On Spark architecture adds a storage system, and improves SparkDriver, computing server, and parameter server, so that the computing server and parameter server that save the model run on the same computing device.
[0020] figure 2 The computing device shown is a device running an operating system, and may include a physical machine, a virtual machine, or a Docker container.
[0021] image 3 The Tensorflow-based training model preservation method disclosed in the embodiment of the present application includes the following steps:
[0022] S301: After the Spark Driver schedules the training tasks to multiple computing servers, it obtains an Internet Protocol (Internet Protocol, IP) address of the parameter server, and stores the IP address of the parameter server in...
PUM
Login to View More Abstract
Description
Claims
Application Information
Login to View More - R&D
- Intellectual Property
- Life Sciences
- Materials
- Tech Scout
- Unparalleled Data Quality
- Higher Quality Content
- 60% Fewer Hallucinations
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2025 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com



