Target object processing method and device

A technology of a target object and a processing device, applied in the computer field, can solve the problems of high consumption of human resources, irregular delivery behavior, and low quality of garbage classification, and achieve the effect of saving human resources and improving the quality of classification.

Pending Publication Date: 2021-01-05
ALIPAY (HANGZHOU) INFORMATION TECH CO LTD
10 Cites 2 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0002] With the development of society and the improvement of people's living standards, there are more and more types of domestic garbage, and the way of garbage disposal is also constantly developing, especially the classification of garbage is already a model that has been vigorously promoted by the country. ...
View more

Method used

Specifically, the hardware drive management in the garbage collection box device 502 is used to drive and manage hardware equipment, for example, when the garbage collection box device 502 receives an instruction to open the insertion port, open the garbage collection box device. mouth, and can interact with the same type of sensors in the garbage collection box device 502 at the same time; the communication network management in the garbage collection box device 502 is used to maintain the stability of communication with the user terminal 504 and the garbage collection box service platform 506, and at the same time Functions such as network quality detection and network reconnection can also be realized; the message management in the garbage collection box device 502 is used for message transmission with the user terminal 504 and the garbage collection box service platform 506, including sending messages to the garbage collection box service platform 506 Return the success information of opening the insertion port, the detection result of detecting that the garbage is placed in the garbage collection box device 502, etc.; the box interaction system in the garbage collection box device 502 is used to provide the user with the operation Interface, for example, the user scans the identification code displayed on the operation interface to send a processing request to the garbage collection box service platform 506, or displays promotional pictures of garbage classification to the user on the operation interface.
The target object processing method provided by the embodiment of this description, by inputting the image of the collected target object into the image classification model, the classification result for the target object is obtained, and the target object is classified based on the classification result, so that the target object can be realized The processing device automatically performs the classification of the target objects without manually classifying the target objects, which greatly saves human resources.
The target object processing method provided by the embodiment of this description, by judging whether the target object is put into the corresponding classification box is full or whether the load reaches a certain condition, the empty classification box in the target object processing device will be detected and adjusted For the category corresponding to the target object, the target object is put into the adjusted empty classification box, so that the category of the classification box in the target object processing device can dynamically adjust the category of the classification box based on the actual situation of the user's delivery of the target object, thereby Further improve space utilization.
[0114] The target object processing method provided by the embodiment of this description, by sending an authorization interface to a new user who has never used the target object processing device, and synchronizing the user's identity information and login status in the case of user authorization, provides the user with The convenience of automatic login, and then open the ta...
View more

Abstract

The invention provides a target object processing method and device, and the method comprises the steps: receiving a processing request of a user for a target object processing device, and starting the target object processing device based on the processing request; starting an image acquisition device of the target object processing device under the condition that the target object placed by theuser in the target object processing device is monitored; acquiring an image of the target object based on the image acquisition device, and performing category analysis on the image of the target object to determine the category of the target object; and putting the target object into a classification box of the target object processing device corresponding to the category of the target object soas to realize automatic classification of the target object without manual participation.

Application Domain

Discounts/incentivesCharacter and pattern recognition +2

Technology Topic

Nuclear medicineImage acquisition +2

Image

  • Target object processing method and device
  • Target object processing method and device
  • Target object processing method and device

Examples

  • Experimental program(1)

Example Embodiment

[0067]In the following description, many specific details are explained in order to fully understand this specification. However, this specification can be implemented in many other ways different from those described herein, and those skilled in the art can make similar promotion without violating the connotation of this specification. Therefore, this specification is not limited by the specific implementation disclosed below.
[0068]The terms used in one or more embodiments of this specification are only for the purpose of describing specific embodiments, and are not intended to limit one or more embodiments of this specification. The singular forms of "a", "said" and "the" used in one or more embodiments of this specification and the appended claims are also intended to include plural forms, unless the context clearly indicates other meanings. It should also be understood that the term "and/or" used in one or more embodiments of this specification refers to and includes any or all possible combinations of one or more associated listed items.
[0069]It should be understood that although the terms first, second, etc. may be used to describe various information in one or more embodiments of this specification, the information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other. For example, without departing from the scope of one or more embodiments of this specification, the first may also be referred to as the second, and similarly, the second may also be referred to as the first. Depending on the context, the word "if" as used herein can be interpreted as "when" or "when" or "in response to determination".
[0070]First of all, the terminology involved in one or more embodiments of this specification is explained.
[0071]SKU (English full name: Stock Keeping Unit, Chinese full name: Stock Keeping Unit): The basic unit of stock in and out measurement, which can be in units of pieces, boxes, pallets, etc.
[0072]In this specification, a target object processing method is provided. This specification also relates to a target object processing device, a computing device, and a computer-readable storage medium. , The detailed descriptions are given in the following embodiments one by one.
[0073]In the garbage sorting scene, in order to popularize users' correct garbage sorting and improve the quality of sorting, some trash bins need to be opened regularly at fixed points, and community members should be arranged for garbage sorting guidance. Although manual intervention can help users improve the quality of waste sorting, However, it also increases labor costs. At the same time, fixed-point delivery also brings a lot of inconvenience to many people who cannot deliver garbage in time due to work and life. Based on this, one or more embodiments of this specification provide a target object processing method In order to solve the above technical problems, in practical applications, one or more embodiments of this specification provide a target object processing method that can not only be applied to garbage classification scenes, but also can be applied to any scenes that require item classification. For example, the sorting and sorting scene of express delivery, etc., this application does not make any restrictions on this.
[0074]In order to facilitate understanding, the following describes in detail the processing of the garbage thrown by the user by applying a target object processing method to the garbage classification scene.
[0075]Seefigure 1 ,figure 1 It shows a flowchart of a method for processing a target object according to an embodiment of this specification, which specifically includes the following steps.
[0076]Step 102: Receive a user's processing request for a target object processing device, and turn on the target object processing device based on the processing request.
[0077]Wherein, the target object includes an object dropped by the user to the target object processing device. In an actual application scenario, the target object is garbage dropped by the user to the target object processing device, but is not limited to the type of garbage For example, the garbage thrown by the user includes but is not limited to recyclable garbage, hazardous waste, kitchen waste, other garbage, etc.; the target object processing device includes a device that processes the target object thrown by the user. In actual application scenarios, The target object processing device may be a sorting device for processing garbage delivered by users, such as a recycling bin.
[0078]In practical applications, taking the target object processing device as a garbage collection box as an example, the garbage collection box receives a user’s request for garbage disposal from the garbage collection box, and opens the garbage collection box based on the garbage disposal request. Put the mouth.
[0079]Further, the target object processing device receives the user's processing request for the target object, which can be specifically implemented in the following manner:
[0080]Before the receiving the processing request from the user for the target object processing device, the method further includes:
[0081]Receive the identification code generated by the user by clicking the display interface in the target object processing device, and display the identification code on the display interface in the target object processing device.
[0082]Wherein, the identification code can be understood as an identification code for users to scan and obtain information through a user terminal. In an actual application scenario, the identification code may be a two-dimensional code that carries information.
[0083]In specific implementation, before the target object processing device receives the user's processing request for the target object, the user can send a request to the target object processing device by clicking on the display interface in the target object processing device. The identification code can be generated and displayed on the display interface of the target object processing device, where the identification code contains the processing information processed by the user for the target object. In practical applications, it is not limited to clicking The request is sent to the target object processing device in a method, and the request can also be triggered by sliding the sliding area of ​​the display interface in the target object processing device, which is not limited in this application.
[0084]In practical applications, the garbage collection box is equipped with a display device, and the display interface in the display device can display the identification code (can be a QR code). The user can click on the area on the display interface through the display device in the garbage collection box. A QR code can be generated and displayed on the display interface.
[0085]The target object processing method provided by the embodiment of this specification generates an identification code by receiving a request triggered by a user's operation before sending a processing request to the target object processing device, and displays the identification code on the display interface of the target object processing device , The user can send a processing request for the target object to the target object processing device through the identification code, which not only reduces the chance of the user actually touching the target object processing device, but also brings great convenience to the user's subsequent use.
[0086]Furthermore, the receiving a processing request from the user for the target object processing device includes:
[0087]Receive the identification code displayed by the user through the user terminal scanning the display interface, and generate a processing request for the target object processing device, wherein the processing request carries the attribute information of the user.
[0088]Specifically, the target object processing device receives the identification code displayed in the display interface scanned by the user through the user terminal, and sends a processing request for the target object processing device, the processing request carries the attribute information of the user, wherein the The attribute information includes basic information such as name, gender, age, occupation, home address, etc., when the user uses the target object processing device, and fills in the attribute information of the user through registration.
[0089]For example, in the actual application of garbage classification, the garbage collection box generates a QR code through the user's click operation, and the user scans the QR code in the display interface of the garbage collection box through the mobile phone. The processing request of the box, wherein the basic information of the user is carried in the processing request generated by the user scanning the two-dimensional code.
[0090]In addition, in practical applications, the identification code can be set with a certain validity period, that is, after the identification code generated by the user through a click operation, the user does not scan the identification code in time. In the case of exceeding a certain validity period, the identification code When the user scans the invalid identification code, the garbage disposal request will not be sent to the garbage collection box. For example, the valid time of the identification code is set to 1 minute, when the identification code is displayed on the display After the interface, the garbage collection box can start to count down for 1 minute. Within 1 minute, the user can scan the identification code to send a processing request to the garbage collection box. If it exceeds 1 minute, the user scans the identification code. The recycling box can feed the user the information that the identification code is invalid, and prompt the user to click to generate a new identification code.
[0091]In order to better understand the process of displaying the identification code of the recycling box, please refer to the attachmentfigure 2 ,figure 2 It shows a flowchart of the identification code display process of the target object processing method provided in an embodiment of the present specification applied to the garbage classification scene, which specifically includes the following steps.
[0092]Step 202: The recycling box sends a request for generating an identification code to the server.
[0093]Specifically, the recycling box sends a request for generating an identification code to the server based on the user's operation on the display interface.
[0094]Step 204: The server generates a time-effective QR code based on the request.
[0095]Specifically, the server generates a two-dimensional code, where the two-dimensional code has a certain validity period. If the validity period is exceeded, the user will fail to scan the code.
[0096]Step 206: The server returns the two-dimensional code and displays it on the display interface of the recycling box.
[0097]Specifically, the server sends the generated two-dimensional code to the recycling box and displays it in the display interface of the recycling box for the user to scan the two-dimensional code through the mobile phone terminal.
[0098]The target object processing method provided in the embodiment of this specification generates an identification code request based on a user's operation to generate an identification code with a certain validity period for the user to scan and send a processing request to the target object processing device, in order to prevent the user from saving the generated identification code Repeated use, by setting the identification code to a certain validity period, can reduce the repetitive use of the same identification code by the user, so as to avoid subsequent scanning of the same identification code by the user, and the user's request processing behavior for processing the target object cannot be traced.
[0099]Step 104: When the target object placed by the user on the target object processing device is monitored, start the image acquisition device of the target object processing device.
[0100]Wherein, the image acquisition device can be understood as an image acquisition device with a camera included in the target object processing device.
[0101]Specifically, the target object processing device may activate the image acquisition device of the target object processing device to take a picture of the target object when it detects that the user places the target object on the target object processing device. Obtain an image of the target object.
[0102]Further, before the user places the target object on the target object processing device, based on the user's processing request, the opening of the target object processing device will be opened. Specifically, turning on the target object processing device may include the following Method realization:
[0103]The turning on the target object processing device based on the processing request includes:
[0104]The identity of the user is verified based on the attribute information of the user, and if the identity verification of the user is passed, the target object processing device is turned on based on the processing request.
[0105]Specifically, after receiving the processing request sent by the user by scanning the identification code, the target object processing device performs user identity verification on the user attribute information carried in the processing request, and the user’s identity information passes In the case of verification, the target object processing device is turned on based on the processing request.
[0106]In actual applications, before using the garbage collection box, the user must scan the identification code through the user terminal to enter the system and then register the user, and enter the user’s basic information. The basic information may include the user’s name, gender, age, occupation, Residential address, and account level, etc.; after the garbage collection bin receives the garbage classification request sent by the user scanning the QR code, the user’s attribute information carried in the garbage classification request is authenticated, and the user’s attribute information matches After the user information in the user database of the system, it is indicated that the user's identity information has been entered in the system, and the identity information is verified, and the target object processing device can be turned on based on the processing request.
[0107]The target object processing method provided by the embodiment of this specification verifies the user's identity, and only when the user's identity is verified can the target object processing device be turned on, which ensures the user's operation safety for the target object processing device.
[0108]Further, in the case that the user's identity information is not passed, the specific implementation manner of turning on the target object processing device is as follows:
[0109]The turning on the target object processing device based on the processing request includes:
[0110]In the case that the user's identity verification fails, return an authentication and authorization interface to the user;
[0111]In the case that the user's identity authorization is successful, the target object processing device is turned on based on the processing request.
[0112]Specifically, when the target object processing device fails to verify the received user's identity information, the user terminal used by the user is shown a display interface for identity information authentication and authorization, and after the user is authorized, the target object processes The device can obtain the user's login status and the user's basic information, and upon receiving the user's successful authorization of the identity information, open the launch port of the target object processing device based on the processing request sent by the user through the identification code .
[0113]In practical applications, if the user has never used the target object processing device, after the user triggers a request by the target object processing device, the user scans the identification code displayed by the target object processing device through the user terminal, In the case that the target object processing device does not match the user's identity information in the user database of the system, the authentication and authorization interface is returned to the user terminal of the user. In the case that the user is authorized successfully, the target After the object processing device obtains the user's login status and basic information, it can open the launch port of the target object processing device based on the user's processing request.
[0114]The target object processing method provided in the embodiments of this specification sends an authorization interface to a new user who has never used the target object processing device, and synchronizes the user’s identity information and login status under the user’s authorization to provide users with automatic login Convenient, and then turn on the target object processing device for users to improve the efficiency of new users.
[0115]In addition, in the process of turning on the target object processing device, in order to prevent the user from maliciously turning on the target object processing device and affecting the processing efficiency of the target object, it is necessary to obtain real-time location information of the user before turning on the target object processing device. Matching with the position information of the target object processing device to determine whether the target object processing device is turned on, the specific implementation manner is as follows:
[0116]The turning on the target object processing device based on the processing request includes:
[0117]Obtaining address information of the current user, and matching the address information with the address information of the target object processing device, to obtain a matching value of the user;
[0118]In the case where the matching value is greater than or equal to a preset threshold, turning on the target object processing device;
[0119]In the case that the matching value is less than the preset threshold value, a notification of failure to start is fed back to the user.
[0120]Wherein, the address information of the current user is real-time location information of the user terminal of the current user, and the address information of the target processing device is address information of the location of the target processing device.
[0121]Specifically, when the target object processing device obtains the real-time address information of the user terminal through the user terminal of the user, the address information is matched with the address information of the target object processing device to obtain the corresponding information of the user The matching value with the target object processing device, when the matching value is greater than or equal to the preset threshold, the opening of the target object processing device is opened, and when the matching value is less than the preset threshold , Then feed back the prompt message of failure to open to the user terminal of the user.
[0122]For example, in an application scenario where a user throws garbage into a garbage collection bin, after the user triggers an operation on the garbage collection bin S1 next to the garbage collection bin S1 in cell A, the garbage collection bin S1 receives the request from the user by scanning the QR code If the user’s identity information is verified by the garbage collection box, the user’s specific geographic location information a in cell A is obtained, and the user’s specific geographic location information a is compared with the geographic location of the garbage collection box S1. The location information b is matched to obtain a matching value W, and the obtained matching value W is compared with a preset threshold U. It can be seen that the matching value W is greater than the preset threshold U, and the garbage collection bin will open the opening for users to use.
[0123]For example, in an application scenario where a user throws garbage into a garbage collection bin, where the user is located, after the user obtains the QR code of the garbage collection bin S2 at the location B, the user can scan the QR code through the user terminal to transfer The user’s identity information is sent to the system of the garbage collection box S2, and when the garbage collection box S2 passes the verification of the user’s identity information, the user’s specific geographic location information a1 at A is obtained, and the user’s specific geographic location The location information a1 is matched with the specific geographic location information b1 of the garbage collection bin S2 to obtain the matching value W1. The obtained W1 is compared with the preset threshold U1. It can be seen that the matching value W1 is less than the preset threshold U1, and the garbage collection bin S2 is Feedback to the user terminal of the user that the opening of the insertion port failed.
[0124]In practical applications, the user's real-time location information is obtained. In the case that the location information does not match the location information of the target object processing device, the user will not be able to open the insertion port of the target object processing device for subsequent use by the user In order to avoid the occurrence of when the target object processing device is turned on, it is not the corresponding user placing the target object on the insertion port, but the other person placing the target object on the target object processing device, but the system of the target object processing device recognizes the user’s Identity information, or to prevent users from maliciously opening the target object processing device, thereby affecting the normal usage rate of the target object processing device.
[0125]The target object processing method provided by the embodiment of this specification matches the real-time position information of the user with the position information of the target object processing device, and can control whether the target object processing device is turned on according to the level of the matching value, thereby reducing the number of users. The risk of cross-regional abnormal operation of the target object processing device occurs, and the operation authority of the user who has a high degree of matching with the target object processing device is also guaranteed.
[0126]In order to further understand the operation process of turning on the target object processing device, refer to the appendiximage 3 ,image 3 It shows a flow chart of the process of the target object processing method provided by an embodiment of the present specification being applied to a garbage classification scene in which a user scans an identification code to open a garbage collection box, which specifically includes the following steps.
[0127]Step 302: The user terminal scans the identification code displayed on the display interface of the garbage collection box, and sends the user's identity information and the real-time location information of the user terminal to the collection box.
[0128]Step 304: The garbage collection box sends the user's identity information and the real-time location information of the user terminal to the server.
[0129]Step 306: The server verifies the identity information of the user.
[0130]Step 308: In the case where the user's identity information is verified, the server matches the real-time location information of the user terminal with the location information of the garbage collection bin.
[0131]Step 310: In the case that the matching value is greater than or equal to the preset threshold value, send an opening instruction to the garbage collection bin.
[0132]Step 312: The garbage collection box executes the opening of the disposal port of the garbage collection box based on the opening instruction.
[0133]Step 314: Return a successful opening message to the user terminal.
[0134]The target object processing method provided by the embodiment of this specification verifies the user’s identity information and matches the current user’s geographic location information before the user controls the target object processing device to turn on. Only when the location information is verified can the target object processing device be turned on, which not only ensures the utilization rate of the target object processing device, but also reduces the risk of users operating the target object processing device remotely.
[0135]Step 106: Collect an image of the target object based on the image acquisition device, and perform category analysis on the image of the target object to determine the category of the target object.
[0136]Wherein, the image of the target object is an image of the target object placed on the target object processing device and collected by the image acquisition device.
[0137]Specifically, the image acquisition device collects the image of the target object placed on the target object processing device, and performs category analysis on the obtained image of the target object to determine the category of the target object. In practical applications, the image acquisition The device is a built-in device of the target object processing device to ensure timely and accurate collection of the image of the target object.
[0138]Further, the image for the target object collected by the image collection device includes:
[0139]In the case of receiving that the user generates delivery completion information by clicking the delivery completion button in the display interface of the target object processing device, start the image acquisition device of the target object processing device;
[0140]Performing photographing processing on the target object to obtain an image of the target object.
[0141]In practical applications, take the application of the garbage collection box as an example. After the user puts the garbage into the input port of the garbage collection box, click the delivery complete button in the display interface of the garbage collection box to start the image acquisition device of the garbage collection box. Taking a picture of the dropped garbage to obtain a photographed image of the garbage.
[0142]The target object processing method provided by the embodiments of this specification can accurately collect the image of the target object by setting the image acquisition device in the target object processing device, so as to facilitate the subsequent accurate category analysis of the target object.
[0143]Furthermore, the performing category analysis on the image of the target object to determine the category of the target object includes:
[0144]Inputting the image of the target object into an image classification model according to the image of the target object collected by the image acquisition device to obtain a classification result of the target object;
[0145]The target object is classified based on the classification result.
[0146]Wherein, the image classification model is a model for classifying image categories based on machine learning, and the classification result is a classification result corresponding to the classification result of the classification box in the target object acquisition device.
[0147]Specifically, the target image collected by the image acquisition device in the target object processing device is input into the image classification model, SKU-level recognition is performed, and the classification recognition result for the target object is obtained, and the target object processing device is based on the classification The recognition results are classified.
[0148]For example, taking an application scenario where a garbage collection bin classifies garbage as an example, the user A puts an empty plastic bottle into the opening of the garbage collection bin, and the garbage collection bin activates the image acquisition device to take pictures of the empty plastic bottles, and obtains the empty plastic bottle. Plastic bottle photos, and input the empty plastic bottle photos into the preset image classification model of the garbage collection bin, and perform SKU-level identification on the empty plastic bottle photos, and obtain the identification result of the empty plastic bottle garbage as recyclable Garbage, the garbage collection bin classifies the empty plastic bottles as recyclable garbage based on the classification result of the empty plastic bottles.
[0149]The target object processing method provided by the embodiments of this specification can obtain the classification result for the target object by inputting the image of the collected target object into the image classification model, and classify the target object based on the classification result, which can realize the automatic target object processing device. Perform the classification of the target object, without manually classifying the target object, the user directly delivers the target object, and the target object can be automatically classified.
[0150]In order to further understand the process of identifying and classifying the target object by the target object processing device, please refer to the appendixFigure 4 ,Figure 4 It shows a flow chart of the process of identifying and classifying garbage for a garbage collection bin in a garbage classification scenario according to an embodiment of the present specification, and specifically includes the following steps.
[0151]Step 402: After receiving the garbage dropped by the user, the garbage collection box closes the input port of the garbage collection box and turns on the image collection device in the garbage collection box to take pictures of the garbage to obtain images of the garbage.
[0152]Step 404: The garbage collection bin uploads the image of the garbage to the server.
[0153]Step 406: The server inputs the image of the garbage into the image classification model, starts the visual recognition device in the server, and calculates the classification result corresponding to the garbage based on a preset algorithm.
[0154]Step 408: The server returns the category result to the garbage collection box.
[0155]Step 410: The garbage collection box updates the display information in the display interface of the garbage collection box based on the category result.
[0156]Step 412: The server returns the category result to the user terminal.
[0157]Step 414: The user terminal displays the category result by refreshing the user interface.
[0158]The target object processing method provided by the embodiments of this specification can obtain the classification result for the target object by inputting the image of the collected target object into the image classification model, and classify the target object based on the classification result, which can realize the automatic target object processing device. Perform the classification of the target object without manual classification of the target object, which greatly saves human resources.
[0159]Further, after the classification of the target object based on the classification result, the method further includes:
[0160]According to the category of the target object, the first equity value corresponding to the category of the target object is issued to the user.
[0161]Among them, the equity value can be understood as the rewards obtained by the user after delivering the target object to the target object processing device, including but not limited to account points, energy points, and related rights that meet the needs of the user. It can be preset according to different application scenarios Different corresponding first equity values ​​are not limited in this application.
[0162]In practical applications, following the above example, user A drops an empty plastic bottle into the injection port of the target object processing device, and classifies the empty plastic bottle as a recyclable garbage type according to the classification result of the target object processing device for empty plastic bottles , The target processing device is based on preset rules, and the equity value corresponding to the type of recyclable garbage is 1 point, then the target processing device creates a corresponding point account for the user A, and based on the empty plastic bottle as the recyclable garbage One point is added to the type of points account. Among them, the preset rules of the target object processing device can be changed differently according to different application scenarios and requirements, and this application does not make any limitation here.
[0163]According to the target object processing method provided by the embodiment of this specification, after the target object processing device processes the target object, according to the type of the target object, the user is issued with the equity value corresponding to the target object type. On the one hand, it not only guarantees different target object types Different equity values ​​can be obtained, and on the other hand, it also reflects the reward for the user to put the target object into the target object processing device.
[0164]In addition, based on the classification result of the target object, the new user creates a point account to increase the points. For the old user who already has a posting record in the target object processing device, the user is issued according to the user’s posting record and the category of the target object. The second equity value, which is different from the first equity value, is implemented as follows:
[0165]The issuing the equity value corresponding to the target object category to the user includes:
[0166]Acquiring a delivery record of the user delivering the target object;
[0167]Based on the placement record and the category of the target object placed by the user, a corresponding second equity value is issued to the user.
[0168]Wherein, the placement record is a record of the user placing the target object in the target object processing device, and the second equity value is different from the first equity value.
[0169]Specifically, the target object processing device obtains the historical posting record of the target object placed by the user, and issues the second equity value to the user based on the number of times the user uses the target object processing device and the number of times the type of target object is placed. Compared with the first equity value, the second equity value is a different target object type, and the second equity value is higher than the first equity value.
[0170]Following the above example, user A drops an empty plastic bottle into the injection port of the target object processing device. The target object processing device classifies the empty plastic bottle into recyclable garbage types. After obtaining the classification result, the target object processing device still needs Obtain the historical placement record of the user A. According to the historical record of the user A, it is determined that the number of times that the user A has placed recyclable garbage is 10 times, and then the release is based on the history and placement record and the recyclable garbage placed. The level of equity value adds 5 points to user A's points account.
[0171]In addition, in practical applications, the target object processing device can preset the equity value to redeem the mall. When the equity value in the user’s point account is greater than or equal to the preset threshold, the user can process the target object In the equity value mall preset by the device, the equity value obtained by the delivery target object is used to exchange the goods corresponding to the equity value. The specific implementation method can be set in different implementation methods according to the different needs of users, which is not limited in this application.
[0172]The target object processing method provided by the embodiment of this specification obtains the user's historical use record, and according to the user's historical use record and the type of the target object, the user is issued a second right that has a higher value than the first right. Value, can realize the reward for the user to use the target object processing device, so as to promote the number of times the user uses the target object processing device and increase the utilization rate of the target object processing device.
[0173]In addition, after determining the category of the target object, it further includes:
[0174]Returning the classification result to the user terminal corresponding to the user, and starting the voice device of the target object processing device;
[0175]Broadcast the classification result of the target object processing device based on the target object to the user.
[0176]Specifically, the classification result of the target object delivered by the user by the target object processing device is returned to the user terminal corresponding to the user, and the target object processing device activates the voice device to broadcast the classification result for the target object to improve the user’s Perception.
[0177]The target object processing method provided by the embodiment of this specification not only feeds back the classification result for the target object to the user terminal, but also passes the classification result through the voice device of the target object processing device in order to address the user's irregular delivery behavior. Broadcasting to users realizes timely reminding and popularization of users with irregular delivery behaviors, thereby improving the processing efficiency of target objects.
[0178]Step 108: Put the target object into the classification box of the target object processing device corresponding to the category of the target object.
[0179]Wherein, the classification box is a preset classification box set in the target object processing device.
[0180]Specifically, after the target object processing device determines the category of the target object delivered by the user, the target object is dropped according to its category to the classification box of the target object processing device corresponding to the category of the target object.
[0181]Further, the target object processing device includes at least two classification boxes;
[0182]Correspondingly, the placing the target object into the classification box of the target object processing device corresponding to the category of the target object includes:
[0183]Judging whether the classification box of the target object processing device corresponding to the category of the target object is full;
[0184]If yes, in a case where there is an empty classification box in the target object processing device, drop the target object into the empty classification box, and set the category of the empty classification box as the category of the target object;
[0185]If not, the target object is dropped into the classification box of the target object processing device corresponding to the category of the target object.
[0186]Wherein, the classification box can be understood as at least two different classification boxes preset according to the classification rules, and all the categories of the classification box also correspond to the categories of all target objects; the empty classification box can be understood as not placing any target objects The classification box, the category of the classification box can be set with different types of classification boxes according to specific usage scenarios, which is not limited here.
[0187]Specifically, the target object processing device includes at least two classification boxes, and after the target object processing device determines the category of the target object dropped by the user, the classification box corresponding to the target object category in the target object processing device is determined If the detection module in the target object processing device detects that the target object is full, it will determine whether there is an empty classification box in the target object processing device. If there is an empty classification box in the object processing device, drop the target object into the empty classification box, and set the category of the empty classification box as the type of the dropped target object; if the target object processing device is If the detection module detects that the target object is not full, the target object is directly dropped into the classification box of the target object processing device corresponding to the category of the target object, wherein the judgment target object corresponds to The condition of whether the sorting box is full can be changed according to different needs. For example, the judgment condition can be changed to determine whether the loading degree of the target object in the sorting box corresponding to the target object category in the target object processing device is reached The preset threshold is not limited in this application.
[0188]In practical applications, take the user putting garbage into the garbage collection bin as an example. The garbage collection bin includes three types of classification bins, such as recyclable garbage classification bins, hazardous garbage classification bins, and other garbage classification bins. There is a sorting box A1 for the type of garbage, two sorting boxes B1 and B2 for the hazardous garbage type, and three sorting boxes C1, C2, and C3 for the other garbage types. When the user S sorts the garbage In the case of two plastic bottles in the bin, after the garbage sorting bin determines that the plastic bottles are recyclable garbage, it is necessary to judge whether the sorting bin in which the recyclable garbage is placed is full or whether the load reaches a certain threshold. When it is judged that the sorting bin A1 is full or the load reaches a certain threshold, the garbage collection bin detects whether there is an empty sorting bin in each sorting bin, and it is detected that the sorting bin B2 of the hazardous waste type is empty in the bin A1 In the case of a sorting box, set the empty sorting box B2 as a sorting box for recyclable garbage, and set the empty sorting box as sorting box A2, and put the two plastic bottles dropped by user S into the empty sorting box. In the box A2; in the garbage collection box, it is determined that the sorting box A1 is not full of recyclable garbage or has not reached a certain threshold of the loading capacity, and then two plastic bottles are continued to be put into the sorting box A1.
[0189]The target object processing method provided in the embodiment of this specification adjusts the empty sorting box detected in the target object processing device to the target object by judging whether the target object is put into the corresponding classification box or whether the load reaches a certain condition. According to the corresponding category, the target object is put into the adjusted empty classification box, so that the classification box category in the target object processing device can dynamically adjust the classification box category based on the actual situation of the user delivering the target object, thereby further improving the space Utilization rate.
[0190]Further, in the process of classifying the target object by the target object processing device, if the target object delivered by the user does not meet the type of the classification box preset in the target object processing device, the target object processing device sets a reserved classification The box method realizes the classified placement of the target object, and the specific realization method is as follows:
[0191]The target object processing device includes at least one reserved classification box;
[0192]Correspondingly, the placing the target object into the classification box of the target object processing device corresponding to the category of the target object includes:
[0193]Judging whether the category of the target object matches the classification box of the target object processing device corresponding to the category of the target object;
[0194]If yes, drop the target object into the classification box of the target object processing device corresponding to the category of the target object;
[0195]If not, the target object is dropped into the reserved classification box of the target object processing device corresponding to the category of the target object.
[0196]Wherein, the reserved classification box can be understood as a classification box that is different from the classification box reserved according to the category of the target object, including at least one type of classification box, which is not limited herein.
[0197]Specifically, in practical applications, following the above example, the reserved sorting box can be set to a different type of sorting box from the three types mentioned above. For example, the reserved sorting box is set as a kitchen waste sorting box, After the recycling bin determines that the type of garbage that user S puts into the garbage recycling bin is the type of kitchen waste, it is judged whether the type of kitchen waste matches the type of the corresponding sorting bin in the garbage sorting bin. In the case that no food waste sorting bin is provided, the food waste is put into a reserved sorting bin corresponding to the food waste category.
[0198]In addition, when the target object processing device receives a target object that does not match the classification box type, the target object processing device sends the target object processing result and suggestion to the user, where the suggestion can be understood as a standard operation for the target object. It is suggested that the knowledge of garbage classification can be popularized to users in the scene of garbage classification.
[0199]In the target object processing method provided in the embodiment of this specification, a reserved classification box is preset in the target object processing device to ensure that the target object placed by the user is not matched to the corresponding classification box. According to the classification results, they are also delivered to the reserved classification box, thereby improving the user's experience of using the target object processing device, and also improving the processing efficiency for the target object.
[0200]Further, when the classification box of the target object processing device is full of the target object or the reposting amount for the target object reaches a preset threshold, the specific processing method for the classification box is as follows:
[0201]After determining whether the classification box of the target object processing device corresponding to the category of the target object is full, the method further includes:
[0202]When it is determined that the classification box of the target object processing device corresponding to the target object category is full, an alarm device is triggered.
[0203]Specifically, when it is determined that the classification box of the target object processing device corresponding to the target object category is full, the alarm device of the target object processing device will be triggered to facilitate subsequent manual processing of the target object.
[0204]The target object processing method provided by the embodiment of this specification proposes an automatic garbage classification method that can trace the delivery behavior of users in the garbage classification scene, which can be realized by delivering the garbage to the discharge port of the garbage collection bin Automatic classification of garbage collection bins, and dynamically adjust the classification bins of garbage collection bins by determining the type of garbage to improve the utilization of garbage collection bins. At the same time, the user's delivery behavior can be traced at the user terminal to realize the user's garbage delivery behavior Guidance and education, so as to enhance users' awareness of garbage classification and solve the problem of the classification quality of ordinary recycling bins.
[0205]In addition, the target object processing method provided in this specification is applied to the garbage collection box service platform in the garbage collection box business system. The garbage collection box business system includes a garbage collection box device, a user terminal, and a garbage collection box service platform. In order to further illustrate the implementation process of the target object processing method provided in this manual in the application scenario of garbage classification, the following combined attachmentFigure 5 , Further explain the target object processing method. among them,Figure 5 It shows a structural diagram of a garbage collection bin service system provided by an embodiment of this specification.
[0206]The garbage collection box business system 500 includes a garbage collection box device 502, a user terminal 504, and a garbage collection box service platform 506; in the garbage collection box business system 500, the garbage collection box service platform 506 communicates with the garbage collection box device 502 and The user terminal 504 is connected.
[0207]The garbage collection box device 502 is mainly used to drive the hardware device of the garbage collection box and provide a display interface for the user to interact with the garbage collection box device 502, and is connected to the user terminal 504 and the garbage collection box service platform 506 through the network.
[0208]Specifically, the hardware driver management in the garbage collection box device 502 is used to drive and manage hardware devices. For example, when the garbage collection box device 502 receives an instruction to open the injection port, the garbage collection box device 502 opens the injection port in the garbage collection box device, and At the same time, it can interact with the same type of sensors in the garbage collection box device 502; the communication network management in the garbage collection box device 502 is used to maintain the stability of the communication with the user terminal 504 and the garbage collection box service platform 506, and can also be realized Functions such as network quality detection and network reconnection; the message management in the garbage collection box device 502 is used for message transmission with the user terminal 504 and the garbage collection box service platform 506, including returning to the garbage collection box service platform 506 to start delivery口 success information, the detection result of detecting that the garbage is placed in the garbage collection box device 502, etc.; the box interaction system in the garbage collection box device 502 is used to provide users with an operation interface on the garbage collection box device 502, such as The user scans the identification code displayed on the operation interface to send a processing request to the garbage collection box service platform 506, or displays a promotional picture of garbage classification to the user on the operation interface.
[0209]The user terminal 504 serves as an interaction medium between the user and the garbage collection box device 502. The user terminal 504 provides services for the user. The user turns on the user terminal 504 and scans the identification code displayed on the display interface of the garbage collection box device 502. In the case of user identity verification and detection of a geographical location match, the opening of the garbage collection bin device 502 can be realized; the user information authorization function in the user terminal 504 is used to synchronize the user’s identity information under user authorization As well as the login status, after the user is authorized, the user can provide the user with the convenience of automatic login; the rights and interests function in the user terminal 504, the user manages the different types of garbage delivered by the user, and feedbacks the different rights and interests values ​​corresponding to the garbage category. In practical applications, the equity value can be set in different forms according to requirements, and there is no limitation here. At the same time, the equity function can provide users with access to relevant equity values ​​obtained at any time; the geographic location in the user terminal 504 , The user obtains the real-time location information where the user terminal 504 is located, and sends the real-time location information to the garbage collection box service platform 506 to verify whether the geographic location of the user belongs to matches the location of the garbage collection box device 502, In order to reduce the risk of the user abnormally operating the garbage collection box device 502 across regions, and at the same time ensure the operation authority of the user and the garbage collection box device 502 that match the geographic location.
[0210]The garbage collection bin service platform 506, as the main business platform for processing garbage classification, a target object processing method provided in this specification is applied to the garbage collection bin service platform 506, without the garbage collection bin device 502 and the user terminal 504 Provide business support;
[0211]Specifically, the user management in the garbage collection box service platform 506 is used to manage users who deliver garbage. After the user scans the identification code displayed by the garbage collection box device 502, the user management is used to save the user terminal 504 The login status of the garbage collection box service platform 506 is used to manage garbage collection box equipment information and control the business status; the device control in the garbage collection box service platform 506 is used for the same garbage collection The bin device 502 interacts and triggers a control instruction for the garbage collection bin device 502; the message management in the garbage collection bin service platform 506, the user's message reception between the garbage collection bin device 502 and the user terminal 504 and Sending, used to manage various types of business data; the rule management in the garbage collection bin service platform 506 is used to manage user behavior rules for delivering garbage, for example, managing users to deliver different types of garbage can obtain matching types of garbage Equity value; the delivery record in the garbage collection box service platform 506 is used to manage the user's historical delivery behavior, and according to the user's historical delivery record, different equity values ​​can be issued to the user; the vision in the garbage collection box service platform 506 Recognition is used to identify the rubbish delivered by the user at the SKU level and convert the recognition result into a device control instruction, where the device control instruction includes classifying the rubbish delivered by the user based on the recognition result and controlling the The garbage collection box device 502 puts the garbage into the classification box corresponding to the classification result according to the classification result.
[0212]The target object processing method provided in the embodiments of this specification is applied to the garbage collection box service platform in the garbage collection box business system. Through the interactive connection of the garbage collection box device, the user terminal and the garbage collection box service platform, it can not only realize the user The automatic sorting of the dumped garbage can also make it possible to use the user terminal to trace the user's historical delivery behavior when the user uses the garbage collection box device to dispose of the garbage, thereby realizing the guidance and education of the user's garbage delivery behavior, thereby enhancing the user's garbage Awareness of classification and to solve the problem of classification quality of ordinary recycling bins.
[0213]Corresponding to the above method embodiment, this specification also provides an embodiment of a target object processing device,Figure 6 It shows a schematic structural diagram of a target object processing apparatus provided by an embodiment of this specification. Such asFigure 6 As shown, the device includes:
[0214]The receiving module 602 is configured to receive a processing request from a user for a target object processing device, and turn on the target object processing device based on the processing request;
[0215]The activation module 604 is configured to activate the image acquisition device of the target object processing device when the target object placed by the user on the target object processing device is monitored;
[0216]The analysis and determination module 606 is configured to collect an image of the target object based on the image acquisition device, and perform category analysis on the image of the target object to determine the category of the target object;
[0217]The processing module 608 is configured to drop the target object into the classification box of the target object processing device corresponding to the category of the target object.
[0218]Optionally, the processing module 608 is further configured to:
[0219]Judging whether the classification box of the target object processing device corresponding to the category of the target object is full;
[0220]If yes, in a case where there is an empty classification box in the target object processing device, drop the target object into the empty classification box, and set the category of the empty classification box as the category of the target object;
[0221]If not, the target object is dropped into the classification box of the target object processing device corresponding to the category of the target object.
[0222]Optionally, the processing module 608 is further configured to:
[0223]Judging whether the category of the target object matches the classification box of the target object processing device corresponding to the category of the target object;
[0224]If yes, drop the target object into the classification box of the target object processing device corresponding to the category of the target object;
[0225]If not, the target object is dropped into the reserved classification box of the target object processing device corresponding to the category of the target object.
[0226]Optionally, the analysis and determination module 606 is further configured to:
[0227]When it is determined that the classification box of the target object processing device corresponding to the target object category is full, an alarm device is triggered.
[0228]Optionally, the analysis and determination module 606 is further configured to:
[0229]Inputting the image of the target object into an image classification model according to the image of the target object collected by the image acquisition device to obtain a classification result of the target object;
[0230]The target object is classified based on the classification result.
[0231]Optionally, the starting module 604 is further configured to:
[0232]In the case of receiving that the user generates delivery completion information by clicking the delivery completion button in the display interface of the target object processing device, start the image acquisition device of the target object processing device;
[0233]Performing photographing processing on the target object to obtain an image of the target object.
[0234]Optionally, the processing module 608 is further configured to:
[0235]According to the category of the target object, the first equity value corresponding to the category of the target object is issued to the user.
[0236]Optionally, the receiving module 602 is further configured to:
[0237]Receive the identification code generated by the user by clicking the display interface in the target object processing device, and display the identification code on the display interface in the target object processing device.
[0238]Optionally, the receiving module 602 is further configured to:
[0239]Receive the identification code displayed by the user through the user terminal scanning the display interface, and generate a processing request for the target object processing device, wherein the processing request carries the attribute information of the user.
[0240]Optionally, the starting module 604 is further configured to:
[0241]The identity of the user is verified based on the attribute information of the user, and if the identity verification of the user is passed, the target object processing device is turned on based on the processing request.
[0242]Optionally, the starting module 604 is further configured to:
[0243]In the case that the user's identity verification fails, return an authentication and authorization interface to the user;
[0244]In the case that the user's identity authorization is successful, the target object processing device is turned on based on the processing request.
[0245]Optionally, the starting module 604 is further configured to:
[0246]Obtaining address information of the current user, and matching the address information with the address information of the target object processing device, to obtain a matching value of the user;
[0247]In the case where the matching value is greater than or equal to a preset threshold, turning on the target object processing device;
[0248]In the case that the matching value is less than the preset threshold value, a notification of failure to start is fed back to the user.
[0249]Optionally, the starting module 604 is further configured to:
[0250]Returning the classification result to the user terminal corresponding to the user, and starting the voice device of the target object processing device;
[0251]Broadcast the classification result of the target object processing device based on the target object to the user.
[0252]Optionally, the processing module 608 is further configured to:
[0253]Acquiring a delivery record of the user delivering the target object;
[0254]Based on the placement record and the category of the target object placed by the user, a corresponding second equity value is issued to the user.
[0255]The foregoing is a schematic solution of a target object processing apparatus of this embodiment. It should be noted that the technical solution of the target object processing device belongs to the same concept as the technical solution of the above-mentioned target object processing method. For details that are not described in detail in the technical solution of the target object processing device, please refer to the above-mentioned target object processing method Description of the technical solution.
[0256]Figure 7It shows a structural block diagram of a computing device 700 according to an embodiment of the present specification. The components of the computing device 700 include but are not limited to a memory 710 and a processor 720. The processor 720 and the memory 710 are connected through a bus 730, and the database 750 is used to store data.
[0257]The computing device 700 also includes an access device 740 that enables the computing device 700 to communicate via one or more networks 760. Examples of these networks include public switched telephone network (PSTN), local area network (LAN), wide area network (WAN), personal area network (PAN), or a combination of communication networks such as the Internet. The access device 740 may include one or more of any type of wired or wireless network interface (for example, a network interface card (NIC)), such as IEEE802.11 wireless local area network (WLAN) wireless interface, global interconnection for microwave access ( Wi-MAX) interface, Ethernet interface, universal serial bus (USB) interface, cellular network interface, Bluetooth interface, near field communication (NFC) interface, etc.
[0258]In an embodiment of this specification, the above-mentioned components of the computing device 700 andFigure 7Other components not shown in can also be connected to each other, for example via a bus. It should be understood,Figure 7The shown structural block diagram of the computing device is only for the purpose of example, rather than limiting the scope of this specification. Those skilled in the art can add or replace other components as needed.
[0259]The computing device 700 can be any type of stationary or mobile computing device, including a mobile computer or mobile computing device (for example, a tablet computer, a personal digital assistant, a laptop computer, a notebook computer, a netbook, etc.), a mobile phone (for example, a smart phone). ), wearable computing devices (for example, smart watches, smart glasses, etc.) or other types of mobile devices, or stationary computing devices such as desktop computers or PCs. The computing device 700 may also be a mobile or stationary server.
[0260]The processor 720 is configured to execute the following computer-executable instructions, where the steps of the target object processing method are implemented when the processor executes the computer-executable instructions.
[0261]The foregoing is a schematic solution of a computing device of this embodiment. It should be noted that the technical solution of the computing device belongs to the same concept as the above-mentioned technical solution of the target object processing method. For details of the technical solution of the computing device that are not described in detail, please refer to the description of the technical solution of the above-mentioned target object processing method. .
[0262]An embodiment of the present specification also provides a computer-readable storage medium that stores computer instructions that, when executed by a processor, implement the steps of the target object processing method.
[0263]The foregoing is a schematic solution of a computer-readable storage medium of this embodiment. It should be noted that the technical solution of the storage medium and the technical solution of the target object processing method mentioned above belong to the same concept. For details that are not described in the technical solution of the storage medium, please refer to the description of the technical solution of the target object processing method mentioned above. .
[0264]The foregoing describes specific embodiments of this specification. Other embodiments are within the scope of the appended claims. In some cases, the actions or steps described in the claims may be performed in a different order than in the embodiments and still achieve desired results. In addition, the processes depicted in the drawings do not necessarily require the specific order or sequential order shown in order to achieve the desired results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
[0265]The computer instructions include computer program codes, and the computer program codes may be in the form of source code, object code, executable files, or some intermediate forms. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U disk, mobile hard disk, magnetic disk, optical disk, computer memory, read-only memory (ROM, Read-Only Memory) , Random Access Memory (RAM, Random Access Memory), electrical carrier signal, telecommunications signal, and software distribution media. It should be noted that the content contained in the computer-readable medium can be appropriately added or deleted according to the requirements of the legislation and patent practice in the jurisdiction. For example, in some jurisdictions, according to the legislation and patent practice, the computer-readable medium Does not include electrical carrier signals and telecommunication signals.
[0266]It should be noted that for the foregoing method embodiments, for simplicity of description, they are all expressed as a series of action combinations, but those skilled in the art should know that the embodiments of this specification are not subject to the described sequence of actions. Limitation, because according to the embodiments of this specification, some steps can be performed in other order or simultaneously. Secondly, those skilled in the art should also know that the embodiments described in the specification are all preferred embodiments, and the involved actions and modules are not necessarily all required by the embodiments of the specification.
[0267]In the above-mentioned embodiments, the description of each embodiment has its own emphasis. For parts that are not detailed in an embodiment, reference may be made to related descriptions of other embodiments.
[0268]The preferred embodiments of this specification disclosed above are only used to help explain this specification. The optional embodiments do not describe all the details in detail, nor do they limit the invention to only the specific embodiments described. Obviously, many modifications and changes can be made according to the content of the embodiments of this specification. This specification selects and specifically describes these embodiments in order to better explain the principles and practical applications of the embodiments of this specification, so that those skilled in the art can understand and use this specification well. This description is only limited by the claims and their full scope and equivalents.

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.

Similar technology patents

Intelligent replenishment and sorting system

Owner:宏普科技(扬州)有限公司

Intelligent catering management system

Owner:泉州市君点软件科技有限公司

Automatic sorting device for tableware

InactiveCN104707804Adecrease productivitySave human resources
Owner:QINGDAO HUADELI MACHINERY

Classification and recommendation of technical efficacy words

  • Save human resources

Goods delivery management method and system, server and goods delivery robot

PendingCN107878990ASave human resourcesStrong distribution ability
Owner:SHANGHAI MROBOT TECH CO LTD

System and method for acquiring bus passenger flow information

InactiveCN102592339ASave human resourcesreduce human intervention
Owner:CHONGQING ACADEMY OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products