Refrigerator food storage position recording method, device and terminal and refrigerator

A technology of storage location and recording method, which is applied to household refrigeration devices, lighting and heating equipment, household appliances, etc., can solve the problem of low efficiency of entry of food storage location information, and achieve the effect of improving entry efficiency and ensuring accuracy.

Active Publication Date: 2014-09-24
HAIER GRP CORP +1
7 Cites 16 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0004] Embodiments of the present invention provide a method, device, terminal, and refrigerator for recording food storage l...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

A recording method and device for a food storage location in a refrigerator, a terminal and a refrigerator. The recording method comprises: acquiring a food collecting image obtained by shooting the internal storage space of the refrigerator; determining an object storage location mark existing in the food collecting image and a region range corresponding to the object storage location mark in the food collecting image; identifying food existing in the region range corresponding to the object storage location mark; and recording an object storage location characterized by the object storage location mark in which the identified food is stored in the refrigerator. Using the recording method, the entering efficiency of the storage location information about the food in the refrigerator can be increased.

Application Domain

Technology Topic

Image

  • Refrigerator food storage position recording method, device and terminal and refrigerator
  • Refrigerator food storage position recording method, device and terminal and refrigerator
  • Refrigerator food storage position recording method, device and terminal and refrigerator

Examples

  • Experimental program(5)

Example Embodiment

[0035] Example 1:
[0036] In Embodiment 1 of the present invention, the user only takes pictures of one subspace, that is, the storage space corresponding to a storage location. The user can take pictures of the whole subspace or a part of the subspace, but the subspace must be taken. Storage location sign for space.
[0037] figure 2 Shown is a flow chart of the method for recording food storage locations in a refrigerator provided in Embodiment 1 of the present invention, specifically including:
[0038] Step 201, acquire the captured food collection image.
[0039] Step 202: Determine a storage location mark existing in the food collection image, and determine that the range of the area corresponding to the storage location mark is the entire area in the food collection image.
[0040] There are many specific forms of the storage position mark, preferably, a barcode can be used. Furthermore, in order to distinguish the one-dimensional barcodes carried on the food in the refrigerator, the storage location mark can use two-dimensional barcodes.
[0041] Step 203, identifying the food present in the food collection image.
[0042] The detailed flow process of the food identification method that the embodiment of the present invention 1 provides is as follows image 3 , which will be described in detail later.
[0043] Step 204, record that the identified food is stored in the storage location indicated by the storage location mark in the refrigerator.
[0044] After each food storage location is recorded, the food icon corresponding to the stored food in the refrigerator and the schematic diagram of the storage location in the refrigerator can also be displayed to the user. Preferably, enhanced display technology can be used to improve user experience.
[0045] image 3 Shown is a flow chart of the food identification method provided by Embodiment 1 of the present invention, specifically including:
[0046] Step 301, determine the images of the food to be recognized that exist within the area corresponding to the storage location mark.
[0047] This step includes preprocessing the area range corresponding to the storage location mark in the food collection image, and performing contour detection and background segmentation of the food to be recognized. This part is a prior art, so it will not be described in detail here.
[0048] Step 302, judging whether there is a food identification barcode in the image of the food to be recognized.
[0049] When there is a food identification barcode in the image of the food to be recognized, enter step 303; when there is no food identification barcode in the image of the food to be identified, enter step 304.
[0050] Step 303: Determine that the food in the image of the food to be recognized is the food represented by the food identification barcode, and end the food recognition process.
[0051] For some foods that carry a food identification barcode and the barcode is intact and can be identified, such as food purchased from a supermarket, the food can be identified through the food identification barcode, and no subsequent steps are required.
[0052] For some foods that do not carry food identification barcodes or carry damaged barcodes, such as foods purchased from the market, the following steps are mainly used for identification.
[0053] Step 304, extracting feature vectors of the image of the food to be recognized.
[0054] The method for extracting the feature vector of the image of the food to be identified in this step should correspond to the method for establishing the corresponding relationship between the preset food and the feature vector. In Embodiment 1 of the present invention, the preset corresponding relationship between the food and the feature vector can be as follows Establish:
[0055]Extract the color feature value, shape feature value and texture feature value of the sample image of multiple samples of the same food; based on the color feature value, shape feature value and texture feature value of the sample image of multiple samples of the same food, determine the The eigenvector corresponding to the food.
[0056] Among them, when determining the feature vector corresponding to the food based on the color feature value, shape feature value and texture feature value of the sample images of multiple samples of the same food, specifically, the sample images of multiple samples of the food can be determined first. The average value of the color feature value, the average value of the shape feature value of the sample images of multiple samples of the food, and the average value of the texture feature value of the sample images of the multiple samples of the food; the average value of the color feature value , the average value of the shape feature value and the average value of the texture feature value are directly used as the components of the feature vector to form the feature vector corresponding to the food.
[0057] Preferably, the color feature value interval corresponding to the food can also be determined based on the color feature values ​​of the sample images of multiple samples of the food, and the corresponding color feature value range of the food can be determined based on the shape feature values ​​of the sample images of multiple samples of the food. The shape eigenvalue interval, and the texture eigenvalue interval corresponding to the food based on the texture eigenvalues ​​of the sample images of multiple samples of the food; and then based on the color eigenvalue interval, shape eigenvalue interval and texture eigenvalue interval corresponding to the food , to determine the feature vector corresponding to the food.
[0058] Correspondingly, when extracting the feature vector of the food image to be recognized, the following steps are specifically included:
[0059] Extract the color feature value, shape feature value and texture feature value of the food image to be recognized; determine the feature vector of the food image to be recognized based on the color feature value, shape feature value and texture feature value of the food image to be recognized.
[0060] Wherein, when determining the feature vector of the food image to be recognized based on the color feature value, shape feature value and texture feature value of the food image to be recognized, specifically, the color feature value, shape feature value and texture feature value of the food image to be recognized can be The eigenvalues ​​are directly used as the components of the eigenvectors to form the eigenvectors of the food image to be recognized.
[0061] Preferably, it is also possible to first determine the color feature value preset range where the color feature value of the food image to be identified is located, the shape feature value preset range where the shape feature value of the food image to be identified is located, and the texture feature value of the food image to be identified The preset interval of the texture feature value; then based on the preset interval of the color feature value, the preset interval of the shape feature value and the preset interval of the texture feature value, the feature vector of the food image to be recognized is determined.
[0062] Step 305 , judging whether there is a feature vector whose correlation with the feature vector of the image of the food to be recognized is greater than a preset value among the preset feature vectors respectively corresponding to different foods.
[0063] When among the preset feature vectors corresponding to different foods, there is a feature vector whose correlation with the feature vector of the food image to be recognized is greater than the preset value, enter step 306; when the preset feature vectors respectively correspond to different foods Among the plurality of feature vectors, if there is no feature vector whose correlation with the feature vector of the food image to be recognized is greater than a preset value, it is determined that the food cannot be recognized, and the food recognition process ends.
[0064] Step 306 : From the preset multiple feature vectors respectively corresponding to different foods, select the feature vector with the greatest correlation with the feature vector of the image of the food to be recognized as the matching feature vector.
[0065] The greater the correlation, the higher the reliability of the food identification result.
[0066] Step 307, determining that the food in the food image to be recognized is the food corresponding to the matching feature vector.
[0067] The above-mentioned food recognition method combines barcode recognition and image recognition, and gives priority to barcode recognition. Correspondingly, Embodiment 1 of the present invention can also adopt a scheme of prioritizing image recognition, or a scheme of simultaneous barcode recognition and image recognition, based on the respective recognition The result determines the final food identification result.

Example Embodiment

[0068] Example 2:
[0069] In Embodiment 2 of the present invention, the user shoots a plurality of subspaces, that is, storage spaces corresponding to a plurality of storage locations, and may photograph all of the plurality of subspaces or a part of the plurality of subspaces.
[0070] Figure 4 Shown is the flow chart of the method for recording the food storage location in the refrigerator provided by Embodiment 2 of the present invention, specifically including:
[0071] Step 401, acquire the captured food collection image.
[0072] Step 402: Determine the multiple storage location marks existing in the food collection image, and the area ranges corresponding to the multiple storage location marks in the food collection image.
[0073] Preferably, the storage location mark is a two-dimensional barcode.
[0074] Wherein, when determining the corresponding area range for each storage position mark, the specified range in the food collection image based on the storage position mark can be determined as the area range corresponding to the storage position mark. The specified range is specifically related to the location of the storage location mark in the storage space corresponding to the storage location.
[0075] For example, when the storage position mark is located in the center of the storage space corresponding to the storage position, it can be determined that the area corresponding to the storage position mark is extended to the surrounding area around the storage position mark in the food collection image. An area range of length and width.
[0076] Moreover, the size of the specified range is positively correlated with the size of the storage location mark in the food collection image. When the storage position mark in the food collection image is larger, the specified range is larger; when the storage position mark in the food collection image is smaller, the specified range is smaller.
[0077] Step 403: Identify the foods that exist in the areas corresponding to the plurality of storage location marks.
[0078] Specifically, this step can also adopt the food identification method in step 203 of the above-mentioned embodiment 1, image 3 shown, and will not be described in detail here.
[0079] Step 404, record the storage location represented by the storage location mark corresponding to the identified food stored in the refrigerator.
[0080] After each food storage location is recorded, the food icon corresponding to the stored food in the refrigerator and the schematic diagram of the storage location in the refrigerator can also be displayed to the user. Preferably, enhanced display technology can be used to improve user experience.
[0081] It can be seen that using the method for recording food storage locations in refrigerators provided by the embodiments of the present invention avoids manual input of food storage locations, improves the efficiency of entering food storage location information, and ensures the accuracy of information.

Example Embodiment

[0082] Example 3:
[0083] Based on the same inventive idea, according to the method for recording the food storage location in the refrigerator provided by the above-mentioned embodiments of the present invention, correspondingly, Embodiment 3 of the present invention also provides a storage location recording device for food in the refrigerator, the structural diagram of which is as follows Figure 5 shown, including:
[0084] An acquisition unit 501, configured to acquire food collection images obtained by photographing the internal storage space of the refrigerator;
[0085] A determining unit 502, configured to determine the storage location mark existing in the food collection image, and the area range corresponding to the storage location mark in the food collection image;
[0086] An identification unit 503, configured to identify the food existing in the area corresponding to the storage location mark;
[0087] The recording unit 504 is configured to record the identified storage location of the food stored in the refrigerator represented by the storage location mark.
[0088] Further, the acquisition unit 501 is specifically configured to acquire a food collection image obtained by shooting a storage space corresponding to a storage location inside the refrigerator;
[0089] The determining unit 502 is specifically configured to determine a storage location mark existing in the food collection image, and determine that the range of the area corresponding to the storage location mark is the entire area in the food collection image.
[0090] Further, the determining unit 502 is specifically configured to determine a plurality of storage location marks existing in the food collection image, and area ranges corresponding to the multiple storage location marks in the food collection image.
[0091] Further, the determining unit 502 is specifically configured to, for each storage location mark in the plurality of storage position marks, determine that the range of the area corresponding to the storage position mark is the storage position mark in the food collection image is the designated range of the reference, wherein the size of the designated range is positively correlated with the size of the storage location mark in the food collection image.
[0092] Further, the identification unit 503 is specifically configured to determine the image of the food to be identified that exists within the area corresponding to the storage location mark; when there is a food identification barcode in the image of the food to be identified, determine the image of the food to be identified Identify the food represented by the barcode for the food; when there is no food identification barcode in the image of the food to be identified, extract the feature vector of the image of the food to be identified; from a plurality of preset feature vectors corresponding to different foods, select the The correlation between the feature vectors of the food image to be recognized is greater than a preset value, and the feature vector with the largest correlation with the feature vector of the food image to be recognized is used as a matching feature vector; determine the food image to be recognized The food in is the food corresponding to the matching feature vector.
[0093] Further, the recognition unit 503 is specifically configured to extract the color feature value, shape feature value and texture feature value of the food image to be recognized; based on the color feature value, shape feature value and texture feature value of the food image to be recognized, determine The feature vector of the food image to be identified;
[0094] The identification unit 503 is also specifically used to extract the color feature value, shape feature value and texture feature value of the sample images of multiple samples of the same food; The eigenvalue and texture eigenvalue determine the eigenvector corresponding to the food.
[0095] The functions of the above units can correspond to Figure 1-4 Corresponding processing steps in the flow shown will not be repeated here.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Examination monitoring system based on cloud server

Owner:安徽协达软件科技有限公司

Classification and recommendation of technical efficacy words

  • Improve entry efficiency
  • Guaranteed accuracy
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products