Picture cleanup method, picture cleanup device and terminal device

A picture and picture-to-picture technology, applied in the field of digital information, can solve problems such as easy to miss pictures, low cleaning efficiency, and increase user workload, so as to achieve the effect of improving cleaning efficiency, reducing work burden, and reducing workload

Active Publication Date: 2015-07-29
BEIJING QIHOO TECH CO LTD
3 Cites 40 Cited by

AI-Extracted Technical Summary

Problems solved by technology

Using this method to clean up pictures requires the user to check all the pictures, which increases the workload of the user and the cleaning efficiency is low; in addition,...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention relates to the field of digital information, and discloses a picture cleanup method, a picture cleanup device and a terminal device. The method comprises the following steps: scanning a storage device, and acquiring a stored picture; generating fingerprints representing the characteristics of the picture image corresponding to the picture, calculating the similarity of different pictures according to the fingerprints of the pictures, and determining the pictures with similarity satisfying a preset condition as the similar pictures; displaying the similar pictures in an interface; marking a selected picture in the interface after the inputted instruction for selecting the pictures is received; deleting the selected picture from the storage device after the instruction for deleting the picture is received. According to the technical scheme, the technical problems in the prior art that the picture is missed and the cleanup efficiency is low can be solved, and an technical effect for reducing the quantity of the pictures which are forgotten to be cleaned, alleviating the workload of the user and improving the cleanup efficiency can be achieved.

Application Domain

Technology Topic

Image

  • Picture cleanup method, picture cleanup device and terminal device
  • Picture cleanup method, picture cleanup device and terminal device
  • Picture cleanup method, picture cleanup device and terminal device

Examples

  • Experimental program(1)

Example Embodiment

[0057]Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided so that the present disclosure will be more thoroughly understood, and will fully convey the scope of the present disclosure to those skilled in the art.
[0058] figure 1 A flow chart showing a method for image cleaning according to an embodiment of the present invention, the method is suitable for terminal devices, such as smart phones, tablet computers, and personal computers; such as figure 1 As shown, the method includes steps S110-S150.
[0059] In step S110, the memory is scanned to obtain the stored pictures.
[0060] For example, when scanning the storage, a specified directory may be scanned, for example, according to o_c_pss.dat (the configuration file of the photo space), the photo directory is scanned. The photo directory may include: DCIM/Camera, DCIM/100MEDIA, DCIM/100ANDRO, DCIM, Camera, Photo, My Camera, Camera/Photo, Camera, My Photo.
[0061] It is also possible to scan the entire storage disk of the terminal device, for example, to scan files in jpeg and jpg formats from the storage, and the files in jpeg and jpg formats are the pictures to be acquired.
[0062] When the scan is started, the sending of the scan command can be triggered by clicking the pre-set clean data scan button on the display screen of the terminal device, or by opening the pre-installed one-key scan and clean software that supports clean data.
[0063] When the user clicks the scan button on the display interface, the terminal device confirms the receipt of the scan instruction, triggers the operation of scanning the storage space of the terminal device, and enters the garbage scan display interface. After the garbage scan is completed, the garbage scan completion interface is entered; like Figure 3A As shown, a schematic diagram of a garbage scanning display interface according to an embodiment of the present invention is shown, such as Figure 3B As shown, a schematic diagram of a garbage scanning completion interface according to an embodiment of the present invention is shown. The storage space of the terminal device may be a non-volatile storage space, for example, a storage space in a storage device such as an SD (Secure Digital memory, secure data memory) card, a micro SD card, or the like, or may be a memory in the terminal device.
[0064] The terminal device scans the directory structure in the storage space according to the scan instruction, and obtains directory information of the directory in the storage space. The directory information of the directory may include: the path and name of the directory; in addition, the directory information of the directory may also include: the version number of the directory, the time stamp, and the size of the occupied space.
[0065] Through the above scanning process, the pictures stored in the terminal device are identified and acquired. In addition, during the process of scanning and cleaning data, other junk data can be cleaned.
[0066] In practical applications, in the process of scanning and cleaning data, the relevant information of the currently scanned directory (for example, information such as path, package name, etc.) can be matched with the information of the directory in the cleaning database downloaded in advance from the cloud server or stored locally. , according to the cleaning strategy corresponding to the matched directory, identify the directory currently scanned as cleaning data, and further determine the cleaning strategy for cleaning the cleaning data. Further, the cloud server can regularly update and upgrade the local cleaning database, and update the directory recorded in the cleaning database and the corresponding cleaning strategy in time.
[0067] The display interface is specifically the display interface of garbage scanning results. In addition to the above picture information, it can also display "system disk garbage", "cache", "uninstall residual", "advertising", "installation package" and "large size". "File" respectively indicates the directory to be cleaned whose data category is system disk junk data, cache junk data, uninstall residual data, advertisement data, installation package data and large file data. Moreover, corresponding to "system disk garbage", "cache", "uninstall residual", "advertising", "installation package" and "large file", there are respective suggested sizes of data to be cleaned up: "100MB", "40MB" , "20MB", "10MB", "10MB" and "0MB". "200MB" displayed in this interface indicates the recommended size of data to be cleaned in each directory to be cleaned.
[0068] In step S120, a fingerprint representing a picture feature of the picture is generated corresponding to the picture, the similarity between the pictures is calculated according to the fingerprint of the picture, and a picture whose similarity meets a preset condition is determined as a similar picture.
[0069] In one embodiment, generating a fingerprint corresponding to the picture in step S120 representing the picture feature may include: extracting features in the picture, and using a preset algorithm to calculate the extracted features to generate a fingerprint of the picture.
[0070] For example, reduce the picture to 8 × 8 size to obtain a picture including 64 pixels. In this way, the obtained picture removes the detailed features of the picture and retains the brightness and structural features of the picture, that is, the brightness and structure of the picture are extracted. feature. After that, convert the downscaled picture to 64-level grayscale, and calculate the grayscale average of 64 pixels. The gray level of each pixel is compared with the average value, if it is less than the average value, it is recorded as 0; if it is greater than or equal to the average value, it is recorded as 1. The grayscale comparison results of each pixel are combined to obtain a 64-bit fingerprint of the picture. This is just an example to illustrate a method for generating a picture fingerprint, any other method for generating a picture fingerprint may be adopted, which is not particularly limited in the present invention.
[0071] In one embodiment, calculating the similarity between the pictures according to the fingerprints of the pictures in step S120 may include: calculating the Hamming distance between the fingerprints of the pictures, and calculating the similarity between the pictures according to the obtained Hamming distance.
[0072] The Hamming distance indicates the difference between two pictures. A Hamming distance of 0 indicates that the two pictures are 100% similar. The larger the Hamming distance, the lower the similarity between the pictures. The similarity between the two pictures can be calculated by using the formula (N-d)/N, where N is the total number of digits of the fingerprints of the pictures, and d is the Hamming distance between the fingerprints of the two pictures. For the above-mentioned 64-bit image fingerprint, formula (64-d)/64 is used to calculate the similarity between two images. When the Hamming distance is 0, the similarity between the two pictures is 100%; when the Hamming distance is 64, the similarity between the two pictures is 0. The similarity is compared with a preset threshold. For example, the first preset threshold is 92%. If the similarity between the two pictures is greater than 92%, it is determined that the two pictures are similar.
[0073] In addition, the Hamming distance can also be directly used to represent the similarity between the two pictures. For example, the second preset threshold is 5. When the Hamming distance between the two pictures is less than 5, it is determined that the two pictures are similar.
[0074] In step S130, similar pictures are displayed in the interface.
[0075] For example, to display the determined similar pictures in the displayed interface, the pictures can be displayed in the form of thumbnails, and two or more similar pictures are put into a group, and the pictures are displayed in a group manner.
[0076] In step S140, after receiving an instruction for selecting a picture input by the user, the selected picture in the interface is marked.
[0077] For example, when the user clicks on the touch screen or uses the mouse to click the thumbnail, an instruction input by the user is received, and the instruction indicates that the image corresponding to the clicked thumbnail is selected.
[0078] In step S150, after receiving the instruction to delete the picture input by the user, the selected picture is deleted from the memory.
[0079] For example, when the user clicks on the touch screen or clicks the delete button with a mouse, an instruction input by the user is received, and the instruction instructs to delete the selected picture, so the selected picture in step S140 is deleted from the memory. Further, before performing the deletion operation, a prompt that the picture will be deleted may be displayed, and after receiving the confirmation instruction input by the user, the selected picture may be deleted from the memory.
[0080] In addition, in an embodiment, the above method further includes the following steps D1-D2.
[0081] In step D1, a parameter value representing the picture quality of the picture is calculated, and a picture whose parameter value satisfies a preset picture quality unqualified condition is determined as a picture with unqualified picture quality.
[0082] In step D2, the determined unqualified picture is displayed on the interface.
[0083] For example, the sharpness parameter value of the picture is calculated. When the sharpness value of the picture is less than the preset sharpness value, it means that the picture is a blurred picture, that is, the picture quality is unqualified, and the picture is regarded as the unqualified picture in the picture. displayed in the interface. Afterwards, as described in steps S140 and S150, after receiving an instruction for selecting a picture input by the user, the selected blurred picture in the interface is marked. After receiving the instruction to delete the picture input by the user, the selected blurred picture is deleted from the memory.
[0084] According to the technical solution in this embodiment, the pictures in the memory can be scanned, the similarity between the pictures can be calculated, the similar pictures can be determined according to the similarity, and the similar pictures can be displayed for the user to select and delete; All or most of the stored pictures are cleaned up to avoid missing pictures or reduce the number of missing pictures; and, before displaying and deleting, the pictures are filtered according to the similarity between pictures, which reduces the number of pictures viewed by users, The workload of the user is reduced and the cleaning efficiency is improved. Therefore, the technical solution of the present invention solves the technical problems of missing pictures during cleaning and low cleaning efficiency, and achieves the beneficial effects of reducing the number of pictures missing during cleaning, reducing user workload and improving cleaning efficiency.
[0085] figure 2 A flow chart showing a method for image cleaning according to an embodiment of the present invention, the method is suitable for terminal devices, such as smart phones, tablet computers, and personal computers; such as figure 2 As shown, the method includes steps.
[0086] In step S202, the memory is scanned to obtain the stored pictures.
[0087] For an exemplary description of step S202, refer to the description in step S110, which will not be repeated here.
[0088] In step S204, the acquired pictures are classified according to the attribute information of the pictures.
[0089] The attribute information includes at least one item of the following information: name, storage path, and photographing time.
[0090] For example, the pictures are divided into three categories according to the attribute information of the pictures, including the beautification category, the continuous shooting category and the multi-shot category in descending order of priority. When classifying pictures, you can first judge whether the picture belongs to the beautification category, and then judge whether the picture belongs to the continuous shooting category if it does not belong to the beautification category, and then judge whether the picture belongs to the multi-shot category if it does not belong to the continuous shooting category. Among them, the pictures in the beautification category include: original photos and photos after image editing; the pictures in the continuous shooting category include: pictures taken in the continuous shooting mode; the pictures in the multi-shot category include: continuous photos of the same scene or character, the content of the screen Similar pictures.
[0091]For the beautification category, classify the pictures with the same shooting time and the same subject name before the suffix in the name into the beautification category; or classify the pictures stored under the same image processing application path into the beautification category. For example, the beautified pictures A1_1.jpg and A1_2.jpg are obtained after processing the picture A1.jpg by an image processing application, and the shooting times of the three pictures are the same; or, the three pictures are all saved to the file storage path of the image processing application Down. In this way, by comparing the name of the picture and the shooting time, or the storage path of the picture, the original picture and the beautified picture can be preliminarily identified and classified into the beautification category. Further, the three pictures can be classified into a sub-category of the beautification category. If the beautified picture B1_1.jpg is obtained after the picture B1.jpg is processed by an image processing application, and the shooting times of the two pictures are the same, the two pictures are classified into the beautification category. Further, the two pictures can be classified into another sub-category of the beautification category.
[0092] For the continuous shooting category, pictures with the same shooting time and all or part of the suffix in the name are classified into the continuous shooting category. For example, two pictures taken continuously, 20150106_191743_Burst01.jpg and 20150106_191743_Burst20.jpg, both have the string Burst suffixed, and the two pictures were taken at the same time. In this way, a plurality of pictures continuously shot are classified into a burst category by comparing the names of the pictures and the shooting time. Further, pictures with the same shooting time in the continuous shooting category can be classified into the same sub-category.
[0093] For multi-shot pictures, the pictures whose shooting time interval is within the preset interval is classified into the multi-shot category. For example, classify pictures taken within 2 hours into the multi-shot category. Further, pictures whose shooting time is in the same time period can be classified into the same sub-category of the multi-shot category.
[0094] In step S206, corresponding to each category obtained from the classification, a fingerprint of each picture in the category is generated, the similarity between each picture in the category is calculated according to the fingerprint of the picture, the similarity is compared with the preset threshold corresponding to the category, and according to the comparison result Identify similar images.
[0095] For an exemplary description of the method for generating the fingerprint and calculating the similarity, please refer to the detailed description in step S120, which will not be repeated here. In step S206, for each sub-category in the beautification, continuous shooting, and multi-shot categories, the similarity of the pictures in the sub-category can be calculated, and the similarity is compared with the preset threshold of the corresponding category to determine similar pictures in each sub-category. For example, for the Hamming distance, the preset threshold of the beautification category is 40, the preset threshold of the continuous shooting category is 40, and the preset threshold of the multi-shot category is 20. In a sub-category of the beautification category, pictures with a Hamming distance less than 40 are determined to be similar pictures; in a sub-category of the continuous shooting category, pictures with a Hamming distance less than 40 are determined to be similar pictures; in a sub-category of the multi-shot category In the class, the pictures whose Hamming distance is less than 20 are determined to be similar pictures.
[0096] In step S208, the sharpness of the picture is calculated, and a picture whose sharpness value is less than a preset sharpness threshold is determined as a blurred picture.
[0097] In step S210, the blurred picture and similar pictures in each category are displayed on the interface.
[0098] like Figure 4A As shown, the interface will blur the picture, as well as beautify, multi-shot, continuous shooting ( Figure 4A The burst category is not shown in ) The number of similar pictures in the category is displayed. Click on a category to enter the interface corresponding to the category. For example, after clicking the beautification category, enter the beautification category interface, the beautification category interface is as follows Figure 4B As shown, similar pictures in each sub-category of the beautification category are displayed in the beautification category interface.
[0099] In step S212, after receiving an instruction for selecting a picture input by the user, the selected picture in the interface is marked.
[0100] In step S214, after receiving the instruction to delete the picture input by the user, a prompt message is displayed, and after receiving the instruction of confirming the deletion from the user, the selected picture is deleted from the memory.
[0101] like Figure 4C As shown in the figure, after the user clicks the delete button, an instruction to delete the picture entered by the user is received, and a prompt is displayed "After deletion, the picture cannot be retrieved, confirm deletion?". After the user clicks the delete button in the prompt, a confirmation deletion instruction from the user is received. After that, the pictures in the selected beautification category are deleted from the memory.
[0102] The selection and deletion process of blurred pictures and pictures in the continuous shooting and multi-shot categories is the same as or similar to the selection and deletion process of pictures in the above-mentioned beautification category, and will not be repeated here.
[0103] According to the technical solution in this embodiment, pictures can be classified according to their attributes, and similarity between pictures is calculated in the classification, which reduces the amount of similarity calculation and further improves the efficiency of cleaning pictures.
[0104] The above-mentioned method in the embodiment of the present invention can clean up the captured photos stored in the terminal device, and the following methods can be used to clean up other pictures in the terminal device that do not belong to photos. When the user clicks the cleaning button on the display interface, the terminal device confirms that it has received the cleaning instruction, triggers the cleaning operation, and enters the garbage cleaning display interface. After the garbage cleaning is completed, it enters the garbage cleaning completion interface; for example, Figure 5A As shown, a schematic diagram of a display interface for garbage cleaning according to an embodiment of the present invention is shown, such as Figure 5B As shown, a schematic diagram of an interface for completing garbage cleaning according to an embodiment of the present invention is shown.
[0105] A. Application
[0106] When an application is uninstalled abnormally, some of its corresponding images still exist. Because the image cannot be recognized and cleaned, these images will always occupy the cache space. If there are too many such images, the performance of the terminal will be affected and the user experience will be degraded.
[0107] Therefore, for the identification and cleaning of invalid images of an application, the cache information is the name of the virtual machine cache package, that is, the package name of the virtual machine image, and the original file information is the package name of the installed file, usually the package name of each file in the list of installed files. .
[0108] B. Pictures
[0109] Pictures are cached as thumbnails, and quick browsing of pictures can be achieved through thumbnails.
[0110] In the technical solution of the embodiment of the present invention, a database of directory lists is created in a server in the cloud, wherein the directory list corresponds to records to be cleaned and a cleaning policy matching the directory to be cleaned.
[0111] The cleaning policies for the directories recorded in the directory list can include the following:
[0112] Complete cleanup, that is, the directory and all directories and files in the directory are cleaned and deleted.
[0113] Validity cleaning, specifically, cleaning and deleting files in the directory that have exceeded their valid duration, that is, files that have lost their validity.
[0114] Clean up carefully, that is, prompting the user of the risk of performing cleanup, and after receiving the user's instruction to determine cleanup, cleanup and delete all directories and files in the directory.
[0115] Partial cleaning, that is, cleaning and deleting the directory or file marked as completely cleaned in the directory; cleaning and deleting the directory or file marked for careful cleaning in the directory after receiving the user's instruction to confirm the cleaning; The directories or files that are not marked as completely cleaned or carefully cleaned in the , are not cleaned and deleted.
[0116] More preferably, in the directory list in the embodiment of the present invention, the data category of the directory to be cleaned is also recorded corresponding to the recorded directory to be cleaned. The data categories of the directory include: system disk junk data, cache junk data, advertisement data, installation package data, uninstall residual data and large file data.
[0117] Further, in this embodiment of the present invention, in order to enhance the visual interaction experience between the product and the user, and simplify the user operation, after the cleaning data of the memory and non-volatile storage space is scanned, the cleaning can be performed based on the floating window, so that the user can It is easy to manage terminal devices.
[0118] The cleaning method based on the floating window may specifically include the following steps: calling the second floating window according to the calling instruction generated by the operation of the first floating window; receiving the cleaning instruction generated by the operation of the second floating window; according to the cleaning instruction, Clean up the file. Wherein, the display state of the called second floating window is any one of the following: when the first floating window displays the memory occupancy rate, the second floating window displays a conventional interface; when the first floating window displays the memory occupancy rate and cleaning prompts , the second floating window displays a regular interface, and a cleanup prompt area is added to the regular interface.
[0119] Wherein, the content displayed in the first floating window is: the current memory occupancy rate; or the current memory occupancy rate and detected cleaning prompts that need to be cleaned; the second floating window includes a memory acceleration interface, a cleaning interface, and a common interface.
[0120] In practical applications, the commonly used interfaces include self-starting management function controls, uninstallation pre-installed function controls, privacy cleaning function controls, game/video acceleration function controls, software uninstallation function controls, and timing cleaning function controls; the memory acceleration interface is the second floating window. In the general interface, the cleanup prompt area is the control area, and there is a corresponding prompt copy. The memory acceleration interface includes an initial sub-interface and a completion sub-interface. The initial sub-interface is used to display the current memory occupancy rate and is provided with a click acceleration function control, which is used to call the completion sub-interface.
[0121] The cleaning interface includes a scanning sub-interface, a stop scanning sub-interface, a cleaning sub-interface, and a cleaned-up sub-interface. The scan sub-interface, the stop-scan sub-interface, and the clean-up sub-interface are respectively provided with a scan function control for calling the stop-scan sub-interface, a stop-scan function control for calling the clean-up sub-interface, and a one-key cleaning for calling the cleaned sub-interface Functional controls.
[0122] The above is only an exemplary illustration of the image cleaning method of the present invention, and the present invention is not limited thereto. Any modification, equivalent replacement, improvement, etc. made within the spirit or principle of the present invention are included in the protection scope of the present invention.
[0123] Image 6 A structural diagram of an apparatus for clearing pictures according to an embodiment of the present invention is shown, and the apparatus may be suitable for use in terminal equipment, such as smart phones, tablet computers, and personal computers; such as Image 6 As shown, the device includes:
[0124] memory 110, suitable for storing pictures;
[0125] The processor 120 is adapted to scan the memory 110, obtain the stored pictures, generate a fingerprint representing the picture characteristics of the pictures corresponding to the pictures, calculate the similarity between the pictures according to the fingerprints of the pictures, and determine the pictures whose similarity meets the preset condition as similar pictures;
[0126] The display 130 is adapted to display the similar pictures determined by the processor 120 in the interface;
[0127] a receiver 140, adapted to receive an instruction input by a user;
[0128] The processor 120 is further adapted to mark the selected picture in the interface after the receiver 140 receives the instruction for selecting a picture input by the user, and after the receiver 140 receives the instruction for deleting the picture input by the user, the selected picture is marked. Deleted from memory 110.
[0129]For example, when the processor 120 scans the memory 110, it may scan a specified directory, for example, scan a photo directory according to o_c_pss.dat (a configuration file of a photo space). The photo directory may include: DCIM/Camera, DCIM/100MEDIA, DCIM/100ANDRO, DCIM, Camera, Photo, My Camera, Camera/Photo, Camera, My Photo. The entire disk of the storage 110 may also be scanned, for example, files in jpeg and jpg formats are scanned from the storage 110, and the files in jpeg and jpg formats are the pictures to be acquired.
[0130] When the scan is started, the sending of the scan command can be triggered by clicking the pre-set clean data scan button on the display screen of the terminal device, or by opening the pre-installed one-key scan and clean software that supports clean data.
[0131] When the user clicks the scan button on the display interface, the receiver 140 confirms receipt of the scan instruction, triggers the processor 120 to scan the memory 110 of the terminal device, and enters the garbage scan display interface. After the garbage scan is completed, it enters the garbage scan. Scan complete interface; such as Figure 3A As shown, a schematic diagram of a garbage scanning display interface according to an embodiment of the present invention is shown, such as Figure 3B As shown, a schematic diagram of a garbage scanning completion interface according to an embodiment of the present invention is shown. The memory 110 may be a non-volatile storage space, for example, a storage space in a storage device such as an SD (Secure Digital memory, secure data memory) card, a micro SD card, or the like, or may be a memory in a terminal device.
[0132] The processor 120 scans the directory structure in the memory 110 according to the scan instruction to obtain directory information of the directory in the storage space. The directory information of the directory may include: the path and name of the directory; in addition, the directory information of the directory may also include: the version number of the directory, the time stamp, and the size of the occupied space.
[0133] Through the above scanning process, the pictures stored in the terminal device are identified and acquired. In addition, the processor 120 may clean other junk data during the process of scanning and cleaning data.
[0134] In practical applications, in the process of scanning and cleaning data, the processor 120 can compare the relevant information of the currently scanned directory (for example, information such as path, package name, etc.) with the directory in the cleaning database downloaded in advance from the cloud server or stored locally. The information is matched, and according to the cleaning strategy corresponding to the matched directory, the currently scanned directory is identified as cleaning data, and a cleaning strategy for cleaning the cleaning data can be further determined. Further, the cloud server can regularly update and upgrade the local cleaning database, and update the directory recorded in the cleaning database and the corresponding cleaning strategy in time.
[0135] The display 130 can also display a garbage scanning result display interface. In addition to the above picture information, the interface can also display “system disk garbage”, “cache”, “uninstallation residue”, “advertising”, “installation package” and “installation package”. "Large file" respectively indicates the directory to be cleaned whose data category is system disk junk data, cache junk data, uninstall residual data, advertisement data, installation package data and large file data. Moreover, corresponding to "system disk garbage", "cache", "uninstall residual", "advertising", "installation package" and "large file", there are respective suggested sizes of data to be cleaned up: "100MB", "40MB" , "20MB", "10MB", "10MB" and "0MB". "200MB" displayed in this interface indicates the recommended size of data to be cleaned in each directory to be cleaned.
[0136] In one embodiment, the processor 120 is specifically adapted to extract features in a picture of a picture, and use a preset algorithm to calculate the extracted features to generate a fingerprint of the picture.
[0137] For example, the processor 120 reduces the picture to a size of 8×8 to obtain a picture including 64 pixels. In this way, the obtained picture removes the detailed features of the picture and retains the brightness and structural features of the picture, that is, the extracted picture is extracted. Brightness and structural features in images. Afterwards, the processor 120 converts the reduced picture into 64-level grayscale, and calculates the grayscale average value of the 64 pixels. The gray level of each pixel is compared with the average value, if it is less than the average value, it is recorded as 0; if it is greater than or equal to the average value, it is recorded as 1. The processor 120 combines the grayscale comparison results of each pixel to obtain a 64-bit fingerprint of the picture. This is just an example to illustrate a method for generating a picture fingerprint, any other method for generating a picture fingerprint may be adopted, which is not particularly limited in the present invention.
[0138] In one embodiment, the processor 120 is specifically adapted to calculate the Hamming distance between the fingerprints of the pictures, and calculate the similarity between the pictures according to the obtained Hamming distance.
[0139] The Hamming distance indicates the difference between two pictures. A Hamming distance of 0 indicates that the two pictures are 100% similar. The larger the Hamming distance, the lower the similarity between the pictures. The similarity between the two pictures can be calculated by using the formula (N-d)/N, where N is the total number of digits of the fingerprints of the pictures, and d is the Hamming distance between the fingerprints of the two pictures. The processor 120 uses the formula (64-d)/64 to calculate the similarity between the two pictures for the above-mentioned 64-bit picture fingerprint. When the Hamming distance is 0, the similarity between the two pictures is 100%; when the Hamming distance is 64, the similarity between the two pictures is 0. The processor 120 compares the similarity with a preset threshold. For example, the first preset threshold is 92%. If the similarity between the two pictures is greater than 92%, it is determined that the two pictures are similar.
[0140] In addition, the Hamming distance can also be used to directly represent the similarity between the two pictures. For example, the second preset threshold is 5. When the Hamming distance between the two pictures is less than 5, the processor 120 determines that the two pictures are similar.
[0141] The display 130 displays the determined similar pictures in the displayed interface, and may display the pictures in the form of thumbnails, and puts two or more similar pictures into a group, and displays the pictures in a grouped manner.
[0142] The receiver 140 may be a touch screen, or may be an input element such as a mouse or a keyboard. When the user clicks on the touch screen or uses the mouse to click the thumbnail, the receiver 140 receives an instruction input by the user, the instruction indicating that the image corresponding to the clicked thumbnail is selected.
[0143] When the user clicks on the touch screen or clicks the delete button with a mouse, the receiver 140 receives an instruction input by the user, the instruction instructs to delete the selected picture, and then the processor 120 deletes the selected picture from the memory. Further, the processor 120 may instruct the display 130 to display a prompt that the picture will be deleted before executing the delete operation, and after the receiver 140 receives the confirmation instruction input by the user, the processor 120 deletes the selected picture from the memory 110 .
[0144] In one embodiment, the processor 120 is further adapted to calculate a parameter value representing the picture quality of the picture, and determine a picture whose parameter value satisfies a preset picture quality unqualified condition as an unqualified picture; the display 130 is further adapted to display the picture quality on the interface. A picture of unqualified image quality determined by the processor is displayed in .
[0145] For example, the processor 120 calculates the sharpness parameter value of the picture. When the sharpness value of the picture is smaller than the preset sharpness value, it means that the picture is a blurred picture, that is, the picture quality is unqualified, and the display 130 regards the picture as a picture. Unqualified pictures are displayed in the interface. After that, the processor 120 marks the selected blurred picture in the interface after the receiver 140 receives the instruction of selecting a picture input by the user; Deleted from memory 110.
[0146] In one embodiment, the processor 120 is further adapted to classify the acquired pictures according to the attribute information of the pictures, and the attribute information includes at least one of the following information: name, storage path, and photographing time;
[0147] The processor 120 is specifically adapted to correspond to each category obtained from the classification, generate fingerprints of each picture in the category, and calculate the similarity between the pictures in the category according to the fingerprints of the pictures.
[0148] Further, the processor 120 is specifically adapted to classify the pictures with the same shooting time and the same subject name before the suffix into the beautification category; or classify the pictures stored under the same picture processing application path into the beautification category.
[0149] Further, the processor 120 is specifically adapted to classify the pictures with the same shooting time and the same suffix in all or part of the names into the continuous shooting category.
[0150] Further, the processor 120 is specifically adapted to classify the pictures whose shooting time interval is within the preset interval duration into the multi-shot category.
[0151] Further, the processor 120 is specifically adapted to compare the similarity with a preset threshold corresponding to the category to which it belongs, and determine the similar pictures.
[0152] For example, the pictures are divided into three categories according to the attribute information of the pictures, including the beautification category, the continuous shooting category and the multi-shot category in descending order of priority. When classifying the pictures, the processor 120 may first determine whether the picture belongs to the beautification category, if not, then determine whether the picture belongs to the continuous shooting category, and if not, then determine whether the picture belongs to the multi-shot category. Among them, the pictures in the beautification category include: original photos and photos after image editing; the pictures in the continuous shooting category include: pictures taken in the continuous shooting mode; the pictures in the multi-shot category include: continuous photos of the same scene or character, the content of the screen Similar pictures.
[0153] For the beautification category, the processor 120 classifies the pictures with the same shooting time and the same subject name before the suffix into the beautification category; or classifies the pictures stored under the same picture processing application path into the beautification category. For example, the beautified pictures A1_1.jpg and A1_2.jpg are obtained after processing the picture A1.jpg by an image processing application, and the shooting times of the three pictures are the same; or, the three pictures are all saved to the file storage path of the image processing application Down. In this way, the processor 120 can preliminarily identify the original picture and the beautified picture by comparing the name and shooting time of the picture, or the storage path of the picture, and classify them into beautification categories. Further, the processor 120 may classify the three pictures into a sub-category of the beautification category. If the beautified picture B1_1.jpg is obtained after the picture B1.jpg is processed by an image processing application, and the shooting times of the two pictures are the same, the two pictures are classified into the beautification category. Further, the processor 120 may classify the two pictures into another sub-category of the beautification category.
[0154] For the continuous shooting category, the processor 120 classifies the pictures with the same shooting time and the same suffix in all or part of the names into the continuous shooting category. For example, two pictures taken continuously, 20150106_191743_Burst01.jpg and 20150106_191743_Burst20.jpg, both have the string Burst suffixed, and the two pictures were taken at the same time. In this way, by comparing the names of the pictures and the shooting time, the processor 120 classifies the plurality of pictures continuously shot into the continuous shooting category. Further, the processor 120 may classify pictures with the same shooting time in the continuous shooting category into the same sub-category.
[0155] For multi-shot pictures, the processor 120 classifies pictures whose shooting time interval is within the preset interval duration into a multi-shot category. For example, classify pictures taken within 2 hours into the multi-shot category. Further, the processor 120 may classify pictures whose shooting times are in the same time period into the same sub-category of the multi-shot category.
[0156] Next, the processor 120 corresponds to each category obtained from the classification, generates fingerprints of each picture in the category, calculates the similarity between the pictures in the category according to the fingerprints of the pictures, compares the similarity with the preset threshold corresponding to the category, and compares the similarity with the preset threshold corresponding to the category. As a result, similar images were identified.
[0157] The processor 120 may calculate the similarity of pictures in the sub-categories for each sub-category in the beautification, continuous shooting, and multi-shot categories, compare the similarity with a preset threshold of the corresponding category, and determine similar pictures in each sub-category. For example, for the Hamming distance, the preset threshold of the beautification category is 40, the preset threshold of the continuous shooting category is 40, and the preset threshold of the multi-shot category is 20. In a sub-category of the beautification category, the processor 120 determines that the pictures with the Hamming distance less than 40 are similar pictures; in a sub-category of the continuous shooting category, the processor 120 determines that the pictures with the Hamming distance less than 40 are similar pictures; In a sub-category of the multi-shot category, the processor 120 determines that pictures with a Hamming distance less than 20 are similar pictures.
[0158] In addition, the processor 120 may also calculate the sharpness of the picture, and determine a picture whose sharpness value is less than a preset sharpness threshold as a blurred picture.
[0159] The display 130 displays blurred pictures and similar pictures in various categories in the interface.
[0160] like Figure 4A As shown, the interface will blur the picture, as well as beautify, multi-shot, continuous shooting ( Figure 4A The burst category is not shown in ) The number of similar pictures in the category is displayed. Click on a category to enter the interface corresponding to the category. For example, after clicking the beautification category, enter the beautification category interface, the beautification category interface is as follows Figure 4B As shown, similar pictures in each sub-category of the beautification category are displayed in the beautification category interface.
[0161] After the receiver 140 receives the instruction for selecting a picture input by the user, the processor 120 marks the selected picture in the interface; after the receiver 140 receives the instruction for deleting the picture input by the user, it instructs the display 130 to display a prompt message, and then After the receiver 140 receives the user's instruction to confirm deletion, the processor 120 deletes the selected picture from the memory.
[0162] like Figure 4C As shown, after the user clicks the delete button, the receiver 140 receives an instruction to delete the picture input by the user, and the display 130 displays a prompt "After deletion, the picture cannot be retrieved, confirm deletion?". After the user clicks the delete button in the prompt, the receiver 140 receives the user's instruction to confirm the deletion. Afterwards, the processor 120 deletes the pictures in the selected beautification category from the memory 110 .
[0163] The processor 120 selects and deletes the blurred pictures and pictures in the continuous shooting and multi-shot categories the same as or similar to the above-mentioned processing procedures for selecting and deleting pictures in the beautification category, and will not be repeated here.
[0164] According to the technical solution in this embodiment, pictures can be classified according to their attributes, and similarity between pictures is calculated in the classification, which reduces the amount of similarity calculation and further improves the efficiency of cleaning pictures.
[0165] By adopting the above solutions of the embodiments of the present invention, the captured photos stored in the terminal device can be cleaned up, and other pictures in the terminal device that do not belong to photos can be cleaned up by using the following solutions. When the user clicks the cleanup button on the display interface, the receiver 140 confirms receipt of the cleanup instruction, triggers the processor 120 to perform a cleanup operation and enters the garbage cleanup display interface, and after the garbage cleanup is completed, enters the garbage cleanup completion interface; such as Figure 5A As shown, a schematic diagram of a display interface for garbage cleaning according to an embodiment of the present invention is shown, such as Figure 5B As shown, a schematic diagram of an interface for completing garbage cleaning according to an embodiment of the present invention is shown.
[0166] A. Application
[0167] When an application is uninstalled abnormally, some of its corresponding images still exist. Because the image cannot be recognized and cleaned, these images will always occupy the cache space. If there are too many such images, the performance of the terminal will be affected and the user experience will be degraded.
[0168] Therefore, for the processor 120 to identify and clean up the invalid pictures of the application, the cache information is the virtual machine cache package name, that is, the package name of the virtual machine picture, and the original file information is the package name of the installed file, usually each file in the list of installed files. package name.
[0169] B. Pictures
[0170] Pictures are cached as thumbnails, and quick browsing of pictures can be achieved through thumbnails.
[0171] In the technical solution of the embodiment of the present invention, a database of directory lists is created in a server in the cloud, wherein the directory list corresponds to records to be cleaned and a cleaning policy matching the directory to be cleaned.
[0172] The cleaning policies for the directories recorded in the directory list can include the following:
[0173] Complete cleanup, that is, the directory and all directories and files in the directory are cleaned and deleted.
[0174] Validity cleaning, specifically, cleaning and deleting files in the directory that have exceeded their valid duration, that is, files that have lost their validity.
[0175] Clean up carefully, that is, prompting the user of the risk of performing cleanup, and after receiving the user's instruction to determine cleanup, cleanup and delete all directories and files in the directory.
[0176] Partial cleaning, that is, cleaning and deleting the directory or file marked as completely cleaned in the directory; cleaning and deleting the directory or file marked for careful cleaning in the directory after receiving the user's instruction to confirm the cleaning; The directories or files that are not marked as completely cleaned or carefully cleaned in the , are not cleaned and deleted.
[0177] More preferably, in the directory list in the embodiment of the present invention, the data category of the directory to be cleaned is also recorded corresponding to the recorded directory to be cleaned. The data categories of the directory include: system disk junk data, cache junk data, advertisement data, installation package data, uninstall residual data and large file data.
[0178] Further, in this embodiment of the present invention, in order to enhance the visual interaction experience between the product and the user, and simplify the user operation, after the cleaning data of the memory and non-volatile storage space is scanned, the cleaning can be performed based on the floating window, so that the user can It is very easy to manage terminal devices.
[0179] The cleaning solution based on the floating window may specifically include: the processor 120 calls the second floating window according to the calling instruction generated by the operation of the first floating window; receiving the cleaning instruction generated by the operation of the second floating window; according to the cleaning instruction , to clean up the file. Wherein, the display state of the called second floating window is any one of the following: when the first floating window displays the memory occupancy rate, the second floating window displays a conventional interface; when the first floating window displays the memory occupancy rate and cleaning prompts , the second floating window displays a regular interface, and a cleanup prompt area is added to the regular interface.
[0180] Wherein, the content displayed in the first floating window is: the current memory occupancy rate; or the current memory occupancy rate and detected cleaning prompts that need to be cleaned; the second floating window includes a memory acceleration interface, a cleaning interface, and a common interface.
[0181] In practical applications, the display 130 can also display common interfaces, and the common interfaces include self-start management function controls, uninstall preinstalled function controls, privacy cleanup function controls, game/video acceleration function controls, software uninstallation function controls, and timing cleanup function controls; memory; The acceleration interface is the conventional interface of the second floating window, the cleanup prompt area is the control area, and there is a corresponding prompt text. The memory acceleration interface includes an initial sub-interface and a completion sub-interface. The initial sub-interface is used to display the current memory occupancy rate and is provided with a click acceleration function control, which is used to call the completion sub-interface.
[0182] The cleaning interface includes a scanning sub-interface, a stop scanning sub-interface, a cleaning sub-interface, and a cleaned-up sub-interface. The scan sub-interface, the stop-scan sub-interface, and the clean-up sub-interface are respectively provided with a scan function control for calling the stop-scan sub-interface, a stop-scan function control for calling the clean-up sub-interface, and a one-key cleaning for calling the cleaned sub-interface Functional controls.
[0183] The invention also discloses a picture cleaning terminal device, the terminal device includes any of the above devices. The terminal device may be an intelligent terminal, a tablet computer, or a personal computer, which is not particularly limited in the present invention.
[0184] It should be noted:
[0185] The algorithms and displays provided herein are not inherently related to any particular computer, virtual appliance, or other device. Various general-purpose devices can also be used with the teachings based on this. The structure required to construct such a device is apparent from the above description. Furthermore, the present invention is not directed to any particular programming language. It is to be understood that various programming languages ​​may be used to implement the inventions described herein, and that the descriptions of specific languages ​​above are intended to disclose the best mode for carrying out the invention.
[0186] In the description provided herein, numerous specific details are set forth. It will be understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
[0187] Similarly, it is to be understood that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together into a single embodiment, figure, or its description. This disclosure, however, should not be construed as reflecting an intention that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.
[0188] Those skilled in the art will understand that the modules in the device in the embodiment can be adaptively changed and arranged in one or more devices different from the embodiment. The modules or units or components in the embodiments may be combined into one module or unit or component, and further they may be divided into multiple sub-modules or sub-units or sub-assemblies. All features disclosed in this specification (including accompanying claims, abstract and drawings) and any method so disclosed may be employed in any combination, unless at least some of such features and/or procedures or elements are mutually exclusive. All processes or units of equipment are combined. Each feature disclosed in this specification (including accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
[0189] Furthermore, it will be understood by those skilled in the art that although some of the embodiments described herein include certain features, but not others, included in other embodiments, that combinations of features of different embodiments are intended to be within the scope of the invention within and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
[0190] Various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art should understand that a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in the apparatus for cleaning pictures according to the embodiments of the present invention. The present invention can also be implemented as apparatus or apparatus programs (eg, computer programs and computer program products) for performing part or all of the methods described herein. Such a program implementing the present invention may be stored on a computer-readable medium, or may be in the form of one or more signals. Such signals may be downloaded from Internet sites, or provided on carrier signals, or in any other form.
[0191] It should be noted that the above-described embodiments illustrate rather than limit the invention, and that alternative embodiments may be devised by those skilled in the art without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several different elements and by means of a suitably programmed computer. In a unit claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, and third, etc. do not denote any order. These words can be interpreted as names.
[0192] The present invention discloses the following scheme:
[0193] A1. A method for image cleaning, comprising:
[0194] Scan the memory to obtain the stored pictures;
[0195] A fingerprint representing the picture feature of the picture is generated corresponding to the picture, the similarity between the pictures is calculated according to the fingerprint of the picture, and the picture whose similarity meets the preset condition is determined as a similar picture;
[0196] displaying the similar picture in the interface;
[0197] After receiving the instruction of selecting a picture input by the user, mark the selected picture in the interface;
[0198] After receiving the instruction to delete the picture input by the user, the selected picture is deleted from the memory.
[0199] A2. The method according to A1, wherein the method further comprises:
[0200] Classify the acquired pictures according to the attribute information of the pictures, the attribute information includes at least one of the following information: name, storage path, and photographing time;
[0201] The corresponding picture generates a fingerprint representing the feature of the picture, and calculates the similarity between the pictures according to the fingerprint of the picture, which specifically includes:
[0202] Corresponding to each category obtained from the classification, the fingerprint of each picture in the category is generated, and the similarity between each picture in the category is calculated according to the fingerprint of the picture.
[0203] A3. The method according to A2, wherein the classification of the acquired pictures according to the attribute information of the pictures specifically includes:
[0204] Classify the pictures with the same shooting time and the same subject name before the suffix in the name into the beautification category; or
[0205] Classify the pictures stored in the same picture processing application path into the beautification category.
[0206] A4. The method according to A2, wherein the classification of the acquired pictures according to the attribute information of the pictures specifically includes:
[0207] Classify pictures with the same shooting time and the same suffix in all or part of the name into the burst category.
[0208] A5. The method according to A2, wherein the classification of the acquired pictures according to the attribute information of the pictures specifically includes:
[0209] Classifies pictures whose shooting time interval is within the preset interval length into the multi-shot category.
[0210] A6. The method according to A2, wherein determining the pictures whose similarity satisfies a preset condition as similar pictures includes:
[0211] Similar pictures are determined by comparing the similarity with the preset threshold corresponding to the category to which they belong.
[0212] A7. The method according to A1, wherein the method further comprises:
[0213] Calculate the parameter value representing the picture quality of the picture, and determine the picture whose parameter value meets the preset picture quality unqualified condition as the picture with unqualified picture quality;
[0214] The determined unqualified picture is displayed in the interface.
[0215] A8. The method according to A1, wherein the corresponding picture generates a fingerprint representing the picture feature, which specifically includes:
[0216] Extract the features in the picture of the picture, use a preset algorithm to calculate the extracted features, and generate the fingerprint of the picture.
[0217] A9. The method according to A1, wherein the calculating the similarity between pictures according to the fingerprints of the pictures specifically includes:
[0218] Calculate the Hamming distance between the fingerprints of the pictures, and calculate the similarity between the pictures according to the obtained Hamming distance.
[0219] B10. A device for cleaning pictures, the device comprising:
[0220] memory, suitable for storing pictures;
[0221] The processor is adapted to scan the memory, obtain the stored pictures, generate a fingerprint representing the picture characteristics of the picture corresponding to the picture, calculate the similarity between the pictures according to the fingerprint of the picture, and determine the pictures whose similarity meets the preset condition as similar pictures;
[0222] a display, adapted to display the similar pictures determined by the processor in the interface;
[0223] a receiver, adapted to receive an instruction input by a user;
[0224] The processor is further adapted to mark the selected picture in the interface after the receiver receives the instruction of selecting a picture input by the user, and after the receiver receives the instruction to delete the picture input by the user, it will mark the selected picture in the interface. The selected picture is deleted from memory.
[0225] B11. The device according to B10, wherein the processor is further adapted to classify the acquired pictures according to attribute information of the pictures, the attribute information including at least one of the following information: name, storage path, and photographing time;
[0226] The processor is specifically adapted to correspond to each category obtained by classification, generate fingerprints of each picture in the category, and calculate the similarity between the pictures in the category according to the fingerprints of the pictures.
[0227] B12. The device according to B11, wherein the processor is specifically adapted to classify the pictures with the same shooting time and the same subject name before the suffix in the name into the beautification category; or store the pictures in the same picture processing application path The pictures below are classified into the beautification category.
[0228] B13. The device according to B11, wherein the processor is specifically adapted to classify the pictures with the same shooting time and all or part of the suffixes in the names into the continuous shooting category.
[0229] B14. The device according to B11, wherein the processor is specifically adapted to classify pictures whose shooting time interval is within a preset interval duration into a multi-shot category.
[0230] B15. The apparatus according to B11, wherein the processor is specifically adapted to compare the similarity with a preset threshold corresponding to the category to which it belongs, and determine the similar pictures.
[0231] B16. The device according to B10, wherein the processor is further adapted to calculate a parameter value representing the picture quality of the picture, and determine a picture whose parameter value satisfies a preset picture quality unqualified condition as a picture with unqualified picture quality;
[0232] The display is further adapted to display the unqualified picture determined by the processor in the interface.
[0233] B17. The device according to B10, wherein the processor is specifically adapted to extract features in a picture of a picture, and use a preset algorithm to calculate the extracted features to generate a fingerprint of the picture.
[0234] B18. The device according to B1, wherein the processor is specifically adapted to calculate the Hamming distance between fingerprints of the pictures, and calculate the similarity between the pictures according to the obtained Hamming distance.
[0235] C19. A terminal device for image cleaning, wherein the terminal device includes the apparatus described in any one of B10-B18.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Air-conditioning system for motor vehicle

InactiveUS20050087332A1Comfortable air-conditioning performanceReduce in quantityAir-treating devicesRailway heating/coolingTruckEvaporator
Owner:DENSO CORP

Classification and recommendation of technical efficacy words

  • Reduce in quantity
  • Reduce workload

Fused Protein Composition

InactiveUS20080241884A1Strong cytotoxicityReduce in quantityAntibacterial agentsPeptide/protein ingredientsFucoseDrug
Owner:KYOWA HAKKO KIRIN CO LTD

Method and apparatus for allocating erasure coded data to disk storage

ActiveUS20130132800A1Reduce in quantityError correction/detection using block codesStatic storageDisk storageData chunk
Owner:HEWLETT-PACKARD ENTERPRISE DEV LP

Hospital gown

InactiveUS7181773B1Reduce workloadQuickly exposePyjamasUndergarmentsRight clavicleSurface plate
Owner:WILLIAM BEAUMONT HOSPITAL

Method and apparatus for directionally grabbing page resource

InactiveCN101452463AIncrease yield and recallReduce workloadSpecial data processing applicationsRegular expressionDatabase
Owner:ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products