Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Methods for object recognition and related arrangements

a technology of object recognition and related arrangements, applied in the field of object recognition techniques and search space reduction techniques, can solve the problems of affecting the accuracy of object recognition, so as to achieve fast device-side execution of object recognition

Active Publication Date: 2015-12-01
DIGIMARC CORP
View PDF37 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0014]4. Device, Local-Server, Global-Server Dynamics: Many retail, in-store applications will push key reference features directly onto the user's device, allowing fast device-side execution of object recognition, constrained by power consumption, memory and channel usage. Fluidity of “where” various recognition stages are actually executed provides a welcome design flexibility in the device-local-global continuum.

Problems solved by technology

But they break down when trying identifying 3D objects.
But if the camera view is oblique, as in FIG. 1, conventional fingerprinting starts having difficulty.
Even if a reference fingerprint (e.g., SIFT) was available for the entire package (e.g., in a flat configuration, before the box was glued into its 3D configuration), this reference fingerprint will be difficult to match with the FIG. 1, given its projective distortions in two opposing directions.
Still more difficult are objects that cannot readily be fingerprinted in a “flat” state (e.g., before a box is glued).

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025]FIG. 2 shows that there are essentially four free parameters at play when a camera at a viewpoint 12 is pointed towards an object 14. These are two angles θ and Φ, which characterize the view direction of a vector that points from the viewpoint 12 towards the center of the object (the former being an azimuth angle—in the X-Y plane, and the latter being an elevation angle—shown as measured down from the Z-axis); a distance d of the vector; and an angle Ψ characterizing the rotation of the camera from a normal orientation (e.g., an orientation in which the bottom of the camera's image sensor is parallel to an equatorial plane through the object).

[0026]There are two further parameters that describe the offset of the “center” of the object as imaged onto the image sensor, from the center of the image sensor. (The “center” is in quotations because even this notion is somewhat equivocated when speaking of arbitrary 3 dimensional objects.)

[0027]It can be appreciated then that for som...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Methods and arrangements involving portable user devices such smartphones and wearable electronic devices are disclosed, as well as other devices and sensors distributed within an ambient environment. Some arrangements enable a user to perform an object recognition process in a computationally- and time-efficient manner. Other arrangements enable users and other entities to, either individually or cooperatively, register or enroll physical objects into one or more object registries on which an object recognition process can be performed. Still other arrangements enable users and other entities to, either individually or cooperatively, associate registered or enrolled objects with one or more items of metadata. A great variety of other features and arrangements are also detailed.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS[0001]This application claims the benefit of U.S. Provisional Application No. 61 / 811,049, filed Apr. 11, 2013, U.S. Provisional Application No. 61 / 815,172, filed Apr. 23, 2013, U.S. Provisional Application No. 61 / 815,994, filed Apr. 25, 2013 and U.S. Provisional Application No. 61 / 838,165, filed Jun. 21, 2013, each of which is herein incorporated by reference.TECHNICAL FIELD[0002]The present technology generally concerns object recognition techniques, search space reduction techniques, processing techniques that may be described as contextual, anticipatory or intuitive, techniques for implementing location-based services, ubiquitous or crowd-sourced capture of imagery, sound or other data to support the above-mentioned techniques, and many other technologies.BACKGROUND AND SUMMARY[0003]Image fingerprinting (aka image signature technology) commonly involves deriving a set of 2D feature points from imagery, and then searching a set of reference i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(United States)
IPC IPC(8): G06K9/00G06K9/62G06F17/30G06T17/00G06K9/46
CPCG06K9/6202G06F17/30247G06K9/00208G06K9/4671G06K9/6255G06T17/00G06K9/00
Inventor RHOADS, GEOFFREY B.BAI, YANGRODRIGUEZ, TONY F.ROGERS, ELIOTSHARMA, RAVI K.LORD, JOHN D.LONG, SCOTTMACINTOSH, BRIAN T.STACH, JOHNLYONS, ROBERT G.EATON, KURT M.
Owner DIGIMARC CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products