Title of Invention | "A DEVICE FOR PERFORMING A LINE BASED IMAGE MATCHING" |
---|---|
Abstract | A line based image matching method in which a model image indexed by similar shape descriptors to a query image is retrieved from an image database indexed by line based shape descriptors. The line based image matching method involves: collecting line information of a query image and model images; defining the binary relation between the lines of the query image and the lines of the model images; measuring the compatibility coefficients of the node-label pairs of the query and model images based on the binary relation; and measuring the similarity between the query and model images on the basis of continuous relaxation labeling using the compatibility coefficient. |
Full Text | FILED OF THE INVENTION The present invention relates to a device for performing a line based image matching. The present invention relates to an image matching method, and more particularly, to a line based image matching method for retrieving a model image indexed by similar shape describers to a query image from an image database indexed by line based shape describers. Shape descriptors as a basic image pattern recognition describer is based on automatically extractable lower abstraction level description Recent approaches have been made on algorithms for describing the shape of a particular object from an image, and measuring a degree of matching accuracy or similarity based on the shape descriptors These algorithms are enough to describe the shape of a particular object but not to recognize the shape of an arbitrary object The shape descriptors suggested by the MPEG-7 are obtained with feature points found through a variety of modifications of a given object There are many kinds of shape descriptors The experiment Model 1 (XM) of the MPEG-7 suggest two types of shape descriptors: Zemike moment shape descriptor and Curvature scale space shape descriptor As for the Zernike moment shape descriptor, Zernike base functions are defined for a variety of shapes of an object to investigate the distribution of the shapes within an image Then, the image in a predetermined size is projected over all the base functions, and ttie resultant values are used as the shape describers As for the curvature scale space shape descriptor, the contour of a model image is extracted, and changes of curvature points of an image along the contour are expressed as peak values on a scaled space Then, the location of the peak values are expressed as a 2-dimensional vector The extraction of the former shape descriptor can be applied only when each shape of the image and the base function have a similar scale. Also, the size of input image is restricted. On the other hand, a drawback of the latter shape descriptor extraction lies in that the shapes are extracted from only a single object According to a conventional shape descriptor based image matching method, accurate extraction of an object from a query image is needed prior to retrieving of a model image indexed by similar shape describers to the query image from a database. In other words, if the object cannot be accurately extracted, retrieving of a model image for the conventional image matching is impossible. Therefore, there is a need for a simple image matching method capable of retrieving a model image indexed by similar shape describers to a query image from an image database indexed by line based shape descriptors, without need for accurate object extraction. SUMMARY OF THE INVENTION It is an object of the present invention to provide a line based image matching method capable of retrieving a model image indexed by similar shape descriptors to a query image from a mode! image database indexed by line based shape describers, without need for accurate object extraction. The object of the present invention is achieved by a line based image matching method comprising: collecting line information of a query image and model images; defining the binary relation between the lines of the query image and the lines of the model images; measuring the compatibility coefficients of the node-label pairs of the query and model images based on the binary relation; and measuring the similarity between the query and model images on the basis of continuous relaxation labeling using the compatibility coefficient. It is preferable that the line information of each of the query and model images is expressed by shape descriptors. It is preferable that the model images are retrieved from a database indexed by shape descriptors determined by: extracting the skeleton of a model image; thinning the skeleton; concatenating corresponding pixels based on the extracted skeleton to obtain a set of lines; and normalizing the set of lines to determine the normalized set of lines as the shape descriptors. It is preferable that the binary relation is invariant with respect to rotations, scale changes and translations. In this case, the binary relation may include at least one of an angular difference between two lines, a ratio of the lengths, a relative location, and a relative distance. It is preferable that measuring the compatibility coefficients of the node-label pairs based on the binary relation comprises: measuring the binary relation, denoted by ξij, for two nodes i and j within the set of lines of the query image; measuring the binary relation, denoted by ξλλ for two labels A and A 'within the set of lines for each of the model images; and measuring the compatibility coefficients, denoted by rij (λ, λ), for the node-label pairs of the query and each of the model images. The compatibility coefficients rij(λ λ) as a measure of the strength of compatibility between the node-label pairs may have high values corresponding to compatibility and low values corresponding to incompatibility. The compatibility coefficients rtj(λ, λ) may be determined as 1 if the binary relation of a node pair (/", j) of the query image coincides with the binary relation of a label pair (λ,λ). The compatibility coefficients r^(λ λ ) can be expressed as: (Equation Removed) where (Equation Removed) "K" denotes the number of elements of a character vector for a defined binary relation; and "p" is a measure of the difference in compatibility on between node-label pairs. It is preferable that, before measuring the similarity on the basis of the continuous relaxation labeling, the line based image matching method further comprises assigning a uniform initial probability to a predetermined number of upper node-label pairs in which the sums of the highest degree of support by each adjacent label for the nodes are within the range of an upper level, the initial probability being close to the final probability After assigning the uniform initial probability to the upper node-label pairs, the line based image matching method may further comprise defining a probability update element for the continuous relaxation labeling as: (Equation Removed) where p(.k} (') denotes the node-to-label correspondence probability, and k denotes the number of iterations needed. After defining the probability update element, the line based image matching method may further comprise updating the probability on the basis of the Zucker's theory using (Equation Removed) where(Equation Removed) It is preferable that measuring the similarity between the query and model images comprises: calculating the sum of the distances between corresponding node-to-label pairs of the sets of lines for the query and each of the model images; and determining the reciprocal of the sum of the distances as the similarity between corresponding two images. The distances may be measured using the Euclidean distance or the Housdorff distance. BRIEF DESCRIPTION OF THE DRAWINGS The above object and advantages of the present invention will become more apparent by describing in detail preferred embodiments thereof with reference to the attached drawings in which: FIG. 1 is a flowchart illustrating the main steps of a line based image matching method according to a preferred embodiment of the present invention; FIG. 2A illustrates a difference in slope of two lines as an example of the binary relation between two lines, and FIG. 2B illustrates an angle between the line segment connecting the middle points of two lines, and one of the two lines, as another example of the binary relation between the two lines. STATEMENT OF INVENTION The present invention provides a device for performing a line based image matching comprising collecting characterizing information unit collecting characterizing information for a query image and a plurality of model images and determining unit determining the similarity between the query image and each model image on the basis of said characterizing information, which comprises, for each image, relationship between paired lines, characterized by said determining unit determining said similarities on the basis of continuous relaxation labelling using compatibility coefficients for pairs comprising query image nodes and model image labels determined on the basis of characterizing information. DETAILED DESCRIPTION OF THE INVENTION A preferred embodiment of a line based image matching method according to the present invention is illustrated in FIG. 1. Referring to FIG. 1, line information on a query image and model images are collected (step 102). Here, it is assumed that the mode images are stored in a database, and indexed by line based shape descriptors. These line based shape descriptors can be extracted by the shape descriptor extraction method which has been filed on October 21, 2000, by the present applicant and assigned Korean Patent Application No. 00-62163, and is incorporated herein by reference. Shape describers can be extracted from a query image by the shape'descriptor extraction method. The shape descriptors extracted by the shape descriptor extraction method includes a set of lines. Line information on the query image and model images can be collected from the set of lines. Binary relations between the lines of the query image and model images are defined (step 104). The intrinsic attributes of a single line, such as slope, length or location of the middle point, and binary relations between two lines, such as angular difference, ratio of the lengths, relative location, or relative distance between two lines, can be used as the features of the lines In the present embodiment, the binary relations that are invariant to rotations, scale changes, and translations are used as the lines features. FIG. 2A illustrates a difference in slope between two lines as an example of the binary relation. As shown in FIG. 2A, the relative position of the two liens can be expressed by the difference (6) in the slopes of the two lines FIG. 2B illustrates an angle between the line segment connecting the middle points of the two lines, and one of the two lines, as another example of the binary relation between two lines. As shown in FIG. 28, the relative location between two lines can be expressed by the angle (). Although not illustrated in the drawings, a ratio (r) of lengths between two line can be used as a binary relation between the two lines In the line based image matching method according to the present invention, the correspondence between a set of lines for the query image, and a set of lines for the model images, are determined based on continuous relaxation labeling. For the continuous relaxation labeling, the initial probability is updated on the basis of compatibility coefficients, and a more recent labeling status. Once the initial probability is assumed, the probability is updated depending on the comparability coefficients. The compatibility coefficients are used in modeling possible matching conditions to reach a satisfactory labeling. Accurate labeling close to a real query image is important. Next, in step 106, the compatibility coefficients between node-label pairs of the query and model images are determined based on the binary relations defined in step 104. The procedures of step 106 are as follows. First, a binary relation value between two nodes vi and vi within the set of lines for the query image is calculated based on their binary relations. Here, i and i denote the identifiers of the nodes of the query image, and are not identical to each other. A binary relation value ξ between two labels λ and λ 'within the set of lines for each model image is calculated based on their binary relations. Here, λ and λ 'denote the identifiers of the labels of the model image. The strength of compatibility between the node pairs of the query image and the label pairs of the model image is expressed as the compatibility coefficient rij (λ , λ'). The compatibility coefficient rv(λ, λ)s determined to be between 0 and 1 based on the binary relations of the query image and model images. The compatibility coefficient r:](λ, λ ) measures the strength of compatibility between node-label pairs: high values correspond to compatibility and low values correspond to incompatibility. For example, in considering the compatibility of the node-label pair (v-, λ), if the compatibility of label λt for the node v, is higher than that of label λ2, the compatibility coefficient is defined to satisfy the following relation: (Equation Removed) (1) When the binary relation of a node pair (i, j) of the query image coincides with the binary relation of a label pair (λ. λ '), the coefficient r,t(λ, λ ) is determined as 1 In the present embodiment, the compatibility coefficient r,;(λ. λ } is expressed as 1 (Equation Removed) (2) where(Equation Removed) r and "K" denotes the number of elements of a character vector for a defined binary relation. In formula (2), "p" is a measure of the difference in compatibility between node-label pairs. Next, the similarity is determined based on the continuous relaxation labeling with the compatibility coefficients (step 108). As previously described, in a continuous relaxation labeling process, the initial probability is updated on the basis of compatibility coefficients, and a more recent labeling status. Once the initial probability is assumed, the probability is updated depending on the comparability coefficients. The compatibility coefficients are used in modeling possible matching conditions to reach a satisfactory labeling. Accurate labeling close to a real query image is important. Step 108 performed based on the continuous relaxation labeling using the compatibility coefficients involves a probability updating step, a normalization step, and a similarity determination step. First, a uniform initial probability is assigned to a predetermined number of upper node-label pairs in which the sums of the highest degree of support by each adjacent label for the nodes are within the range of an upper level, wherein the initial probability must be close to the final probability A probability update element for the continuous relaxation labeling is defined as (Equation Removed) where p(;k)(') denotes the node-to-label correspondence probability, and k denotes the number of iterations needed. The probability is updated as follows on the basis of the Zucker's theory which ensures efficient convergence result: (Equation Removed) (3) where(Equation Removed) The range of each element of the character vector does not coincide with other elements, and thus there is a need for normalization. Information on lines to be matched with a query image can be clearly obtained by the node-to-label correspondence based on the relaxation labeling. One of the lines from the set for the query image is selected, and the remainder of the lines is normalized with respect to the selected line in terms of size and location. In particular, an arbitrary line is selected from the set of lines for a query image. The length of the selected line is assigned 1, and a spatial transform function is defined such that the selected line is aligned on the X-axis of an orthogonal coordinates system. Then, all the lines for the query image is transformed by the spatial transform function. This spatial transformation is carried out for all the lines of the model images corresponding to the selected lines. Then, the similarity between the query image and model images are determined. In the present embodiment, the similarity is determined by calculating the sum of the node-to-label distances of the query and each of the model images. A variety of techniques can be applied to determine the similarity. For example, if the node-to-label correspondence between the query image and the model images is apparent, and if the normalization with respect to rotations and scale changes is carried out for the corresponding lines, the distances between the nodes and labels can be directly measured using the Euclidian distance. Alternatively, the distances can be measured using the Housdorff distance, which ensures a more accurate similarity result. In the present embodiment, the similarity between the query image and a model image is measured as follows The distances between the corresponding nodes and labels are measured for the two line sets of the query image and each of the model images, using the Euclidian distance Then, the sum of all the distances is calculated, and then the similarity between two images is determined with the reciprocal of the sum. A model image having the highest degree of similarity is retrieved (step 110). The present invention also provides a device for performing a line based image matching, said device comprising: (a) a first means for collecting line information of a query image and model images; (b) a second means for defining the binary relation between the lines of the query image and the lines of the model images; (c) a third means for measuring the compatibility coefficients of the node-label pairs of the query and model images based on the binary relation; and (d) a fourth means for measuring the similarity between the query and model images on the basis of continuous relaxation labeling using the compatibility coefficient. In an embodiment of the present invention, the line information of each of the query and model images are expressed by shape descriptors. In another embodiment of the present invention, the model images are retrieved from a database indexed by shape descriptors determined by: extracting the skeleton of a model image; thinning the skeleton; concatenating corresponding pixels based on the extracted skeleton to obtain a set of lines; and normalizing the set of lines to determine the normalized set of lines as the shape descriptors. In still another embodiment of the present invention, the binary relation is invariant with respect to rotations, scale changes and translations. In yet another embodiment of the present invention, the binary relation includes at least one of an angular difference between two lines, a ratio of the lengths, a relative location, and a relative distance. In still yet another embodiment of the present invention, the third means further comprises of: (a) fifth means for measuring the binary relation, denoted by ξij for two nodes i and j within the set of lines of the query image; (b) sixth means for measuring the binary relation, denoted by ξ for two labels . and .' within the set of lines for each of the model images; and (c) seventh means for measuring the compatibility coefficients, denoted by rij(, ), for the node-label pairs of the query and each of the model images. In yet still another embodiment of the present invention, the compatibility coefficients rij(,') are a measure of the strength of compatibility between the node-label pairs. In further embodiment of the present invention, high values for compatibility coefficient correspond to compatibility and low values for compatibility coefficient correspond to incompatibility. In one more embodiment of the present invention, the compatibility coefficients rij (,, .') are determined as 1 if the binary relation of a node pair (i, j) of the query image coincides with the binary relation of a label pair (, ,'). In one another embodiment of the present invention, the compatibility coefficients rjj(,, .') are expressed as: (Equation Removed) where(Equation Removed) , "K" denotes the number of elements of a character vector for a defined binary relation; and "p" is a measure of the difference in compatibility on between node-label pairs. In one further embodiment of the present invention, the fourth means further comprises of an eighth means for assigning an uniform initial probability to a predetermined number of upper node-label pairs in which the sum of the highest degree of support by each adjacent label for the nodes are within the range of an upper level, the initial probability being close to the final probability. In an embodiment of the present invention, the eight mean further comprises of an ninth means for defining a probability update element for the continuous relaxation labeling as: where p(k)j () denotes the node-to-label correspondence probability, and k denotes the number of iterations needed. In another embodiment of the present invention, the ninth means further comprises of a tenth means for updating the probability on the basis of the Zucker's theory using: (Equation Removed) where i(Equation Removed) In still another embodiment of the present invention, the fourth means further comprises: (a) an eleventh means for calculating the sum of the distances between corresponding node-to-label pairs of the sets of lines for the query and each of the model images; and (b) a twelfth means for determining the reciprocal of the sum of the distances as the similarity between corresponding two images. In yet another embodiment of the present invention, the distances are measured using the Euclidean distance or the Housdorff distance. A simulation experiment was carried out based on the above-mentioned line based image matching method. Edges of a query image were extracted, and the extracted edges are approximated into lines to express the query image with the lines. Then, some of the lines of the query image were selected by user, and then matched with model images using information on the selected lines. An example of a query image is shown in FIG. 3A. FIG. 3B illustrates the query image of FIG. 3A after the above-mentioned line approximation. FIG. 3C illustrates some lines selected from the query image of FIG. 3B by user. FIG. 4A illustrates an example of a model image corresponding to the query image which was retrieved in the simulation test. FIG. 4B illustrates the model image of FIG. 4A after line approximation. As shown in FIG. 4B, the retrieved model image approximated with lines, as indicated by reference numeral 402, includes similar lines to the line approximated query image, as shown in FIG. 3C. FIG. 5A illustrates another example of a model image, and FIG. 5B illustrate the model image of FIG. 5A after line approximation. As shown in FIG. 5B, the line approximated model image, as indicated by reference numeral 502, also includes similar lines to the line approximated query image, as shown in FIG. 3C. As previously mentioned, in the line based image matching method according to the present invention, a model image with similar shape descriptors to a query image can be retrieved from an image database indexed by line based shape descriptors, without need for object extraction from the query image. The line based image matching method according to the present invention may be embodied as a program executable on a personal or server computer. A functional program, code and code segments, used to implement the present invention, can be derived by a skilled computer programmer from the description of the invention contained herein. The program can be stored in a computer usable medium, including but not limited to storage media such as magnetic storage media, optically readable media, and carrier waves. WE CLAIM: 1. A device for performing a line based image matching, said device being characterized in that (a) a collecting characterizing information unit for collecting characterizing information for a query image and a plurality of model images; and (b) a determining unit for determining the similarity between the query image and each model image on the basis of said characterizing information, which comprises, for each image, relationship between paired lines, wherein said determining unit being configured to determine said similarities on the basis of continuous relaxation labelling using compatibility coefficients for pairs comprising query image nodes and model image labels determined on the basis of characterizing information such as herein described. 2. The device for performing a line based image matching as claimed in claim 1, wherein said collecting characterizing information unit collects characterizing information from lines in the query image and model image, and defines said relationships between paired lines based on the binary relations between the lines of the query image and the lines of the model image. 3. The device for performing a line based Image matching as claimed in claim 2, wherein the model images data's are stored in a database means and the said data's are retrieved from the same. 4. The device for performing a line based image matching as claimed in claim 3, wherein the said database means is connected to a shape descriptor for expressing line information of each of the query image and model images by a manner such as herein described. 5. The device for performing a line based image matching as claimed in claim 2, wherein the binary relation is invariant with respect to rotations, scale changes and translations and the same includes at least one of an angular difference between two lines, a ratio of the lengths, a relative location, and a relative distance. 6. The device for performing a line based image matching as claimed in claim 1, wherein the determining unit comprises a compatibility coefficient measuring unit for measuring the compatibility coefficients of the node-label pairs based on the binary relation. 7. The device for performing a line based image matching as claimed in claim 6, wherein the compatibility coefficient measuring unit comprises: first binary relation measuring unit for measuring the binary relation for two nodes within the set of lines of the query image; second binary relation measuring unit for measuring the binary relation for two labels within the set of lines for each of the model images; and a measuring unit for measuring the compatibility coefficients for the node-label pairs of the query and each of the model images. 8. The device for performing a line based image matching as claimed in claim 7, wherein the compatibility coefficients thus determined by the measuring unit are a measure of the strength of compatibility between the node-label pairs, low value of compatibility coefficients correspond to incompatibility and high value corresponds to image coincidence. 9. The device for performing a line based image matching as claimed in claim 7, wherein the compatibility coefficients are determined as 1 by the measuring unit if the binary relation of a node pair of the query image coincides with the binary relation of a label pair. 10. The device for performing a line based image matching as claimed in claim 1, wherein the determining unit comprises assigning unit for assigning an uniform initial probability to a predetermined number of upper node-label pairs in which the sum of the highest degree of support by each adjacent label for the nodes are within the range of an upper level, the initial probability being close to the final probability. 11. The device for performing a line based image matching as claimed in claim 10, wherein the determining unit comprises a defining unit for defining a probability update element for the continuous relaxation labeling and an updating unit for updating the probability by a manner such as herein described. 12. The device for performing a line based image matching as claimed in claim 1, wherein the determining unit comprises a distance calculating unit for calculating the sum of the distances between corresponding node-to-label pairs of the sets of lines for the query and each of the model images; and a similarity determining unit for determining the reciprocal of the sum of the distances as the similarity between corresponding two images. 13. A device for performing a line based image matching substantially as herein described with reference to the accompanying drawings. |
---|
523-del-2001-correspondence-others.pdf
523-del-2001-correspondence-po.pdf
523-del-2001-description (complete).pdf
Patent Number | 231582 | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Indian Patent Application Number | 523/DEL/2001 | |||||||||||||||
PG Journal Number | 13/2009 | |||||||||||||||
Publication Date | 27-Mar-2009 | |||||||||||||||
Grant Date | 06-Mar-2009 | |||||||||||||||
Date of Filing | 25-Apr-2001 | |||||||||||||||
Name of Patentee | SAMSUNG ELECTRONICS CO., LTD. | |||||||||||||||
Applicant Address | 416, MAETAN-DONG, YEONGTONG-GU, SUWON-SI, GYEONGGI-DO, 442-742, REPUBLIC OF KOREA. | |||||||||||||||
Inventors:
|
||||||||||||||||
PCT International Classification Number | G06T 7/00 | |||||||||||||||
PCT International Application Number | N/A | |||||||||||||||
PCT International Filing date | ||||||||||||||||
PCT Conventions:
|