Title of Invention

METHOD AND DEVICE FOR MOTION ESTIMATION AND COMPENSATION FOR PANORAMA IMAGE

Abstract A device and a method for motion estimation and compensation to be performed on a panorama image with a 360 ° omni-directional view based on that a spatial relation between left and right borders of the panorama image is very high. Accordingly, it is possible to improve an image quality through effective and precise estimation and compensation for the motion of the panorama image. In particular, it is possible to improve the image quality at the right and left edges of the panorama image.
Full Text F0RM2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)
"METHOD AND DEVICE FOR MOTION ESTIMATION AND COMPENSATION FOR PANORAMA IMAGE"
a) INDUSTRY ACADEMIC COOPERATION FOUNDATION KYUNGHEE UNIVERSITY a Non-propit academy entity of korea of 1, Seochun-ri, Kiheung-eup, Yongin-si, Gyeonggi-do, Republic of Korea
b) SAMSUNG ELECTRONICS CO., LTD a Korean corporation of 416, Maetan-dong, Yeongtong-gu, Suwon-si, Gyeonggi-do 442-742, Republic of Korea
The following specification particularly describes the invention and the manner in which it is to be performed.

■ /*-
WO 2006/016783 PCT/KR2005/002639
Description
METHOD AND DEVICE FOR MOTION ESTIMATION AND
COMPENSATION FOR PANORAMA IMAGE
Technical Field
[1] The present general inventive concept relates to motion estimation and com-
pensation for a panorama image, and more particularly, to a method and apparatus to estimatea motion of a panorama image containing 360 ° omni-directional image information, and a method and apparatus to compensate for the motion of the panorama image.
Background Art
[2] An omni-directional video camera system is capable of acquiring a 360 ° omni-
directional view from a single viewpoint. The omni-directional video camera system includes a camera to which a special mirror, such as a hyperboloid mirror, or a special lens, such as a fish-eye lens, is installed, or a plurality of cameras.
[3] Three-dimensional (3D) realistic broadcasting may be applied to omni-directional
video coding. As an example of a 3D realistic broadcasting service, a viewer's terminal receives all image information regarding scenes viewed from diverse viewpoints, such as the viewpoints of a pitcher, a catcher, a hitter, and an audience on a first base side in a baseball game, and the viewer can select a desired viewpoint to view one of the scenes from the desired viewpoint
[4] An image captured by the omni-directional camera system has characteristics cor-
responding to a 3D cylindrical environment and thus is transformed into a two-dimensional (2D) plane image. In this case, the 2D plane image is a panorama image with a 360 ° omni-directional view, and omni-directional video coding is performed on the 2D panorama image.
[51 In a motion estimation technique, which is one of image coding techniques, a
motion vector is computed by detecting a data unit, which is most similar to a data unit in a current frame, from a previous frame using a predetermined evaluation function, the motion vector represents a position difference between the data units, and, in general, 16 ' 16 macro blocks are used as the data blocks but the sizes of macro blocks are not limited, and for instance, the data units may be 16x 8,8x16, or 8x8 macro blocks.
[6] A conventional motioa estnnatioa
blocks will now be described in greater detail. First, a motion vector of a current macro block of a current frame is predicted using a plurality of previous macro blocks of a previous frame adjacent to a position cx)nespxrad1ng to the oirrent macro block of the

WO 2006/016783 PCT/KR2005/002639
current frame. FIG. 1 illustrates a plurality of previous macro blocks A, B, C, and Dof
the previous frame used to estimate the motion vector of a current macro block x of
the current frame. The previous macro blocks A through D are encoded before coding of the current macro block X.
[7] However, sometimes, some of previous macro blocks adjacent to the current macro
block X are unavailable in estimating the motion vector of the current macro block X according to the position of the current macro block X in the current frame. FIG. 2A illustrates a case where the previous macro blocks B, C, and D required for estimation of the motion vector of the current macro block X are not present. In this case, the motion vector of the current macro block X is set to 0.
[8] FIG. 2B illustrates a case where the previous macro blocks A and D are not present.
In this case, the motion vectors of the previous macro blocks A and D are set to 0, and the motion vector of the current macro block X is set to a median value of the motion vectors of the previous macro blocks A through D.
[9] FIG. 2C illustrates a case where the previous macro block C is not present. In this
case, the motion vector of the previous macro block C is set to 0, and the motion vector of the current macro block X is set to the median value of the motion vectors of the previous macro blocks A through D.
[10] After predicting the motion vector of the current macro block X, the similarity
between each reference macro block in a reference frame indicated by the predicted motion vector and the current macro block X is computed using a predetermined evaluation function. Next, a reference macro block that is most similar to the current macro block X is detected from the reference frame within a predetermined search range. In general, a sum of absolute differences (SAD) function, a sum of absolute transformed differences (SATD) function, or a sum of squared differences (SSD) function is used as the predetermined evaluation function.
[11] During detection of the most similar reference macro block within the pre-
determined search range, some or all pixels of the reference macro blocks may be present outside thepredeterminedsearchrange. In this case, as illustrated in FIG. 3, it is necessary to pad values of the pixels on left and right borders of the most similar reference macro block pixels to an outside of the left and right borders, respectively, to perform motion estimation and compensation. This motion estimation and compensation is referred to as motion estimation and compensation in an unrestricted motion vector (UMV) mode.
[12] HG.4A illustrates a c^liiidricd
illustrates a panorama image with a 360 ° omni-directional view, taken along with a line X of the cylindrical image of FIG. 4A. Referring to FIG. 4B, a left side A and a right side B of a human-shaped object shown in FIG. 4A are positioned at the right and

WO 2006/016783 PCT/KR2005/002639
left borders of the panorama image, respectively. That is, a spatial relation between the right and left borders of the panorama image with the 360 ° omni-directional view is very high.
[13] Thus, it is ineffective to perform the conventional motion estimation and com-
pensation on a panorama image with an omni-directional view without considering characteristics of the panorama image. Thus, a method of effectively estimating and compensating for the motion of a panorama image with an omni-directional view is required. Disclosure of Invention
Technical Solution
[14] The present general inventive concept provides a method and apparatus to ef-
fectively and precisely estimatea motion of a panorama image containing omnidirectional image information.
[15] The present general inventive concept also provides a method and apparatus to ef-
fectively and precisely compensate for a motion of a panorama image containing omn directional image information.
Advantageous Effects
[16] According to the presentgeneralinventive concept, motion estimation and com-
pensation are performed on a panorama image with a 360 ° omni-directional view based on that the spatial relation between right and left borders of the panorama image is very high, thereby increasing the efficiency and precision of motion estimation and compensation. Accordingly, it is possible to improve image quality, in particular, the image quality at the right and left borders of the panorama image.
Description of Drawings
[17] FIG. 1 is a diagram illustrating a plurality of previous macro blocks available for
conventional estimation of a motion vector for a current macro block;
[18] FIGS. 2A through 2C are diagrams illustrating cases where previous macro blocks
to be used for estimation of a motion vector of a current macro block are not present;
[ 19] FIG. 3 is a diagram illustrating a conventional method of padding a reference
image;
[20] FIG. 4A is a diagram illustrating a cylindrical image with a 360 ° omni-directional
view;
[21] FIG. 4B is a diagram illustrating a two-dimensional (2D) image corresponding to
the cylindrical image of FIG. 4A;
[22] HG. 5 is a block diagram inusta*ing
of a panorama image according to an embodiment of the preaentgeaeral inventive concept;

WO 2006/016783 PCT/KR2005/00^639
[23] FIGS. 6A and 6B are flowcharts illustrating a method of estimating the moti on of a
panorama image according to an embodiment of the presentgeneral inventive concept;
[24] FIG. 7A is a diagram illustrating selection of previous maCTO blOCkS tO be U&ed fOT
estimation of a motion vector of a current macro block according to an embodiment of the presentgeneral inventive concept;
[25] FIG. 7B is a diagram illustrating selection of previous macro blocks to be used for
estimation of a motion vector of a current macro block according to another embodiment of the presentgeneral inventive concept;
[26] FIG. 8A is a diagram illustrating a case where a reference macro block parti ally
overlaps with a reference image;
[27] FIG. 8B is a diagram illustrating a case where a reference macro block is positioned
outside a reference image;
[28] FIG. 9 is a diagram illustrating a method of padding a reference image acco rding to
an embodiment of the presentgeneral inventive concept;
[29] FIG. 10 is a diagram illustrating a motion vector of a current macro block;
[30] FIG. 11 is a block diagram illustrating a decoding unit that decodes a motion vector
of a panorama image according to an embodiment of the presentgeneralinven-tiveconcept; and
[31] FIG. 12 is a flowchart illustrating a method of compensating for a motion o f a
panorama image according to an embodiment of the presentgeneral inventive concept
Best Mode
[32] The foregoing and/or other aspects of the present general inventive conceptmay be
achieved by providing a method of estimating a motion of a current panorama image containing 360 ° omni-directional view information, the method comprising estimating a motion vector of a current data unit of the panorama image using motion vectors of a plurality of previous reference data units of a reference image adjacent to the current data unit, when one or more pixels of one of the reference data units indicated by the estimated motion vector are present outside one of left and right borders of a reference image, padding an image in a predetermined range from the other border of the reference image outside the one of the left and right borders; obtaining values of all pixels of the reference data unit from the padded reference image; and determining a similarity between the current data unit and the reference data unit using a predetermined evaluation function.
[33] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing a JBcthod of estimating a nx)tioii of a cunient panorama image containing 360 ° onmHJBttCtkmal view information, tfre method OTPiprising estimating a motion vector of acanvtt data umt of the vectors of a plurality of previous reference data units adjaceit to me current data unit,

ft
WO 2006/016783 PCT/KR2005/002639
when one or more pixels of one of reference data unit indicated by the estimated motion vector are present outside one of left and right borders of the reference image,
obtaining values of all pixels of the one of the reference data units of the reference image from a cylindrical image which is obtained by connecting the left and right borders of the reference image when the reference image is the cylindrical image, and determining a similarity between the current data unit and the reference data unit using a predetermined evaluation function.
[34] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing an apparatus to compensate for a motion of a panorama image containing 360 ° omni-directional view information, the apparatus comprising a memory to store a reference image to be used for motion estimation of the panorama image, and motion vectors of a plurality of previous reference data units adjacent to a current data unit of the panorama image, and a motion estimating unit to estimate a motion vector of the current data unit using the motion vectors of the plurality of the previous data units, when one or more pixels of one of the reference data units indicated by the estimated motion vector are present outside one of left and right borders of the reference image, padding an image in a predetermined range from the other border of the reference image outside the one of the left and right borders,to obtain values of all pixels of the reference data unit from the padded reference image, and to determine a similarity between the current data unit and the reference data unit using a predetermined evaluation function.
[35] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing an apparatus to estimate a motion of a panorama image containing 360 ° omni-directional view information, the apparatus comprising a memory to store a reference image to be used for motion estimation of the panorama image, and motion vectors of a plurality of previous reference data units of the reference image adjacent to a current data unit of the panorama image, and a motion estimating unit to estimate a motion vector of the current data unit using the motion vectors of the plurality of the previous data units, when one or more pixels of one of the reference data units indicated by the estimated motion vector are present outside one of left and right borders of the reference image, to obtain values of all pixels of the reference data unit from a cylindrical image obtained by connecting the left and right
borders of the reference image on an assumption that the reference image is the
cylindrical image, and to determine a similarity between the current data unit and the
reference data unit using a rjwdetermiiiedevaraation function.
[36] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing a method of compensating ftw a motion of a panorama image containing 360 ° omni-directional view information, the method comprising

*1
WO 2006/016783 PCT/KK2005/002639
receiving a motion vector of a current data unit of the panorama image, when one or more pixels of one of reference data units of a reference image indicated by the motion
vector of the current data unit are present outside one of left and right borders of the reference image, padding the reference image in a predetermined range from the other border of the reference image outside the one of the left and right borders of the reference image, obtaining values of all pixels of the reference data unit from the padded reference image, and reproducing the current data unit using the values of the pixels of the reference data unit.
[37] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing a method of compensating for a motion of a panorama image containing 360 ° omni-directional view information, the method comprising receiving a motion vector of a current data unit of the panorama image, when one or more pixels of one of reference data units of a reference image indicated by the motion vector of the current data unit are present outside one of left and right borders of the reference image, obtaining values of all pixels of the reference data unit from a cylindrical image which is obtained by connecting the left and right borders of the reference image when the reference image is the cylindrical image; and reproducing the current data unit using the values of the pixels of the reference data unit.
[38] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing an apparatus to compensate for a motion of a panorama image containing 360 ° omni-directional view information, the apparatus comprising a memory to store a reference image to be used for motion estimation of the panorama image, and a motion compensating unit to receive a motion vector of a current data unit of the panorama image, when one or more pixels of one of reference data units of a reference image indicated by the motion vector of the current data unit are present outside one of left and right borders of the reference image, to padthe reference image in a predetermined range from the other border of the reference image outside the one of the left and right borders of the reference image,to obtain values of all pixels of the reference data unit from the padded reference image, and to reproduce the current data unit using the values of the pixels of the reference data unit.
[39] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing an apparatus to compensate for the motion of a panorama image containing 360 ° omni-directional view information, the apparatus comprising a memory to store a reference image to be used for motion estimation of the panorama image; and a motion compensating unit to recajw amotion vector of a current data unit of the paaoraam image, when one or mare fixete of reference data units of a reference image indicated by me riwriOT vector of the current data unit are present outside one of left and right borders of the reference image, to obtain values of

WO 2006/016783 PCT/KR2005/002639
all pixels of the reference data unit from a cylindrical image which is obtained by connecting the left and rights borders of the reference image when the reference image is the cylindrical image; and reproducing the current data unit using the values of the pixels of the reference data unit.
[40] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing a n apparatus to estimate a motionvectorof a panorama image containing 360 ° omni-directional view information, the apparatus comprisinga memory to store a reference image having first and second borders and first and second reference data units disposed adjacent to the first border and the second border, re-spectively,within the reference image,anda motion estimating unit to receive a current data unit of a current image and the reference data units of the reference image from the memory, and to estimate a motion vector of the current data unit using one of the first and second reference data units of the reference image which is not included in a search area when the other one of the first and second reference data units is included in the search area.
[41 ] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing a n apparatus to generate a panorama image containing 360 ° omni-directional view information, the apparatus comprising a decoding unit to decode a bitstream having data corresponding to a current image and a reference image, and togenerate a motion vector of a current data unit of the current image to correspond to a search area of the reference image which includes a first reference data unit disposed on a first border of the reference image.a panorama image motion compensating unit to generate a reference macro blockofthe first reference data unit of the reference image usinga second reference data unit disposed on a second border of the reference image which is not included in the search area according to die motion vector,and an output unit to generate die current image according to the reference macro block and data corresponding to the decoded bitstream.
[42] The foregoing and/or other aspects of the present general inventive concept may
also be achieved by providing a n apparatus having an encoder and a decoder to estimate a motion vector of a panorama image containing 360 ° omni-directional view information. The encoder comprises a memory to store a reference image having first and second borders and first and second reference data units disposed adjacent to the first border and die second border, respectively, within the reference image,a motion estimating unit to receive a current data unit of a current image and die reference data units of die reference image from die memory, and to estimate a motion vector of die current data unit using one ofme fust and second reference (fata units of die reference image which is not included in a search area when die odier one of die first and second reference data units is included in die search area,a panorama image motion com-

WO 2006/016783 PCT/KR2005/O02639
pensating unit to generate a reference macro block according to the motion vector and the reference image,anda coding unit togenerate an bitstream according to the current image and the reference macro block. The decoder comprises a decoding unit to decode the bitstream having data corresponding tothecurrent image and the reference image, and to generate the motion vector ofthe current data unit of the current image to correspond to the search area of the reference image which includes the first reference data unit disposed on the first border of the reference image,a second panorama image motion compensating unit to generate the reference macro block of the first reference data unit of the reference image using the second reference data unit disposed on the second border of the reference image which is not included in the search area according to the motion vector,and an output unit to generate the current image according to the reference macro block and data corresponding to the decoded bitstream.
[43] [0012] The foregoing and/or other aspects of the present general inventive concept
may also be achieved by providing a method of estimating a motion vector of a panorama image containing 360 ° omni-directional view information, the method comprising storing a reference image having first and second borders and first and second reference data units disposed adjacent to the first border and the second border, respectively, within the reference image,and receiving a current data unit of a current image and the reference data units of the reference image from the memory, and estimating a motion vector of the current data unit using one of the first and second reference data units of the reference image which is not included in a search area when the other one of the first and second reference data units is included in the search area.
[44] [0013] The foregoing and/or other aspects of the present general inventive concept
may also be achieved by providing a method of generating a panorama image
containing 360 ° omni-directional view information, the method comprising decoding
a bitstream having data corresponding to a current image and a reference image,
generatinga motion vector of a current data unit of the current image to correspond to a
search area of the reference image which includes a first reference data unit disposed
on a first border of the reference image, generatinga reference macro block of the first
reference data unit of the reference image using a second reference data unit disposed
on a second border of the reference image which is not included in the search area
according to the motion vector,and generating the current image according to the
reference macro block and data corresponding to the decoded bitstream.
[45] |0O14] The foregoing and/or other aspects of the preseitf jgeneral inventive concept
may also be achieved by provide
panorama image containing 360 ° omni-directional view mfdrmation, the method comprising storing a reference image having fast and second borders and first and

WO 2006/016783 PCT/KR2005/002639
second reference data units disposed adjacent to the first border and the second border, respectively, within the reference image,receiving a current data unit of a current
image and the reference data units of the reference image from the memory, estimating a motion vector of the current data unit using one of the first and second reference data units of the reference image which is not included in a search area when the other one of the first and second reference data units is included in the search area,generating a reference macro block according to the motion vector and the reference image,generating a bitstream according to the current image and the reference macro block,decoding the bitstream having data corresponding to the current image and the reference image, generating the motion vector of the current data unit of the current image to correspond to the search area of the reference image which includes the first reference data, unit disposed on the first border of the reference image,generating the reference macro block of the first reference data unit of the reference image using the second reference data unit disposed on the second border of the reference image which is not included in the search area according to the motion vector,and generating the current image according to the reference macro block and data corresponding to the decoded bitstream.
Mode for Invention
[46] Reference will now be made in detail to the embodiments of the present general
inventive concept, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present general inventive con-ceptwhilereferring to the figures.
[47] FIG. 5 is a block diagram illustrating an encoding unit that encodes a motion vector
of a panorama image according to an embodiment of the presentgeneral inventive concept. Referring to FIG. 5, the encoding unit includes a transforming unit 110, a quantizing unit 115, an inverse quantizing unit 120, an inverse transforming unit 125, an adding unit 130, a clipping unit 140, a frame memory 150, a panorama image motion estimation unit 160, a panorama image motion compensation unit 170, a subtraction unit 180, and a variable-length coder (VLC) 190.
[48] The transforming unit 110 receives an input panorama image and transforms the
received panorama image through predetermined transformation to output transform coefficients. The input panorama image is a panorama image with a 360 ° omnidirectional view as shown in FIG. 4B, taken along a line X of a cylindrical image shown in FIG. 4A. The fwateniwiaed transform performed fay the mmfiarmiag unit 110 may be a discrete cosine tmnfoim (DCT)inunitsof 8'8 blocks.
[49] The quantizing unit 115 p^»«^w the transform c^ffkieatsrecw
transforming unit 110. After the quantised transform coefficients we inversely

Vt\
WO 2006/016783

PCT/KR2005/002639

quantized by the inverse quantizing unit 120 and inversely transformed by the inverse transforming unit 125, the input panorama image is reproduced. The reproduced panorama image is normalized by the clipping unit 140 and stored in the frame memory 150. The panorama image stored in the frame memory 150 is used as a reference panorama image in motion estimation and compensation of a newly input panorama image. The adding unit 130may have a predetermined value, receivethe reproduced panorama image,modify the reproduced panorama imageusingthepredeter-minedvalue, andoutput one of the reproduced panorama image and the modified panorama image to the clipping unitl40 and the panorama image motion compensation unit 170 as the reproduced panorama image. It is possible that the modified panorama image is the same as the reproduced panorama image according to the predetermined value.
[50] The panorama image motion estimation unit 160 performs motion estimation, using
the reference panorama image stored in the frame memory 150. Specifically, the panorama image motion estimation unit 160 receives information regarding the current panorama image, obtains a motion vector of the current panorama image by performing motion estimation on the current panorama image using the reference panorama image stored in the frame memory 150, and outputs the motion vector to the VLC 190. Motion estimation and compensation are performed in units of predetermined blocks referred to as data units. In this embodiment, the data units may be 16x16 macro blocks.
[51] The panorama image motion compensation unit 170 performs the motion com-
pensation. In detail, the panorama image motion compensation unit 170 receives the motion vector of a current macro block ofthe current panorama image from the panorama image motion estimation unit 160 and thereference panorama image of the frame memory 150, and outputs a reference macro block corresponding to the current macro block to the subtraction unit 180 using themotion vector ofthe current macro block of the current panorama image and the reference panorama image ofthe frame memory 150. The panorama image motion compensation unit 170 may use the reproduced panorama image and the motion vector togeneratethe reference macro block. The subtraction unit 180 outputs a restdualsignal between the current macro block and the reference macro block to the transforming unit 110. The residual signal is transformed by the transforming unit 110, quantized by the quantizing unit 115, and variable-length coded by the VLC 190. The motion vector of the current macro block generated by the panorama image motion estiinationunh 16^ is k^itt directly to and variable-length c»ded by the VLC 190.
[52] The operation of the panorama image motion estimation writ 160 will now be
described in greater detail with reference toFIGS. 6Aand6B. FIGS. 6A and 6B are

WO 2006/016783 PCT/KR2005/MJ2639
flowcharts illustrating a method of estimating the motion of a panorama imagJ according to an embodiment of the presentgeneral inventive concept. Referring
toFlGS. 5, 6A, and 6B, the panorama image motion estimation unit 160 estllr&ies a motion vector of a current data unit using motion vectors of a plurality of previous data units adjacent to the current data unit (S310). As illustrated in FIG. 1, a data unit X is a current data unit, and the data units A, B, C and D are previous data units required for estimation of a motion vector of the current data unit X. In this embodiment, the data units may be 16x16 macro blocks. The current data unit X is included in a current frame, and the plurality of previous data units A, B, C and D are included in a previous frame.
[S3] In detail, the panorama image motion estimation unit 160 detects the motion vectors
of the previous macro blocksA, B, C, and D stored in an internal memory (not shown). When all the previous macro blocksA through D are present, the motion vector of the current macro block X is estimated according to predetermined or conventional motion estimation, using the detected motion vectors.
[54] However, at least one of the previous macro blocks A through D may not be
present. FIG. 7 A illustrates a case where the previous macro blocks A and D are not. present in a panorama image, and thus, their motion vectors are unavailable for motion estimation of the current macro block X. FIG. 7B illustrates a case where the previous macro block C is not present in a panorama image, and thus, its motion vector is unavailable for motion estimation of the current macro block X.
[55] As described above, a spatial relation between the right and left borders of a
panorama image with a 360 ° omni-directional view is very high. That is, a distance between the right and left borders of the panorama image is substantially 0. According to this embodiment of the presentgeneral inventive concept, when one or more of the previous macro blocks A, C, and D required for estimation of the motion vector of the current macro block X are not present, the motion vectors of the previous macro blocks required for motion estimation are determined using the above characteristics of the panorama image. For instance, referring to FIG. 7A, a previous macro block D' at a right side of the panorama image and on a Y-axis on which the previous macro block D is positioned is substantially the same as the previous macro block D. Accordingly, a motion vector of the previous macro block D' is considered to be the same as that of the previous macro block D and can be used in estimation of the motion vector of the current macro block X. In contrast, when the motion vector of a previous macro block at aright side of the panorama image and on an Y-axis on which the previous macro block A is positioned, is predicted after motion estimation of the current macro block X , there is no available motion vector for the previous macro block A. Accordingly, the motion vector of the previous macro block A required for estimation of the motion

1213

WO 2006/016783

PCT/KR2005/M32639

vector of the current macro block X is set to 0.
[56] Referring to FIG. 7B, a previous macro block C at a left side of the panorama
image and on an Y-axis on which the previous macro block C is positioned, u substantially the same as the previous macro block C. Thus, a motion vector of the previous macro block C is considered the same as that of the previous macro block C and thus is used in estimation of the motion vector of the current macro block X.
[57] Referring back toFIGS. 6A and 6B, after the motion vector of the current ltiacro
block X(or current data unit) is estimated in operation S310, the panorama hrage motion estimation unit 160 determines whether the reference macro block indi cated by the estimated motion vector is present in a reference image (or reference panorama image)in operation S315. The reference image is stored in the frame memory 150.
[58] If all pixels of the reference macro block indicated by the motion vector of the
current macro block X are present in the reference image, the pixels of the reference macro block are fetched from the frame memory 150 (S335), and the similarity between the current macro block X and the reference macro block is determined using a predetermined evaluation function (S335).
[59] However, when some or all of the pixels of the reference macro block indicated by
the motion vector of the current macro block X are present outside one of right and left borders of the reference image, an image present in a predetermined range of the reference image from the other border is padded outside the one of the right and left borders (S320).
[60] FIG. 8A illustrates a case where the reference macro block is positioned at a border
of the reference image. FIG. 8B illustrates a case where the reference macro block is positioned outside the reference image.
[61] Referring to FIG. 3, conventional motion estimation and compensation are
performed after padding values of pixels at a left border of a reference image to the outside of the left border and pixels at a right border of the reference image to the outside of the right border. In contrast, the embodiment of the present general inventive concept is based on that the spatial relation between the right and left borders of a panorama image with a 360 ° omni-directional view is very high. Referring to FIG. 9, an outside region 480 of a left border region 450 of a reference image 400 is padded with the values of pixels at a right border region 470 of the reference image 400. An outside region 460 of the right border region 470 is padded with the values of
pixels at the left bonder region 450.
(62] Next, after paddrngtrwrefewncemiage in ooer^^
motion estimation unit 160 fetches afi pixel values of the reference macro block from the padded reference image in the frame mernory 150 (S325). Thereafter, tte siintfarity between the current macro block X and me reference macro block is evaluated using a

tftf

WO 2006/016783

PCT/KR2005A) 02639

predetermined evaluation function (S335). In general, a sum of absolute differences (SAD) function, a sum of absolute transformed differences (SATD) function, or a sum
of squared differences (SSD) function is used as the predetermined evaluation function.
[63] Alternatively, when the reference image is a cylindrical image obtained by
connecting the right and left borders of the reference image, it is possible to obtain the values of all pixels of a reference data unit from the cylindrical image without padding the reference image. Specifically, the reference image is a two-dimensional (2D) plane image such as that shown in FIG. 4B, and the cylindrical image such as that shown in FIG. 4A is obtained by connecting the right and left borders of the 2D plane image. That is, when the reference image is a cylindrical image, the values of all pixe 1 values of the reference data unit can be obtained from the cylindrical image.
[64] Next, the panorama image motion estimation unit 160 changes a position of the
reference macro block in a predetermined search range and determines the similarity between the changed reference macro block and the current macro block X (S340 and S345). After the evaluation of the similarity between the current macro block X and each of a plurality of reference macro blocks in the predetermined search range, the panorama image motion estimation unit 160 determines a reference macro block that is the most similar to the current macro block X from the plurality of reference macro blocks, and generates a motion vector of the determined reference macro block (S350).
[65] FIG. 10 is a diagram illustrating a motion vector of a current macro block 510. In
FIG. 10, a reference numeral 530 denotes the macro block that is most similar to the current macro block 510 and present on the padded reference image, and a reference numeral 540 denotes the macro block that corresponds to the macro block 530 and is present on the non-padded image 500. When the macro block 530 is the most similar to the current macro block 510, a reference numeral 550 denotes the motion vector of the current macro block 510. When the reference macro block 540 is the most similar to the current macro block 510, a reference numeral 560 denotes the motion vector of the current macro block 510. That is, the motion vector of the current macro block 510 may be one of the motion vectors 550 and 560. However, a motion vector of a macro block that does not fall within a predetermined search range may not be transmitted to a decoder (not shown). Therefore, the motion vector 550 of the reference macro block 530 maybe determined as the motion vector of the current macro block 510.
[66] A method and apparatus for compensating for the motion of a panorama image
according to an embodiment of the present general inventive concept will now be described
[67] FIG. 11 is a block diagram of a decoding unit that decodes a motion vector of a
panorama image according to an embodiment of the present invention. Referring to

14) f
WO 2006/016783 PCT/KR2005/) 02639
FIG. 11, the decoder includes a variable-length decoder (VLD) 710, an inverse quantizing unit 720, an inverse transforming unit 730, an adding unit 740, a panorama image motion compensating unit 750, a clipping unit 760, and a frame memory 770.
[68] The VLD 710 decodes an input bitstream using a variable-length coding/
decodingmethod. A motion vector and a residual signal between a macro bloc It and a reference macro block output from the VLD 710 are input to the panorama im age m otion compensating unit 750 and the inverse quantizing unit 720, respectively.
[69] The frame memory 770 stores a reference panorama image obtained by sequentially
inputting the input bitstream to the inverse quantizing unit 720, the inverse transforming unit 730, and the clipping unit 760. The reference panorama image stored in the frame memory 770 is used for compensation for the motion of a newly input panorama image (current panorama image).
[70] The panorama image motion compensating unit 750 performs motion compensation
using the reference panorama image stored in the frame memory 770. In detail, the panorama image motion compensating unit 750 receives a motion vector of a current macro block of the panorama image from an encoder such as that shown in FIG. 5, reads a reference macro block of a previous frame corresponding to the current macro block in the frame memory 770, and outputs the read reference macro block to the adding unit 740. Then, the adding unit 740 receives the residual signal between the current macro block and the reference macro block that are inversely quantized by the inverse quantizing unit 720 and inversely transformed by the inverse transforming 730.
[71] The adding unit 740 reproduces the current macro block using the residual signal
between the current macro block and the reference macro block, and the reference macro block input from the panorama image motion compensating unit 750. The clipping unit 760 normalizes the reproduced current macro block output from the adding unit 740.
[72] The operation of the panorama image motion compensating unit 750 will now be
described in greater detail. FIG. 12 is a flowchart illustrating a method of compensating for the motion of a panorama image according to an embodiment of the pre-sentgeneral inventive concept.
[73] Referring toFIGS. 11 and 12, the panorama image motion compensating unit 750
receives a motion vector of a current data unit on which motion estimation is to be performed from the VLD 710 (S910). In this embodiment, data units may be 16 ' 16
macro blocks.
[741 Next, the paiwraminiageinotioncam^
reference macro Week mdicated by the nation veaor of tfie current n»cro Hock is present in a reference image (S920). The reference image is stored in the frame memory 770.

tftt

WO 2006/016783

PCT/KR2005/KJ2639

[75] When pixels of the reference macro block indicated by the motion vector oif the
current macro block are present in the reference image, the values of all pixels of the
reference macro block are read from the frame memory 770 (S950), and the current macro block is reproduced (S960). The adding unit 740 reproduces the current macro block, using the residual signal between the current macro block and the refeience macro block output from the inversely transforming unit 730 and the reference macro block output from the panorama image motion compensating unit 750.
[76] However, as illustrated in FIG. 8A or 8B, when some or all of the pixels of the
reference macro block indicated by die motion vector of the current macro block are positioned outside one of left and right borders of the reference image, an image in a predetermined range from the other border of the reference image is padded outside the one of the left and right borders (S930). According to the present invention, as illustrated in FIG. 9, regions outside of the reference image are padded based on that the spatial relation between right and left borders of a panorama image with a 360 ° omnidirectional view is very high.
[77] Next, after padding the reference image in operation S930, the panorama image
motion compensating unit 750 reads the values of all pixels of the reference macro block from the padded reference image from the frame memory 770 (S940).
[78] Alternatively, when the reference image is a cylindrical image obtained by
connecting the left and right borders of the reference image, it is possible to obtain the values of all pixels of the reference data unit from the cylindrical image without padding the reference image. More specifically, the reference image is a 2D plane image such as that shown in FIG. 4B, and the cylindrical image such as mat shown in FIG. 4B is obtained by connecting the left and right borders of the 2D plane image. That is, if the reference image is the cylindrical image, the values of all pixels of the reference data unit can be obtained from the cylindrical image.
[791 Lastly, the adding unit 740 reproduces the current macro block using the residual
signal between the current macro block and the reference macro block and the reference macro block input from the panorama image motion compensating unit 750 (S960).
[801 The present general inventive concept may be embodied as computer readable code
in a computer readable medium. Here, the computer readable medium may be any recording apparatus capable of storing data that is read by a computer system, e.g., a
read-only memory (ROM), a random access memory (RAM), a compact disc (CD)-ROM, a magnetic tape, a floppy disk, an optical data storage device, and so on. Abo, the computer readable medium may be a carrier wave mat transmits data via the Internet, for example. The computer readable medium can be distributed among computer systems mat are interconnected through a network, and the present invention

«m

WO 2006/016783

PCT/KR2005O02639

may be stored and implemented as a computer readable code in the distributed system.
[81] As described above, according to the presentgeneralinventive concept, motion
estimation and compensation are performed on a panorama image with a 360 ° omnidirectional view based on that the spatial relation between right and left bord&rs of the panorama image is very high, thereby increasing the efficiency and precision of motion estimation and compensation. Accordingly, it is possible to improve image quality, in particular, the image quality at the right and left borders of the panorama image.
[82] Although a few embodiments of the present general inventive concept have been
shown and described, it will be appreciated by those skilled in the art that chuiges may be made in these embodiments without departing from the principles and spirit of the general inventive concept, the scope of which is defined in the appended claims and their equivalents.

vr\t
WO 2006/016783 PCT/KR2005/MJ2639
Claims
[1] What is claimed is:
1. A method of estimating a motion of a panorama image containing 3(0 ° omnidirectional view information, the method comprising: estimating a motion vector of a current data unit of the panorama image using motion vectors of a plurality of previous reference data units adjacent to the current data unit;
when one or more pixels of one of the reference data units indicated by the estimated motion vector are present outside one of left and right borders of the reference image, padding the reference image in a predetermined range from the other border of the reference image outside the one of the left and right borders; obtaining values of all pixels of the one of the reference data units from the padded reference image; and
determining a similarity between the current data unit and the reference data unit using a predetermined evaluation function.
[2] 2. The method of claim 1, wherein when the one or more of die plurality of the
previous data units is present outside one of the left and right borders of the panorama image, the estimating of the motion vector of the current data unit comprises:
determining the plurality of the previous data units from a cylindrical image which is obtained by connecting the left and right borders of the reference image when the reference image is the cylindrical image.
[3] 3. The method of claim 1, wherein the plurality of the previous data units
comprise:
a first data unit disposed adjacent to a position corresponding toa left side of the current data unit;
a second data unit disposed adjacent to a position correspondingtoa top of the current data unit;
a third data unit disposed adjacent to a right sideof die second data unit; and a fourth data unit disposed adjacent to both the first and second data units.
[4] 4. The method of claim 1, further comprising:
determining the one of the reference data units which is the most similar to the
current data unit in a predetermined search range; and
determining the motion vector representing the determined reference data unit
[5] 5.ArnediadofestmiatiiigaiiiotiOTof omni-
directional view information, the method comprising: estimating a motion vector of a current data unit of the panorama image, using

WO 2006/016783 PCT/KR2OO50 U2639
motion vectors of a plurality of previous reference data units adjacent to the current data unit;
when one or more pixels of one of the reference data units indicated byi the estimated motion vector are present outside one of left and right borders of the reference image, obtaining values of all pixels of one of the reference data units of the reference image from a cylindrical image which is obtained by connecting the left and right borders of the reference image when the reference image is the cylindrical image; and
determining a similarity between the current data unit and the reference data unit using a predetermined evaluation function.
6. The method of claim 5, wherein when at least one of the plurality of the
previous reference data units is present outside one of the left and right borders
of the panorama image, the estimating of the motion vector of the current data
unit comprises:
determining the plurality of the previous reference data units from a cylindrical image which is obtained by connecting the left and right borders of the reference image on when the panorama image is the cylindrical image.
7. The method of claim S, wherein the plurality of the previous data units
comprise:
a first data unit disposed adjacent to a position corresponding toa left side of the
current data unit;
a second data unit disposed adjacent to a position corresponding toa top of the
current data unit;
a third data unit disposed adjacent to a right sideof the second data unit; and
a fourth data unit disposed adjacent to both the first and second data units.
8. The method of claim 5, further comprising:
determining the one of die reference data units which is the most similar to the
current data unit in a predetermined search range; and
determining the motion vector representation of the determined reference data
unit.
9. An apparatus to compensate for a motion of a panorama image containing 360
° omni-directional view information, the apparatus comprising:
a memory to store a reference image to be used for motion estimation of a panorama image, and motion vectors of a plurality of previous reference data
units adjacent to a current data ant of the paiwrama image; and
anwtionestiinatinguiuttoestisuBearaotionvecu^
using the motion vectors of the plurality of the l^ewnce data units, when one or
more pixels of one of the reference data units indkatetfby the estimated motion

WO 2006/016783 PCT/KR2005/*O2639
vector are present outside one of left and right borders of the reference image, to padthe reference image in a predetermined range from the other border o f the reference image outside the one of the left and right borders,t() obtain VilueS of all pixels of the reference data unit from the padded reference image, and to determine a similarity between the current data unit and the reference data unit using a predetermined evaluation function.
[10] 10. The apparatus of claim 9, wherein when the one of the plurality of th e
reference data units is present outside one of the left and right borders of the panorama image, the motion estimating unit determines the plurality of the reference data units from a cylindrical image which is obtained by connecting the left and right borders of the panorama image when the panorama image is the cylindrical image.
[11] 11. The apparatus of claim 9, wherein the plurality of the reference data units
comprise:
a first data unit disposed adjacent to a position corresponding toa left side of the current data unit;
a second data unit disposed adjacent to a position corresponding toa top of the current data unit;
a third data unit disposed adjacent to a right side to the second data unit; aad a fourth data unit disposed adjacent to both the first and second data units.
[12] 12. The apparatus of claim 9, wherein the motion estimating unit determines
theone of the reference data units which is the most similar to the current data unit in a predetermined search range, and determines the motion vector representing the determined reference data unit.
[13] 13. An apparatus for estimating amotion of a panorama image containing 360 °
omni-directional view information, the apparatus comprising: a memory to store a reference image to be used for motion estimation of a panorama image, and motion vectors of a plurality of previous reference data units adjacent to a current data unit of the panorama image; and a motion estimating unit to estimate a motion vector of the current data unit using the motion vectors of the plurality of the reference data units, when oneor more pixels of the one of the reference data units indicated by the estimated motion vector are present outside one of left and right borders of die reference image, to obtain values of all pixels of the one of the reference data units from a cylindrical image obtained by connecting the left and right borders of the reference image when the reference image is the cylindrical image, and to determine a similarity between the current data unit and the reference data unit using a predetermined evaluation function.

WO 2006/016783 PC17KR2005/HO2639
[14] 14. The apparatus of claim 13, wherein when the one of the plurality of the
reference data units is present outside one of the left and right borders of the
panorama image, the motion estimating unit determines the plurality of the reference data units from a cylindrical image obtained by connecting ths left and right borders of the panorama image when the panorama image is the cylindrical image.
[15] 15. The apparatus of claim 13, wherein the plurality of the reference data units
comprise:
a first data unit disposed adjacent to a position corresponding toa left sid e of the current data unit;
a second data unit disposed adjacent to a position corresponding toa top of the current data unit;
a third data unit disposed adjacent to a right sideof the second data unit; and a fourth data unit disposed adjacent to both the first and second data units.
[16] 16. The apparatus of claim 13, wherein the motion estimating unit determines
one of the reference data units which is the most similar to the current data unit in a predetermined search range, and determines the motion vector representing the determined reference data unit
[17] 17. A method of compensating for a motion of a panorama image containing 360
° omni-directional view information, the method comprising: receiving a motion vector of a current data unit of a panorama image; when one or more pixels of one of reference data units of apanoramareference image indicated by the motion vector of the current data unit are present outside one of left and right borders of the reference image, padding an image in a predetermined range from the other border of the reference image outside the one of the left and right borders of the reference image;
obtaining values of all pixels of the reference data unit from the padded reference image; and
reproducing the current data unit using the values of the pixels of the reference data unit.
[18] 18. A method of compensating for a motion of a panorama image containing 360
° omni-directional view information, the method comprising: receiving a motion vector of a current data unit of the panorama image; when one or more pixels of a reference data unit of a reference image indicated by the motion vector of the current data unit are presentoutsideoBeofleftand right borders of the reference image, obtaining values of all pixels of the eae of the reference data mats from a cjiiadMcal image which is obtained the left and right borders of the reference image when the reference image is the

I

WO 2006/016783 PCT/KR2OO50O2639
cylindrical image; and
reproducing the current data unit using the values of the pixels of the one of the
reference data units.
[19] 19. An apparatus to compensate for a motion of a panorama image containing
360 ° omni-directional view information, the apparatus comprising: a memory to store a reference image to be used for motion estimation of a panorama image; and
a motion compensating unit to recover a motion vector of a current data unit of the panorama image, when one or more pixels of one of reference data u nits of the reference image indicated by the motion vector of the current data unit are present outside one of left and right borders of the reference image, to pad the reference image in a predetermined range from the other border of the reference image outside the one of the left and right borders of the reference image,to obtain values of all pixels of the one of the reference data units from the padded reference image, and to reproduce the current data unit using the values of the pixels of the reference data unit.
[20] 20. An apparatus to compensate for the motion of a panorama image containing
360 ° omni-directional view information, the apparatus comprising: a memory to store a reference image to be used for motion estimation of a current panorama image; and
a motion compensating unit to receive a motion vector of a current data unit of the panorama image, when one or more pixels of one of reference data units of the reference image indicated by die motion vector of die current data unit are present outside one of left and right borders of the reference image, to obtain
values of all pixels of the one of the reference data units from a cylindrical image
which is obtained by connecting the left and rights borders of the reference
image when the reference image is the cylindrical image, and to reproduce the
current data unit using the values of the pixels of the reference data unit.
[211 21. A computer readable medium having embodied thereon a program for
executing a method of estimating a motion vector of a panorama image containing 360 ° omni-directional view information, the method comprising: estimating a motion vector of a current data unit of a current panorama image using motion vectors of a plurality of previous reference data units adjacent to the current data unit;
wheo oiie or more pixels of one of ref^^
image indicated by the estimated rrwdon vector are rjreJent outside one of left and right borders of a reference image, padding me rcfeence image in a predetermined range fiom tine other border of the reference image outside the one of

WO 2006/016783 PCT/KR2005/IO2639
the left and right borders;
obtaining values of all pixels of the one of the reference data units from the
padded reference image; and
determining a similarity between the current data unit and the reference data unit
using a predetermined evaluation function.
[22] 22. A computer readable medium having embodied thereon a program for
executing a method of estimating the motion of a panorama image containing 360 ° omni-directional view information, the method comprising: estimating a motion vector of a current data unit of a current panorama image using motion vectors of a plurality of previous reference data units adjacent to the current data unit;
when one or more pixels of oneof reference data units of a reference image indicated by the estimated motion vector are present outside one of left and right borders of the reference image, obtaining values of all pixels of the reference image from a cylindrical image which is obtained by connecting the left and right borders of the reference image when die reference image is die cylindrical image; and
determining a similarity between the current data unit and the reference data unit using a predetermined evaluation function.
[23] 23. A computer readable medium having embodied thereon a program for
executing a method of compensating for a motion of a panorama image containing 360 ° omni-directional view information, the method comprising: receiving a motion vector of a current data unit of a current panorama image; when one or more pixels of one of reference data units of a reference image indicated by the motion vector of the current data unit are present outside one of left and right borders of the reference image, padding the reference image in a predetermined range from the other border of the reference image outside the one of the left and right borders of the reference image;
obtaining values of all pixels of the reference data unit from the padded reference image; and
reproducing the current data unit using the values of the pixels of the reference data unit.
[24] 24. A computer readable medium having embodied thereon a program for
executing a method of compensating for a motion of a panorama image containing 360 ° omni-directional view information, the method comprising: receiving a motion vector fflf a current data unit of a panorama image; when one or more pixels of one of reference data units of a reference image indicated by the motion vector of the current data unit are present outside one of

WO 2006/016783 PCT/KR2005/0O2639
left and right borders of the reference image, obtaining values of all pixels of the reference data unit from a cylindrical image which is obtained by connec ting the
left and right borders of the reference Image when the reference image IS tne cylindrical image; and
reproducing the current data unit using the values of the pixels of the reference data unit.
[25] 25. An apparatustoestimatea motionvectorof a panorama image containing 360 °
omni-directional view information, the apparatus comprising: a memory to store a reference imagehaving firstand secondbordersandfirst and secondreferencedata units disposedadjacent tothe first border and the second border, respectively,within the reference image.and
a motion estimating unit to receive a current data unit of a current image and the reference data units of the reference image from the memory, and to estimate a motion vector of the current data unit using one of die first and second reference data units of the reference image which is not included in a search area when the other one of the first and second reference data units is included in the search area.
[26] 26. The apparatus of claim 25, whereinthe reference imagecomprisesa cylindrical
imageformedwhenthe first and second borders are connected, and the first and second reference data units comprise first and second macro blocks, respectively, having a spatial relationshiptherebetweenand disposed adjacent to each other in the cylindrical image.
[27] 27. The apparatus of claim 25, wherein the reference image and the current im-
agecomprisepanorama images, and thesearching area includesone of the first and second reference data units disposed in an outside of a searching areaof the motion vector of the current data unit while the other one of the first and second reference data units is disposedwithinthe searching area.
[28] 28. The apparatus of claim 25, further comprising:
a panorama image motion compensating unitto generate a reference macro block according tothe motion vector and the. reference image; and an encodingunit to generatea signal corresponding tothe reference image according to the reference macro block and the current image, a second unit to generate the motion vector according toa codedsignal corresponding tothe quantized transformcoefficients, andto generatea residual sig-Mlacoirdingtothecoded signal;
asecowirjaiwraniaiiriagenTOtioncojnpeaisad refer*
encemacro Mock aocotdiag totbemotioa vector, and a thiid unit to generate the current image accordmg to tte referee* nuicro block

WO 2006/016783 PCT/KR2005/00 2639
and the residual signal.
29] 29. The apparatus of claim 28, further comprising:
a second unit to generate the motion vector according to a Coded Signal Corresponding to the quantized transform coefficients, and to generate a residual signal according to the coded signal;
a second panorama image motion compensating unit to generate the reference macro block according to the motion vector; and
a third unit to generate the current image according to the reference macro block and the residual signal.
[30] 30. An apparatus to generate a panorama image containing 360 ° omni-di-
rectional view information, the apparatus comprising: a decoding unit to decode a bitstream having data corresponding to a current image and a reference image, and to generate a motion vector of a current data unit of the current image to correspond to a search area of the reference i mage which includes a first reference data unit disposed on a first border of the reference image;
a panorama image motion compensating unit to generate a reference macro block of the first reference data unit of the reference image using a second reference data unit disposed on a second border of the reference image which is not included in the search area according to the motion vector;and an output unit to generate the current image according to the reference macro block and data corresponding to the decoded bitstream.
[31] 31. An apparatus to estimate a motion vector of a panorama image containing
360 ° omni-directional view information, the apparatus comprising: an encoder comprising:
a memory to store a reference image having first and second borders and first and second reference data units disposed adjacent to the first border and the second border, respectively, within the reference image, a motion estimating unit to receive a current data unit of a current image and the reference data units of the reference image from the memory, and to estimate a motion vector of the current data unit usingone ofthe first and second reference data units of the reference image which is not included in a search area when the other one of the first and second reference data units is included in the search area,
a panorama image motion compensating nnit to generate a reference macro block according to the motion vector and the reference image,and a coding unit to generate an bitstream according to ihe current image and the* reference macro Mock; and

WO 2006/016783 PCT/KR2OO5/O0 2639
a decoder comprising:
a decoding unit to decode the bitstream having data corresponding to the current
image and the reference image, and to generate the motion vector of the current
data unit of the current image to correspond to the search area of the reference image which includes the first reference data unit disposed on the first bo rder of the reference image,
a second panorama image motion compensating unit to generate the reference macro block of the first reference data unit of the reference image using t he second reference data unit disposed on the second border of the reference image which is not included in the search area according to the motion vector,and an output unit to generate the current image according to the reference m aero block and data corresponding to the decoded bitstream.
[32] 32.A method of estimating a motion vector of a panorama image containing 360
° omni-directional view information, the method comprising: storing a reference image having first and second borders and first and second reference data units disposed adjacent to the first border and the second border, respectively, within the reference image; and
receivinga current data unit of a current image and the reference data units of the reference image from the memory, and estimating a motion vector of the current data unit using one of the first and second reference data units of the reference image which is not included in a search area when the other one of the first and second reference data units is included in the search area.
[33] 33. The method of claim 32, further comprising:
generating a reference macro block according to the motion vector and the reference image; and
generating the reference image according to the reference macro block and the current image.
[34] 34.A method of generating a panorama image containing 360 ° omni-directional
view information, the method comprising:
decoding a bitstream having data corresponding to a current image and a reference image, and generating a motion vector of a current data unit of the current image to correspond to a search area of the reference image which includes a first reference data unit disposed on a first border of the reference
image;
generating a reference macro block of the first reference data unit of the
ref erence image using a second reference data unit disposed on a second border
of the reference image which is not included in the search aiea according to the
motion vector, and

WO 2006/016783 l'( ' T/KK.'Oovoo •<. l> generating the current image according to the reference macro block ami dala
corresponding to the decoded bitstream.
[35] 35. A method of estimating a motion vector of a panorama image containing 360
0 omni-difectional view information, the method comprising:
storing a reference image having first and second borders and first and second
reference data units disposed adjacent to the first border and the second border,
respectively, within the reference image;
receiving a current data unit of a current image and the reference data units of the
reference image from the memory, and estimating a motion vector of the current
data unit using one of the first and second reference data units of the reference
image which is not included in a search area when the other one of the first and
second reference data units is included in the search area;
generating a reference macro block according to the motion vector and the
reference image;
generating a bitstream according to the current image and the reference macro
block;
decoding the bitstream having data corresponding to the current image and the
reference image, and generating the motion vector of the current data unit of the
current image to correspond to the search area of the reference image which
includes the first reference data unit disposed on the first border of the reference
image;
generating the reference macro block of the first reference data unit of the
reference image using the second reference data unit disposed on the second
border of the reference image which is not included in the search area according
to the motion vector, and
generating the current image according to the reference macro block and data
corresponding to the decoded bitstream.


Dated this 03rd day of February, 2007

ABSTRACT
"METHOD AND DEVICE FOR MOTION ESTIMATION AND COMPENSATION FOR PANORAMA IMAGE"
A device and a method for motion estimation and compensation to be performed on a panorama image with a 360 ° omni-directional view based on that a spatial relation between left and right borders of the panorama image is very high. Accordingly, it is possible to improve an image quality through effective and precise estimation and compensation for the motion of the panorama image. In particular, it is possible to improve the image quality at the right and left edges of the panorama image

Documents:

183-mumnp-2007-abstract(13-10-2008).doc

183-mumnp-2007-abstract(13-10-2008).pdf

183-mumnp-2007-abstract.doc

183-mumnp-2007-abstract.pdf

183-mumnp-2007-cancelled pages(05-02-2007).pdf

183-mumnp-2007-claims(granted)-(05-02-2007).doc

183-mumnp-2007-claims(granted)-(05-02-2007).pdf

183-mumnp-2007-claims.doc

183-mumnp-2007-claims.pdf

183-mumnp-2007-correspondence(24-12-2008).pdf

183-mumnp-2007-correspondence(ipo)-(21-11-2008).pdf

183-mumnp-2007-correspondence-others.pdf

183-mumnp-2007-correspondence-received.pdf

183-mumnp-2007-description (complete).pdf

183-mumnp-2007-drawing(05-02-2007).pdf

183-mumnp-2007-drawings.pdf

183-mumnp-2007-form 1(05-02-2007).pdf

183-mumnp-2007-form 1(06-08-2007).pdf

183-mumnp-2007-form 18(05-02-2007).pdf

183-mumnp-2007-form 2(granted)-(05-02-2007).doc

183-mumnp-2007-form 2(granted)-(05-02-2007).pdf

183-mumnp-2007-form 26(06-08-2007).pdf

183-mumnp-2007-form 3(13-10-2008).pdf

183-mumnp-2007-form 5(03-02-2007).pdf

183-mumnp-2007-form-1.pdf

183-mumnp-2007-form-2.doc

183-mumnp-2007-form-2.pdf

183-mumnp-2007-form-3.pdf

183-mumnp-2007-form-5.pdf

183-mumnp-2007-form-pct-ib-301.pdf

183-mumnp-2007-form-pct-ib-304.pdf

183-mumnp-2007-form-pct-ib-307.pdf

183-mumnp-2007-form-pct-ib-308.pdf

183-mumnp-2007-form-pct-ib-311.pdf

183-mumnp-2007-form-pct-ipea-409.pdf

183-mumnp-2007-form-pct-ipea-416.pdf

183-mumnp-2007-form-pct-isa-210(05-02-2007).pdf

183-mumnp-2007-form-pct-isa-220.pdf

183-mumnp-2007-form-pct-isa-237.pdf

183-mumnp-2007-pct-search report.pdf

183-mumnp-2007-petition under rule 137(13-10-2008).pdf

abstract1.jpg


Patent Number 225797
Indian Patent Application Number 183/MUMNP/2007
PG Journal Number 07/2009
Publication Date 13-Feb-2009
Grant Date 02-Dec-2008
Date of Filing 05-Feb-2007
Name of Patentee INDUSTRY ACADEMIC COOPERATION FOUNDATION KYUNGHEE UNIVERSITY
Applicant Address 1, Seochun-ri, Kiheung-eup, Yongin-si, Gyeonggi-do
Inventors:
# Inventor's Name Inventor's Address
1 PARK, GWANG-HOON B-302 Donga Villa, 45 Bundang-dong, Bundang-gu, Seongnam-si, Gyeonggi-do
2 SON, SUNG-HO (401) 7-96 Gwangmyeong 1-dong, Gwangmyeong-si
PCT International Classification Number H04N7/32
PCT International Application Number PCT/KR2005/002639
PCT International Filing date 2005-08-12
PCT Conventions:
# PCT Application Number Date of Convention Priority Country
1 60/601,137 2004-08-13 U.S.A.
2 10-2004-0081353 2004-10-12 U.S.A.