Embodiment 1
[0045]FIG. 1 illustrates a luminance measurement device according to the present embodiment. This luminance measurement device 1 measures the luminance of all pixels of an organic EL panel 2 (each pixel of the organic EL panel 2 is made up of R, G, and B subpixels, and it is assumed in the present invention that “pixel” encompasses such subpixels), and includes a control unit 3, a calculation unit 4, and a storage unit 5. The control unit 3 controls display of the organic EL panel 2 via a pattern generation device 6, and controls imaging by a monochrome solid-state imaging camera 7 that is disposed facing the organic EL panel 2. The calculation unit 4 performs various types of calculation based on the image that was imaged by the camera 7 or the like, and the storage unit 5 stores the imaging result of the camera 7 and the calculation results of the calculation unit 4.
[0046]As illustrated in FIG. 2, when the luminance measurement device 1 measures the luminance of pixels of the organic EL panel 2 while the pixels are being displayed in red, the control unit 3 first instructs the pattern generation device 6 to display an alignment pattern PA shown in FIG. 3 on the organic EL panel 2 (step 1 (in the drawing, this step is denoted with “S. 1”; hereinafter, the same applies to other steps)). The alignment pattern PA is a pattern in which red square dots D are aligned in a matrix, by turning on predetermined pixels (R subpixels) located at specific positions on the organic EL panel 2.
[0047]The control unit 3 causes the camera 7 to image the organic EL panel 2 on which the alignment pattern PA is shown (step 2), and the calculation unit 4 detects, based on this imaged image, on which picture elements on an imaging surface of the camera 7 the image of the dots D is shown, and obtains a positional correspondence relationship between the pixels of the organic EL panel 2 and the picture elements of the camera 7 (step 3), and causes the storage unit 5 to store the correspondence relationship (step 4).
[0048]Next, the control unit 3 instructs the pattern generation device 6 to display an exposure factor measurement pattern PK1 shown in FIG. 4 on the organic EL panel 2 (step 5). The exposure factor measurement pattern PK1 is a pattern obtained by turning on: the most upper left pixel of the organic EL panel 2, every fourth pixel rightward of this pixel, and every fourth pixel downward of these pixels, and the control unit 3 causes the camera 7 to defocus and image the exposure factor measurement pattern PK1 (step 6). Although the shape and size of pixel images on the imaging surface of the camera 7 vary depending on the positions of the pixel images on the imaging surface, it is here assumed that a pixel image has a size that generally ranges over 4×4 picture elements of the camera 7 and, for example, as shown in FIG. 5, a pixel image I ranges over picture elements 9a to 9p on the imaging surface 8 of the camera 7. Also, adjacent pixel images do not overlap each other on the imaging surface of the camera 7.
[0049]The calculation unit 4 obtains the luminance of the entire pixel image on the basis of the image imaged in step 5 (step 7). For example, the luminance of the entire pixel image I can be obtained from outputs of the picture elements 9a to 9p. The obtained luminance of the entire image of pixels that are turned-on and constitute the exposure factor measurement pattern PK1 is stored in the storage unit 5 (step 8).
[0050]Subsequently, the control unit 3 causes the camera 7 to focus and image the exposure factor measurement pattern PK1 (step 9). As described above, the shape and size of pixel images on the imaging surface of the camera 7 differ from each other depending on the position on the imaging surface on which the pixel image is located, but it is here assumed that the pixel image has a size that generally ranges over 3×3 picture elements of the camera 7. For example, as shown in FIG. 6, the pixel image I ranges over the picture elements 9a to 9c, 9e to 9g, and 9i to 9k on the imaging surface 8 of the camera 7. That is, the pixel image I is reduced in size compared to the case where the image is defocused and imaged (FIG. 5) (in other words, the pixels are enlarged and imaged in step 6 in which the pixel image ranges over a larger number of picture elements than the pixel image in step 9), and, at this time, adjacent pixel images do of course not overlap each other.
[0051]The calculation unit 4 obtains the luminance of the central part of the pixel image on the basis of the image imaged in step 9 (step 10), causes the storage unit 5 to store that obtained luminance (step 11), obtains the luminance in the eight peripheral parts of the pixel image (step 12), and causes the storage unit 5 to store that obtained luminance (step 13). Here, the luminance of the central part of the pixel image is assumed to be a luminance that depends on the output of a picture element where the centroid of the pixel image is located, and the luminance in the eight peripheral parts of the pixel image is assumed to be a luminance that depends on the outputs of eight picture elements adjacent to that picture element at which the centroid is located. Therefore, in FIG. 6, the luminance of the central part of the pixel image I is obtained from the output of the picture element 9f, the luminance in a first peripheral part of the pixel image I is obtained from the picture element 9a, the luminance in a second peripheral part of the pixel image I is obtained from the output of the picture element 9b, the luminance in a third peripheral part of the pixel image I is obtained from the output of the picture element 9c, the luminance in a fourth peripheral part of the pixel image I is obtained from the output of the picture element 9e, the luminance in a fifth peripheral part of the pixel image I is obtained from the output of the picture element 9g, the luminance in a sixth peripheral part of the pixel image I is obtained from the output of the picture element 9i, the luminance in a seventh peripheral part of the pixel image I is obtained from the output of the picture element 9j, and the luminance in an eighth peripheral part of the pixel image I is obtained from the output of the picture element 9k.
[0052]Moreover, the calculation unit 4 calculates the central exposure factor by dividing the luminance of the central part of the pixel image that was obtained in step 10 by the luminance of the entire pixel image that was obtained in step 7 (step 14), causes the storage unit 5 to store the calculated central exposure factor (step 15), calculates eight peripheral exposure factors by dividing the luminance in each of the eight peripheral parts of the pixel image that were obtained in step 12 by the luminance of the entire pixel image that was obtained in step 7 (step 16), and causes the storage unit 5 to store the calculated eight peripheral exposure factors (step 17).
[0053]Then, the luminance measurement device 1 performs similar processing to the processing from step 5 to step 15 on: an exposure factor measurement pattern PK2 obtained by turning on pixels that are one pixel rightward of the pixels turned on in the exposure factor measurement pattern PK1; an exposure factor measurement pattern PK3 obtained by turning on pixels that are two pixels rightward of the pixels turned on in the exposure factor measurement pattern PK1; an exposure factor measurement pattern PK4 obtained by turning on pixels that are three pixels rightward of the pixels turned on in the exposure factor measurement pattern PK1; an exposure factor measurement pattern PK5 obtained by turning on pixels that are one pixel downward of the pixels turned on in the exposure factor measurement pattern PK1; an exposure factor measurement pattern PK6 obtained by turning on pixels that are one pixel downward of the pixels turned on in the exposure factor measurement pattern PK2; an exposure factor measurement pattern PK7 obtained by turning on pixels that are one pixel downward of the pixels turned on in the exposure factor measurement pattern PK3; an exposure factor measurement pattern PK8 obtained by turning on pixels that are one pixel downward of the pixels turned on in the exposure factor measurement pattern PK4; an exposure factor measurement pattern PK9 obtained by turning on pixels that are two pixels downward of the pixels turned on in the exposure factor measurement pattern PK1; an exposure factor measurement pattern PK10 obtained by turning on pixels that are two pixels downward of the pixels turned on in the exposure factor measurement pattern PK2; an exposure factor measurement pattern PK11 obtained by turning on pixels that are two pixels downward of the pixels turned on in the exposure factor measurement pattern PK3; an exposure factor measurement pattern PK12 obtained by turning on pixels that are two pixels downward of the pixels turned on in the exposure factor measurement pattern PK4; an exposure factor measurement pattern PK13 obtained by turning on pixels that are three pixels downward of the pixels turned on in the exposure factor measurement pattern PK1; an exposure factor measurement pattern PK14 obtained by turning on pixels that are three pixels downward of the pixels turned on in the exposure factor measurement pattern PK2; an exposure factor measurement pattern PK15 obtained by turning on pixels that are three pixels downward of the pixels turned on in the exposure factor measurement pattern PK3; an exposure factor measurement pattern PK16 obtained by turning on pixels that are three pixels downward of the pixels turned on in the exposure factor measurement pattern PK4. The luminance measurement device 1 then calculates, with respect to each pixel, a central exposure factor and eight peripheral exposure factors (hereinafter, collectively referred to as “exposure factors”), and causes the storage unit 5 to store the calculated exposure factors (step 18). Accordingly, nine exposure factors in total are calculated with respect to all the pixels of the organic EL panel 2.
[0054]Meanwhile, the exposure factor varies due to lens aberration of the camera 7 or the like, depending on a position (a position of the picture element on the imaging surface at which the centroid of the pixel image is located) or a phase (the displacement between the centroid and the center of the picture element on which the centroid of the pixel image is located) of the pixel image on the imaging surface of the camera 7. That is, the exposure factor is determined by four parameters, that is, the δx and δy positions of the pixel image and the αx and αy phases, although the exposure factors of pixels that were obtained in steps 5 to 18 cannot readily be used in the later described step 23 or the like since the respective pixels have different positions and phases. Therefore, the calculation unit 4 calculates the exposure factor of representative positions and phases for each area in the organic EL panel 2 on the basis of the obtained exposure factor of each pixel (step 19), and causes the storage unit 5 to store the calculated exposure factors in an exposure factor table (step 20). The exposure factor of the representative positions and phases is calculated for each area of the organic EL panel 2 for the following reasons. The shape of a pixel image is substantially circular, elliptical or the like, depending on the area of the organic EL panel 2 (depending on the area of the imaging surface of the camera 7). Accordingly, preparation of the exposure factors of a number of positions and phases in areas where the pixel image is substantially circular and the exposure factors of a number of positions and phases in areas where the pixel image is substantially elliptical enables the exposure factor of a substantially circular pixel image to be obtained if needed by interpolating a representative exposure factor for areas in which the pixel image is substantially circular and the exposure factor of a substantially elliptical pixel image to be obtained if needed by interpolating a representative exposure factor for areas in which the pixel image is substantially elliptical.
[0055]When the exposure factor table is obtained, the control unit 3 instructs the pattern generation device 6 to display a luminance measurement pattern PB1 shown in FIG. 7 on the organic EL panel 2 (step 21). The luminance measurement pattern PB1 is a red image that is shown on the organic EL panel 2 and obtained by simultaneously turning on the most upper left pixels of the organic EL panel 2 (R subpixels), every other pixel (R subpixels) moving rightward from that pixel, and every other pixel (R subpixels) moving downward from these pixels, in which adjacent pixel images on the imaging surface of the camera 7 overlap each other when the organic EL panel 2 has a high resolution.
[0056]The control unit 3 causes the camera 7 to image the organic EL panel 2 on which the luminance measurement pattern PB1 is shown (step 22), and the calculation unit 4 calculates, based on this imaged image, the luminance of the pixels constituting the luminance measurement pattern PB1 (step 23), and cause the storage unit 5 to store the calculated luminance (step 24).
[0057]Specifically, based on the correspondence relationship between the pixels of the organic EL panel 2 and the picture elements of the camera 7 that was obtained in step 3, and the exposure factor table that was generated in steps 19 and 20, the calculation unit 4 can recognize which pixel image of the organic EL panel 2 is shown on picture elements of the camera 7. That is, since it is clear from the correspondence relationship that an arbitrary pixel image of the organic EL panel 2 is shown centering on a given picture element of the camera 7, and it is also clear from the exposure factor table which picture elements on the periphery of a “given picture element” the “arbitrary pixel image” ranges over, the calculation unit 4 can know how much each picture element of the camera 7 is influenced by any given pixel of the organic EL panel 2. Also, the output of each picture element of the camera 7 (an amount of received light of each picture element) is clear from the measurement, whereas in the organic EL panel 2 the luminance of each pixel constituting the luminance measurement pattern PB1 is unknown. As shown in FIG. 8, for example, when the picture element 9f of the camera 7 corresponds to the central part of the pixel image 15, and a peripheral part of the pixel images I1 to I4 and I6 to I9, the following equation holds:
B9f=k9X1+k8X2+k7X3+k6X4+k5X5+k4X6+k3X7+k2X8+k1X9
[0058]where B9f denotes the luminance that corresponds to the output of the picture element 9f,
[0059]X5 denotes the luminance of the pixel image I5, and k5 denotes the exposure factor of the central part,
[0060]X1 denotes the luminance of the pixel image I1, and k9 denotes the exposure factor in the peripheral part of the picture element 9f,
[0061]X2 denotes the luminance of the pixel image I2, and k8 denotes the exposure factor in the peripheral part of the picture element 9f,
[0062]X3 denotes the luminance of the pixel image I3, and k7 denotes the exposure factor in the peripheral part of the picture element 9f,
[0063]X4 denotes the luminance of the pixel image I4, and k6 denotes the exposure factor in the peripheral part of the picture element 9f,
[0064]X6 denotes the luminance of the pixel image I6, and k4 denotes the exposure factor in the peripheral part of the picture element 9f,
[0065]X7 denotes the luminance of the pixel image I7, and k3 denotes the exposure factor in the peripheral part of the picture element 9f,
[0066]X8 denotes the luminance of the pixel image I8, and k2 denotes the exposure factor in the peripheral part of the picture element 9f, and
[0067]X9 denotes the luminance of the pixel image I9, and k1 denotes the exposure factor in the peripheral part of the picture element 9f.
[0068]Since such an equation is also holds for other picture elements, the calculation unit 4 can obtain the luminance of the pixels constituting the luminance measurement pattern PB1 by solving a simultaneous linear equation that is constituted by these equations for an unknown X.
[0069]Thereafter, the luminance measurement device 1 performs the same procedures as in steps 21 to 24 on a luminance measurement pattern PB2 obtained by turning on the pixels to the right of the turned-on pixels of the luminance measurement pattern PB1, a luminance measurement pattern PB3 obtained by turning on the pixels below the turned-on pixels of the luminance measurement pattern PB1, and a luminance measurement pattern PB4 obtained by turning on the pixels below the turned-on pixels of the luminance measurement pattern PB2, and calculates the luminance of each of the pixels constituting the luminance measurement patterns PB2, PB3, and PB4 (step 25). Thus, the luminance of all the pixels of the organic EL panel 2 is calculated.
[0070]Although the above has described a method in which the luminance measurement device 1 measures the luminance of pixels of the organic EL panel 2 that are being displayed in red (the luminance of all R subpixels), the luminance measurement device 1 can also measure the luminance of pixels of the organic EL panel 2 that are being displayed in green (the luminance of all G subpixels), and the luminance of pixels of the organic EL panel 2 that are being displayed in blue (the luminance of all B subpixels).
[0071]In the luminance measurement method performed by the luminance measurement device 1 according to the present embodiment, pixels of the organic EL panel 2 are imaged by the camera 7 such that pixel images of the organic EL panel 2 do not overlap each other on the imaging surface 8 of the camera 7, the central exposure factor that indicates the luminance of a central part of the pixel image as a percentage of the luminance of the entire pixel image and the peripheral exposure factor that indicates the luminance of a peripheral part of the pixel image as a percentage of the luminance of the entire pixel image are calculated, then all the pixels of the organic EL panel 2 are sorted into a plurality of groups (the luminance measurement patterns PB1, PB2, PB3, and PB4), sequentially turned on one group after another, and imaged by the camera 7, and the luminance of all the pixels of the organic EL panel 2 is calculated on the basis of the imaged image, the central exposure factor, and the peripheral exposure factor. Therefore, even when the pixel images of the organic EL panel 2 overlap each other on the imaging surface 8 of the camera 7, it is possible to accurately measure the luminance of each pixel by eliminating the influence of adjacent pixel images.
[0072]Also, in the luminance measurement device 1, a correspondence relationship between the pixels of the organic EL panel 2 and the picture elements of the camera 7 is obtained by imaging the alignment pattern PA obtained by turning on predetermined pixels of the organic EL panel 2 by the camera 7, and detecting on which picture elements of the camera 7 the image of the alignment pattern PA is shown, and the use of this correspondence relationship increases the accuracy of the relationship between the imaging result and the pixel luminance. Also, since a pixel image is enlarged and imaged in step 6 and then the luminance of the entire pixel image is obtained in step 7, it is possible to suppress a reduction in measurement accuracy caused by non-light receiving units disposed between picture elements on the imaging surface 8, and to accurately measure the luminance of each pixel.