Magic Leap Patent | Method And System For Subgrid Calibration Of A Display Device

Patent: Method And System For Subgrid Calibration Of A Display Device

Publication Number: 20200043201

Publication Date: 20200206

Applicants: Magic Leap

Abstract

A method for calibrating a wearable device includes displaying an image with a plurality of pixels for each of three primary colors using the wearable device, and determining RGB and XYZ values for each of the plurality of pixels. The method includes selecting a subset of the plurality of pixels to form a group of grid points, and dividing the image into a group of tile regions, with each tile region including a grid point. Grid XYZ values are determined for each grid point, based on averaging XYZ values of all pixels in a corresponding tile region, and a grid RGB-to-XYZ conversion matrix is determined for each grid point. The method also includes determining a correction matrix for each grid point by multiplying an inverse of the grid RGB-to-XYZ conversion matrix for the grid with an sRGB-to-XYZ conversion matrix.

CROSS-REFERENCES TO RELATED APPLICATIONS

[0001] This application claims the benefit of priority to U.S. Provisional Patent Application No. 62/714,502, filed Aug. 3, 2018, entitled “METHOD AND SYSTEM FOR SUBGRID CALIBRATION OF A DISPLAY DEVICE,” the contents of which is hereby incorporated by reference in its entirety for all purposes.

BACKGROUND OF THE INVENTION

[0002] Modern computing and display technologies have facilitated the development of systems for so called “virtual reality” or “augmented reality” experiences, wherein digitally reproduced images or portions thereof are presented to a viewer in a manner wherein they seem to be, or may be perceived as, real. A virtual reality (VR) scenario typically involves presentation of digital or virtual image information without transparency to other actual real-world visual input; an augmented reality (AR) scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the viewer.

[0003] Despite the progress made in these display technologies, there is a need in the art for improved methods and systems related to augmented reality systems, particularly, display systems.

SUMMARY OF THE INVENTION

[0004] Embodiments of the present invention are directed generally to augmented reality systems, particularly, display systems. Some embodiments of the present invention are directed to the calibration of a wearable display in a VR or AR device. As described herein, subgrid analysis is utilized in some embodiments to improve the chromaticity uniformity across a waveguide display.

[0005] According to some embodiments, a method for calibrating a wearable device includes displaying an image with a plurality of pixels for each of three primary colors using the wearable device, and determining RGB and XYZ values for each of the plurality of pixels. The method includes selecting a subset of the plurality of pixels to form a group of grid points, and dividing the image into a group of tile regions, with each tile region including a grid point. Grid XYZ values are determined for each grid point, based on averaging XYZ values of all pixels in a corresponding tile region, and a grid RGB-to-XYZ conversion matrix is determined for each grid point. The method also includes determining a correction matrix for each grid point by multiplying an inverse of the grid RGB-to-XYZ conversion matrix for the grid with an sRGB-to-XYZ conversion matrix.

[0006] In some embodiments of the above method, the method also includes forming a correction matrix for each pixel that is a grid point using the correction matrix for the grid point, and forming a correction matrix for each pixel that is not a grid point by interpolation from correction matrices of adjacent grid points. In some embodiments, the interpolation is performed using bilinear interpolation. In alternative embodiments, the interpolation is performed using barycentric interpolation.

[0007] In some embodiments, the method also includes receiving color values for a pixel for an intended content as defined in a target color space, applying the correction matrix for each pixel to the received color values to generate corrected color values, and sending the corrected color values to in the wearable device for displaying the intended content.

[0008] In some embodiments, the method also includes applying a gamma to corrected color values. In some embodiments, the method also includes scaling down the corrected color values to improve intensity uniformity.

[0009] In some embodiments, the method can include receiving color values for a pixel for an intended content as defined in a target color space, applying the correction matrix for each grid point to the received color values to generate new color values, interpolating pixel color values for each pixel using the new color values for adjacent grid points, and sending the interpolated new color values to the wearable device for displaying the intended content.

[0010] In some embodiments, determining RGB and XYZ values for each of the plurality of pixels includes capturing RGB values using a digital camera and converting the RGB values to XYZ values.

[0011] In some embodiments, determining RGB and XYZ values for each of the plurality of pixels includes capturing RGB and XYZ values using spectroradiometer.

[0012] In some embodiments of the above method, the method can include receiving color values for a pixel for an intended content as defined in a target color space, applying the correction matrix for each grid point to the received color values to generate new color values, interpolating pixel color values for each pixel using the new color values for adjacent grid points, and sending the interpolated new color values to the wearable device for displaying the intended content.

[0013] According to some embodiments, a method for calibrating a wearable device includes displaying an image for each of three primary colors by a wearable display device, and capturing the image in M by N pixels using a digital color camera, where M and N are integers, with the digital color camera providing RGB values for each pixel. The method also includes converting the RGB values for each of the pixels of the image to XYZ values using a camera conversion matrix that converts RGB values to corresponding XYZ values. The method also includes selecting K by L grid points from the M by N pixels, where K and L are integers smaller than M and N, and dividing the image into K by L tile regions, with each tile region including a grid point. The method further includes determining grid XYZ values for each grid point, based on averaging XYZ values of all pixels in a corresponding tile region, and determining a grid RGB-to-XYZ conversion matrix for each grid point that converts the RGB values at the grid point to the XYZ values of the grid point. The method also includes determining a correction matrix for each pixel by multiplying an inverse of the grid RGB-to-XYZ conversion matrix for the pixel with an sRGB-to-XYZ conversion matrix.

[0014] In some embodiments of the method, the correction matrix is an sRGB-to-display-RGB correction matrix that is configured to provide corrected display-RGB color values to the wearable device.

[0015] In some embodiments, the method also includes receiving color values for a pixel for an intended content as defined in a target color space, and determining if the pixel is a grid point. If the pixel is a grid point, the correction matrix for the grid point is used as the pixel correction matrix for the pixel. If the pixel is not a grid point, a pixel correction matrix for the pixel is determined by interpolation from correction matrices at adjacent grid points, wherein each matrix element of the correction matrix is interpolated from corresponding matrix elements in correction matrices at adjacent grid points. The method further includes applying the pixel correction matrix for the pixel point to the received color values to generate new color values, and sending the corrected color values to in the wearable device for displaying the intended content.

[0016] In some embodiments, the interpolation is performed using bilinear interpolation. In some embodiments, the interpolation is performed using barycentric interpolation.

[0017] In some embodiments, the method also includes linearizing received color values. In some embodiments, the method also includes applying a gamma to the new color values. In some embodiments, the method also includes scaling down the grid point color values to improve intensity uniformity.

[0018] According to some embodiments of the invention, a system for calibrating a wearable display device includes a digital color camera disposed to capture an image display by the wearable display device. The digital color camera is configured to provide RGB values for each of a plurality of pixels of the image. The system also has a first processor for converting the RGB values provided by the camera to XYZ values. The first processor is also configured for selecting a subset of the plurality of pixels to form a group of grid points, and dividing the image into a group of tile regions, each tile region including a grid point. The first processor is also configures for determining grid XYZ values for each grid point, based on averaging XYZ values of all pixels in a corresponding tile region, and determining a grid RGB-to-XYZ conversion matrix for each grid point. Further, the first processor is configured for determining a correction matrix for each pixel by multiplying an inverse of the grid RGB-to-XYZ conversion matrix for the pixel with an sRGB-to-XYZ conversion matrix.

[0019] In some embodiments, the system is further configured to form a correction matrix for each pixel that is not a grid point by interpolation from correction matrices of adjacent grid points.

[0020] In some embodiments, the system is further configured for receiving color values for a pixel for an intended content as defined in a target color space, applying the pixel correction matrix for the grid point to the received color values to generate new color values, and sending the corrected color values to in the wearable device for displaying the intended content.

[0021] According to some embodiments, a method for calibrating a wearable device can include displaying an image with a plurality of pixels for each of three primary colors using the wearable device, and determining RGB and XYZ values for each of the plurality of pixels. The method includes determining a pixel RGB-to-XYZ conversion matrix for each pixel. Further, the method can determine a correction matrix for each pixel by multiplying an inverse of the pixel RGB-to-XYZ conversion matrix for the pixel with an sRGB-to-XYZ conversion matrix. To apply correction to the display, the method includes receiving color values for each pixel for an intended content as defined in a target color space, applying the correction matrix for each pixel to the received color values to generate corrected color values, and sending the corrected color values to the wearable device for displaying the intended content.

[0022] For example, in some embodiments, a method for calibrating a wearable device can include displaying an image with a plurality of pixels for each of three primary colors using the wearable device, and determining RGB and XYZ values for each of the plurality of pixels. The method includes determining a pixel RGB-to-XYZ conversion matrix for each pixel. Further, the method can determine a correction matrix for each pixel by multiplying an inverse of the pixel RGB-to-XYZ conversion matrix for the pixel with an sRGB-to-XYZ conversion matrix. To apply correction to the display, the method includes receiving color values for each pixel for an intended content as defined in a target color space, applying the correction matrix for each pixel to the received color values to generate corrected color values, and sending the corrected color values to the wearable device for displaying the intended content.

[0023] Numerous benefits are achieved by way of the present invention over conventional techniques. For example, embodiments of the present invention provide methods and systems for using a low-cost digital color camera to calibrate a wearable display device. Embodiments of the present invention provide methods and systems for determining a conversion model from the digital camera specific RGB color space to the CIE XYZ color space. The conversion model can then be applied to digital color cameras for measuring absolute chromaticity and luminance of the virtual images. The calibrated low-cost digital cameras can be used in large quantities in a production environment for the calibration of a large number of wearable display devices. Further, embodiments of the present invention provide methods and systems for subgrid analysis to improve the chromaticity uniformity across a waveguide display. Further, image tiling techniques are provided that can reduce the complexity and cost of grid-by-grid correction.

[0024] These and other embodiments of the invention along with many of its advantages and features are described in more detail in conjunction with the text below and attached figures.

BRIEF DESCRIPTION OF THE DRAWINGS

[0025] FIG. 1 is a perspective view of an exemplary augmented reality glasses according to some embodiments;

[0026] FIG. 2 is a top view of an exemplary augmented reality glasses according to some embodiments;

[0027] FIG. 3 is a schematic diagram illustrating the light paths in a viewing optics assembly (VOA) that may be used to present a digital or virtual image to a viewer according to some embodiments of the present invention;

[0028] FIG. 4A is diagram illustrating the CIE 1931 color space chromaticity diagram.

[0029] FIG. 4B is a simplified schematic diagram illustrating a method for calibrating a wearable device according to some embodiments of the present invention.

[0030] FIG. 5A is a simplified schematic diagram illustrating a system for calibrating a wearable device according to some embodiments of the present invention.

[0031] FIG. 5B is a simplified schematic diagram illustrating a system for calibrating a wearable device according to another embodiment of the present invention.

[0032] FIG. 5C is a simplified block diagram illustrating a system for characterizing a digital color camera according to some embodiments of the present invention.

[0033] FIG. 6 is a flowchart illustrating a method for performing color calibration of a wearable device according to some embodiments of the present invention.

[0034] FIG. 7A is a simplified diagram illustrating the pixels in a display image field according to some embodiments of the invention.

[0035] FIG. 7B is simplified diagram illustrating tiled regions in a display image field according to some embodiments of the invention.

[0036] FIG. 7C is a simplified diagram illustrating a portion of a display image field according to some embodiments of the invention.

[0037] FIG. 8 is a flowchart illustrating a method of calibrating a display device according to some embodiments of the present invention.

[0038] FIG. 9 is a simplified diagram illustrating interpolation of the correction matrix for pixel points that are not grid points according to some embodiments of the present invention.

[0039] FIGS. 10 and 11 are diagrams illustrating an example of calibration results using methods described above according to some embodiments of the invention.

[0040] FIG. 12 is a simplified schematic diagram illustrating a computer system according to an embodiment described herein.

DETAILED DESCRIPTION OF THE SPECIFIC EMBODIMENTS

[0041] Embodiments of the present invention are directed characterization of a digital camera for use in the calibration of a wearable display device.

[0042] A display operated in native mode may not show the intended colors. Typically, a device may output colors using the three channels or primary colors of red, green, and blue. But, because the three primary colors used for those channels may be different from those used by the target color space (say, sRGB), the display device usually has a 3.times.3 matrix to colorimetrically transform the three sRGB values into three numbers for the display’s red, green, and blue, such that when sent to the device, it will output the originally-intended color. The LEDs used in the display device may not only have primary colors different from those of the target’s color space, but the colors shift in both chromaticity (x, y) and intensity (Y) after their spectra pass through the optics, making the color changes vary depending on the location on the display. For example, the color displayed by a wearable device can be non-uniform. For example, in a wearable device that includes a waveguide display, the optical components can narrow the color spectrum and/or shift the color spectrum. Further, color shifts can occur from the user’s temple side to the nasal side. Color variations can also be caused by other factors. Therefore, it is desirable to apply pixel-by-pixel calibration and correction to obtain a uniform output color over the field of view.

[0043] FIG. 1 is a perspective view of an exemplary wearable display device 100 according to some embodiments. In this example, wearable display device 100 can be a pair of augmented reality glasses. As shown in FIG. 1, wearable display device 100 can include frames 110 supporting a left waveguide eyepiece 120L and a right waveguide eyepiece 120R. Each waveguide eyepiece 120L and 120R can include an input coupling grating (ICG) 121, an orthogonal pupil expander (OPE) 122, and an exit pupil expander (EPE) 123. The input coupling grating is also referred to as the input coupling port. The input coupling grating (ICG) 121, orthogonal pupil expander (OPE) 122, and exit pupil expander (EPE) 123 can be suitable diffractive optical elements (DOEs). For example, they can take the form of gratings formed on an optical waveguide. According to certain embodiments, rather than providing a single waveguide for each eyepiece, each eyepiece can have a stack of multiple optical waveguides, for different colors and with different optical power EPEs. The EPEs are configured to project images that can be viewed from the user eye positions 130.

[0044] In FIG. 1, incoming light, which can be image light or a scanning light beam, can incident upon the ICG (121) of each eyepiece 120L, 120R. Each ICG 121 couples the incoming light into guided mode propagating in a direction toward the OPE region 122. The eyepiece propagates the image light by total internal reflection (TIR). The OPE region 122 of each eyepiece 120L, 120R can also include a diffractive element that couples and redirects a portion of the image light propagating in the eyepiece 120L, 120R toward the EPE region 123. The EPE region 123 includes a diffractive element that couples and directs a portion of the light propagating in each eyepiece 120L, 120R in a direction outward from the plane of the eyepiece layer 120 toward the viewer’s eye positions 130. In this fashion, an image may be viewed by the viewer.

[0045] The incoming light may include light in the three primary colors, namely blue (B), green (G), and red (R).

[0046] In some applications, the eyepiece can accept collimated light which is scanned in two degrees of freedom. Each instantaneous angle of incidence (or small range of angle of incidence) corresponds to an angularly defined pixel. In some embodiments, the light can be configured to simulate a virtual object, which can appear to be some distance, e.g., half a meter to a meter, away from the viewer.

[0047] FIG. 2 is a top view of an exemplary wearable display device 200 according to some embodiments. In this example, wearable display device 200 can be a pair of augmented reality glasses. As shown in FIG. 2, wearable display device 200 can include frames 210 and eyepieces 220. Each eyepiece can be similar to eyepieces 120L and 120R in FIG. 1 and can include an ICG, an OPE, and an EPE, which are not visible in the top view. Wearable display device 200 can also include a scanner housings 230, which can include a scanning mirror for forming a virtual image (e.g., at infinity) from incoming light sources. In some embodiments, the ICGs are used as input ports for receiving light. The images formed by the eyepiece can be viewed from user eye positions 240. The augmented reality glasses can also have left and right speakers 250 and cameras 260.

[0048] As described above, the incoming light may include light in the three primary colors, namely blue (B), green (G), and red (R). In some embodiments, the light beams in the incoming light are combined in an optical combiner. For a wearable device for VR and AR applications, it is desirable that the optical combiners in the system are compact and light weight.

[0049] FIG. 3 illustrates schematically the light paths in a viewing optics assembly (VOA) that may be used to present a digital or virtual image to a viewer, according to some embodiments of the present invention. The VOA includes a projector 301 and an eyepiece 300 that may be worn around a viewer’s eye. In some embodiments, the projector 301 may include a group of red LEDs, a group of green LEDs, and a group of blue LEDs. For example, the projector 301 may include two red LEDs, two green LEDs, and two blue LEDs according to an embodiment. The eyepiece 300 may include one or more eyepiece layers. Projector 301 can also include a LCOS (Liquid Crystal on Silicon)–SLM (Spatial Light Modulator), and various optical elements such as a reflective collimator and a projector relay. In one embodiment, the eyepiece 300 includes three eyepiece layers, one eyepiece layer for each of the three primary colors, red, green, and blue. In another embodiment, the eyepiece 300 may include six eyepiece layers, i.e., one set of eyepiece layers for each of the three primary colors configured for forming a virtual image at one depth plane, and another set of eyepiece layers for each of the three primary colors configured for forming a virtual image at another depth plane. In other embodiments, the eyepiece 300 may include three or more eyepiece layers for each of the three primary colors for three or more different depth planes. Each eyepiece layer comprises a planar waveguide layer and may include an incoupling grating 307, an orthogonal pupil expander (OPE) region 308, and an exit pupil expander (EPE) region 309.

[0050] Still referring to FIG. 3, the projector 301 projects image light onto the incoupling grating 307 in eyepiece 300. The incoupling grating 307 couples the image light from the projector 301 into the planar waveguide layer propagating in a direction toward the OPE region 308. The waveguide layer propagates the image light in the horizontal direction by total internal reflection (TIR). The OPE region 308 of the eyepiece 300 also includes a diffractive element that couples and redirects a portion of the image light propagating in the waveguide layer toward the EPE region 309. The EPE region 309 includes an diffractive element that couples and directs a portion of the image light propagating in the waveguide layer in a direction approximately perpendicular to the plane of the eyepiece 300 toward a viewer’s eye 302. In this fashion, an image projected by projector 301 may be viewed by the viewer’s eye 302.

[0051] As described above, image light generated by the projector may include light in the three primary colors, namely blue (B), green (G), and red (R). Such image light can be separated into the constituent colors, so that image light in each constituent color may be coupled to a respective waveguide layer in the eyepiece.

[0052] In some embodiments, the color displayed by a wearable device can be calibrated by measuring the wearable output with an eye-proxy camera at the eye position and compared with target display values. The measurement can be carried out at a larger FOV (field of view), and can be close to the full FOV of the camera. Digital color cameras measure color in RGB space. To represent human perception of the color, data in the camera’s RGB space can be transformed from the camera’s RGB space to the eye’s XYZ space or other absolute color space, e.g., as defined in the CIE 1931 color space. Once the output of the wearable device can be described in the absolute color space, adjustments can be applied to the colors to obtain the desired virtual image, for example, a uniform white color over the entire image.

[0053] FIG. 4A is a diagram illustrating the CIE 1931 color space chromaticity diagram. The CIE XYZ color space encompasses all color sensations that are visible to a person with average eyesight. The CIE XYZ (Tristimulus values) is a device-invariant representation of color. It serves as a standard reference against which many other color spaces are defined. In FIG. 4A, point 401 denotes D65 (or CIE Standard Illuminant D65) is a commonly used standard illuminant defined by the International Commission on Illumination (CIE). It is used to portray standard illumination conditions at open-air in different parts of the world. FIG. 4A also illustrates the sRGB color space 402. The sRGB (standard Red Green Blue) color space is an RGB color space that Hewlett Packard and Microsoft created cooperatively in 1996 to use on monitors, printers, and the Internet. It was subsequently standardized and is often the “default” color space for images. It is understood, however, that other color spaces can also be used with the embodiments described here.

[0054] FIG. 4B is a simplified schematic diagram illustrating a method for calibrating a wearable display device according to some embodiments of the present invention. In FIG. 4B, a wearable display device 400 is similar to the exemplary wearable display device 100 illustrated above in FIG. 1. In this example, wearable display device 400 can be a pair of glasses for augmented reality applications. As shown in FIG. 4, wearable display device 400 can include frames 410 supporting a left waveguide eyepiece 420L and a right waveguide eyepiece 420R. Each waveguide eyepiece 420L and 420R can include an input coupling grating (ICG) 421, an orthogonal pupil expander (OPE) 422, and an exit pupil expander (EPE) 423. The input coupling grating may also be referred to as an input coupling port. The input coupling grating (ICG) 421, orthogonal pupil expander (OPE) 422, and exit pupil expander (EPE) 423 can be suitable diffractive optical elements (DOEs). For example, they can take the form of gratings formed on an optical waveguide. Each eyepiece can have a stack of multiple optical waveguides, for different colors and with different optical power EPEs. The EPEs are configured to project images that can be viewed from the user eye positions 430.

[0055] In embodiments of the invention, for the calibration of the wearable device 400, digital color cameras 431 and 432 are disposed or positioned at the user eye positions 430, that is, positioned where the users’ eyes would be located during use of the wearable device. In some embodiments, a spectroradiometer can be used to measure the displayed image light of a display device and determine the output color in an absolute color space, such as the CIE XYZ color space. However, spectroradiometers are often too bulky for the measurement of a wearable device as shown in FIG. 4B. Further, spectroradiometers are also expensive, limiting their use in large quantities in a production environment for the calibration of a large number of wearable display devices. Moreover, some methods only calibrate the white color using a white light input. Therefore, conventional systems and methods are not suitable for the calibration of wearable devices using field sequential color displays.

[0056] Accordingly, in some embodiments of the invention, the color displayed by a wearable device can be calibrated by measuring the wearable output with low-cost digital color cameras, also referred to as eye-proxy cameras, that are located at the eye position. Accordingly, eye-proxy cameras are used as a color measurement devices in embodiments of the present invention that can be implemented in a high-volume production environment. Digital color cameras provide the color measurements in RGB space. To represent human perception of the color, data in the camera’s RGB space is mapped from the camera’s RGB space to the eye’s XYZ space, e.g., as defined in CIE 1931 color space. Once the output of the wearable device is described in the absolute color space, adjustments can be applied to the input color data to obtain the desired colors in the virtual image.

[0057] FIG. 5A is a simplified schematic diagram illustrating a system for calibrating a wearable device according to some embodiment of the present invention. FIG. 5B is a simplified schematic diagram illustrating the calibration system 500 with certain functions illustrated according to another embodiment of the present invention. As shown in FIGS. 5A and 5B, a calibration system 500 is implemented for measuring one side of the wearable device, which is associated with one eye. The extension of the system to the testing of the second eye will be evident to one of skill in the art. The calibration system 500 can include an eye-proxy camera 510, a calibration workstation 520, and a GPU 530. The calibration workstation is also referred to as first processor, and the GPU (graphic processing unit) is also referred to as a second processor. In some embodiments, the first and second processors can be part of the same computer system.

[0058] Referring to FIGS. 5A and 5B, wearable device 550 to be calibrated is optically coupled to eye-proxy camera 510. The output of the eye-proxy camera 510 is provided to calibration workstation 520. The output from the calibration workstation, which has a correction matrix TRA for each of the grid points, is provided as an input to GPU 530. GPU 530 is configured to receive color values for a pixel for an intended content as defined in a target color space, and apply the correction matrix for each grid point to the received color values to generate new color values. Further, the GPU can be configured to interpolate pixel color values for each pixel using the new color values for adjacent grid points. The GPU also sends the new color values to the wearable device for displaying the intended content.

[0059] Eye-proxy camera 510 can be a digital color camera. For example, low-cost, small commercial and industrial color cameras can be used. During calibration, the camera can be installed adjacent to a wearable device in a station. Two cameras can be used side by side to measure the wearable device’s display output for the left and right eyes concurrently or simultaneously, as illustrated in FIG. 4B. To simplify the illustration, only one camera 510 is shown positioned at a conjugate distance from a wearable device 550. The position of the wearable device 550 can be shifted to different positions relative to the camera to account for possible color shift with changes in eye position, inter-pupil distance, and movement of the user, etc. Merely as an example, the wearable device 550 is shown to be shifted in three lateral locations, at -3, 0, and +3 mm. In addition, the relative angles of the wearable device with respect to the camera can also be varied to provide additional calibration conditions.

[0060] The wearable device 550 can include one or more light sources (also referred to as image light sources) such LEDs or lasers. In some embodiments, an LCOS projector can be used to provide the display images. The LCOS projector can be built into the wearable device 550. However, in FIGS. 5A and 5B, LCOS 552 is shown outside the wearable device 550 for purposes of illustration and clarity. During calibration, image light is projected by the wearable device 550 in field sequential color order, for example, in the sequence of red image, green image, and blue image. In a field-sequential color system, the primary color information is transmitted in successive images, which at least partially relies on the human vision system to fuse the successive images into a color picture. The eye-proxy camera 510 captures the images as pixels in the camera’s RGB color space and provides the data to the calibration workstation 520.

[0061] The color values for a pixel in the wearable device 550 for an intended content is defined in a target color space by, for example, a content provider. In the examples described here, the content in is specified in color values in the sRGB (Standard Red Green Blue) color space. However, it is understood that other color spaces can be used to define the content, and the description provided here are applicable.

[0062] In the examples described here, calibration workstation 520 is configured to convert the image data in the camera’s RGB space to the CIE XYZ color space. In some embodiments, calibration workstation 520 can capture images in the camera’s RGB space and corresponding measurement values in the XYZ space of a spectroradiometer. The spectroradiometer can capture spectrum and/or CIE XYZ values of the light from, for example, an integrating sphere. In some embodiments, the spectroradiometer can measure the xyY values and derive the CIE XYZ values. The spectroradiometer can also convert the spectrum or XYZ values to CIE XYZ (CIE XYZ color space), CIE xyY (CIE xyY color space), CIELUV (CIE 1976 L*, u*, v* color space), CIELab (CIE L*a*b* color space), or LCh (CIE L*C*h* color space), or some other appropriate absolute color space.

[0063] FIG. 5C is a simplified block diagram illustrating a system for characterizing a digital color camera according to some embodiments of the present invention As shown in FIG. 5C, calibration system 560 can include a light source 521, a spectral modification device 522, an integrating sphere 523, a digital color camera 524, a spectroradiometer 525, and a calibration work station 520. The integrating sphere 523 is an example of a display device, and other kinds of display devices can also be used. The spectroradiometer 525 is an example of a color-measurement device, and other types of color-measurement device can be used instead of the spectroradiometer 525. System 560 is configured to determine a conversion model between each displayed color captured by the digital color camera RGB values and the color-measurement values in an absolute color space, e.g., XYZ values. The conversion model includes conversion matrices or equations for each of three primary colors used in a field sequential color virtual image.

[0064] Light source 521 and spectral modification device 522 are configured to illuminate the integrating sphere 523 with an input light beam of a primary color having spectral properties representative of a light beam in a virtual image in a wearable device. The digital color camera 524 captures an image of the display device and determines from the image RGB values for the primary color. The spectroradiometer 525 determines a color-measurement value, e.g., XYZ values, associated with each corresponding primary color at the display device, thereby acquiring a color-measurement value in an absolute color space. The calibration work station 520 is configured to determine a conversion model for each primary color using RGB values and the color-measurement values determined from training samples representing different illumination conditions.

[0065] In FIG. 5C, light source 521 can include LEDs or lasers for generating light beams of primary colors. Alternatively, light beams of the primary colors can be generated from a monochromator, which includes a white light source that can be filtered to provide different light colors. In some embodiments, the LEDs can be configured to emit light at different brightness levels, or grey levels. In an embodiment, the LEDs are configured at the highest grey level, 255, in a scale from 0-255. In order to characterize the digital color camera at different operating conditions, a broad set of training samples with spectral diversity can be generated by using different LEDs. For example, LED’s from different bins of dies can be selected. Further, the light source 521 can include a controller that can vary the emission power of the LEDs, for example, at LED current levels of 10 mA, 50 mA, or 100 mA, etc., that can change the temperature of the LEDs. The controller can also modulate the LEDs with pulse width modulation (PWM) at various duty cycles, e.g., from 0 to 1.0. The PWM pulsing can change the thermal characteristic of the LEDs. Moreover, to add variety, outliner training samples can be generated by a combination of LEDs of different colors. For example, a training sample of a modified red color can be obtained with red light mixed with a small amount of green and blue. The color mixing can be achieved by PWM control, for example, with 1.0 PWM duty cycle for green and 0.05-0.2 PWM duty cycle for red and blue. As described further below, adding extra colors to a primary color can improve the stability of the computation process to determine the conversion matrix. Additional varieties of training samples can be obtained by leaving out a portion of a training set for a regression procedure, and leaving out a different portion for another regression procedure.

[0066] As described above, a wearable device can include various optical components, such as the waveguide and diffractive components. These optical components can modify the light beam spectrum. In order to provide input light beams that can emulate the output light in the wearable device, the spectrum properties of the wearable device is characterized. The spectral modification device 522 is configured to receive light beams from light source 521 and generate a light beam that is representative of the spectrum of light in a virtual image of the wearable device. For example, the spectral modification device 522 can change the center emission wavelength and the bandwidth of the input light beam. Depending on the embodiments, the spectral modification device 522 can include lens systems, filters, and diffusers, etc. For example, dichroic filters can be used to further narrow LED spectrum and increase saturation. Further, filters can be used to narrow the bandwidth of different colors. In some cases, rotating dichroic filters can be used to tune spectral wavelength to better mimic the wearable output, for example, with the filter positioned at different tilt angles for different colors.

[0067] In FIG. 5C, the integrating sphere 523 is an example of a display device. Alternatively, other display devices, such as a projection screen can also be used. An integrating sphere is an optical component consisting of a hollow spherical cavity with its interior covered with a diffuse white reflective coating, with small holes for input and output ports. Its relevant property is a uniform scattering or diffusing effect. An integrating sphere may be thought of as a diffuser which preserves power but destroys spatial information. As shown in FIG. 5C, System 560 uses an integrating sphere 523 as the display device. The integrating sphere can have an input port 527 for receiving the light beam from the spectral modification device 522, and an output port 528 to provide output light for measurement by digital color camera 524 and spectroradiometer 525. Alternatively, a projection screen can also be used as the display device.

[0068] The digital color camera 524 captures digital images in an array of N by M pixels. In a specific embodiment, the digital color camera 524 can have 1400 by 960 pixels. Color information in pixel is represented in the camera’s RGB color space. For example, when a red light is emitted from the light source 521, the sensors in the digital camera 524 can sense light at the integrating sphere and can capture values of red, green, blue colors (Rr, Gr, Br) in each pixel in the camera’s RGB color space. Similarly, for an input of green color, RGB data (Rg, Gg, Bg) is captured for each pixel. Further, for a input of blue color, RGB data (Rb, Gb, Bb) is captured for each pixel.

[0069] In System 560, the calibration work station 520 is configured to assemble training samples from the captured images in the camera’s RGB space and corresponding spectroradiometer measurement values in the XYZ space of the spectroradiometer. The calibration work station 520 is coupled to spectroradiometer 525 and digital camera 524 through wired or wireless connections. As described further below, a diverse set of training samples can be gathered by varying the light source and spectrum. The calibration work station 520 then applies conversion methods to generate a transformation model from RGB training data to XYZ training data. Various conversion methods can be used, such as least square, linear regression, polynomial regression, or neural networks, etc. In addition, methods such as k-fold cross validation and leave-one-out cross validation can also be used to further optimize conversion robustness and accuracy.

[0070] For example, in a training sample for the red color, the camera captures the image in the camera’s RGB space in the form of (Rr, Gr, Br) representing the red, green, and blue light sensed by the camera. Depending on the embodiments, the RGB data from all pixels of the camera, or a subset of the pixels, can be included in the training set. In some embodiments, for each image, the RGB value over all the pixels is used, since the output from the integrating sphere is substantially uniform. The corresponding spectroradiometer measurement in the form of (Xr, Yr, Zr) in the XYZ space for the red, green, and blue light components are generated by a spectroradiometer. A conversion matrix Mred can be derived by for the training samples for the red color with the following property,

发表评论

电子邮件地址不会被公开。 必填项已用*标注