What’s the difference between CRI and CCT?
Usually, there are two key measures used to valuate lighting sources:
Color Rendering Index (CRI)
Correlated Color Temperature (CCT)
CCT measures the color of a light source by using Kelvin (K) temperature, which indicates the warmth or coolness of a lamp’s color appearance. The lower the Kelvin temperature (2700-3000 K), the warmer the color of the light, while the higher the temperature (3600-5500 K), the cooler. But CCT does not indicate how natural the colors of objects appear when illuminated by a light source. In fact, two lamps can have the same CCT but render colors much differently. This is why we need other measure like CRI to reflect such difference.
Color Rendering Index (CRI) helps match human perception of color quality.
According to Wikipedia, the color rendering index (CRI) of a light source is a quantitative measure of its ability to reproduce the colors of various objects faithfully in comparison with an ideal or natural light source. To put it in simple words, CRI is a measure of a light source’s ability to show object colors “realistically” or “naturally” compared to a familiar reference source, like incandescent light or daylight. The higher the CRI, the better the light source renders every color in the visible spectrum. To have what is generally considered good color rendering, a light source, for example the LED, must have CRI over Ra72 and ideally reach Ra90.
Both CRI and CCT are important when we are considering if the led lights are qualified or not to replace traditional incandescent or halogen lamps.