In our house, we name our pets after terms and objects in the entertainment industry. We've had pets named Pinspot and Patch, and currently have a cat named Lumen. She isn't all that bright (she IS a cat...), but who says the name has to be descriptive, right?
In some ways, projectors have the same problem. There are lots of projectors that SAY they have high lumen output, but then don't have nearly the same brightness as another projector with the same lumens. So can you really trust the manufacturer's spec when they talk about lumen ratings?
First, we need to understand the term Lumen. It is actually a medical term ("The interior space within a tubular structure, such as within a blood vessel, a duct, or the intestine."). However, for our purposes, we will use the optical version of the term - "The unit of luminous flux, equal to the luminous flux emitted within a unit solid angle (1 steradian) from a point source having a uniform intensity of 1 candela, or to the luminous flux received on a unit surface, all points of which are at a unit distance from such a source. Symbolized lm."
(above definitions are from McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.)
Huh? Well, why don't we just say that a Lumen is a unit of measure of light. Should be pretty standard, then, for us to use lumens as a measure of how much light is coming out of a projector, right? Well, not really. See, a projector, especially depending on the type of imaging device, doesn't always put out the same amount of light across the entire image, and in some cases, the brightness depends on how much of the screen is being used at full white.
For example, in the old CRT projectors, lumens ratings could vary widely among projectors that appeared to have the same brightness when displaying a common image. This is why the old InfoComm Projector shootouts were so important to buyers of projection technology.The reason is that a CRT is often brighter when only a portion of the image is displaying white, while the rest is showing black. If the same projector was displaying an entire screen that was full white, the brightness could be substantially lower than if full white was displayed only in the center of the image. So manufacturers' ratings of their lumens output could vary widely, depending on what amount of screen was displaying white while being measured. These measurements are termed "peak lumens" or "center lumens", since the measurement was taken at the center of the output, usually the brightest area of the projection. But it isn't a very accurate measurement, since very few of us use only the center portion of the projection for our image - we need the whole screen to be bright! So a standardized method of measuring projection output was finally developed by the American National Standards Institute (ANSI). It is quite common for a CRT projector to have a Center Lumens rating from the manufacturer that is 6 times the ANSI rating. Newer LCD technology is usually a bit brighter in the center than at the outer parts of the image, but much more uniform. And DLP technology is extremely even between center and outside, with a variance often less than 5%.
ANSI lumens measurement specs are pretty detailed, all the way to the temperature of the room in which the measurement is taken (25 degrees Celcius). I won't get into all the specs, but the most important is that 9 measurements are taken at various screen locations, then averaged, and multiplied by the screen area. This gives the ANSI Lumens rating, which is much more accurate to use to compare brightness between projectors.
So, we use ANSI lumens today to compare the outputs of projectors today - that solves the issue, right? Not completely. You see, these measurements are taken by the manufacturer with new, optimized lamps at their brightest. Makes sense, if you are a manufacturer, to obtain the best lumens rating possible. But since most lamps and power supplies lose brightness as they age, the ANSI rating may not be the same once a projector hits the field. Some lamps hold up better over time, and others lose a lot of brightness within the first few hours of use. So projector brightness in the field can sometimes depend on the type of lamp used, and the age. Other factors come into play, like the types of lens being used (some lenses can cut out nearly half of the light output!) and how frequently the optical path has been cleaned, and whether it is a sealed optical path.
But one thing that a lot of consumers don't understand is the perceived brightness.... how bright one projector appears next to a different unit. This is important when selling and installing boardroom type projectors (but doesn't vary as much in the larger venue systems). The reason for this has to do with contrast ratios. A contrast ratio is essentially the measure of difference between the brightest and the darkest part of an image. A projector with a low contrast ratio has very little range between the brights and darks of an image, and can appear to be soft or gray when compared with another projector that has a higher contrast ratio. These ratios vary widely, from as little as 200:1 or 300:1 all the way up to 10,000:1. Although I find that when you get up in the higher contrast ratios, it is rather difficult to tell them apart, when you compare a 300:1 contrast projector with a unit that has 1,000:1, there is a substantial difference in the perceived brightness and clarity of the image. That is why you will find some smaller projectors with similar ANSI lumen ratings that have a big price difference, or end-user ratings. The one with the higher contrast ratio APPEARS to look brighter, and has a much more pleasing image.
There are a lot of other factors that can affect brightness, but most of our world seems to be focused primarily on ANSI lumens ratings. Do yourself (and your clients) a favor and learn more about things like lamp types, contrast ratios, and optics. Then you can be the genius in the room when questions are asked about projection brightness!