The Idiots' Guide to Radiometry and Colorimetry
by Ian Mallett
Radiometry and Colorimetry are huge, poorly explained topics that are at the same time staggeringly important to understand when doing graphics. This is a brief summary of the key points intended as a guide for beginners looking at more detailed sources. It's intended for people who know nothing about formalized rendering. You still need Calculus though.
Radiometry:
Radiometry literally means light measurement. All your graphics code should work with radiometric units. Here are the important radiometric units summarized. Within cells, multiple labels are in descending order of use:
Unit Name 
Symbol 
Physical Meaning 
Description 
Radiant Energy 
\(Q\), \(J\) 
Energy (perhaps expressed as a number of photons), in Joules 
Energy is a fundamental concept in Physics. For radiometry, this is usually expressed in some abstract way, but sometimes as a number of photons (especially higherorder packets of photons carrying the energy of their aggregate; the "photon"s in "Photon Mapping" are of this type). 
Radiant Flux, Radiant Power 
\(\Phi\) 
Power (energy per second) 
The amount of energy per second. Contrary to popular misconception, radiant flux is the value we want to measure over a sensor pixel, not radiance. Radiance tells you how bright something looks. In a digital camera, for instance, the electrical response from a pixel is a (simple function of) the incoming power reaching the innards of that pixel's sensor. Contributing to the confusion is probably the fact that an infocus object will have radiant flux proportional to radiance. 
Irradiance/Radiant Flux Density, Radiant Exitance, and Radiosity 
\(E\), \(M\), and \(B\), respectively, all equal to \(\frac{d \Phi}{d A}\) 
Power per unit area 
The amount of energy per second per area. For irradiance, this is incident light arriving at a surface. For radiant exitance, it is emitted light leaving the surface. Radiosity is emitted plus reflected light leaving a surface. You usually only care about irradiance.
These quantities fall off with the inverse square law.
Note: other people have different definitions of radiosity, in particular. That's because they're wrong. Also, because radiometry developed from finite element heat transfer simulations in the Physics community. It doesn't really matter; you almost never care about anything but irradiance when rendering (and then, only in passing).

Radiant Intensity 
\(I\), equal to \(\frac{d \Phi}{d \omega}\) 
Power per steradian 
The amount of energy per second per solid angle. This isn't used much, except in mathematically correct descriptions of point light sources, which don't make physical sense anyway.
Intensity is technically constant with distance, but since the solid angle an object subtends is actually a function of distance, you need to be careful to be considering less and less solid angle when moving away. 
Radiance, Importance 
\(L\), equal to \(\frac{d^2 \Phi}{d \omega \cdot d A \cdot cos(\theta)}=\frac{d^2 \Phi}{d \omega \cdot d A^\perp}\). No standard symbol for importance exists that I am aware of. 
Power per steradian per projected area 
Radiance is the most important unit after energy and power. Intuitively, it represents the "brightness" of an object. It's hard to wrap your head around what it means physically, but I find the best way is with math:
Radiance is the amount of light coming from/arriving at a differential patch and within a differential set of directions. The cosine factor projects the differential patch to be perpendicular to the direction of light propagation.
Amazingly, radiance and importance are invariant over straight rays (that is, they don't attenuate with distance). This makes sense if you remember the intuitive definition. A lightbulb doesn't look brighter when you get closer to it. You receive more energy because it fills more of your field of view, but it doesn't look brighter. This property of distance invariance makes radiance extremely convenient for raytracing.
Importance is the adjoint quantity to radiance. It behaves the same way, and as far as I can tell, only a very few people (including me, oh so very much) care about it.

Tack "spectral" in front of all of these to make them perwavelength. There are also other units. No one cares.
The spectral units are more fundamental. The integrals of the spectral units over the spectrum (usually just the visible spectrum) make the nonspectral units. Example:
This graph is wrong. The \(\vec{y}\)axis should be labeled "Spectral Radiant Flux". "Intensity" here is trying to refer to "signal strength". In any case, it is the spectral radiant flux that is graphed against wavelength (usually \(\lambda\)). The integral of that curve is the just plain "radiant flux" (no "spectral"). This entire graph is called a spectral power distribution because it describes the distribution of radiant flux over wavelength.
In fact, every thing that you can see is due to a spectral power distribution entering your eye(s). This brings us to:
Colorimetry:
Fair disclosure: I hate colorimetry.
Colorimetry is the study of what these radiometric spectra look like when you look at them. Your eyes (usually) have three kinds of color receptors, which we call "red", "green", and "blue". In practice, they respond to different wavelengths of light different amounts. Here's an approximate graph:
The \(\vec{x}\)axis is the wavelength of incoming light in nanometers, and the \(\vec{y}\)axis is the response of the three types of color receptors. This varies from person to person, and the three response curves have also been scaled vertically here to be dimensionless with a maximum at \(1.0\).
The problem is this: how do you convert an incoming set of wavelengths comprising a spectrum (the spectral power distribution) into the three scalarvalued responses of the human eye using those response curves above? Moreover, how can we convert a radiometric image into a set of pixel values that will induce these desired responses? The best answer I've found is to do the following and hope for the best.
First: take your spectral power distribution \(\Phi(\lambda)\) and integrate it against the CIE XYZ color space's three "standard observer"/"color matching" functions \(\bar{x}(\lambda)\), \(\bar{y}(\lambda)\), and \(\bar{z}(\lambda)\) to give the \(X\), \(Y\), and \(Z\) CIE tristimulus values. Please note that the bar over these functions is not an arrow; that is, these functions are not vector valued. I suspect the notation comes from Statistics, where a bar means "estimated" or "average".\[
X = \frac{1}{\int_{360 nm}^{830 nm} \bar{x}(\lambda) \cdot d \lambda} \int_{360 nm}^{830 nm} \Phi(\lambda) \cdot \bar{x}(\lambda) \cdot d \lambda \\
Y = \frac{1}{\int_{360 nm}^{830 nm} \bar{y}(\lambda) \cdot d \lambda} \int_{360 nm}^{830 nm} \Phi(\lambda) \cdot \bar{y}(\lambda) \cdot d \lambda \\
Z = \frac{1}{\int_{360 nm}^{830 nm} \bar{z}(\lambda) \cdot d \lambda} \int_{360 nm}^{830 nm} \Phi(\lambda) \cdot \bar{z}(\lambda) \cdot d \lambda
\]The standard observer functions were determined empirically, and can be found in their native sampled form here. They were measured once with a \(2^\circ\) and once with a \(10^\circ\) field of view. The latter is more typical of viewing cases, but it doesn't matter much which you use.
Notes:
 The division by the integral (clearly to normalize) does not appear in the original definition. I suspect this is because the original observer functions were expected to be normalized. However, data (e.g. from the above link) are not normalized.
 Relatedly, the integral of the original seems to be over nanometers as the principal unit (that is, the limit is from \(360\) to \(830\), not \(0.000000360\) to \(0.000000830\)). This distinction doesn't matter if the normalization term is present.
At this point, you now have converted your radiometric spectrum into a set of abstract responses (representing, vaguely, the actual responses of cones) within the human eye. Now, we need to induce those responses by coloring pixels certain colors.
There are lots of people who are wrong about this. Simply: start with sRGB. sRGB is a color space that defines the response functions of almost every display technology. By converting the CIE XYZ color space into sRGB, we can figure out the pixel values we need. For brevity, the transformation is well explained here. Notes:
 The transformation has tiny \(0^{th}\) and \(1^{st}\) derivative discontinuities. Usually no one cares, but you can tweak it to improve the matter using the theory section a little below it.
 This is where gamma correction comes in. For a well calibrated monitor (the defaults most manufacturers leave do a passable, if not perfect, job of this), the gamma correction will always be \(2.4\). Yes, that's not \(2.2\). The \(2.2\) is an approximation of the real curve, which is actually defined piecewise.
One last note: some different radiometric spectra can be perceived by humans in exactly the same way. This can happen because each wavelength can take on a value independent of all the others. So, (assuming everything is continuous, which is reasonable), there are \(\mathbb{R^+}\times\mathbb{R^+}\) possible spectra but only \(\mathbb{R^+}\times\{short,medium,long\}\cong\mathbb{R^+}\) possible responses^{*}. This phenomenon is called "metamerism".
^{*}Yes, it does turn out that that's actually the same numberâ€”but your eye isn't making the necessary weird bijective maps between them; the integration with the observer functions is pretty clearly noninjective.

COMMENTS 