Circular Dichroism (CD) is observed when optically active matter absorbs left and right handed circularly polarized light slightly differently. It is measured with a CD spectropolarimeter, which is able to measure accurately in the far UV at wavelengths down to 190-170nm. In addition, the difference in the left and right handed absorbance is very small (usually in the range of 0.0001 absorbance units) corresponding to an ellipticity of a few 1/100th of a degree. The CD spectrum is the wavelength dependency of the difference in absorption between the right and left-handed components.
The units used in CD spectroscopy often cause confusion!
The Chirascan Plus (Applied Photophysics) gives its raw output in ellipticity, given the symbol θ, and is measured in millidegrees (mdeg). To compare your data with those of others, the ellipticity θ is usually converted to the Molar Ellipticity. This is given the symbol [θ] (with units of degrees.cm squared. per decimole).
Thus: [θ] = θ / (10 x c x l) where c is the Molar concentration of the sample (mole/L) and l is the pathlength in cm.
Alternatively, for far UV CD measurements on proteins, the mean residue molar ellipticity is often used to reflect the fact that the peptide bond is the absorbing species in this case.
The mean residue molar ellipticity is given the symbol [θ] MRW and [θ] MRW = θ/(10 x cr x l), where cr is the mean residue molar concentration.
This means that cr = n x c, where n is the number of peptide bonds in the protein or peptide and, of course, c = (1000 x n x cg)/Mr, where cg is the macromolecule concentration in g/ml and Mr is the molecular weight of the species.
Fortunately, the CD spectropolarimeter will do many of these conversions for you. However, if you intend to carry out CD fitting, the different packages available often need their input in different units. Make sure your data is in the correct units!