- published: 21 May 2015
- views: 141688
The clock rate typically refers to the frequency at which a CPU is running. It uses the SI unit Hertz.
The clock rate of a CPU is normally determined by the frequency of an oscillator crystal. Typically a crystal oscillator produces a fixed sine wave – the frequency reference signal. Electronic circuitry translates that into a square wave at the same frequency for digital electronics applications (or, in using a CPU multiplier, some fixed multiple of the crystal reference frequency). The clock distribution network inside the CPU carries that clock signal to all the parts that need it. An AD Converter has a "clock" pin driven by a similar system to set the sampling rate.
In this context, the use of the word "speed" (physical movement), should not be confused with frequency or its corresponding clock rate. Thus, the term "clock speed" or "processor speed" is a misnomer.
CPU manufacturers typically charge premium prices for CPUs that operate at higher clock rates, a practice called binning. For a given CPU, the clock rates are determined at the end of the manufacturing process through actual testing of each CPU. CPUs that are tested as complying with a given set of standards may be labeled with a higher clock rate, e.g., 1.50 GHz, while those that fail the standards of the higher clock rate yet pass the standards of a lesser clock rate may be labeled with the lesser clock rate, e.g., 1.33 GHz, and sold at a lower price. Chip manufacturers publish a "maximum clock rate" specification, and they test chips before selling them to make sure they meet that specification, even when executing the most complicated instructions with the data patterns that take the longest to settle (testing at the temperature and voltage that runs the lowest performance).