What is the Difference Between a Ghz and a Mg?
According to the International System of Units, the modernized form of the metric system, a GHz is a gigahertz, meaning a cycle operating at a billion times per second, while a Mg is a megagram, the official designation for a tonne, also called the metric ton, equivalent to 1000 kilograms. So a GHz and a Mg refer to units measuring fundamentally different things: the speed of a cycle and weight. The unit GHz is commonly used in the study of electromagnetic spectra, which light with a frequency of a GHz falls into the microwave portion of the spectrum, with a wavelength of about 20 cm, and in measuring the clock speed of computers, which is often a few GHz. Contrary to popular belief, the clock speed of a computer is not directly related to its computational performance. This misconception has been referred to as the Megahertz Myth. As of 2008, the best personal computers operate at a clock speed of about 3 GHz. This is sure to increase to 10 GHz and beyond in coming years, as faster co
Related Questions
- If a monitoring procedure consists of taking multiple measurements, are separate entries required for each measurement (i.e. must each measurement includes a time, date and initials)?
- What is the difference between ultrasonic temperature measurements and a thermocouple measurement?
- What is the Difference Between a Ghz and a Mg?