The moment magnitude scale (abbreviated as MMS; denoted as M_{w}) is used by seismologists to measure the size of earthquakes in terms of the energy released.^{[1]} The magnitude is based on the moment of the earthquake, which is equal to the rigidity of the Earth multiplied by the average amount of slip on the fault and the size of the area that slipped.^{[2]} The scale was developed in the 1970s to succeed the 1930sera Richter magnitude scale (M_{L}). Even though the formulae are different, the new scale retains the familiar continuum of magnitude values defined by the older one. The MMS is now the scale used to estimate magnitudes for all modern large earthquakes by the United States Geological Survey.^{[3]}
Contents 
The symbol for the moment magnitude scale is M_{w}, with the subscript w meaning mechanical work accomplished. The moment magnitude M_{w} is a dimensionless number defined by
where M_{0} is the seismic moment in dyne centimeters (10^{−7} Nm).^{[1]} The constant values in the equation are chosen to achieve consistency with the magnitude values produced by earlier scales, most importantly the Local Moment (or "Richter") scale.
As with the Richter scale, an increase of 1 step on this logarithmic scale corresponds to a 10^{1.5} ≈ 31.6 times increase in the amount of energy released, and an increase of 2 steps corresponds to a 10^{3} = 1000 times increase in energy.
A closely related formula—obtained by solving the previous equation for M_{0}—allows one to assess the proportional difference f_{ΔE} in energy release between earthquakes of two different moment magnitudes, say m_{1} and m_{2}:
For example, Chile's magnitude8.8 earthquake of February 2010 released roughly 500 times more energy than Haiti's magnitude7.0 earthquake the month before.^{[4]} The comparison works out as follows:
Potential energy is stored in the crust in the form of builtup stress. During an earthquake, this stored energy is transformed and results in
The seismic moment M_{0} is a measure of the total amount of energy that is transformed during an earthquake. Only a small fraction of the seismic moment M_{0} is converted into radiated seismic energy E_{s}, which is what seismographs register. Using the estimate
Choy and Boatwright defined in 1995 the energy magnitude
The energy released by nuclear weapons is traditionally expressed in terms of the energy stored in a kiloton or megaton of the conventional explosive trinitrotoluene (TNT).
A rule of thumb equivalence from seismology used in the study of nuclear proliferation asserts that a one kiloton nuclear explosion creates a seismic signal with a magnitude of approximately 4.0. ^{[5]} This in turn leads to the equation ^{[6]}
where m_{TNT} is the mass of the explosive TNT that is quoted for comparison (relative to megatons Mt).
Such comparison figures are not very meaningful. As with earthquakes, during an underground explosion of a nuclear weapon, only a small fraction of the total amount of energy transformed ends up being radiated as seismic waves. Therefore, a seismic efficiency has to be chosen for a bomb that is quoted as a comparison. Using the conventional specific energy of TNT (4.184 MJ/kg), the above formula implies the assumption that about 0.5% of the bomb's energy is converted into radiated seismic energy E_{s}.^{[7]} For real underground nuclear tests, the actual seismic efficiency achieved varies significantly and depends on the site and design parameters of the test.
In 1935, Charles Richter and Beno Gutenberg developed the local magnitude (M_{L}) scale (popularly known as the Richter scale) with the goal of quantifying mediumsized earthquakes (between magnitude 3.0 and 7.0) in Southern California. This scale was based on the ground motion measured by a particular type of seismometer at a distance of 100 kilometres (62 mi) from the earthquake. Because of this, there is an upper limit on the highest measurable magnitude; all large earthquakes will have a local magnitude of around 7. The local magnitude's estimate of earthquake size is also unreliable for measurements taken at a distance of more than about 350 miles (600 km) from the earthquake's epicenter.^{[3]}
The moment magnitude (M_{w}) scale was introduced in 1979 by Caltech seismologists Thomas C. Hanks and Hiroo Kanamori to address these shortcomings while maintaining consistency. Thus, for mediumsized earthquakes, the moment magnitude values should be similar to Richter values. That is, a magnitude 5.0 earthquake will be about a 5.0 on both scales. This scale was based on the physical properties of the earthquake, specifically the seismic moment (M_{0}). Unlike other scales, the moment magnitude scale does not saturate at the upper end; there is no upper limit to the possible measurable magnitudes. However, this has the sideeffect that the scales diverge for smaller earthquakes.^{[1]}
Moment magnitude is now the most common measure for medium to large earthquake magnitudes,^{[8]} but breaks down for smaller quakes. For example, the United States Geological Survey does not use this scale for earthquakes with a magnitude of less than 3.5, which is the great majority of quakes. For these smaller quakes, other magnitude scales are used. All magnitudes are calibrated to the M_{L} scale of Richter and Gutenberg.
Magnitude scales differ from earthquake intensity, which is the perceptible moving, shaking, and local damages experienced during a quake. The shaking intensity at a given spot depends on many factors, such as soil types, soil sublayers, depth, type of displacement, and range from the epicenter (not counting the complications of building engineering and architectural factors). Rather, they are used to estimate only the total energy released by the quake.
The following table compares magnitudes towards the upper end of the Richter Scale for major Californian earthquakes.^{[1]}
Date  M_{L}  M_{w}  

19330311  2  6.3  6.2 
19400519  30  6.4  7.0 
19410701  0.9  5.9  6.0 
19421021  9  6.5  6.6 
19460315  1  6.3  6.0 
19470410  7  6.2  6.5 
19481204  1  6.5  6.0 
19520721  200  7.2  7.5 
19540319  4  6.2  6.4 

The Moment Magnitude scale is a way to measure the power of earthquakes. The higher the number, the bigger the earthquake. It is the energy of the earthquake at the moment it happens. It is similar to the Richter scale.
The scale is a logarithmic, with a base of ten, like the Richter scale.
Scale Number  Earthquake Effect 

less than 3.5  This would be a very weak earthquake. People would not feel it, but it would be recorded by Geologists 
3.55.4  Generally felt by people, but it rarely causes damage. 
5.46.0  Will not cause damage to welldesigned buildings, but can cause damage or destroy small or poorlydesigned ones. 
6.16.9  Can be destructive in areas up to about 100 kilometers across where people live. 
7.07.9  Considered a "major earthquake" that causes a lot of damage. 
8 or greater  Large and destructive earthquake that can destroy large cities. 
