Recent from talks
Contribute something
Nothing was collected or created yet.
Seismometer
View on Wikipedia
| Part of a series on |
| Earthquakes |
|---|
A seismometer is an instrument that responds to ground displacement and shaking such as caused by quakes, volcanic eruptions, and explosions. They are usually combined with a timing device and a recording device to form a seismograph.[1] The output of such a device—formerly recorded on paper (see picture) or film, now recorded and processed digitally—is a seismogram. Such data is used to locate and characterize earthquakes, and to study the internal structure of Earth.
Basic principles
[edit]
A simple seismometer, sensitive to up-down motions of the Earth, is like a weight hanging from a spring, both suspended from a frame that moves along with any motion detected. The relative motion between the weight (called the mass) and the frame provides a measurement of the vertical ground motion. A rotating drum is attached to the frame and a pen is attached to the weight, thus recording any ground motion in a seismogram.
Any movement from the ground moves the frame. The mass tends not to move because of its inertia, and by measuring the movement between the frame and the mass, the motion of the ground can be determined.
Early seismometers used optical levers or mechanical linkages to amplify the small motions involved, recording on soot-covered paper or photographic paper. Modern instruments use electronics. In some systems, the mass is held nearly motionless relative to the frame by an electronic negative feedback loop. The motion of the mass relative to the frame is measured, and the feedback loop applies a magnetic or electrostatic force to keep the mass nearly motionless. The voltage needed to produce this force is the output of the seismometer, which is recorded digitally.
In other systems the weight is allowed to move, and its motion produces an electrical charge in a coil attached to the mass which voltage moves through the magnetic field of a magnet attached to the frame. This design is often used in a geophone, which is used in exploration for oil and gas.
Seismic observatories usually have instruments measuring three axes: north-south (y-axis), east–west (x-axis), and vertical (z-axis). If only one axis is measured, it is usually the vertical because it is less noisy and gives better records of some seismic waves.[citation needed]
The foundation of a seismic station is critical.[2] A professional station is sometimes mounted on bedrock. The best mountings may be in deep boreholes, which avoid thermal effects, ground noise and tilting from weather and tides. Other instruments are often mounted in insulated enclosures on small buried piers of unreinforced concrete. Reinforcing rods and aggregates would distort the pier as the temperature changes. A site is always surveyed for ground noise with a temporary installation before pouring the pier and laying conduit. Originally, European seismographs were placed in a particular area after a destructive earthquake. Today, they are spread to provide appropriate coverage (in the case of weak-motion seismology) or concentrated in high-risk regions (strong-motion seismology).[3]
Nomenclature
[edit]The word derives from the Greek σεισμός, seismós, a shaking or quake, from the verb σείω, seíō, to shake; and μέτρον, métron, to measure, and was coined by David Milne-Home in 1841, to describe an instrument designed by Scottish physicist James David Forbes.[4]
Seismograph is another Greek term from seismós and γράφω, gráphō, to draw. It is often used to mean seismometer, though it is more applicable to the older instruments in which the measuring and recording of ground motion were combined, than to modern systems, in which these functions are separated. Both types provide a continuous record of ground motion; this record distinguishes them from seismoscopes, which merely indicate that motion has occurred, perhaps with some simple measure of how large it was.[5]
The technical discipline concerning such devices is called seismometry,[6] a branch of seismology.
The concept of measuring the "shaking" of something means that the word "seismograph" might be used in a more general sense. For example, a monitoring station that tracks changes in electromagnetic noise affecting amateur radio waves presents an rf seismograph.[7] And helioseismology studies the "quakes" on the Sun.[8]
History
[edit]The first seismometer was made in China during the 2nd century.[9] It was invented by Zhang Heng, a Chinese mathematician and astronomer. The first Western description of the device comes from the French physicist and priest Jean de Hautefeuille in 1703.[10] The modern seismometer was developed in the 19th century.[3]
Seismometers were placed on the Moon starting in 1969 as part of the Apollo Lunar Surface Experiments Package. In December 2018, a seismometer was deployed on the planet Mars by the InSight lander, the first time a seismometer was placed onto the surface of another planet.[11]
Ancient era
[edit]In Ancient Egypt, Amenhotep, son of Hapu invented a precursor of seismometer, a vertical wooden poles connected with wooden gutters on the central axis functioned to fill water into a vessel until full to detect earthquakes.
In AD 132, Zhang Heng of China's Han dynasty is said to have invented the first seismoscope (by the definition above), which was called Houfeng Didong Yi (translated as, "instrument for measuring the seasonal winds and the movements of the Earth"). The description we have, from the History of the Later Han Dynasty, says that it was a large bronze vessel, about 2 meters in diameter; at eight points around the top were dragon's heads holding bronze balls. When there was an earthquake, one of the dragons' mouths would open and drop its ball into a bronze toad at the base, making a sound and supposedly showing the direction of the earthquake. On at least one occasion, probably at the time of a large earthquake in Gansu in AD 143, the seismoscope indicated an earthquake even though one was not felt. The available text says that inside the vessel was a central column that could move along eight tracks; this is thought to refer to a pendulum, though it is not known exactly how this was linked to a mechanism that would open only one dragon's mouth. The first earthquake recorded by this seismoscope was supposedly "somewhere in the east". Days later, a rider from the east reported this earthquake.[9][12]
Early designs (1259–1839)
[edit]By the 13th century, seismographic devices existed in the Maragheh observatory (founded 1259) in Persia, though it is unclear whether these were constructed independently or based on the first seismoscope.[13] French physicist and priest Jean de Hautefeuille described a seismoscope in 1703,[10] which used a bowl filled with mercury which would spill into one of eight receivers equally spaced around the bowl, though there is no evidence that he actually constructed the device.[14] A mercury seismoscope was constructed in 1784 or 1785 by Atanasio Cavalli,[15] a copy of which can be found at the University Library in Bologna, and a further mercury seismoscope was constructed by Niccolò Cacciatore in 1818.[14] James Lind also built a seismological tool of unknown design or efficacy (known as an earthquake machine) in the late 1790s.[16]
Pendulum devices were developing at the same time. Neapolitan naturalist Nicola Cirillo set up a network of pendulum earthquake detectors following the 1731 Puglia Earthquake, where the amplitude was detected using a protractor to measure the swinging motion. Benedictine monk Andrea Bina further developed this concept in 1751, having the pendulum create trace marks in sand under the mechanism, providing both magnitude and direction of motion. Neapolitan clockmaker Domenico Salsano produced a similar pendulum which recorded using a paintbrush in 1783, labelling it a geo-sismometro, possibly the first use of a similar word to seismometer. Naturalist Nicolo Zupo devised an instrument to detect electrical disturbances and earthquakes at the same time (1784).[14]
The first moderately successful device for detecting the time of an earthquake was devised by Ascanio Filomarino in 1796, who improved upon Salsano's pendulum instrument, using a pencil to mark, and using a hair attached to the mechanism to inhibit the motion of a clock's balance wheel. This meant that the clock would only start once an earthquake took place, allowing determination of the time of incidence.[14]
After an earthquake taking place on October 4, 1834, Luigi Pagani observed that the mercury seismoscope held at Bologna University had completely spilled over, and did not provide useful information. He therefore devised a portable device that used lead shot to detect the direction of an earthquake, where the lead fell into four bins arranged in a circle, to determine the quadrant of earthquake incidence. He completed the instrument in 1841.[14]
Early Modern designs (1839–1880)
[edit]In response to a series of earthquakes near Comrie in Scotland in 1839, a committee was formed in the United Kingdom in order to produce better detection devices for earthquakes. The outcome of this was an inverted pendulum seismometer constructed by James David Forbes, first presented in a report by David Milne-Home in 1842, which recorded the measurements of seismic activity through the use of a pencil placed on paper above the pendulum. The designs provided did not prove effective, according to Milne's reports.[14] It was Milne who coined the word seismometer in 1841, to describe this instrument.[4] In 1843, the first horizontal pendulum was used in a seismometer, reported by Milne (though it is unclear if he was the original inventor).[17] After these inventions, Robert Mallet published an 1848 paper where he suggested ideas for seismometer design, suggesting that such a device would need to register time, record amplitudes horizontally and vertically, and ascertain direction. His suggested design was funded, and construction was attempted, but his final design did not fulfill his expectations and suffered from the same problems as the Forbes design, being inaccurate and not self-recording.[17]
Karl Kreil constructed a seismometer in Prague between 1848 and 1850, which used a point-suspended rigid cylindrical pendulum covered in paper, drawn upon by a fixed pencil. The cylinder was rotated every 24 hours, providing an approximate time for a given quake.[14]
Luigi Palmieri, influenced by Mallet's 1848 paper,[17] invented a seismometer in 1856 that could record the time of an earthquake. This device used metallic pendulums which closed an electric circuit with vibration, which then powered an electromagnet to stop a clock. Palmieri seismometers were widely distributed and used for a long time.[18]
By 1872, a committee in the United Kingdom led by James Bryce expressed their dissatisfaction with the current available seismometers, still using the large 1842 Forbes device located in Comrie Parish Church, and requested a seismometer which was compact, easy to install and easy to read. In 1875 they settled on a large example of the Mallet device, consisting of an array of cylindrical pins of various sizes installed at right angles to each other on a sand bed, where larger earthquakes would knock down larger pins. This device was constructed in 'Earthquake House' near Comrie, which can be considered the world's first purpose-built seismological observatory.[17] As of 2013, no earthquake has been large enough to cause any of the cylinders to fall in either the original device or replicas.
The first seismographs (1880-)
[edit]The first seismographs were invented in the 1870s and 1880s. The first seismograph was produced by Filippo Cecchi in around 1875. A seismoscope would trigger the device to begin recording, and then a recording surface would produce a graphical illustration of the tremors automatically (a seismogram). However, the instrument was not sensitive enough, and the first seismogram produced by the instrument was in 1887, by which time John Milne had already demonstrated his design in Japan.[19]

In 1880, the first horizontal pendulum seismometer was developed by the team of John Milne, James Alfred Ewing and Thomas Gray, who worked as foreign-government advisors in Japan, from 1880 to 1895.[3] Milne, Ewing and Gray, all having been hired by the Meiji Government in the previous five years to assist Japan's modernization efforts, founded the Seismological Society of Japan in response to an Earthquake that took place on February 22, 1880, at Yokohama (Yokohama earthquake). Two instruments were constructed by Ewing over the next year, one being a common-pendulum seismometer and the other being the first seismometer using a damped horizontal pendulum. The innovative recording system allowed for a continuous record, the first to do so. The first seismogram was recorded on 3 November 1880 on both of Ewing's instruments.[19] Modern seismometers would eventually descend from these designs. Milne has been referred to as the 'Father of modern seismology'[20] and his seismograph design has been called the first modern seismometer.[21]
This produced the first effective measurement of horizontal motion. Gray would produce the first reliable method for recording vertical motion, which produced the first effective 3-axis recordings.[19]
An early special-purpose seismometer consisted of a large, stationary pendulum, with a stylus on the bottom. As the earth started to move, the heavy mass of the pendulum had the inertia to stay still within the frame. The result is that the stylus scratched a pattern corresponding with the Earth's movement. This type of strong-motion seismometer recorded upon a smoked glass (glass with carbon soot). While not sensitive enough to detect distant earthquakes, this instrument could indicate the direction of the pressure waves and thus help find the epicenter of a local quake. Such instruments were useful in the analysis of the 1906 San Francisco earthquake. Further analysis was performed in the 1980s, using these early recordings, enabling a more precise determination of the initial fault break location in Marin county and its subsequent progression, mostly to the south.
Later, professional suites of instruments for the worldwide standard seismographic network had one set of instruments tuned to oscillate at fifteen seconds, and the other at ninety seconds, each set measuring in three directions. Amateurs or observatories with limited means tuned their smaller, less sensitive instruments to ten seconds. The basic damped horizontal pendulum seismometer swings like the gate of a fence. A heavy weight is mounted on the point of a long (from 10 cm to several meters) triangle, hinged at its vertical edge. As the ground moves, the weight stays unmoving, swinging the "gate" on the hinge.
The advantage of a horizontal pendulum is that it achieves very low frequencies of oscillation in a compact instrument. The "gate" is slightly tilted, so the weight tends to slowly return to a central position. The pendulum is adjusted (before the damping is installed) to oscillate once per three seconds, or once per thirty seconds. The general-purpose instruments of small stations or amateurs usually oscillate once per ten seconds. A pan of oil is placed under the arm, and a small sheet of metal mounted on the underside of the arm drags in the oil to damp oscillations. The level of oil, position on the arm, and angle and size of sheet is adjusted until the damping is "critical", that is, almost having oscillation. The hinge is very low friction, often torsion wires, so the only friction is the internal friction of the wire. Small seismographs with low proof masses are placed in a vacuum to reduce disturbances from air currents.
Zollner described torsionally suspended horizontal pendulums as early as 1869, but developed them for gravimetry rather than seismometry.
Early seismometers had an arrangement of levers on jeweled bearings, to scratch smoked glass or paper. Later, mirrors reflected a light beam to a direct-recording plate or roll of photographic paper. Briefly, some designs returned to mechanical movements to save money. In mid-twentieth-century systems, the light was reflected to a pair of differential electronic photosensors called a photomultiplier. The voltage generated in the photomultiplier was used to drive galvanometers which had a small mirror mounted on the axis. The moving reflected light beam would strike the surface of the turning drum, which was covered with photo-sensitive paper. The expense of developing photo-sensitive paper caused many seismic observatories to switch to ink or thermal-sensitive paper.
After World War II, the seismometers developed by Milne, Ewing and Gray were adapted into the widely used Press-Ewing seismometer.
Modern instruments
[edit]

Modern instruments use electronic sensors, amplifiers, and recording devices. Most are broadband covering a wide range of frequencies. Some seismometers can measure motions with frequencies from 500 Hz to 0.00118 Hz (1/500 = 0.002 seconds per cycle, to 1/0.00118 = 850 seconds per cycle). The mechanical suspension for horizontal instruments remains the garden-gate described above. Vertical instruments use some kind of constant-force suspension, such as the LaCoste suspension. The LaCoste suspension uses a zero-length spring to provide a long period (high sensitivity).[22][23] Some modern instruments use a "triaxial" or "Galperin" design, in which three identical motion sensors are set at the same angle to the vertical but 120 degrees apart on the horizontal. Vertical and horizontal motions can be computed from the outputs of the three sensors.
Seismometers unavoidably introduce some distortion into the signals they measure, but professionally designed systems have carefully characterized frequency transforms.
Modern sensitivities come in three broad ranges: geophones, 50 to 750 V/m; local geologic seismographs, about 1,500 V/m; and teleseismographs, used for world survey, about 20,000 V/m. Instruments come in three main varieties: short-period, long-period and broadband. The short- and long-period instruments measure velocity and are very sensitive; however they 'clip' the signal or go off-scale for ground motion that is strong enough to be felt by people. A 24-bit analog-to-digital conversion channel is commonplace. Practical devices are linear to roughly one part per million.
Delivered seismometers come with two styles of output: analog and digital. Analog seismographs require analog recording equipment, possibly including an analog-to-digital converter. The output of a digital seismograph can be simply input to a computer. It presents the data in a standard digital format (often "SE2" over Ethernet).
Teleseismometers
[edit]
The modern broadband seismograph can record a very broad range of frequencies. It consists of a small "proof mass", confined by electrical forces, driven by sophisticated electronics. As the earth moves, the electronics attempt to hold the mass steady through a feedback circuit. The amount of force necessary to achieve this is then recorded.
In most designs the electronics holds a mass motionless relative to the frame. This device is called a "force balance accelerometer". It measures acceleration instead of velocity of ground movement. Basically, the distance between the mass and some part of the frame is measured very precisely, by a linear variable differential transformer. Some instruments use a linear variable differential capacitor.
That measurement is then amplified by electronic amplifiers attached to parts of an electronic negative feedback loop. One of the amplified currents from the negative feedback loop drives a coil very like a loudspeaker. The result is that the mass stays nearly motionless.
Most instruments measure directly the ground motion using the distance sensor. The voltage generated in a sense coil on the mass by the magnet directly measures the instantaneous velocity of the ground. The current to the drive coil provides a sensitive, accurate measurement of the force between the mass and frame, thus measuring directly the ground's acceleration (using f=ma where f=force, m=mass, a=acceleration).
One of the continuing problems with sensitive vertical seismographs is the buoyancy of their masses. The uneven changes in pressure caused by wind blowing on an open window can easily change the density of the air in a room enough to cause a vertical seismograph to show spurious signals. Therefore, most professional seismographs are sealed in rigid gas-tight enclosures. For example, this is why a common Streckeisen model has a thick glass base that must be glued to its pier without bubbles in the glue.
It might seem logical to make the heavy magnet serve as a mass, but that subjects the seismograph to errors when the Earth's magnetic field moves. This is also why seismograph's moving parts are constructed from a material that interacts minimally with magnetic fields. A seismograph is also sensitive to changes in temperature so many instruments are constructed from low expansion materials such as nonmagnetic invar.
The hinges on a seismograph are usually patented, and by the time the patent has expired, the design has been improved. The most successful public domain designs use thin foil hinges in a clamp.
Another issue is that the transfer function of a seismograph must be accurately characterized, so that its frequency response is known. This is often the crucial difference between professional and amateur instruments. Most are characterized on a variable frequency shaking table.
Strong-motion seismometers
[edit]Another type of seismometer is a digital strong-motion seismometer, or accelerograph. The data from such an instrument is essential to understand how an earthquake affects man-made structures, through earthquake engineering. The recordings of such instruments are crucial for the assessment of seismic hazard, through engineering seismology.
A strong-motion seismometer measures acceleration. This can be mathematically integrated later to give velocity and position. Strong-motion seismometers are not as sensitive to ground motions as teleseismic instruments but they stay on scale during the strongest seismic shaking.
Strong motion sensors are used for intensity meter applications.
Other forms
[edit]
Accelerographs and geophones are often heavy cylindrical magnets with a spring-mounted coil inside. As the case moves, the coil tends to stay stationary, so the magnetic field cuts the wires, inducing current in the output wires. They receive frequencies from several hundred hertz down to 1 Hz. Some have electronic damping, a low-budget way to get some of the performance of the closed-loop wide-band geologic seismographs.
Strain-beam accelerometers constructed as integrated circuits are too insensitive for geologic seismographs (2002), but are widely used in geophones.
Some other sensitive designs measure the current generated by the flow of a non-corrosive ionic fluid through an electret sponge or a conductive fluid through a magnetic field.
Interconnected seismometers
[edit]Seismometers spaced in a seismic array can also be used to precisely locate, in three dimensions, the source of an earthquake, using the time it takes for seismic waves to propagate away from the hypocenter, the initiating point of fault rupture (See also Earthquake location). Interconnected seismometers are also used, as part of the International Monitoring System to detect underground nuclear test explosions, as well as for Earthquake early warning systems. These seismometers are often used as part of a large-scale governmental or scientific project, but some organizations such as the Quake-Catcher Network, can use residential size detectors built into computers to detect earthquakes as well.
In reflection seismology, an array of seismometers image sub-surface features. The data are reduced to images using algorithms similar to tomography. The data reduction methods resemble those of computer-aided tomographic medical imaging X-ray machines (CAT-scans), or imaging sonars.
A worldwide array of seismometers can actually image the interior of the Earth in wave-speed and transmissivity. This type of system uses events such as earthquakes, impact events or nuclear explosions as wave sources. The first efforts at this method used manual data reduction from paper seismograph charts. Modern digital seismograph records are better adapted to direct computer use. With inexpensive seismometer designs and internet access, amateurs and small institutions have even formed a "public seismograph network".[24]
Seismographic systems used for petroleum or other mineral exploration historically used an explosive and a wireline of geophones unrolled behind a truck. Now most short-range systems use "thumpers" that hit the ground, and some small commercial systems have such good digital signal processing that a few sledgehammer strikes provide enough signal for short-distance refractive surveys. Exotic cross or two-dimensional arrays of geophones are sometimes used to perform three-dimensional reflective imaging of subsurface features. Basic linear refractive geomapping software (once a black art) is available off-the-shelf, running on laptop computers, using strings as small as three geophones. Some systems now come in an 18" (0.5 m) plastic field case with a computer, display and printer in the cover.
Small seismic imaging systems are now sufficiently inexpensive to be used by civil engineers to survey foundation sites, locate bedrock, and find subsurface water.
Fiber optic cables as seismometers
[edit]A new technique for detecting earthquakes has been found, using fiber optic cables.[25] In 2016 a team of metrologists running frequency metrology experiments in England observed noise with a wave-form resembling the seismic waves generated by earthquakes. This was found to match seismological observations of an Mw6.0 earthquake in Italy, ~1400 km away. Further experiments in England, Italy, and with a submarine fiber optic cable to Malta detected additional earthquakes, including one 4,100 km away, and an ML3.4 earthquake 89 km away from the cable.
Seismic waves are detectable because they cause micrometer-scale changes in the length of the cable. As the length changes so does the time it takes a packet of light to traverse to the far end of the cable and back (using a second fiber). Using ultra-stable metrology-grade lasers, these extremely minute shifts of timing (on the order of femtoseconds) appear as phase-changes.
The point of the cable first disturbed by an earthquake's p wave (essentially a sound wave in rock) can be determined by sending packets in both directions in the looped pair of optical fibers; the difference in the arrival times of the first pair of perturbed packets indicates the distance along the cable. This point is also the point closest to the earthquake's epicenter, which should be on a plane perpendicular to the cable. The difference between the P wave/S wave arrival times provides a distance (under ideal conditions), constraining the epicenter to a circle. A second detection on a non-parallel cable is needed to resolve the ambiguity of the resulting solution. Additional observations constrain the location of the earthquake's epicenter, and may resolve the depth.
This technique is expected to be a boon in observing earthquakes, especially the smaller ones, in vast portions of the global ocean where there are no seismometers, and at much lower cost than ocean-bottom seismometers.
Deep-Learning
[edit]Researchers at Stanford University created a deep-learning algorithm called UrbanDenoiser which can detect earthquakes, particularly in urban cities.[26] The algorithm filters out the background noise from the seismic noise gathered from busy cities in urban areas to detect earthquakes.[26][27]
Recording
[edit]


Today, the most common recorder is a computer with an analog-to-digital converter, a disk drive and an internet connection; for amateurs, a PC with a sound card and associated software is adequate. Most systems record continuously, but some record only when a signal is detected, as shown by a short-term increase in the variation of the signal, compared to its long-term average (which can vary slowly because of changes in seismic noise)[citation needed], also known as a STA/LTA trigger.
Prior to the availability of digital processing of seismic data in the late 1970s, the records were done in a few different forms on different types of media. A "Helicorder" drum was a device used to record data into photographic paper or in the form of paper and ink. A "Develocorder" was a machine that record data from up to 20 channels into a 16-mm film. The recorded film can be viewed by a machine. The reading and measuring from these types of media can be done by hand. After the digital processing has been used, the archives of the seismic data were recorded in magnetic tapes. Due to the deterioration of older magnetic tape medias, large number of waveforms from the archives are not recoverable.[28][29]
See also
[edit]References
[edit]- ^ Agnew, Duncan Carr (2003). "Ch. 1: History of Seismology". International Handbook of Earthquake & Engineering Seismology. Vol. Part A. pp. 3–11. ISBN 978-0-12-440652-0. LCCN 2002103787.
- ^ Erhard Wielandt's 'Seismic Sensors and their Calibration' Archived 2010-09-24 at the Wayback Machine- Current (2002) reference by a widely consulted expert.
- ^ a b c Reitherman, Robert (2012). Earthquakes and Engineers: an International History. Reston, VA: ASCE Press. pp. 122–125. ISBN 978-0-7844-1071-4. Archived from the original on 2012-07-26.
- ^ a b Ben-Menahem, A. (2009). Historical Encyclopedia of Natural and Mathematical Sciences, Volume 1. Springer. p. 2657. ISBN 978-3-540-68831-0. Retrieved 28 August 2012.
- ^ Richter, C.F. (1958). Elementary Seismology. San Francisco: W.H. Freeman.
- ^ William H.K. Lee; Paul Jennings; Carl Kisslinger; Hiroo Kanamori (27 September 2002). International Handbook of Earthquake & Engineering Seismology. Academic Press. p. 283. ISBN 978-0-08-048922-3. Retrieved 29 April 2013.
- ^ "The RF Seismograph". nsarc.ca. Archived from the original on 1 December 2017. Retrieved 28 March 2018.
- ^ "The Singing Sun". solar-center.stanford.edu. Retrieved 28 March 2018.
- ^ a b Sleeswyk AW, Sivin N (1983). "Dragons and toads: the Chinese seismoscope of BC. 132". Chinese Science. 6: 1–19.
- ^ a b Joseph Needham (1985). Science and Civilisation in China: Paper and Printing. Cambridge University Press. p. 122. ISBN 978-0-521-08690-5. Retrieved 16 April 2013.
In the Southern Sung dynasty, gift money for bestowing upon officials by the imperial court was wrapped in paper envelopes (chih pao)
- ^ Cook, Jia-Rui; Good, Andrew (19 December 2018). "NASA's InSight Places First Instrument on Mars". NASA. Retrieved 20 December 2018.
- ^ Needham, Joseph (1959). Science and Civilization in China, Volume 3: Mathematics and the Sciences of the Heavens and the Earth. Cambridge: Cambridge University Press. pp. 626–635. Bibcode:1959scc3.book.....N.
- ^ Szczepanski, Kallie. "The invention of the Seismoscope | The Asian Age Online, Bangladesh". The Asian Age. Retrieved 2022-10-12.
- ^ a b c d e f g Oldroyd, David; Amador, F.; Kozák, Jan; Carneiro, Ana; Pinto, Manuel (2007-01-01). "The Study of Earthquakes in the Hundred Years Following Lisbon Earthquake of 1755". Earth Sciences History. 26 (2): 321–370. Bibcode:2007ESHis..26..321O. doi:10.17704/eshi.26.2.h9v2708334745978.
- ^ Ferrari, Graziano (1997-01-01). "Cultural and scientific value of seismology's heritage in Europe: why and how to preserve". Cah. Cent. Europ. Geodyn. Seismol. 13: 1–21.
- ^ Hart, Scott de (2013-07-22). Shelley Unbound: Discovering Frankenstein's True Creator. Feral House. p. 39. ISBN 978-1-936239-64-1.
- ^ a b c d Musson, R. M. W. (2013-06-01). "A history of British seismology". Bulletin of Earthquake Engineering. 11 (3): 715–861. Bibcode:2013BuEE...11..715M. doi:10.1007/s10518-013-9444-5. ISSN 1573-1456. S2CID 110740854.
- ^ "Seismographen". Archived from the original on 2011-03-18. Retrieved 2011-02-18.
- ^ a b c Batlló, Josep (2021). "Historical Seismometer". In Beer, Michael; Kougioumtzoglou, Ioannis A.; Patelli, Edoardo; Siu-Kui Au, Ivan (eds.). Encyclopedia of Earthquake Engineering. Berlin, Heidelberg: Springer. pp. 1–31. doi:10.1007/978-3-642-36197-5_171-1. ISBN 978-3-642-36197-5. Retrieved 2022-10-17.
- ^ Herbert-Gustar, A. L.; Nott, Patrick A. (1980). John Milne: father of modern seismology.
- ^ "Who Invented the Seismograph?". Retrieved 2022-10-12.
- ^ "Physics of the Zero-Length Spring of Geoscience". physics.mercer.edu. Retrieved 28 March 2018.
- ^ "A Biography of Lucien LaCoste, inventor of the zero-length spring". Archived from the original on March 20, 2007.
- ^ "Redwood City Public Seismic Network". psn.quake.net. Archived from the original on 26 March 2018. Retrieved 28 March 2018.
- ^ Marra, Giuseppe; Clivati, Cecilia; Luckett, Richard; Tampellini, Anna; Kronjäger, Jochen; Wright, Louise; Mura, Alberto; Levi, Filippo; Robinson, Stephen; Xuereb, André; Baptie, Brian; Calonico, Davide (3 August 2016), "Ultrastable laser interferometry for earthquake detection with terrestrial and submarine cables", Science, 361 (6401): 486–490, doi:10.1126/science.aat4458, hdl:11696/59747, PMID 29903881.
- ^ a b Yang, Lei; Liu, Xin; Zhu, Weiqiang; Zhao, Liang; Beroza, Gregory C. (2022-04-15). "Toward improved urban earthquake monitoring through deep-learning-based noise suppression". Science Advances. 8 (15) eabl3564. Bibcode:2022SciA....8L3564Y. doi:10.1126/sciadv.abl3564. ISSN 2375-2548. PMC 9007499. PMID 35417238.
- ^ "A deep-learning algorithm could detect earthquakes by filtering out city noise". MIT Technology Review. Retrieved 2022-04-17.
- ^ Hutton, Kate; Yu, Ellen. "NEWS FLASH!! SCSN Earthquake Catalog Completed!!" (PDF). Seismological Laboratory, Caltech. Archived from the original (PDF) on 14 July 2014. Retrieved 4 July 2014.
- ^ Fogleman, Kent A.; Lahr, John C.; Stephens, Christopher D.; Page, Robert A. (June 1993). Earthquake Locations Determined by the Southern Alaska Seismograph Network for October 1971 through May 1989 (Report). United States Geological Survey.
External links
[edit]- The history of early seismometers
- The Lehman amateur seismograph, from Scientific American Archived 2009-02-04 at the Wayback Machine- not designed for calibrated measurement.
- Sean Morrisey's professional design of an amateur teleseismograph Also see Keith Payea's version Both accessed 2010-9-29 Morrissey was a professional seismographic instrument engineer. This superior design uses a zero-length spring to achieve a 60-second period, active feedback and a uniquely convenient variable reluctance differential transducer, with parts scavenged from a hardware store. The frequency transform is carefully designed, unlike most amateur instruments. Morrisey is deceased, but the site remains up as a public service.
- SeisMac is a free tool for recent Macintosh laptop computers that implements a real-time three-axis seismograph.
- The Development Of Very-Broad-Band Seismography: Quanterra And The Iris Collaboration Archived 2016-08-10 at the Wayback Machine discusses the history of development of the primary technology in global earthquake research.
- Video of seismograph at Hawaiian Volcano Observatory – on Flickr – retrieved on 2009-06-15.
- Seismoscope – Research References 2012
- Iris EDU – How Does A Seismometer Work?
- Seismometers, seismographs, seismograms – what's the difference? How do they work? – USGS
Seismometer
View on GrokipediaFundamentals
Basic Principles of Operation
A seismometer is an instrument designed to measure ground motion caused by seismic waves, while a seismograph encompasses both the measurement device and a system for recording that motion.[9] The core operating principle relies on inertia: a suspended mass within the instrument tends to remain stationary relative to an inertial frame, even as the supporting structure—anchored to the ground—moves with seismic vibrations, thereby converting ground displacement into a measurable relative motion of the mass.[10] This relative displacement is given by the equation , where represents the displacement of the instrument frame (following ground motion) and is the absolute displacement of the mass.[11] Seismometers detect both horizontal and vertical components of ground motion. Horizontal motion is typically measured using pendulum-based systems, where gravity provides the restoring force to the suspended mass, while vertical motion employs spring-mass systems to counteract gravitational effects and isolate the inertial response.[9] To prevent excessive oscillations and ensure accurate measurement, damping mechanisms—such as viscous fluids or electromagnetic devices—are incorporated to dissipate energy from the mass's motion. The damping ratio , which quantifies this effect, is defined as , where is the damping coefficient, is the spring constant, and is the mass; a value near 0.7 often achieves near-critical damping for optimal response without overshoot.[11] The frequency response of a seismometer is governed by its natural frequency , which sets the range of seismic wave frequencies to which the instrument is sensitive—lower values enhance detection of long-period waves, while higher values suit short-period motions.[11] This sensitivity is crucial for capturing various seismic waves, including primary (P) waves that propagate as compressional longitudinal motions, secondary (S) waves as transverse shear motions, and slower surface waves that travel along the Earth's exterior.[12]Nomenclature and Distinctions
The term "seismometer" originates from the Greek words seismos (σεισμός), meaning "earthquake" or "shaking," and metron (μέτρον), meaning "measure."[13] The word first appeared in English in the 1840s, with the earliest documented use in 1841 by James David Forbes, a Scottish physicist, in reference to instruments designed to detect earth tremors.[14] Although Irish engineer Robert Mallet advanced the field of seismology in the 1850s through his experimental work on seismic wave propagation and instrument design, the specific term "seismometer" predates his contributions but aligns with his efforts to standardize earthquake measurement terminology. A key distinction exists between a seismometer and related instruments: a seismometer is the sensor that detects and measures ground motion, typically in terms of displacement, velocity, or acceleration, while a seismograph refers to the complete system that includes the seismometer coupled with a recording mechanism, either mechanical or digital.[1] These terms are often used interchangeably in modern contexts, but the precise differentiation highlights the seismometer's role as the core transducer.[1] In contrast, an accelerometer measures acceleration directly (e.g., changes in velocity over time) rather than displacement, making it suitable for high-amplitude, short-period events but less ideal for long-period seismic waves where relative motion is key.[15] Seismometers are broadly classified by their response mechanism: inertial types, which rely on a suspended mass that resists motion due to inertia (per Newton's first law), dominate traditional designs and measure absolute ground displacement or velocity relative to this inertial reference.[2] Strainmeters, however, measure relative ground strain—the deformation or differential displacement between two fixed points—rather than absolute motion, providing complementary data for tectonic strain accumulation and slow deformation processes.[16] Essential performance terms include the natural period, which is the oscillation time of the instrument's internal mass-spring system (often seconds to minutes for broadband models), defining its frequency response and ability to faithfully record waves of varying wavelengths.[17] Sensitivity denotes the smallest detectable ground motion, typically on the order of nanometers for displacement in high-precision instruments.[16] Dynamic range quantifies the span of amplitudes the instrument can measure without clipping or noise dominance, often exceeding 120 dB in modern broadband seismometers to capture signals from micro-vibrations to strong shaking.[18] Measurements are expressed in standardized units: displacement in microns (μm) or nanometers (nm), velocity in millimeters per second (mm/s) or nanometers per second (nm/s), and acceleration in multiples of g (where 1 g ≈ 9.81 m/s²).[19] A common misconception is that seismometers are exclusively for detecting earthquakes; in reality, they record a wide array of ground vibrations, including microseisms generated by ocean waves and human-induced sources such as traffic, industrial activity, or construction, which help in environmental monitoring and noise characterization.[9]Historical Development
Ancient and Early Designs
The earliest known attempt to instrumentally detect earthquakes originated in ancient China during the Han dynasty. In 132 AD, the polymath Zhang Heng presented the Houfeng Didong Yi, a bronze seismoscope shaped like a large urn approximately 2 meters in diameter, to the imperial court.[20] The device featured eight dragon heads positioned around the urn's rim, each holding a bronze ball in its mouth, with corresponding open-mouthed toads or frogs placed below on the ground. When seismic waves from a distant earthquake reached the instrument, an internal pendulum or mechanical trigger would dislodge a ball from one dragon's mouth, causing it to fall into a toad's mouth, thereby indicating the direction of the tremor.[21] Historical records describe its remarkable sensitivity, as it reportedly detected a quake in Longxi (modern Gansu province) over 400 kilometers away, alerting officials days before the shaking was felt locally.[22] Zhang's design relied on an inertial mechanism, where the ground's motion relative to a suspended component produced the directional signal, a principle that would echo in later seismometers.[23] In ancient Greece and Rome, earthquakes were extensively documented through observational accounts rather than mechanical devices, with scholars noting potential precursors to aid prediction. Aristotle, in his work Meteorologica around 350 BC, compiled historical data on earthquakes from across the Mediterranean, attributing them to subterranean winds and describing precursors such as unusual animal behaviors, atmospheric disturbances like fogs or clouds, and audible rumbles before the shaking.[24] Similarly, Pliny the Elder in his Natural History (circa 77 AD) cataloged earthquake varieties, causes, and effects, including reports of eerie sounds, swelling earth, and marine disturbances as warning signs, drawing from eyewitness testimonies and earlier Greek sources.[25] These descriptions emphasized qualitative observations of omens—such as restless animals or premonitory noises—rather than instrumental detection, reflecting a cultural view of quakes as divine portents requiring interpretation.[26] Early seismic detection efforts, including Zhang Heng's device, were inherently limited to qualitative indicators of occurrence and direction, lacking the ability to measure intensity, duration, or precise timing. Without recording mechanisms like paper or ink traces, these designs depended on immediate human observation to interpret signals, rendering them unsuitable for scientific analysis or remote verification. In cultural contexts across ancient China, Greece, and Rome, such instruments or observations often served divinatory purposes, interpreted as heavenly warnings of governmental misconduct or societal imbalance rather than purely natural phenomena. For instance, Zhang Heng's seismoscope was valued not only for practical alerts but as a tool to discern auspicious or inauspicious directions tied to cosmology and omens.[27] These rudimentary approaches laid conceptual groundwork but transitioned toward more mechanical innovations only in later centuries.19th-Century Innovations
The 19th century represented a transformative period in seismology, as engineers and scientists developed mechanical instruments capable of quantifying ground motion, building on earlier qualitative designs to establish empirical foundations for the field. Irish civil engineer Robert Mallet conducted groundbreaking experiments starting in the late 1840s, detonating controlled explosions at sites like Killiney Beach to measure seismic wave propagation speeds in sand and rock, achieving velocities such as 825 feet per second in sand. These efforts, motivated by Irish earthquakes and structural engineering concerns, involved rudimentary pendulums and tiltmeters to detect tilts and oscillations, marking the inception of experimental seismology. Mallet also coined the term "seismology" in his 1857 report on the Neapolitan earthquake, formalizing the scientific study of earthquakes. The term "seismometer" itself was introduced earlier by Scottish geologist David Milne-Home in 1841, referring to devices like James David Forbes's pendulum-based motion detector. Early pendulum concepts, first proposed by French physicist Jean de Hautefeuille in 1703 as a mercury-filled bowl to indicate ground tilt via spillage, were refined in the 19th century into more precise inertial systems. By the 1880s, German astronomer Ernst von Rebeur-Paschwitz advanced horizontal pendulum designs, installing sensitive instruments in Wilhelmshaven and Potsdam that used a nearly free-swinging mass to register horizontal motions; his 1889 setup captured the first teleseismic recording of a Japanese earthquake on April 18, spanning over 8,000 kilometers. Italian scientists made seminal contributions to recording technology amid frequent volcanic and tectonic activity. In 1856, physicist Luigi Palmieri, director of the Vesuvius Observatory, invented an electromagnetic seismometer consisting of U-shaped mercury tubes aligned to cardinal directions, connected to a galvanometer that electromagnetically inscribed earthquake timings on smoked paper, allowing for the first automated recordings of local events. This device, influenced by Mallet's wave studies, was installed across Naples and proved vital during the 1857 earthquake. Building on such innovations, physicist Filippo Cecchi developed the era's first true continuous-recording seismograph in 1875, an electromagnetic pendulum that traced ground displacements on a rotating drum, offering unprecedented detail on motion amplitude and duration despite its limited sensitivity. These mechanical advancements, however, were hampered by significant technical limitations. The proliferation of varied designs—from simple pendulums to mercury systems—resulted in a lack of standardization, complicating data comparison across observatories and hindering the establishment of uniform protocols. Instruments were also highly susceptible to non-seismic noise, including wind-induced vibrations that could mimic minor tremors and microseisms from distant ocean swells, often overwhelming signals from small earthquakes and requiring isolated installations for reliable operation. The deployment of these early seismometers profoundly impacted the analysis of global events, enabling instrumental verification of distant seismic activity. For instance, records from Palmieri's and similar devices contributed to studies of the 1868 Arica earthquake in Chile, which generated trans-Pacific tsunamis and was assessed through tilt and pendulum observations for its intensity and propagation. Similarly, the 1883 Krakatoa eruption's explosive phases produced seismic waves detected at over a dozen stations worldwide, including von Rebeur-Paschwitz's pendulums in Europe, providing the first multi-site dataset for a major volcanic event and underscoring the feasibility of international seismic networks.20th-Century Advancements
The early 20th century marked a pivotal shift in seismometer design, transitioning from purely mechanical systems to electromagnetic and electrical recording mechanisms that enhanced sensitivity and global deployment capabilities. In 1906, Prince Boris Galitzin developed the first electromagnetic seismometer, which employed a pendulum suspended by a wire and coupled to a galvanometer for recording ground motion as electrical signals rather than mechanical traces. This innovation allowed for more precise detection of weak seismic waves, overcoming limitations of friction in mechanical levers, and was instrumental in advancing observatory-based monitoring. Building on this, Emil Wiechert introduced the inverted pendulum seismometer in 1903, featuring a heavy mass suspended below its pivot point to achieve exceptional stability and sensitivity for distant earthquakes. This design minimized external disturbances and could register teleseismic events from thousands of kilometers away, leading to widespread installations in European and international observatories by the 1910s. Wiechert's instrument, often paired with a mechanical recording drum, exemplified the era's focus on robust, high-gain systems for long-period waves. John Milne, a key figure in seismology, refined portable seismometer designs in the 1910s, drawing from his earlier work in Japan following the 1891 Mino earthquake. His horizontal pendulum-based instruments were lightweight and field-deployable, facilitating rapid assessments in earthquake-prone regions and contributing to the establishment of seismic networks in Asia. Post-World War I efforts emphasized standardization to support international collaboration. The Press-Ewing seismograph, developed in the 1950s by Frank Press and Maurice Ewing, integrated horizontal and vertical components into a single, electrically recorded unit using electromagnetic transducers.[28] This design improved data comparability across stations and was widely adopted for both research and hazard mitigation. Significant deployments followed major events, such as the 1923 Great Kantō earthquake in Tokyo, which prompted the installation of standardized seismometers across Japan and influenced global network expansions. By the 1950s, these advancements played a crucial role in detecting underground nuclear tests, with long-period instruments distinguishing explosions from natural quakes through waveform analysis. A broader technological shift occurred from mechanical to photoelectric recording methods in the mid-20th century, where light beams and photographic paper captured amplified signals for greater accuracy. This era also introduced the distinction between short-period instruments, optimized for high-frequency local motions, and long-period ones for global wave propagation, enabling differentiated applications in monitoring.Late 20th- and 21st-Century Developments
The late 20th century marked a pivotal shift in seismometer technology toward digital systems, building on analog foundations to achieve unprecedented fidelity in recording seismic signals. The 1970s saw the introduction of digital recording, revolutionizing seismology with improved dynamic range and enabling the creation of large data archives. This transition included the establishment of the Global Digital Seismograph Network (GDSN) in 1976, which incorporated digital upgrades to existing stations like the World Wide Standardized Seismograph Network (WWSSN), facilitating global monitoring and analysis of seismic events.[29] In the early 1980s, the Streckeisen STS-1 emerged as the first broadband seismometer, featuring a frequency response from 360 seconds to 10 Hz and a dynamic range of 144 dB, enabling the capture of both weak teleseismic waves and stronger local events with minimal distortion.[30] This instrument's force-feedback design and high sensitivity revolutionized global monitoring by providing stable, low-noise data across a wide amplitude spectrum, far surpassing the limitations of earlier mechanical systems.[31] Parallel to these hardware advances, the establishment of the Global Seismographic Network (GSN) in the late 1980s and 1990s facilitated the integration of digital seismometers into a coordinated international framework. Comprising over 150 stations distributed worldwide, the GSN emphasized real-time data acquisition and sharing through the Incorporated Research Institutions for Seismology (IRIS) Data Management Center, allowing rapid analysis of global seismic events and improving earthquake location accuracy.[32] By the mid-1990s, this network had deployed STS-1 and similar broadband sensors at key sites, supporting advancements in plate tectonics research and early warning capabilities.[33] The 1990s also saw the introduction of Micro-Electro-Mechanical Systems (MEMS) technology to seismometry, offering miniaturized, low-cost alternatives to traditional accelerometers for strong-motion detection. These devices, leveraging silicon-based fabrication, provided compact three-axis sensing with sufficient sensitivity for engineering applications, revolutionizing the automotive sector before adapting to seismic networks for affordable, deployable units in the late 1990s.[34] Their small size and reduced power needs enabled broader installations in urban and remote areas, though initial models focused on higher-frequency ground motions rather than ultra-low periods.[35] Entering the 21st century, high-resolution digitizers became standard in seismometer systems post-2000, with effective resolutions exceeding 130 dB at sampling rates up to 1000 samples per second, enhancing the digitization of broadband signals without aliasing.[32] Major earthquakes, such as the 2004 Sumatra-Andaman event (Mw 9.1), underscored the value of dense deployments; the GSN's broadband recordings revealed the rupture's extent, prompting the rapid installation of over 20 ocean-bottom seismometers for aftershock monitoring and influencing subsequent global network expansions.[36] Similarly, the 2011 Tohoku-Oki earthquake (Mw 9.0) highlighted gaps in offshore coverage, leading to upgrades in Japan's seismic networks, including denser arrays of broadband stations and improved real-time integration for early warning systems.[37] These events accelerated the shift toward hybrid networks combining permanent and temporary high-density installations to better resolve rupture dynamics.[38] To ensure data reliability, the Incorporated Research Institutions for Seismology (IRIS) and the U.S. Geological Survey (USGS) developed standardized calibration protocols in the 2000s, focusing on verifying frequency response through controlled signal injection and shake-table testing.[39] These guidelines, part of the Advanced National Seismic System, require instruments to maintain sensitivity within 1% across operational bands, with regular on-site calibrations using electrical or mechanical inputs to confirm transfer functions.[40] Early experiments with fiber-optic seismometers in the pre-2010 era were confined primarily to laboratory settings, exploring interferometric designs for rotational and translational sensing. Pioneering efforts, such as the GS-13P prototype in 1998 using a 380-meter high-birefringence fiber coil, achieved sensitivities around 3.5 × 10⁻³ rad/s in controlled tests with lock-in amplifiers.[41] Subsequent lab iterations, like the FORS-I (2001) and FORS-II (2003) models based on Sagnac interferometers, improved resolution to 4.2 × 10⁻⁸ rad/s with longer single-mode fibers up to 11 km, demonstrating potential for low-frequency detection but limited by noise and environmental stability outside lab conditions.[41] These tests laid groundwork for future field applications, though practical deployments remained scarce before 2010.[41]Traditional Seismometer Types
Teleseismometers
Teleseismometers are seismometers engineered for high sensitivity to low-amplitude, long-period seismic waves originating from distant earthquakes, or teleseisms, which occur at epicentral distances greater than approximately 30 degrees (more than 3000 km). These instruments excel at recording body waves, such as primary (P) and secondary (S) waves, that travel through Earth's mantle, facilitating studies of deep planetary structure and global seismic propagation. By focusing on periods of 20 to 100 seconds, teleseismometers capture subtle signals that shorter-period devices might overlook, making them essential for mantle tomography and earthquake source characterization.[42][43] The design centers on long-period pendulums, where a heavy inertial mass is suspended to resist motion during ground shaking, allowing measurement of relative displacement between the mass and the instrument frame. Feedback servo mechanisms, typically electromagnetic force-balance systems, actively counteract any mass deflection to keep it nearly stationary, converting ground motion into precise electrical signals with minimal mechanical wear. This approach enables natural periods far exceeding those of traditional pendulums, often achieving near-infinite free periods through zero-length spring suspensions. A seminal example is the LaCoste-Romberg model, developed in the 1940s based on Lucien LaCoste's 1934 zero-length spring innovation, which was later upgraded with digital electronics for enhanced stability and low drift. Another key example is the Streckeisen STS-2, a triaxial force-feedback sensor with a response from 120 seconds to 50 Hz, prized for its low self-noise in teleseismic applications within global networks.[44][45][46] These instruments offer advantages in fidelity for weak body waves from afar but are disadvantaged by saturation during strong local events, as their high gain amplifies only low accelerations effectively. For velocity transducers common in teleseismometers, the magnification factor at low frequencies (below the natural frequency) is , where is the signal frequency and is the natural frequency; this linear increase with frequency underscores their suitability for gradually amplifying distant, low-frequency signals into detectable outputs. To maximize performance, teleseismometers are installed in thermally stable underground vaults, shielding them from environmental noise and achieving displacement sensitivities around 1 nm, which is critical for resolving minute mantle-induced motions.[47]Strong-Motion Seismometers
Strong-motion seismometers are specialized instruments engineered to capture high-amplitude ground accelerations during intense seismic events, such as nearby earthquakes, without signal saturation. These devices function primarily as accelerometers, measuring linear accelerations in three orthogonal components to record the forceful shaking that can damage structures. Unlike instruments focused on distant or weak motions, strong-motion seismometers prioritize robustness and dynamic range to handle peak accelerations exceeding 1 g, providing critical data for assessing immediate hazards.[48] The core design of strong-motion seismometers relies on force-balance accelerometers, which incorporate a high-stiffness suspension to maintain stability under extreme forces. In this configuration, an inertial mass is suspended within a frame, and ground motion induces a displacement that is detected by capacitive or optical transducers. A servo feedback loop then applies an electromagnetic force to return the mass to its null position, ensuring linear response across the operational bandwidth. This setup yields a frequency response typically spanning 0.1 to 50 Hz, suitable for capturing short-period vibrations dominant in near-source shaking. The high stiffness prevents resonance at low frequencies, focusing instead on the rapid, high-frequency components of strong ground motion.[49][50] In engineering seismology, strong-motion seismometers play a pivotal role in evaluating structural integrity and informing building code development by quantifying the intensity of ground shaking. They are deployed on or within buildings, bridges, dams, and other infrastructure to measure real-time responses during events, enabling post-earthquake damage assessments and retrofitting recommendations. A key application is the determination of peak ground acceleration (PGA), which represents the maximum ground acceleration observed and serves as a fundamental parameter for seismic hazard mapping and design standards. For instance, PGA data from these instruments help calibrate attenuation models used in probabilistic seismic hazard analysis.[51][52] Notable examples include early mechanical designs from the 1950s, such as those pioneered by Teledyne Geotech, which transitioned to digital force-balance systems in modern applications. Contemporary instruments like the Kinemetrics EpiSensor employ MEMS-based force-balance technology for high-fidelity recording, while Nanometrics' Titan accelerometer offers a Class-A dynamic range exceeding 155 dB with low self-noise. These digital models integrate seamlessly with data loggers for continuous monitoring.[53][54][55] A primary advantage of strong-motion seismometers is their ability to withstand and accurately measure accelerations beyond 1 g—often up to 2 g or more—without clipping, making them indispensable for hazard-prone regions. However, their sensitivity to weak signals is inherently lower compared to broadband or teleseismic instruments, as the design trade-off favors amplitude range over noise floor minimization, potentially limiting utility for microseismicity studies. To derive higher-order motion parameters, acceleration time series are numerically integrated: velocity is obtained by integrating acceleration over time, and displacement by double integration, though baseline corrections are applied to mitigate drift errors.[56] The clipping level, typically around 2 g for many commercial models, defines the upper limit of measurable acceleration before signal distortion occurs, ensuring reliable data during moderate-to-large events. This metric is crucial for site-specific deployments, where expected PGA informs instrument selection. The underlying principle follows Newton's second law in the feedback loop, where the output acceleration is proportional to the balancing force applied to the inertial mass : Here, the servo system measures the force needed to nullify mass displacement, directly yielding acceleration without mechanical resonance issues.[49][57]Broadband and Other Specialized Forms
Broadband seismometers are versatile instruments designed to capture a wide range of seismic frequencies, typically providing a flat velocity response from 120 seconds to 50 Hz, enabling the detection of both long-period surface waves and short-period body waves in a single device.[58] A prominent example is the Guralp CMG-3T, which employs three orthogonal galvanic force-feedback sensors to measure ground velocity in north-south, east-west, and vertical components, ensuring high-fidelity recordings across this spectrum.[59] These instruments build on principles from teleseismometers for low-frequency sensitivity and strong-motion sensors for higher-frequency response, but integrate them into one unit for broader applicability. In applications, broadband seismometers facilitate comprehensive earthquake monitoring by resolving source mechanisms, wave propagation, and crustal structure from local to teleseismic distances.[32] They are particularly valuable in volcano seismology, where they detect subtle long-period events associated with magma movement alongside higher-frequency tremor signals.[60] Other specialized forms include borehole seismometers, which are deployed in deep installations—often 100 meters or more—to minimize surface-generated noise such as wind or cultural interference, achieving significantly lower noise levels than surface instruments.[61] Ocean-bottom seismometers (OBS) extend this capability to marine environments, typically incorporating a three-component seismometer alongside a hydrophone to record both shear waves in the seafloor and compressional waves in the water column, supporting studies of subduction zones and mid-ocean ridges.[62] Tiltmeters and strainmeters serve as complementary tools in seismic networks, measuring ground tilt and volumetric strain respectively to capture deformation signals that seismometers alone might miss, as they directly sense changes in orientation or rock dilation rather than particle motion.[63] These instruments enhance interpretation of slow events like postseismic relaxation or volcanic inflation when co-located with broadband sensors.[64] The primary advantages of broadband and specialized forms lie in their ability to record multiple seismic wave types—from microseisms to large teleseisms—with one instrument, reducing deployment complexity and enabling detailed analyses in plate boundary studies, such as imaging fault ruptures along the San Andreas system.[32] However, these systems face limitations, including higher costs due to advanced feedback mechanisms and low-noise electronics, as well as elevated noise floors in urban areas from anthropogenic sources like traffic, which can obscure weak signals.[65][66]Emerging Technologies
Fiber Optic Distributed Sensing
Fiber optic distributed sensing, particularly through distributed acoustic sensing (DAS), utilizes existing fiber optic cables to create continuous, high-resolution seismic monitoring networks by detecting ground vibrations as strain perturbations along the cable length. The core principle involves injecting coherent laser pulses into the optical fiber and analyzing the Rayleigh backscattered light, where microscopic imperfections in the fiber cause phase shifts proportional to longitudinal strain induced by seismic waves or other vibrations.[67] This backscattering enables spatial resolution of 1-10 meters over distances up to tens of kilometers, transforming the cable into thousands of virtual strain sensors without requiring additional hardware deployment.[68] Unlike traditional point-based seismometers, DAS provides a dense, linear array that captures wave propagation in real time, though it measures relative strain changes rather than absolute particle displacement.[69] Significant developments in DAS for seismology accelerated between 2020 and 2025, building on earlier proofs-of-concept to integrate with telecommunications infrastructure. A landmark 2025 demonstration by Lawrence Livermore National Laboratory (LLNL) researchers repurposed buried fiber optic cables in the San Francisco Bay Area into over 8,000 virtual seismometers, achieving unprecedented resolution for imaging urban seismic hazards during a month-long field experiment.[70] This advance leveraged improved interrogator devices to process backscattered signals at high sampling rates, enabling detection of microseismic events that traditional networks often miss.[71] Earlier 2020s work focused on submarine cables for offshore monitoring, but the LLNL effort highlighted scalability for onshore telecom lines, marking a shift toward practical, large-scale deployment.[72] In seismic applications, DAS excels in urban earthquake early warning systems by providing dense coverage in densely populated areas where installing discrete sensors is challenging.[73] For instance, urban DAS arrays have been used to track traffic-induced noise for ambient noise tomography and to detect P-wave arrivals for rapid magnitude estimation.[74] In pipeline monitoring, it detects seismic-like vibrations from leaks or intrusions along linear infrastructure routes, enhancing safety in energy transport.[75] Subduction zone imaging benefits from submarine DAS on existing ocean-bottom cables, as demonstrated in 2025 studies across the North Anatolian Fault, where it resolved fault slip and aftershock patterns with kilometer-scale arrays.[76] DAS offers key advantages for seismometer applications, including cost-effectiveness by repurposing existing fiber optic infrastructure without the need for new sensor installations, potentially reducing deployment expenses by orders of magnitude compared to traditional arrays.[77] Its high spatial density enables km-scale continuous sampling, far surpassing the sparse spacing of conventional seismometers, and supports real-time data acquisition at rates up to 10 kHz for dynamic event capture.[69] These features make it ideal for scaling seismic networks in remote or urban settings, where it can complement point sensors for enhanced wavefield imaging. Despite these benefits, DAS faces challenges such as polarization-induced noise, where variations in light polarization along the fiber degrade signal quality and require advanced compensation algorithms.[78] Additionally, its sensitivity to strain limits direct measurement of absolute ground motion, necessitating conversion models that introduce uncertainties in velocity or acceleration estimates, particularly for weak or distant events.[79] Low signal-to-noise ratios in noisy environments further complicate low-magnitude detection without preprocessing.[80] A notable 2025 advancement involves fusion algorithms that integrate DAS data with traditional seismometer recordings, yielding improved earthquake detection catalogs through hybrid workflows that leverage DAS density for event localization and conventional sensors for absolute motion validation.[72] These methods have demonstrated enhanced sensitivity to small earthquakes, with automated processing achieving up to 30% more detections in integrated offshore arrays compared to standalone systems.[81]Quantum-Based Seismometers
Quantum-based seismometers leverage quantum mechanical principles such as atom interferometry and spin squeezing to measure ground accelerations and gravity variations with sensitivities surpassing classical limits, enabling detection of seismic signals at the nanoscale. In atom interferometry, clouds of ultracold atoms are split into matter waves using laser pulses, and the phase difference accumulated due to acceleration is measured upon recombination; this approach achieves precisions down to 10 nm/s² or better, far exceeding traditional mechanical sensors. The phase shift in such interferometers is given by where is the acceleration, is the interrogation time, and is the laser wavelength (with effective wavevector for Raman two-photon processes). Spin squeezing, another key technique, entangles atomic spins to reduce quantum noise below the standard quantum limit, approaching the Heisenberg limit for enhanced signal-to-noise ratios. These methods extend classical inertial sensing by exploiting wave-particle duality and superposition, providing unprecedented stability for long-term monitoring.[82] Key developments include the 2023 work at the University of Colorado Boulder, where researchers demonstrated entanglement-enhanced quantum sensors using spin-squeezed states of calcium and strontium ions, achieving over 2-fold noise reduction in measurements relevant to gravitational sensing. In 2024, the FLEET Centre in Australia proposed the Quantum Earthquake Detector (QED), a tunneling-based device that exploits quantum tunneling currents across nanoscale gaps to detect vibrations at room temperature, potentially offering higher sensitivity and lower cost than conventional seismometers. These innovations build on earlier quantum gravimetry prototypes, focusing on portability and robustness for field deployment. Applications of quantum-based seismometers include monitoring fault stress through precise gravity gradient measurements, which can identify pre-seismic strain buildup by detecting microgal-level changes indicative of tectonic shifts. They also enable early earthquake warning systems with sensitivities on the order of nm/s², allowing detection of precursor P-waves seconds before destructive S-waves arrive. Inertial principles from traditional seismometers are quantumly enhanced here to achieve such granularity without mechanical components. Advantages stem from fundamental quantum effects: entanglement via spin squeezing reduces phase noise to the Heisenberg limit, theoretically scaling as (where is the number of entangled particles) rather than for independent measurements, enabling faster and more accurate readings. Compared to microelectromechanical systems (MEMS) seismometers, quantum sensors offer up to 10-fold higher sensitivity in controlled environments, with potential for broadband response from DC to kHz frequencies. Challenges persist, including the need for cryogenic cooling to produce ultracold atomic ensembles in interferometric designs, which complicates field portability and increases power demands. Scalability remains an issue, as integrating large atom numbers or arrays into compact devices requires advanced vacuum and laser systems. NASA's 2025 initiation of development for a space-based quantum gravity gradiometer addresses some terrestrial limitations by planning tests of these sensors in orbit for earthquake-related gravity mapping, paving the way for hybrid space-Earth applications.[83]AI and Machine Learning Integration
The integration of artificial intelligence (AI) and machine learning (ML) into seismometer data analysis has revolutionized the interpretation of seismic signals, enabling more accurate earthquake detection, prediction, and noise reduction. These techniques process vast datasets from seismometers to identify subtle patterns that traditional methods often miss, particularly in real-time scenarios. By leveraging algorithms trained on historical seismic records, AI enhances the reliability of earthquake catalogs and supports proactive hazard mitigation.[84] Key applications include the automated generation of earthquake detection catalogs, which compile comprehensive lists of seismic events from continuous seismometer recordings. For instance, ML models have produced detailed catalogs for major events, such as the magnitude 7.4 tremor in Taiwan in April 2024, revealing thousands of aftershocks previously undetected by manual analysis.[85] In prediction efforts, researchers at the University of Texas at Austin developed an AI algorithm that achieved 70% accuracy in forecasting earthquakes one week in advance during a seven-month trial in China, using patterns from five years of seismometer data.[86] Ground-motion forecasting, which estimates shaking intensity for engineering applications, has also benefited, with ML models like graph neural networks predicting maximum intensity measures from seismometer arrays with improved spatial resolution.[87] Prominent techniques involve convolutional neural networks (CNNs) for phase picking, where the algorithm identifies P- and S-wave arrivals in seismograms with high precision, outperforming classical methods in noisy environments.[88] For anomaly detection in microseisms—low-amplitude ambient seismic noise—ML approaches such as isolation forests or deep learning classifiers isolate unusual signals indicative of precursor activity or hidden events.[89] Recent developments underscore the field's momentum, including a 2024 review by the Seismological Society of America on ML for seismicity analysis, which highlights its role in catalog development and event association across dense networks.[90] These integrations offer significant advantages, such as real-time processing of seismometer streams for early warnings and efficient handling of big data from dense arrays, reducing analysis time from days to seconds.[91] However, challenges persist, including training data biases that can skew predictions toward underrepresented regions or event types, and limited explainability of black-box models, which hinders trust in critical applications.[92][93] A pivotal advancement is the integration of ML with distributed acoustic sensing (DAS) systems, where hybrid models combine fiber-optic data with traditional seismometer inputs; 2025 studies demonstrate enhanced event detection in volcanic and tectonic settings using recurrent neural networks on DAS streams.[94]Networks and Data Handling
Interconnected Seismometer Arrays
Interconnected seismometer arrays form the backbone of modern global and regional seismic monitoring, enabling the coordinated deployment of multiple instruments to capture comprehensive data on earthquake activity worldwide. These networks integrate diverse seismometer types, such as broadband stations, to provide real-time insights into seismic events, enhancing detection accuracy and response capabilities. By sharing data across international boundaries, they support collaborative research and hazard mitigation efforts. At the global scale, the Incorporated Research Institutions for Seismology (IRIS) Global Seismographic Network (GSN) operates approximately 150 very broadband stations distributed worldwide, delivering open-access data for studying Earth's seismic structure and global events.[95] The Federation of Digital Seismograph Networks (FDSN) facilitates international data exchange among 93 member organizations, standardizing formats and promoting the dissemination of high-fidelity seismic waveforms from observatories spanning national and global installations.[96] Regionally, the U.S. Geological Survey's (USGS) ShakeAlert system in the western United States employs more than 1,500 sensors, with expansion targeting over 2,000 stations by the end of 2025, to deliver earthquake early warnings across California, Oregon, and Washington.[97] In Europe, the European-Mediterranean Seismological Centre (EMSC) aggregates data from more than 70 member institutes as of 2025 to provide rapid earthquake parameters and impact assessments for the Euro-Mediterranean region.[98] These arrays rely on advanced real-time telemetry technologies, including satellite and internet connections, to transmit data from remote stations without delay, ensuring continuous monitoring even in isolated areas.[99] In the 2020s, Internet of Things (IoT) integration has enabled dense urban arrays, such as low-cost sensor networks in city centers like Catania, Italy, for high-resolution local monitoring and noise reduction through on-board processing.[100] Applications of these networks span critical hazard domains, including tsunami warnings through integration with sea-level data for rapid event forecasting, nuclear explosion monitoring via the International Monitoring System's seismic components that overlap with GSN stations, and volcanic hazard assessment by detecting precursory swarms and eruptions.[33][101][102] By 2025, dense networks augmented by distributed acoustic sensing (DAS) have advanced swarm detection capabilities, transforming existing fiber-optic infrastructure into high-resolution seismic arrays for enhanced temporal and spatial coverage in volcanic and tectonic settings.[68] For instance, real-time DAS processing frameworks, such as modular software for integrating DAS data into operational systems, now support immediate event analysis.[103] The primary benefits of these interconnected arrays include precise epicenter triangulation using arrival-time differences from multiple stations and improved magnitude estimation through waveform analysis across the network, reducing uncertainties in location and intensity assessments.[104][105]Recording and Data Processing Methods
Seismometer recording methods evolved from analog systems, which dominated from the late 1800s until the 1970s, to modern digital approaches. Early analog recordings captured ground motion using mechanical devices that traced signals on photographic paper or ink-on-paper drums, providing visual seismograms for manual analysis.[106][107] These systems were limited by physical media and susceptibility to environmental interference, but they formed the foundation of global seismic monitoring networks.[108] The transition to digital recording began in the late 1970s with the advent of analog-to-digital converters (ADCs), enabling automated capture and storage of seismic signals.[109] Digital seismometers sample signals at rates typically between 100 and 200 Hz for broadband and regional monitoring, ensuring capture of seismic frequencies up to 50-100 Hz without aliasing.[110] This adheres to the Nyquist-Shannon sampling theorem, which requires the sampling frequency to exceed twice the maximum frequency of interest: .[111] Data are stored in standardized formats such as SEED (Standard for the Exchange of Earthquake Data), an international protocol developed for efficient archival and interchange of time-series seismic information, including metadata on station response and timing.[112][113] Initial data processing transforms raw seismometer outputs into interpretable seismic information. Bandpass filtering removes noise outside the seismic band (e.g., 0.01-50 Hz), enhancing signal clarity by attenuating low-frequency cultural noise and high-frequency instrument artifacts.[114] For strong-motion accelerometers, integration converts acceleration to velocity and displacement traces, often using numerical methods like trapezoidal integration after baseline correction.[115] Three-component data undergo rotation to standard orientations, such as vertical-north-south-east (ZNE) or fault-parallel/perpendicular, to align with geological features and facilitate phase identification.[115] Software tools like ObsPy, an open-source Python library, support these processing steps by providing functions for reading SEED files, applying filters, performing integrations, and rotating components.[116][117] It enables real-time streaming analysis, allowing immediate event detection during ongoing recordings.[118] Seismic networks generate vast data volumes, often reaching terabytes per day from global arrays, necessitating efficient compression and storage solutions.[119] Clock synchronization poses another challenge, as precise timing is essential for correlating signals across stations; GPS provides sub-millisecond accuracy but can fail in remote or temporary deployments, leading to drift errors that require post hoc corrections.[120][121] By 2025, cloud-based processing has advanced early warning systems, enabling scalable, real-time analysis of streaming data from distributed networks for rapid hazard alerts.[122][119]References
- https://wiki.seg.org/wiki/Analog_versus_digital_signal
