Dr. Dobb's is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Channels ▼
RSS

Database

Volcano Monitoring


Mar01: Volcano Monitoring

When real-time data acquisition and analysis can save lives

Steve is a geologist specializing in volcanology, seismic monitoring, and software development. He can be contacted at [email protected].


Volcanoes are complex and destructive. Since entire populations are living in volcanic areas, geologists study them to understand why, how, and when they erupt. The goal is straightforward — reduce losses and damages.

There are many physical phenomena resulting from the internal behavior of volcanoes, including external signs that can be measured with instruments when the magma moves or its chemical composition changes, or when its pressure or temperature varies. Expertise in interpreting these variations — coupled with a good knowledge of the history of a volcano — can provide an understanding of its behavior. Then, based on the collected data, volcanologists can draw risk and hazard maps, try to predict when an eruption may occur (or compute a probability), and (if required) inform local authorities about evacuations of certain areas.

The advent of personal computers and progress in electronics in the early 1980s helped scientists in these tasks and have introduced new techniques — such as frequency domain calculations using FFTs — for processing the data. In short, technology advances have been invaluable to geologists in the field.

Instrumental monitoring of volcanoes involves remote data acquisition — usually automatic data processing and interpretation — linked to alarm systems. Computers make it dramatically easier for volcanologists deciphering the data. In essence, a network of instruments placed in the field works as a stethoscope, giving scientists an idea of what's going on inside the volcano. For instance, instruments usually record all earthquakes occurring in the volcano, as quakes provide a means of finding out if the magma is moving in the volcano's plumbing (an indicator of the imminence of an eruption). Gas collection through drills informs geologists about variations in the magma's chemical nature. The pressure of the gas or steam has an impact on the shape of the volcano, which can be measured with tilt meters. Satellites can also be used to observe the main stress or temperature on the edifice.

Since volcanoes exhibit individualistic behaviors, generic monitoring is a challenge. Some volcanoes erupt every 15 minutes whereas others may erupt only once in several centuries, as was the case for Mt. Pinatubo in the Philippines, without ever being extinct. Because dormant volcanoes seem to be quiet most of the time, it is useful to see if the process leading to an eruption takes place at a slow pace. Consequently, you may catch patterns of behaviors that can't be distinguished in short-term observation if you have enough data. Such patterns may cover cycles with a yearly period.

Clearly, volcano surveillance involves collecting, transmitting, and analyzing lots of data collected by recording some of the physical indicators just described. If you want to sample one datum every minute and analyze your measurements over the last three months (or 20 years), you must have a fast, optimized application because you're going to process a lot of data. As a number-cruncher in geophysical data — especially in volcano monitoring early-warning systems — I have developed several applications to process data in real time.

Data Collection

As Figure 1 illustrates, instruments record the activity of a volcano in real time. The data collected can either be stored for long-term analysis or analyzed on-the-fly. Everything depends on the type of measurements you focus on.

Seismic activity is usually recorded in real time at a high sampling rate, something like 100 to 200 samples per second. Since this represents lots of data to be stored, the computer normally ignores that data as long as they remain within boundaries (called "trigger levels") determined by the volcanologists. If the seismic event exceeds those limits, the computer stores the data into files that can be retrieved and processed later. An event can represent something about one minute worth of data — up to 12,000 points for a roughly 30-KB file. Since you measure the event in the x-, y-, and z-axis and on a minimum of four recorders, you end up with a minimum of 144,000 points per event.

For measurements taking place at a slow pace (tilt variations or battery levels, for instance), you typically measure either one datum every minute, every 10 minutes, or even only twice a day. Usually, those data are stored in files for later interpretation, as they serve as a long-term image of the volcano's behavior over days or years. For example, the Fuego and Pacaya (in Guatemala) and the Cerro Negro (in Nicaragua) volcanoes are monitored, among other things, by a method called "RSAM" (see Figure 2), which quantifies seismic data when they are most needed — during the eruption when the instruments are usually saturated. There is only one datum stored every 10 minutes, feeding the computer with just 144 points a day. But if you take a look at the volcano behavior for, say, the last 10 years, you are confronted with about half a million points.

Data Processing

In both cases, there comes a time when users need to use, work on, and display the data in charts. This requires powerful computers and techniques to turn the collected data into something useful. If you have to wait hours while the computer processes data before seeing the results each time you want to take a look at them, the data are useless. Information taken from instruments has to be quickly accessed.

Since we usually record only one point every 10 minutes or so, it would obviously save time if we could process each datum immediately upon acquisition. Why do we store all data and process them as one huge bulk of information exclusively when we need to display them? Because most of the acquisition equipment is located on the volcano itself — a highly corrosive and hostile environment. Instruments are powered by batteries connected to solar panels that can be covered with ashes, providing a low-voltage output to the recorders. Also, external noise — wind, rain, cows, human activities, or whatever — can have an influence on the data recorded. All these conditions have an impact on data reliability, which can become lousy and information may seem to drift from reality.

It is also critical that users be able to modify the way data are processed to apply correction factors. This is often problematic since monitoring systems can be located in developing countries where people have low computing skills and/or no resources to be developers. Consequently, it isn't reasonable to have people in the observatories modify the formulas in the source code itself, recompile it, and launch the application again to see the new results. The only way to go is to let them customize the formulas in dialog boxes, as in Figure 3. This means that formulas have to be parsed and interpreted at run time. Interpreting a formula is an elaborate task and is redone for each data point you pass to the parser. In our case, processing half a million points can become a nightmare even though computers are designed for such duties.

For instance, in my volcano monitoring activities, I've developed an application (see Figure 4) to gather data from multiple instruments — gas sampling, temperatures, drill oxidation, battery levels, and the like — placed on the Kelud volcano on East Java, Indonesia (see Figure 5). Data are acquired at a low pace and always stored as raw values, without being processed. If you want to interpret your data or apply a correction to compensate an artifact such as the corrosion of the battery, then you have to do it on each datum you have recorded. I built my applications using Borland Delphi and Borland C++ Builder. The first versions used a freeware parser component, which was convenient but slow. It showed limitations as the data set was growing in size. For six months worth of samples, processing the data used to take 50 minutes on a 75-MHz Pentium. Then if the correction factor was wrong and I wanted to change it, it would be another 50 minutes waiting for the information to be displayed!

Also, the resulting information I needed was a combination of different parameters. Each parameter is a distinct variable in the parser. This meant that I needed as many parser components as variables I was using (seven in my case). Luckily, I came across the UCalc Fast Math Parser (http://www.ucalc.com/), which lets me process a whole set of data in just 27 seconds per instruments (compared to 50 minutes per instrument previously). Another nice thing about UCalc is its ability to parse the formula once, then keep it in an optimized way so you can directly pass your stream of data in a loop (see Listing One) without reparsing the formula for each point. Thanks to UCalc, we no longer hesitate to clean, calibrate, or correct data immediately, nor do we wait 50 minutes anymore.

The other good thing in terms of the speed gained provided by UCalc was that it gave scientists the capability to play with the data. Intuition and guessing are an important factor in research, and I've seen volcanologists tweaking the data by adjusting parameters in the formulas, thereby isolating information that was actually hidden in the measurements.

Once the data are processed, they are displayed as charts like Figure 6. Users can customize them and focus on the time windows of the volcanic activity of interest. For example, the Kelud volcano has been equipped with geophones placed directly in the crater lake. The crater works as an amplifier of the sonic signals generated by the rupture of the rocks. Since this lake is supplied only with rain water, there is no disturbance coming from a river or local life (temperature of the water is 110F/45C), thus avoiding any turbulence that could be recorded as noise.

Acoustic monitoring in this case provided reliable information about what was taking place inside the volcano at shallow depths and was a good indicator of the gas bubbling activity as well. It was precious as a precursor detector in an eruption in February 1990 — the equipment started to record precursors three months before. The same thing happened to the Taal volcano in the Philippines, where variations in the ultrasonic signals were detected one month before its eruption in March 1994.

By associating those results to automated warning systems, you can then setup a reliable system to prevent volcanic disasters. If all of a sudden, the gas composition dramatically changes, the computer can detect this and send a message to a pager, via e-mail, or turn on a siren. Volcanologists can then start to work with competent persons to increase the security of the active volcanic areas.

The Tiltmeters Application

Tiltmeters is a 32-bit Windows 95/NT application developed specifically for a team from the Institut de Physique du Globe (Grenoble) and the University of Chambéry (France). Its goal is to collect several different volcanic measurements — gas, tilt variations, fumarols temperatures, pressures, and oxidations — specifically for a volcano in Indonesia. Since there are many different measurements, it was important to bring them all into one consistent application — letting users display the ones they want, group them by type of information, or apply different formulas on the raw data.

The user interface was designed to ease the manipulation of the information users want to display. They access the desired charts by simple button clicks. Everything is configurable directly from within a few dialog boxes. The main application window (Figure 4) displays all the data at once, grouped by similar types of information — all temperatures, tilt values, and so on. Users can also change how charts are displayed (stacked or tiled) so that they can compare the values horizontally. Each chart can be maximized for use with tools such as zooming, and users can modify the way data are handled by modifying the formulas used on each raw value, or clicking on scale controls (log scale, automatic scaling, and the like). Finally, data from each station's batteries can be checked at anytime.

Conclusion

Instruments will never replace the expertise of the volcanologists in charge of the surveillance, but they definitely can help them in taking swift decisions in case of a crisis such as an eruption. Most of the daily routine work consists of studying volcano behavior, but when a volcano suddenly goes south into a potential disaster, the specialists have to be supported by computers so they can focus on essential tasks like directing authorities, deciding which areas should be evacuated, evaluating possible risks and losses, studying the evolution of the crisis, and so on. Such a stressful situation can last for days and can drain down all of your energy.

Having quick access to all the data is an invaluable help to the scientists. Columns of numbers have never impressed anyone, but maps or charts can convince decision makers to actually act and save the populations. In volcanology, "a picture can be worth a thousand lives." In this case, UCalc helped me put an early-warning system in place that hopefully will make a difference the day the volcano wakes up.

For More Information

The Surveillance and Prediction of Volcanic Activity: A Review of Methods and Techniques. Unesco, 1971.

Ewert, J.W. and D.A. Swanson. Monitoring Volcanoes: Techniques and Strategies Used By the Staff of the Cascades Volcano Observatory, 1980-90. U.S.G.P.O., 1992.

Halbwachs, M. and J.C. Sabroux. "Capteurs d'ultrasons destinés à la prévision des tremblements de terre," 1994.

DDJ

Listing One

// variables to be used

float vbat, ta, ut, tf0, e1, po2, coef, incl ;
long vbatPtr, taPtr, utPtr, tf0Ptr, e1Ptr, po2Ptr, coefPtr, inclPtr ;

// variables holding the formulas
String vbatF, taF, utF, tf0F, e1F, po2F, coefF, inclF ;
long vbatFPtr, taFPtr, utFPtr, tf0FPtr, e1FPtr, po2FPtr, coefFPtr, inclFPtr ;

// Get the formulas from the dialog box
vbatF = VBatFormula->Text ;
taF   = TaFormula->Text   ;
utF   = UtFormula->Text   ;
tf0F  = Tf0Formula->Text  ;
e1F   = E1Formula->Text   ;
po2F  = Po2Formula->Text  ;
coefF = CoefFormula->Text ;
inclF = InclFormula->Text ;

// Define the variables
vbatPtr  = ucDefineVariable( "vbat" ) ;
taPtr    = ucDefineVariable( "ta"   ) ;
utPtr    = ucDefineVariable( "ut"   ) ;
tf0Ptr   = ucDefineVariable( "tf0"  ) ;
e1Ptr    = ucDefineVariable( "e1"   ) ;
po2Ptr   = ucDefineVariable( "po2"  ) ;
coefPtr  = ucDefineVariable( "coef" ) ;
inclPtr  = ucDefineVariable( "incl" ) ;

// Prepare the parser for each formula
vbatFPtr = ucParse( vbatF.c_str() ) ;
taFPtr   = ucParse( taF.c_str()   ) ;
utFPtr   = ucParse( utF.c_str()   ) ;
tf0FPtr  = ucParse( tf0F.c_str()  ) ;
e1FPtr   = ucParse( e1F.c_str()   ) ;
po2FPtr  = ucParse( po2F.c_str()  ) ;
coefFPtr = ucParse( coefF.c_str() ) ;
inclFPtr = ucParse( inclF.c_str() ) ;

// Now just loop into the file "inputFile" and read all points. Most of them 
// are then sent to the appropriate chart by chartNNN->AddPoint(time,value); 
long timeVar = 0 ;
float dataVBat, dataTa, dataUt, dataTf0, dataE1, dataPo2, dataCoef, dataIncl ;

while( !inputFile->eof() ) {
   // read from file
   fscanf( inputFile, "%f %f %f %f %f %f %f %f", &dataVBat,  &dataTa,
      &dataUt, &dataTf0, &dataE1, &dataPo2, &dataCoef, &dataIncl      ) ;

   // set the variables values
   ucSetVariableValue( vbatPtr, dataVBat ) ;
   ucSetVariableValue( taPtr,   dataTa   ) ;
   ucSetVariableValue( utPtr,   dataUt   ) ;
   ucSetVariableValue( tf0Ptr,  dataTf0  ) ;
   ucSetVariableValue( e1Ptr,   dataE1   ) ;
   ucSetVariableValue( po2Ptr,  dataPo2  ) ;
   ucSetVariableValue( coefPtr, dataCoef ) ;

   ucSetVariableValue( inclPtr, dataIncl ) ;

   // calculate everything and pass the values to the charts
   chartVBat->AddPoint( timeVar, ucEvaluate( vbatFPtr ) ;
   chartTa->AddPoint(   timeVar, ucEvaluate( taFPtr ) ;
   ut = ucEvaluate( utFPtr ) ;
   chartTf0->AddPoint(  timeVar, ucEvaluate( tf0FPtr ) ;
   e1 = ucEvaluate( e1FPtr ) ;
   chartCoef->AddPoint( timeVar, ucEvaluate( coefFPtr ) ;
   chartIncl->AddPoint( timeVar, ucEvaluate( inclFPtr ) ;
   timeVar += deltaTime ;
}
// Now we are done with all the calculation
inputFile.close() ;

ucReleaseExpr() ;

Back to Article


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.