Catastrophic events share characteristic nonlinear behaviors that are
often generated by cross-scale interactions and feedbacks among system
elements. These events result in surprises that cannot easily be predicted
based on information obtained at a single scale. Progress on catastrophic
events has focused on one of the following two areas: nonlinear dynamics
through time without an explicit consideration of spatial connectivityor
spatial connectivity and the spread of contagious processes without a consideration
of cross-scale interactions and feedbacksref.
These approaches rarely have ventured beyond traditional disciplinary boundaries.
An interdisciplinary, conceptual, and general mathematical framework for
understanding and forecasting nonlinear dynamics through time and across
space has been proposed. Decisions that minimize the likelihood of catastrophic
events must be based on cross-scale interactions, and such decisions will
often be counterintuitive. Given the continuing challenges associated with
global change, approaches that cross disciplinary boundaries to include
interactions and feedbacks at multiple scales are needed to increase our
ability to predict catastrophic events and develop strategies for minimizing
their occurrence and impactsref.
Dilley and colleagues at the World
Bank broke down the most of the globe into 8 million grid cells of
about 25 km2 each. They then mapped the risks of human and economic
damage from 6 types of disaster, such as cyclones and landslides, on to
each one and built up a picture of the world's most exposed placesref.
The world's most vulnerable countries include Bangladesh, Nepal, Burundi,
Haiti and Taiwan. In these places, > 90% of people are at 'high risk' of
death for 2 or more types of disaster. The researchers define high-risk
areas as having the top 30% of risk compared with other areas of the world.
Although many of these areas were already known to be in danger, the report
provides a more sophisticated way to compare risks across countries and
regions, allowing governments and aid agencies to prioritize their resources.
Much of the damage and death that disasters cause is preventable: by building
earthquake-proof structures, for example. But repeated hits lock many of
the world's developing countries into a cycle that makes it difficult to
fund changes, especially as much aid goes into immediate relief efforts.
The World Bank plans to use its hotspot map to identify those countries
most in need and help them implement a preventive, rather than reactive,
approach to disasters. Its approach is already affecting homeowners in
Turkey, who must weather frequent earthquakes. When providing aid, the
World Bank requires them to buy insurance for their homes. This shifts
the responsibility for safe buildings from the government to the individual
and private-sector insurance companies. The World Bank also intends to
encourage governments to invest in measures such as flood embankments and
cyclone shelters by granting loans to countries who plan for disasters.
Countries have already started to request money specifically for risk management,
indicating that the message is getting through. We must stop making it
more complicated than it is : if you want to reduce problems after disasters,
you just have to protect people by giving them better housing, better education
and better health services.
volcanic eruptions :
History :
the eruption of the volcano Krakatau in the
Netherlands East Indies (Indonesia) in 1883 had worldwide impact. This
was perceived in the 3 quite different types of global propagation that
occurred after the eruption :
a rapid pressure wave, noticeable only to measuring instruments
followed a few hours later by the spread of the news of the event
succeeded by a slowly expanding optical phenomenon that lasted for a couple
of years.
Krakatau was the first natural catastrophe of global magnitude that was
almost immediately recognized as such throughout the world, largely thanks
to the recently installed worldwide telegraphic networkref.
It was associated to a tsunami in the Indian Ocean.
Assuming that most of the man-made energy, which is used for everything
from residential lighting and heating to manufacture and transport, will
end up in the form of heat, in 2001, the amount of heat energy produced
by volcanoes was 1000 times less than the energy consumed by the USA. Over
2001 and 2002, these volcanoes kicked out about 5 x 1016 J/yr
- enough to power New York city for a few months. Single eruptions of one
or two volcanoes can make up a large part of the year's heat budget. Nyamuragira
in the Democratic Republic of Congo and Mount Etna in Italy contributed
about 40% of the volcanic energy total for 2001. When Mount St Helens northwest
USA erupted in 18 May 1980 killing 57 people, it released > 1018
J of heat at once - about 20 times the total heat flow from all the volcanoes
studied in 2001ref
: that blast removed the mountaintop, created the biggest landslide in
recorded history, and deposited a blanket of ash as far away as Montana.
St Helens experienced a series of very small earthquakes that began
on 23 September 2004 : these tiny shudders are thought to be caused by
the underground movement of pressurized steam or gases released from rising
magma. Since the 1980 eruption, a dome of lava nearly 300 metres high has
built up within the crater at the mountain's summit. This forms a plug
that seals off the vent to the magma chamber below. Now magma seems to
be rising in the vent. It's possible that the hot gases it will release
might blow debris and ash out of the dome. There have been no sightings
of gas being released from the dome, for example, which would be expected
if magma is rising. There could be landslides and flows of debris from
the crater, or even flows of molten lava. The region around the volcano,
in the Cascades mountain range, is mostly uninhabited.
Surveillance :
remote sensing of seismic tremors
geodetic monitoring : global positioning system (GPS) measurements of changes
in ground level from satellites
satellite measurements of heat changes as rising magma warms up a volcano's
crater is still an imperfect science that involves subjective judgements
After spending more than a year assessing 169 active volcanoes in the USA
and the Mariana Islands, experts have identified the volcanoes that pose
the greatest threat to people and property. The US Geological Survey (USGS)
report, issued on 29 April 2005, also pins down what steps should be taken
to fill the gaps in the current monitoring systems for those volcanoes.
We know that many of the most damaging effects of eruptions can be mitigated
if proper monitoring is in place. Unlike experts on earthquakes and many
other natural disasters, volcanologists can do a fairly good job of predicting
when disaster will strike, provided they have studied the volcano in question.
The flanks of a volcano may swell up because of the magma inside, or small
tremors may be provoked, and these hints can be used to forecast volcanic
activity. The eruptions of volcanoes that have been heavily studied for
years, such as Italy's Mount Etna, can be reliably predicted to within
hours or even minutes. But for volcanoes that have not been studied and
are not being monitored, researchers have little hope of predicting an
eruption. We know there are many under-monitored volcanoes in the USA.
The USGS report ranks each volcano based on the characteristics of its
eruption (whether it would spew fast- or slow-moving lava, for example),
how often it tends to erupt, whether there are major developments in the
area, how close the nearby airports are, and how many passengers fly over
it each day. The results include a top-10 list of the most dangerous volcanoes
: the score shown below is arrived at by looking at both the destructive
capacity of each volcano, along with the people and property at risk.
Kilauea, Hawaii : erupting now
St Helens, Washington : erupting now
Rainier, Washington
Hood, Oregon
Shasta, California
South Sister, Oregon
Lassen Volcanic Center, California
Mauna Loa, Hawaii : signs of unrest
Redoubt, Alaska
Crater Lake, Oregon
The volcanoes highlighted as priorities include five volcanoes that are
currently erupting or showing unrest, such as Mount St Helens in Washington;
13 very threatening volcanoes that are inadequately monitored, most of
which are in the Cascade Range on the west coast of North America; and
19 volcanoes that pose a great threat to aviation safety but have no real-time
sensors for volcanic activity, many of which are in Alaska. Guffanti notes
that people often rush to monitor a volcano only after it has erupted and
proved that it is dangerous. But this is a bad idea : i is difficult and
risky to play catch-up with a volcano. In 2002, for example, hundreds of
thousands of people were forced to evacuate and dozens of people died in
the eruption of Goma volcano in the Democratic Republic of Congo. Scientists
scrambled to improve monitoring equipment at that volcano only after the
eruption took place. Japan, on the other hand, is known for its proactive
approach to monitoring volcanoes and for applying the most advanced technology
to that quest. We have a lot to gain by looking at what's going on in Japan.
In order to address the gaps in monitoring identified by its report, the
USGS suggests setting up a National Volcano Early Warning System. Among
other things, the system would enhance the monitoring of the most dangerous
volcanoes, establish a watch office that would run 24 hours a day, seven
days a week, and aid the sharing of data between observatories. "This is
the most thorough and quantitative effort I've seen for focusing resources
on the volcano hazards problem. The USGS plans to polish the warning-system
proposal with input from other federal agencies, local officials and businesses.
The report's authors hope that the framework will be phased in over the
next 8-10 years.
Top 10 volcanoes by heat output 2001-2002
(x 1015 J/yr) :
Kilauea, Hawaii : 13
Nyamuragira, Democratic Republic of Congo : 9
Etna, Italy : 6
Fuego, Guatemala : 3
Soufrière Hills, Montserrat, West Indies : 3
Shiveluch, Kamchatka :2
Erta Ale, Ethiopia : 2
Ambrym, Vanuatu, South Pacific : 2
Piton de la Fournaise, Reunion, Indian Ocean : 2
Nyiragongo, Democratic Republic of Congo : 2
An eruption of Mount Vesuvius, near Naples in Italy, could kill as many
as the Indonesian tsunami, but lessons learned from the 1997 eruption on
the Caribbean island of Montserrat could reduce the damage. Vesuvius is
one of the most serious problems facing Europe. The activity at the Soufrière
Hills Volcano in Montserrat began in 1995, but the worst damage happened
in 1997, when fast-flowing streams of hot gas and ash, called pyroclastic
flows, struck several villages. The same kind of eruption devastated Pompeii
in 79 AD. A pyroclastic flow on 26 December 1997, already hot enough to
ignite furniture and other materials, became even more damaging as it picked
up debris and hurled it at buildingsref.
This volcanic flow was not the "turbulent mess" that has often been assumed.
The pressure of the hot gas was surprisingly low, and it caused much of
its damage by entering buildings through open windows and doors. These
streams became highly focused. Like gusts of wind, they come in one window
and go out another. A person cowering in a corner might escape the scorching
material untouched. People have assumed that pyroclastic flows would batter
buildings down, but the Montserrat study shows that heat-resistant coverings
on windows and doors could greatly reduce damage to buildings. The Neapolitan
authorities plan a mass evacuation if Vesuvius threatens to blow, as one
day it surely will,k but it could take at least 5 days to clear the area,
and pyroclastic flows might strike sooner than that. Protective barriers
for windows might save lives. The Montserrat data will be used to develop
a computer model of an eruption of Vesuvius. A new study of the effects
of such flows on the Roman city of Pompeii might aid their effortsref.
The temperatures in Pompeii were estimated from the amount of magnetism
left unmelted in rocks and building fragments. By mapping the temperatures
throughout the remains, the researchers could see how the shapes and arrangements
of buildings and streets set up turbulence that could have cooled the flow
in some places. The findings paint a bleak picture. The town, 9 km southeast
of Vesuvius, was smothered in about 2.5 m of ash even before the 300 °C
pyroclastic flows struck, choking life and caving in roofs. And changes
in the flow swirling over walls didn't seem to reduce the temperature below
about 100 °C, so survivors of the ash would have burned to death. Civil
engineers are interested in the temperature measurements at Pompeii because
they offer clues about how soon emergency vehicles could drive into a town
struck by pyroclastic flows without their tyres melting.
A cluster of tiny underwater volcanoes off the northeastern coast of
Japan has demonstrated that the Earth's inner mantle may not be as solid
as was once thought. Regions of the mantle contain molten material that
can leak out on to the surface through cracks in the plate aboveref.
Most volcanoes on the planet arise where tectonic plates either part or
collide — as around the 'Pacific ring of fire', for example, where volcanoes
sprout up on the edges of the Pacific Ocean plate. Other volcanoes, such
as Hawaii, poke through the middle of a plate thanks to hot material seeping
up from deep below. The prevailing theory is that these volcanoes are fed
by a 'plume' of heat coming from so deep within the planet that the hot
spot stays relatively still while the tectonic plate slides past above
it, creating a chain of volcanic islands on the surface. Now researchers
have investigated, in depth, a class of tiny volcanoes that pepper the
seafloor, which they suspect may be created a different way. Clues of the
volcanoes' existence came a few years ago when a robotic submarine recovered
young basalt rock, which is characteristic of volcanoes, in an area some
600 km from where the Pacific plate dips below Japan — not close enough
to be characterized as a volcano fed by a plate boundary (Hirano N., et
al. Geophys. Res. Lett. , 28. 2719 - 2722 (2001)). To learn more, Naoto
Hirano of the Tokyo Institute of Technology, Japan, and his colleagues
deployed the 26-ton, three-person submersible, SHINKAI (Japanese for 'deep
sea'), to the region. After descending nearly 6 km below sea level, the
crew of the SHINKAI traversed lava fields littered with large black lava
rocks to travel up and down the sides of the volcanoes, which were about
1 kilometre in diameter, taking samples of rock as they went. Analysis
of the rock composition and the structural features of the nearby crust
suggested that the lava source was molten material from the inner-mantle
region called the asthenosphere — the 'plastic' part of the mantle, some
90-400 metres below the ground, upon which tectonic plates move about.
The researchers point out that the Pacific plate bends sharply near where
the mini-volcanoes are found, as it dips down under Japan. Perhaps this
stress caused nearby areas of the plate to crack, they suggest, allowing
hot asthenosphere material to seep up. If the hypothesis is correct, then
we should see this type of volcano in lots of places on the Earth, particularly
near subduction zones with similar plate flexing. Lavas from some Hawaiian
volcanoes, as well as some western Samoan islands, carry a similar chemical
composition to the mini Japanese volcanoes. There are tens of thousands
of small volcanoes on the Pacific seafloor, some along the spreading zone
where new plate is being formed and volcanism would be expected, but many
others elsewhere. Sometimes you see these little pimples. It seems obvious,
he adds, that these aren't on a plate boundary and are too small to be
made by 'plumes', so something else must be responsible. Scientists proposed
decades ago that parts of the asthenosphere may be molten, but they ran
into trouble explaining how such a region could sustain itself. We started
to make calculations and found that if it was really molten, the melts
would creep upwards and it would dry up very quickly. If it leaks, at some
point you reach the stage at which the low-melting-point rock is depleted,
and there's nothing left that can melt. So scientists largely turned away
from the molten-asthenosphere hypothesis. But the new data from Hirano
and his colleagues cause the old question to resurface: how does the asthenosphere
survive if it is constantly leaking? One possible explanation is that these
areas are fed by molten material created by a deep-sourced plume of heat.
A supply of material from deeper in the mantle could prevent the asthenosphere
from slowly wasting away.
Volcanoes can cool the planet by keeping methane-producing bacteria
at bay. Scientists already knew that a major volcanic eruption can cool
the Earth for a couple of years, because the particles and chemicals thrown
into the air make clouds that reflect sunlight back out into space. But
the sulphur dioxide in volcanic plumes also affects the climate indirectly.
Falling in acid rain, these fumes can feed sulphur-loving bacteria in wetlands,
allowing them to out-compete methane-generating bacteria, and so reducing
the amount of methane emitted into the atmosphere. About 50% the world's
methane - a greenhouse gas stronger than carbon dioxide - comes from bacteria
in peat bogs and rice paddy fields. The sulphurous effects of a volcanic
eruption were simulated by adding the mineral sodium sulphate to a peat
bog in northeast Scotland : the area was fertilized in 1998, and methane
emissions were 40% lower 2 years later. The bog still contained elevated
levels of sulphur; it would take 5-10 years from time of fertilisation
for methane emissions to return to normalref.
The experiment was intended to recreate the acid-rain fallout from the
enormous 1783 eruption of the volcano Laki in Iceland, which emitted >
120 million tonnes of sulphur dioxide, 10 times more than western European
industry produces each year. Iceland has events like this every few hundred
years, and there are volcanoes that continuously pump out sulphur for decades.
But simulating Laki's impact on a small area of Scotland requires a fair
amount of guesswork : we have really good estimates of how much sulphur
came out of Laki, but we don't know exactly where it rained out as acid
rain. Volcanic sulphur might have a bigger climatic impact than industrial
emissions, simply because most wetlands are closer to volcanoes than to
factories or power plants. Further research should pin down how much global
cooling volcanoes can trigger in this way. But in pre-industrial times,
when volcanoes were virtually the sole source of acid rain, these sudden
fertilization events could have driven very rapid climate shifts. 50 million
years ago, the warm greenhouse climate of the day was due, in large part,
to methane from the extensive wetlands that covered the Earth : during
that time, large volcanic eruptions could have been real agents of rapid
climate change through this mechanism. Some scientists have suggested that
fertilizing paddy fields with sulphur could slow down climate change. Industry
and volcanoes may already produce enough acid rain to maximize the bug-boosting
effect.
Web resources :
Exploris
: a European project which is studying the human risks of eruptions and
how to mitigate them. As well as Vesuvius and Montserrat, Exploris is looking
at hazards posed by Mount Teide in the Canary Islands, Sete Cidades in
the Azores, and others.
Stromboli Online, hosted
by Italian researchers Roberto Carniel and Marco Fulle and Swiss teacher
Jürg Alean
storm / tempest : an individual low-pressure
disturbance, complete with winds, clouds, and precipitation. The name is
associated with destructive or unpleasant weather
wind :
Names of winds :
abroholos : a squall frequent from May through August between Cabo
de Sao Tome and Cabo Frio on the coast of Brazil.
austru : a east or southeast wind in Rumania. They are cold
in winter and may be a local name for a foehn wind
Bali wind : a strong east wind at the eastern end of Java.
Barat : a heavy northwest squall in Manado Bay on the north coast
of the island of Celebes, prevalent from December to February.
Barber : a strong wind carrying damp snow or sleet and spray that
freezes upon contact with objects, especially the beard and hair.
Bayamo : a violent wind blowing from the land on the south coast
of Cuba, especially near the Bight of Bayamo.
Bentu de Soli : an east wind on the coast of Sardinia.
Bora : a cold, northerly wind blowing from the Hungarian basin into
the Adriatic Sea
Borasco : a thunderstorm or violent squall, especially in the Mediterranean.
Boreas : a ancient Greek name for north winds. The term may originally
have meant "wind from the mountains" and thus the present term BORA
Brickfielder : a wind from the desert in Southern Australia. Precedes
the passage of a frontal zone of a low passing by. Has the same dusty character
as the Harmattan
Brisa, Briza : a northeast wind which blows on the coast of South
America or an east wind which blows on Puerto Rico during the trade wind
season. 2. The northeast monsoon in the Philippines.
Brisote : the northeast trade wind when it is blowing stronger than
usual on Cuba.
Brubu : a name for a squall in the East Indies.
bull's eye squall : a squall forming in fair weather, characteristic
of the ocean off the coast of South Africa. It is named for the peculiar
appearance of the small isolated cloud marking the top of the invisible
vortex of the storm.
Cape doctor : the strong southeast wind which blows on the South
African coast
Caver, Kaver : a gentle breeze in the Hebrides.
Chubasco : a violent squall with thunder and lightning, encountered
during the rainy season along the west coast of Central America.
Churada : a severe rain squall in the Mariana Islands during the
northeast monsoon. They occur from November to April or May, especially
from January through March.
Contrastes : winds a short distance apart blowing from opposite
quadrants, frequent in the spring and fall in the western Mediterranean.
Cordonazo : the "Lash of St. Francis." Name applied locally to southerly
hurricane winds along the west coast of Mexico. It is associated with tropical
cyclones in the southeastern North Pacific Ocean. These storms may occur
from May to November, but ordinarily affect the coastal areas most severely
near or after the Feast of St. Francis, October 4.
Coromell : a night land breeze prevailing from November to May at
La Paz, near the southern extremity of the Gulf of California.
Diablo : Northern California version of Santa Ana winds. These winds
occur below canyons in the East Bay hills (Diablo range) and in extreme
cases can exceed 60 mph. They develop due to high pressure over Nevada
and lower pressure along the central California coast
Doctor : a cooling sea breeze in the Tropics. 2. See HARMATTAN.
3. The strong SE wind which blows on the south African coast. Usually called
Cape Doctor
Elephanta : a strong southerly or southeasterly wind which blows
on the Malabar coast of India during the months of September and October
and marks the end of the southwest monsoon.
Etesian : a refreshing northerly summer wind of the Mediterranean,
especially over the Aegean Sea.
Euros : the Greek name for the rainy, stormy southeast wind
Foehn : a warm dry wind on the lee side of a mountain range, whose
temperature is increased as the wind descends down the slope. It is created
when air flows downhill from a high elevation, raising the temperature
by adiabatic compression. Classified as a katabatic wind. Examples
include
Chinook / snow eater : a type of foehn wind. Refers to the warm
downslope wind in the Rocky Mountains that may occur after an intense cold
spell when the temperature could rise by 20 to 40°F in a matter of
minutes
Santa Ana : a strong, hot, dry wind blowing out into San Pedro Channel
from the southern California desert through Santa Ana Pass.
Fremantle Doctor : a cooling seabreeze in Western Australia,often
made note of during hot summer-time cricket matches
Gregale : a strong northeast wind of the central Mediterranean.
Haboob : a strong wind and sandstorm (or duststorm) in the northern
and central Sudan, especially around Khartum, where the average number
is about 24 per year. The name come from the Arabic word, "habb", meaning
wind
Harmattan : the dry, dusty trade wind blowing off the Sahara Desert
across the Gulf of Guinea and the Cape Verde Islands. Sometimes called
the DOCTOR, because of its supposed healthful properties.
Knik wind : a strong southeast wind in the vicinity of Palmer, Alaska,
most frequent in the winter.
Kona storm : a storm over the Hawaiian Islands, characterized by
strong southerly or southwesterly winds and heavy rains.
Leste : a hot, dry, easterly wind of the Madeira and Canary Islands.
Levanter : a strong easterly wind of the Mediterranean, especially
in the Strait of Gibraltar, attended by cloudy, foggy, and sometimes rainy
weather especially in winter.
Levantera : a persistent east wind of the Adriatic, usually accompanied
by cloudy weather.
Levanto : a hot southeasterly wind which blows over the Canary Islands.
Leveche : a warm wind in Spain, either a foehn or a hot southerly
wind in advance of a low pressure area moving from the Sahara Desert. Called
a SIROCCO in other parts of the Mediterranean area.
Maestro : a northwesterly wind with fine weather which blows, especially
in summer, in the Adriatic. It is most frequent on the western shore. This
wind is also found on the coasts of Corsica and Sardinia.
Maria : a fictional wind popularized in "Paint Your Wagon" (Lerner
and Lowe, 1951) and by the Kingston Trio (1959), whose name may have originated
with the 1941 book "Storm" by George R. Stewart. (Jan Null, Golden Gate
Weather Services)
Matanuska wind : a strong, gusty, northeast wind which occasionally
occurs during the winter in the vicinity of Palmer, Alaska.
Mistral / Cierzo : a cold, dry wind blowing from the north over
the northwest coast of the Mediterranean Sea, particularly over the Gulf
of Lions. See also FALL WIND.
monsoon : a seasonal wind that brings rain
to many places of the world, for example India and Southern Asia. The Indian
monsoon, which waters India's agriculture, could run dry because of human
impacts on the environment, a team of climate researchers has warned. Kirsten
Zickfeld and her colleagues at the Potsdam
Institute for Climate Impact Research say that the monsoon has two
major settings: on, as at present, and off, when it produces very little
rainfall. A switch-off would be catastrophic for India's main crop, rice,
which depends on heavy monsoon rains. Even a minor change in monsoon timing
or intensity can have a big impact. If the rains are delayed by just a
few days, that affects the agricultural yields. The monsoon's disappearance
would wreak havoc, probably requiring Indian farmers to completely change
their crops and methods. Changes in land use and air pollution on the Indian
continent are pushing conditions towards the off state. They don't know
if or when it might happen, but they say there is cause for concern. The
monsoon is driven by an air-pressure difference between the land and the
Indian Ocean. Usually, the hot season creates low-pressure zones over the
warm continent. Air rushes in from the higher-pressure zone over the water,
bringing rain to the land. Anything that reduces this pressure difference
- such as cooler land temperatures - can weaken the monsoon. And once the
weakening exceeds a certain threshold, the climate switches into a new
state in which moist air over the ocean is no longer carried inlandref.
In India and southeast Asia, several factors are causing less sunlight
to warm the ground. There are more aerosols, because of industrial growth
and greater vehicle use, which reflect light back into space. And clearing
forests for farmland is replacing dark, light-absorbing treetops with lighter,
more reflective soil. "This raises a red flag", says Hans Joachim
Schellnhuber, Zickfeld's co-worker and director of the Tyndall
Centre for Climate Change Research in Norwich, UK. "If we continue
to change land cover, and at the same time aerosols increase, we're moving
towards the 'off' point. There are signs that the Chinese monsoon is weakening,
perhaps for the same reasons. Their model is crude, so it can't predict
when, or even if, current trends will trigger a change in the state of
the monsoon. But their work's simplicity also has its advantages. Complicated
computer models have suggested that the monsoon might change, but it has
been hard to pick out the crucial causes. Global
warming,
caused by rising levels of atmospheric carbon dioxide, might make the monsoon
more intense, increasing rainfall. That could be equally bad for the area,
as illustrated by the 1,000 people killed in Mumbai last week in floods
due to abnormally heavy monsoon rains. The worst case would be what Zickfeld's
team calls a 'roller-coaster effect': drying of the monsoon, followed by
the return of an even more intense wet monsoon as aerosol emissions are
cleaned up but carbon dioxide goes on increasing. Such a series of changes
would seriously challenge the adaptive capabilities of India's rural society.
Nashi, N'aschi : a northeast wind which occurs in winter on the
Iranian coast of the Persian Gulf, especially near the entrance to the
gulf, and also on the Makran coast. It is probably associated with an outflow
from the central Asiatic anticyclone which extends over the high land of
Iran. It is similar in character but less severe than the BORA.
Norte : a strong cold northeasterly wind which blows in Mexico and
on the shores of the Gulf of Mexico. It results from an outbreak of cold
air from the north. It is the Mexican extension of a norther.
Nor'easter / Northeaster : a northeast wind, particularly a strong
wind or gale; an unusually strong storm preceded by northeast winds
off the coast of New England
Nor'wester : this is a very warm wind which can blow for days on
end in the province of Canterbury New Zealand. The effect is especially
felt in the city of Christchurch. The wind comes in from the Tasman Sea,
drys as it rises over the Southern Alps, heats as it decends, crosses the
Canterbury Plains, then blows through Christchurch
(Blue) Norther : a cold strong northerly wind in the Southern Plains
of the United States, especially in Texas, which results in a drastic drop
in air temperatures
Ostria : a warm southerly wind on the Bulgarian coast; considered
a precursor of bad weather
Pali : a local name for strong winds which blow through the Pali
Pass above Honolulu, HI
Pampero : a west or southwest wind in Southern Argentina. This wind
(often violently) picks up during the passage of a cold front of an active
low passing by
Papagayo : a violet northeasterly fall wind on the Pacific coast
of Nicaragua and Guatemala. It consists of the cold air mass of a norte
which has overridden the mountains of Central America
Shamal : a summer northwesterly wind blowing over Iraq and the Persian
Gulf, often strong during the day, but decreasing at night.
Sharki : a southeasterly wind which sometimes blows in the Persian
Gulf.
simoon : dry, hot, dust-laden wind, which blows
in the Sahara, Palestine, Syria and the desert of Arabia.
Sirocco : a warm wind of the Mediterranean area, either a foehn
or a hot southerly wind in advance of a low pressure area moving from the
Sahara or Arabian deserts. Called Leveche in Spain.
Squamish : a strong and often violent wind occurring in many of
the fjords of British Columbia. Squamishes occur in those fjords oriented
in a northeast-southwest or east-west direction where cold polar air can
be funneled westward. They are notable in Jervis, Toba, and Bute inlets
and in Dean Channel and Portland Canal. Squamishes lose their strength
when free of the confining fjords and are not noticeable 15 to 20 miles
offshore.
Suestado : a storm with southeast gales, caused by intense cyclonic
activity off the coasts of Argentina and Uruguay, which affects the southern
part of the coast of Brazil in the winter.
Sumatra : a squall with violent thunder, lightning, and rain, which
blows at night in the Malacca Straits, especially during the southwest
monsoon. It is intensified by strong mountain breezes.
Sundowner : warm downslope winds that periodically occur along a
short segment of the Southern California coast in the vicinity of Santa
Barbara. The name refers to their typical onset (on the populated coastal
plain) in the late afternoon or early evening, though they can occur at
any time of the day. In extreme cases, wind speeds can be of gale force
or higher, and temperatures over the coastal plain and even at the coast
itself can rise significantly above 37.8°C (100°F)
Taku wind : a strong, gusty, east-northeast wind, occurring in the
vicinity of Juneau, Alaska, between October and March. At the mouth of
the Taku River, after which it is named, it sometimes attains hurricane
force.
Tehuantepecer : a violent squally wind from north or north-northeast
in the Gulf of Tehuantepec (south of southern Mexico) in winter. It originates
in the Gulf of Mexico as a norther which crosses the isthmus and blows
through the gap between the Mexican and Guatamalan mountains. It may be
felt up to 100 miles out to sea
Tramontana : a northeasterly or northerly winter wind off the west
coast of Italy. It is a fresh wind of the fine weather mistral type.
Vardar / Vardarac : a cold fall wind blowing from the northwest
down the Vardar valley in Greece to the Gulf of Salonica. It occurs when
atmospheric pressure over eastern Europe is higher than over the Aegean
Sea, as is often the case in winter
warm braw : a foehn wind in the Schouten Islands north of New Guinea.
white squall : a sudden, strong gust of wind coming up without warning,
noted by whitecaps or white, broken water; usually seen in whirlwind form
in clear weather in the tropics.
Williwaw : a sudden blast of wind descending from a mountainous
coast to the sea, in the Strait of Magellan or the Aleutian Islands.
Willy-willy : a tropical cyclone (with winds > 33 knots) in Australia,
especially in the southwest. More recent common usage is for dust-devils.
Zephyros : the ancient Greek name for the west wind, which generally
light and beneficial. It has evolved into "zephyr" which denotes
a soft gentle breeze
Stevenson screen : a white box with ventilated sides that is used
to house weather instruments and protect them from direct sunlight
Beaufort wind speed scale
: a scale of numbers representing different wind speeds and a description
of their effects on land or sea. It was invented by Admiral Beaufort in
the early 19th century.
anemometer: an instrument that measures the speed or force of the
wind.
barometer : an instrument for determining the pressure of the atmosphere
isobar : the line drawn on a weather map connecting points of equal
barometric pressure
closest point of approach : point where hurricane eye makes closest
contact to shore without actually making landfall
cold front : the leading edge of an advancing cold air mass that
is underrunning and displacing the warmer air in its path. Generally, with
the passage of a cold front, the temperature and humidity decrease, the
pressure rises, and the wind shifts (usually from the southwest to the
northwest in the Northern Hemisphere). Precipitation is generally at and/or
behind the front, and with a fast-moving system, a squall line may develop
ahead of the front. See occluded front and warm front
convergence : wind movement that results in a horizontal net inflow
of air into a particular region. Convergent winds at lower levels are associated
with upward motion. Contrast with divergence
depression : in meteorology, it is another name for an area of low
pressure, a low, or trough. It also applies to a stage of tropical cyclone
development and is known as a tropical depression to distinguish it from
other synoptic features.
filling : used in describing the history of a low-pressure system
or an area of cyclonic circulation, it means an increase in the central
pressure of the system. Although it usually describes the action of a pressure
system on a constant pressure chart, it also means a surface low is decreasing
in cyclonic circulation and losing its characteristics
rain : precipitation in the form of liquid water
droplets greater than 0.5 mm. If widely scattered, the drop size may be
smaller. It is reported as "R" in an observation and on the METAR. The
intensity of rain is based on rate of fall. "Very light" (R--) means that
the scattered drops do not completely wet a surface. "Light" (R-) means
it is greater than a trace and up to 0.10 inch an hour. "Moderate" (R)
means the rate of fall is between 0.11 to 0.30 inch per hour. "Heavy" (R+)
means over 0.30 inch per hour
drought : a prolonged period with very little
or no rain
deepening : used in describing the history of a low-pressure system
or an area of cyclonic circulation, it means a decrease in the central
pressure of the system. Although it usually describes the action of a pressure
system on a constant pressure chart, it also means a surface low is increasing
in cyclonic circulation and acquiring more energy
explosive deepening : a decrease in the minimum sea-level pressure
of a tropical cyclone of 2.5 mb/hr for at least 12 hours or 5 mb/hr for
at least 6 hours.
rapid deepening : a decrease in the minimum sea-level pressure of
a tropical cyclone of 1.75 mb/hr or 42 mb for 24 hours.
disturbance : this has several applications. It can apply to a low
or cyclone that is small in size and influence. It can also apply to an
area that is exhibiting signs of cyclonic development. It may also apply
to a stage of tropical cyclone development and is known as a tropical disturbance
to distinguish it from other synoptic features
front : the boundary between two dissimilar air masses
occluded front : the front formed by a cold front overtaking a warm
or stationary front and lifting the warm air above the earth's surface
warm front : the leading edge of an advancing warm air mass that
is replacing a retreating relatively colder air mass. Generally, with the
passage of a warm front, the temperature and humidity increase, the pressure
rises, and although the wind shifts (usually from the southwest to the
northwest in the Northern Hemisphere), it is not as pronounced as with
a cold frontal passage. Precipitation, in the form of rain, snow, or drizzle,
is generally found ahead of the surface front, as well as convective showers
and thunderstorms. Fog is common in the cold air ahead of the front. Although
clearing usually occurs after passage, some conditions may produced fog
in the warm air
cold front : the leading edge of an advancing cold air mass that
is underrunning and displacing the warmer air in its path. Generally, with
the passage of a cold front, the temperature and humidity decrease, the
pressure rises, and the wind shifts (usually from the southwest to the
northwest in the Northern Hemisphere). Precipitation is generally at and/or
behind the front, and with a fast-moving system, a squall line may develop
ahead of the front. See occluded front and warm front.
jet stream : relatively strong winds concentrated within a narrow
current in the atmosphere.
nautical mile : a unit of length used in marine navigation that
is equal to a minute of arc of a great circle on a sphere. One international
nautical mile is equivalent to 1,852 meters or 1.151 statue miles. Refer
to a sea mile.
knot : a unit for the measurement of speed in the nautical system.
It is the nautical miles per hour.
North Atlantic basin : the Atlantic Ocean north of the equator,
the Caribbean Sea, and the Gulf of Mexico.
satellite : used in reference to the manufactured objects that orbit
the earth, either in a geostationary or a polar manner. Some of the information
that is gathered by weather satellites, such as GOES9, includes upper air
temperatures and humidity, recording the temperatures of cloud tops, land,
and ocean, monitoring the movement of clouds to determine upper level wind
speeds, tracing the movement of water vapor, monitoring the sun and solar
activity, and relaying data from weather instruments around the world
satellite pictures : pictures taken by a weather satellite, such
as GOES-9, that reveal information, such as the flow of water vapor, the
movement of frontal systems, and the development of a tropical system.
Looping individual pictures aids meteorologists in forecasting. One way
a picture can be taken is as a visible shot, which is best during times
of visible light (daylight). Another way is as an IR (infrared) shot, which
reveals cloud temperatures and can be used day or night
spiral rainbands : bands of thunderstorms that spiral inward towards
the center, where they wrap themselves around the eye
squall : a sudden increase of wind speed by at least 18 miles per
hour (16 knots) and rising to 25 miles per hour (22 knots) or more and
lasting for at least one minute.
storm surge : an abnormal rise in sea level accompanying a hurricane
or other intense storm, and whose height is the difference between the
observed level of the sea surface and the level that would have occurred
in the absence of the cyclone. Storm surge is usually estimated by subtracting
the normal or astronomic high tide from the observed storm tide. Note:
waves on top of the storm surge will create an even greater high-water
mark
storm tide : the actual level of seawater resulting from the astronomic
tide combined with the storm surge. If the storm comes ashore during astronomical
low tide, the surge will be decreased by the amount of the low tide. If
the storm makes landfall during astronomical high tide, the surge will
be that much higher.
synoptic scale : the size of migratory high and low pressure systems
in the lower troposphere that cover a horizontal area of several hundred
miles or more such as hurricanes. Contrasts with macroscale, mesoscale,
and storms.
trade winds : the wind system, occupying most of the tropics, which
are northeasterly in the Northern Hemisphere and southeasterly in the Southern
Hemisphere
tropical disturbance : a discrete system of clouds, showers, and
thunderstorms (organized convection) that originate in the tropics. Generally
100 to 300 miles in diameter and originating in the tropics or subtropics,
disturbances have a nonfrontal migratory character, and maintain their
identity for 24 hours or more. It may or may not be associated with a detectable
perturbation of the wind field. An upper level of low pressure causes this
to occur. Approximately 100 of these types of events occur annually during
hurricane season.
thunder : the sound that follows a flash of
lightning and is caused by sudden expansion of the air in the path of the
electrical discharge
thunderstorm : a local storm produced
by a cumulonimbus cloud, always with lightning and thunder, and usually
accompanied by strong gusts of wind, heavy rain, and sometimes hail
Symptoms & signs : thunderstorms have
often been linked to epidemics of tracheobronchial
asthma,
especially during the grass flowering season; however, the precise mechanisms
explaining this phenomenon are unknown. Evidence of high respirable allergen
loadings in the air associated with specific meteorologic events combined
with an analysis of pollen physiology suggests that rupture of airborne
pollen can occur. Strong downdrafts and dry, cold outflows distinguish
thunderstorm rain from frontal rain. The weather system of a mature thunderstorm
likely entrains grass pollen into the cloud base, where pollen rupture
would be enhanced, then transports the respirable-sized fragments of pollen
debris to ground level where outflows distribute them ahead of the rain.
The conditions occurring at the onset of a thunderstorm might expose susceptible
people to a rapid increase in concentrations of pollen allergens in the
air that can readily deposit in the lower airways and initiate asthmatic
reactionsref.
vortex : any circular or rotary flow in the atmosphere that possesses
vorticity
vorticity : the measurement of the rotation of a small air parcel.
It has vorticity when the parcel spins as it moves along its path. Although
the axis of the rotation can extend in any direction, meteorologists are
primarily concerned with the rotational motion about an axis that is perpendicular
to the earth's surface. If it does not spin, it is said to have zero vorticity.
In the Northern Hemisphere, the vorticity is positive when the parcel has
a counterclockwise, or cyclonic, rotation. It is negative when the parcel
has clockwise, or anticyclonic, rotation.
cyclone / low-pressure system : a storm of
rotating and converging winds associated with atmospheric low pressure
region or air mass. The cyclone winds spin around a common center with
a relative pressure minimum, counter-clockwise in the northern hemisphere,
or clockwise in the southern hemisphere. It usually has a diameter of 2000
to 3000 km. When developing, a cyclone typically consists of a warm front
pushing northward and a cold front pushing southward with the center of
low pressure (cyclone center) located at the junction of the 2 fronts.
An area of a relative pressure minimum that has converging winds and rotates
in the same direction as the earth.
extratropical cyclone : cyclone
in the middle and high latitudes often being 2000 km in diameter and usually
containing a cold front that extends toward the equator for hundreds of
kilometers. These cyclones forms outside the tropics, the center of storm
is colder than the surrounding air, have fronts and the strongest winds
in the upper atmosphere.
subtropical cyclone : a low pressure
system that develops over subtropical waters that initially has a non-tropical
circulation but in which some elements of tropical cyclone cloud structure
are present. Subtropical cyclones can evolve into tropical cyclones
subtropical depression : subtropical
cyclone in which the maximum sustained surface wind speed (using the U.S.
1-minute average) is < 38 mph (33 knots)
subtropical storm : a subtropical
cyclone in which the maximum sustained surface wind speed (using the U.S.
1-minute average) is > 39 mph (34 knots).
tropical cyclone : a general term
for all cyclone circulations originating over tropical waters. Its characteristics
include a warm-core, non-frontal pressure system of synoptic scale that
originates over the tropical or subtropical waters and has a definite organized
surface. Used to define wind circulations rotating around an atmosphere
which include tropical depressions, tropical storms, and hurricanes. The
strongest winds of this cyclone are near the Earth's center. The basic
element of Lighthill's "sandwich model" of tropical cyclones is
the existence of "ocean spray," a layer intermediate between air and sea
made up of a cloud of droplets that can be viewed as a "third fluid." A
mathematical model of the flow in the ocean spray based on a semiempirical
turbulence theory has been proposed and it has been demonstrated that the
availability of the ocean spray over the waves in the ocean can explain
the tremendous acceleration of the wind as a consequence of the reduction
of the turbulence intensity by droplets. This explanation complements the
thermodynamic arguments proposed by Lighthillref.
Fujiwhara effect : a binary interaction where tropical cyclones
within a certain distance (300-750 nautical miles depending on the sizes
of the cyclones) of each other begin to rotate about a common midpoint
center: the vertical axis or core of a tropical cyclone.
It is usually determined by cloud vorticity patterns, wind, and/or pressure
distributions.
eye : the center of a tropical storm or hurricane characterized
by a roughly circular area of light winds and rain-free skies and the lowest
pressure. An eye will usually develop when the maximum sustained wind speeds
exceed 78 mph. It can range in size from as small as 5 miles to up to 60
miles (20-50 km) but the average size is 20 miles. In general, when the
eye begins to shrink in size, the storm is intensifying.
eye wall : an organized band of convection surrounding the eye,
or center, of a tropical cyclone. It contains cumulonimbus clouds, severest
thunderstorms, heaviest precipitation and strongest wind
feeder bands : in tropical parlance, the lines or bands of thunderstorms
that spiral into and around the center of a tropical system. Also known
as outer convective bands, a typical hurricane may have three or more of
these bands. They occur in advance of the main rain shield and are usually
40 to 80 miles apart. In thunderstorm development, they are the lines or
bands of low-level clouds that move or feed into the updraft region of
a thunderstorm
probability of tropical cyclone conditions : the probability, in
percent, that the cyclone center will pass within 50 miles to the right
or 75 miles to the left of the listed location within the indicated time
period when looking at the coast in the direction of the cyclone's movement.
tropical storm (TS) : a tropical
cyclone in which the maximum sustained surface wind speed (1 minute average)
is within the range of 39 to 73 mph (34 to 63 knots). At this point, the
system is given a name to identify and track it. In the Atlantic/Caribbean/Gulf
of Mexico basin, the names start with "A" each season.
tropical depression (TD) :
a tropical cyclone in which the maximum sustained surface winds (1 minute
average) are < 38 mph (33 knots). Characteristically having one or more
closed isobars, it may form slowly from a tropical disturbance or an easterly
wave, which has continued to organize. At this point, it gets a cyclone
number, starting with "TD01" at the beginning of each storm season.
hurricanes : a tropical cyclone in the
Northern Hemisphere with substained winds > 74 mph (64 knots) in the North
Atlantic Ocean, Caribbean Sea or Gulf of Mexico and Eastern Pacific. The
word is believed to originate from the Caribbean Indian storm god "Huracan".
These winds blow in a large spiral around a relatively calm center of extremely
low pressure known as the eye. Around the rim of the eye, winds may gust
to > 200 mph. The entire storm, which can be up to 340 (550) in diameter,
dominates the ocean surface and lower atmosphere over tens of thousands
of square miles. Hurricanes draw their energy from the warm surface water
of the tropics (usually above 27°C) and latent heat of condensation,
which explains why hurricanes dissipate rapidly once they move over cold
water or large land masses. Most North Atlantic hurricanes start over the
middle of the ocean, and move west towards the Caribbean. Some continue
west through the Gulf of Mexico, but others move up the east coast of Florida.
It is extremely difficult to say whether the storms will move over land
or back out over the sea. However, forecasters can usually predict a hurricane's
path about 3 days in advance, with enough accuracy to give residents time
to evacuate if necessary. This means hurricanes cause far fewer deaths
today compared with 50 years ago. Although forecasters are increasingly
adept at predicting the path of a hurricane, they can still be taken by
surprise when it comes to guessing its strength. 'Hurricane-hunter' aircrafts
are equipped with wind, temperature and pressure gauges. These planes have
a dual function: they help forecasters by observing the storm in real time
and gain valuable readings with which researchers can construct models
of hurricane behaviour. To get right into the heart of the storm, the planes
deploy 'dropsondes', packages of meteorological instruments dropped by
parachute, but it is difficult to get the instruments in the right place.
Climatologists are insisting the battering of hurricances in USA in 2004
(Charley, Frances, Ivan, Jeanne) is pure chance. Global warming is probably
having little or no effect on the number of hurricanes, and there have
been no more hurricanes in 2004 than last. In 2003, however, most of the
storms blew themselves out over the sea. In 2004, many of them have headed
straight for populated areas, which is simply bad luck.
Immense waves capable of sinking the largest ships might not be freaks
of nature, but a common result of hurricanes. That's the implication of
new evidence that Hurricane Ivan, which whirled across the Gulf of Mexico
in September 2004, probably generated waves > 40 metres high from crest
to trough. The water pressures measured by their array of seafloor sensors,
60 to 90 m below the ocean off the northeast coast of the gulf, indicate
the passage of waves nearly 30 m high. But the waves near the eye of the
hurricane, which unfortunately passed over the sensors while they were
not taking measurements, would probably have topped 40 m. These are the
largest wave heights ever recorded with instruments in US waters : they're
larger than we ever thought they would be. Ivan wasn't even a particularly
large hurricane. Terrible walls of water are a staple of nautical lore.
But oceanographers have only recently come to accept them, as simple statistics
suggest that such extreme events should almost never happen. But there's
no fundamental reason why ocean waves can't grow to immense sizes. Nobody
knows what the upper limit might be. The largest waves to hit shore come
from tsunamis - waves generated most often by seafloor movements during
major earthquakes. But while at sea these waves are just a few centimetres
high, only growing as they reach shallow waters. The Indian Ocean tsunami
of December 2004 reached 30 metres above sea level in the hardest hit areas.
There are eyewitness accounts of similarly enormous waves at sea. In 1995,
the Queen Elizabeth 2 liner survived an encounter with one about 30 m high
in the North Atlantic, and 6 years later a similar wave smashed windows
on the cruise ship Bremen in the South Atlantic and nearly sank it. It
is now widely suspected that such rogue waves, generated by wind and currents,
might explain the mysterious, regular disappearance of large ships at sea.
One, the German supertanker München, vanished in 1978. Such waves
are generally ascribed to unusual circumstances, such as fast-moving storms.
But the new findings suggest that waves even larger than these might romp
across the oceans whenever a hurricane hits. The latest hurricane models
predict such waves should occur very often. The findings are consistent
with less accurate measurements made from ships. The study may spell bad
news for ships and coastal defences. If hurricane activity increases, as
some expect to happen along with climate change, such giant waves could
become more frequentref.
Global warming
may be contributing to making hurricanes stronger. The storms have been
getting more destructive over the past 3 decades. Global warming might
increase the effect of hurricanes further still in the coming years. Previous
studies have looked at the effects of rising global average temperatures
on tropical cyclones, commonly called hurricanes or typhoons. But these
past studies have tended to focus on whether the events have become more
frequent. No such trends have been clearly observed. But researchers looked
instead at the question of whether cyclones have got more intense, that
is, whether they hit harder and last longer. Theories and computer
simulations of climate indicate that warming should indeed generate an
increase in storm intensity. Emanuel analysed records of tropical cyclones
since the middle of the last century, and found that the amount of energy
released in these events in both the North Atlantic and the North Pacific
oceans increased markedly since the mid-1970s. Both the duration of the
cyclones and the largest wind speeds they produced have increased by about
half over the past 50 years. These increases in storm intensity are mirrored
by increases in the average temperature at the surface of the tropical
oceans, suggesting that this warming—some of which can be ascribed to global
warming—is responsible for the greater power of the cyclones. As the human
population in coastal regions gets ever denser, the damage and casualties
produced by more intense storms could increase considerably in the future.
hurricane eye : the relatively calm area near the center of the
storm. In this area, winds are light and the sky is often partly covered
by clouds
landfall : the term used to describe where the hurricane eye actually
passes over land, usually used to describe the continental States rather
than islands in the Caribbean
hurricane eye landfall : when the eye, or physical center of the
hurricane, reaches the coastline from the hurricane's approach over water
hurricane path or track : line of movement (propagation) of the
eye through an area
hurricane season : the portion of the year having a relatively high
incidence of hurricanes. The hurricane season in the Atlantic, Caribbean,
and Gulf of Mexico runs from June 1 to November 30. The hurricane season
in the Eastern Pacific basin runs from May 15 to November 30. The hurricane
season in the Central Pacific basin runs from June 1 to November 30
swath : the width of the path of the hurricane. Usually this path
area is about 125 miles wide with 75 miles to the right of the eye and
50 miles to the left of the eye
intertropical convergence zone (ITCZ) : the axis dividing the southeast
trades from the northeast trades, toward which the surface winds tend to
converge The easterly trade winds of both hemispheres converge at an area
near the equator called the "Intertropical Convergence Zone (ICTZ)", producing
a narrow band of clouds and thunderstorms that encircle portions of the
globe
maximum envelope of water (MEOW) : describes the predicted areas
inundated and amount of storm surge for a particular area during the landfall
of a hurricane. Used in the SLOSH Model
maximum envelope of wind (MEOW) : describes the predicted areas
inundated and amount of wind for a particular area during the landfall
of a hurricane. Used in the Inland Wind Model.
typhoons : a hurricane that occurs in the
Western Pacific Region of the Philippines or the China Sea. The word is
believed to originate from the Chinese word "ty-fung". When there's a cyclone
around, you want to know where it is headed. But that may be harder to
predict than we thought. 2 simultaneous tropical cyclones can influence
one another in ways that are subtle and difficult to forecast, even if
they are nearly 2,000 km apart. Weather forecasters currently use computer
methods to predict the course of a cyclone - a fierce storm system that
gathers around a region of low atmospheric pressure. But when there is
another cyclone in the vicinity, small differences in the initial meteorological
conditions used in the model can significantly affect the prediction of
the first cyclone's path and behaviour 2 days later. The distance over
which cyclones can affect each other is surprising. Previous observations
of twin cyclones had led experts to suspect that they could not affect
one another if they were > 1,500 km apart. A cyclone-prediction model developed
by the US Navy was used to forecast the trajectories of 2 real, coexisting
cyclones, Katsana and Parma, that formed in the western Pacific Ocean in
late October 2003. The predicted courses corresponded fairly closely to
those of the real storms over a 2-day period. But a careful look at the
results showed that the forecast for Parma depended on that for Katsana,
despite the fact that the 2 storms were > 1,500 km apart for most of their
lives. (Their closest approach was 1,333 km.) Parma did not, in turn, seem
to affect Katsana, which was the stronger of the 2 cyclones. What's more,
for cyclone Katsana, the researchers showed that the predictions would
have been altered significantly if the model's initial conditions 500 km
from the cyclone's centre were changed slightly, showing how surprisingly
sensitive these storms can be.Although the forecasts in these cases were
good, the tools highlighted the potential for initial errors to grow. 2
cyclones together is not uncommon - it happens about three times every
2 years in the western North Pacific Ocean, and about once every 3 years
in the Atlantic. Despite the potential problems for cyclone forecasting,
Peng and Reynolds say that their results have a positive spin too. The
new technique they used for analysing cyclone interactions, called singular
vector diagnostics, seems to give them a better handle on the problem
than standard methods. It may tell us how much confidence we can have in
our forecastsref
The Saffir-Simpson
damage-potential scale is a 5-category wind speed / storm surge
classification scale developed in the early 1970s by Herbert Saffir, a
consulting engineer, and Robert Simpson, then Director of the National
Hurricane Center, to measure the intensity of an Atlantic hurricane from
1 to 5. The strongest SUSTAINED hurricane wind speeds correspond to a strong
F3 (Severe Tornado) or possibly a weak F4 (Devastating Tornado) value.
The scale categorizes potential damage based on barometric pressure, wind
speeds, and storm surge. Scale numbers are available to public safety officials
when a hurricane is within 72 hours of landfall. Scale assessments are
revised regularly as new observations are made. Public safety organizations
are kept informed of new estimates of the hurricane's disaster potential.
In practice, sustained surface wind speed (1-minute average) is the parameter
that determines the category since storm surge is strongly dependent on
the slope of the continental shelf. Whereas the highest wind gusts in Category
5 hurricanes correspond to moderate F4 tornado values, F5 tornado wind
speeds are not reached in hurricanes.
Those living on the hurricane-buffeted east coast of the USA, or those
who sell them insurance, will probably take notice of a model that predicts
the annual damage done by such storms. The model, based on wind patterns
over the land and sea in July, can predict whether damages will be greater
or less than average in the storm season from August to October. Previous
models have been able to forecast the number of hurricanes forming at sea,
based on variables such as ocean temperature, or fluctuations in air pressure
over the North Atlantic Ocean. But even with a completely accurate forecast
of hurricane formation it is difficult to say how many will make it to
land. A model using data about winds in the troposphere, which extends
about 10 kilometres up from the surface of the Earth, from 1950 to 2003
can say whether a year would see damages that were above or below average
in 74% of cases. Persistent wind systems, dubbed 'steering winds', have
been identified in 6 key regions. Together, these winds seem to determine
whether the coming season will be a bad one for US residents. Near Bermuda,
for example, unusually high pressures in July can cause onshore winds that
consistently blow towards the US coast for the remainder of the summer.
Hurricane systems tend to form in the tropical North Atlantic before drifting
westwards.. If they encounter a steering wind pointing away from the coast
they will swerve harmlessly back out to the ocean. But if the steering
winds are pointing towards land, the storms are funnelled on to the shore.
The researchers are not sure why the steering winds remain consistent for
the entire summer. There's lots of day-to-day variation, but overall the
winds do seem to persist for around three months. This will be really important
for insurance companies : when faced with the possibility of a catastrophic
event, insurers need to decide whether they will buy more coverage or not.
Over time, insurers could save 30% on their returns and premiums using
this model. In fact, they had a chance to do so in 2004, when a devastating
sequence of 4 hurricanes cost insurance firms billions. The researchers
used an early version of their model to forecast a worse-than-average seasonref When hurricane Katrina
hit New Orleans on 29 August 2005, the city thought it had escaped
the worst. The category-5 hurricane had weakened slightly to category 4
before making landfall, and residents were confident that they had avoided
disaster. But the following day, after the storm itself had passed, a 100-m
section of the levees protecting the area from the flood waters was breached,
along with at least 2 other smaller stretches, inundating some 80% of the
city. By 31 August, estimates of the ultimate death toll were in the thousands,
and all this with > 2 months to go until the end of hurricane season. Did
experts know this might happen? Yes. New Orleans is protected by a series
of flood walls called levees that help to hold back nearby Lake Pontchartrain,
which in turn is connected to the Gulf of Mexico. Parts of the city sit
several metres below sea level. And the system's 565 km of walls were built
to withstand only category-3 hurricanes. So a direct strike from a severe
storm has long been anticipated as one of the worst natural disasters that
could befall the mainland USAref.
Could something have been done to prevent this? Yes. The levees could have
been higher. The New York Times has reported that the estimated cost of
protecting against a category-5 hurricane, the highest on the scale, is
$2.5 billion. The USA had had a really long run of good luck with hurricanes.
Lots of building decisions were made thinking we would continue to have
the benign conditions of the 1970s and 80s. The natural marshlands that
protect New Orleans from surrounding waters could also have been protected
from degradation. A 30-year restoration plan, called Coast 2050, was published
in 1998, but it put the bill at a staggering $14 billion. Damages from
the current flooding are expected to run to tens of billions of dollars.
Part of the problem is that planners did not take into account the recent
upswing in hurricane incidence. Unfortunately, 'Don't worry, be happy'
is not a very good philosophy for dealing with this kind of thing. How
exactly did the levees fail? It is still unclear why the 100-metre section
of levee along the 17th Street canal was the one to break. It had recently
been upgraded, and was constructed of concrete several feet thick, unlike
the earthen structures elsewhere in the city. Experts point out that Lake
Pontchartrain was sloshing around in the wake of the storm, which might
have caused water to tip over the edge of nearby levees. This water may
have eaten away at the foundations of the wall, ultimately causing it to
topple. How long will it take to repair the damage? All of the 20-odd pumping
stations surrounding the 17th Street canal, the main route by which water
is normally pumped out of the city, have been knocked out by the flood.
Keeping New Orleans free of water was a daily challenge even before Katrina
struck. With almost the entire area between Lake Pontchartrain and the
Mississippi River under water, clearing the flood will take at least a
month. For now, helicopters and barges are dropping sandbags and concrete
highway construction barriers into the largest levee break in an attempt
to plug the hole. As this story went to press, the waters were slowly starting
to recede. How many people will be affected? Nearby towns already fear
death tolls in the hundreds, and the overall number is expected to be in
the thousands. It is potentially the worst natural disaster on US soil
since the 1906 San Francisco earthquake, which claimed up to 6,000 lives.
Besides this, some 11,000 National Guard troops have been assigned to the
region to distribute food supplies, rescue those stranded, and quell the
looting that has sprung up in New Orleans. Is climate change to blame?
It is impossible to say for certain. There is evidence that hurricanes
are becoming more intense, but this may be due to natural variation. New
Orleans was last hit by a hurricane in 1969, marking the end of a particularly
violent couple of decades. This was followed by a relatively quiet patch
in Atlantic hurricanes, lasting until 1995. Since then, storms have been
heating up again. Hurricanes tend to be stronger when sea surface temperatures
in the Atlantic are higher. But data on these temperatures only stretch
back a couple of decades, since satellites began to be used to monitor
the oceans. And computer models for climate change cannot predict small-scale,
individual events such as hurricanes. Nevertheless, sea surface temperatures
are predicted to rise by a few degrees by 2100, meaning that devastating
hurricanes may become more frequent. Whether these will make landfall or
veer out to sea, however, is not known. What can we expect from the rest
of this hurricane season? The Atlantic hurricane season traditionally lasts
until November, so there could be more in store. So far this year, the
region has produced 11 tropical storms, four of which have become hurricanes.
The final tally could be around 20 storms with 10 hurricanes. There's no
reason to suggest it won't carry on as it has done. The past decade has
seen a sudden switch to high activity. Monitoring such storms and evacuating
people where necessary remains the best form of defence. Of course, not
all hurricanes will home in on major cities to such devastating effect.
Katrina probably picked the worst place to come ashore, with the possible
exception of Miami. But this week's events may well be a wake-up call.
When the armed forces are mobilized in the response to the horrifying conditions
along the Gulf coast, there will be a very small number of soldiers who
are unaware that they have returned from overseas harboring malaria. There
is also an alarming proximity of dengue and yellow fever in Central America.
With all the standing water left behind by Hurricane Katrina there will
be lots of mosquitoes, and a very small human reservoir of active hosts
is all that is required for a large vector population to infect a large
susceptible population. It wasn't so long ago that these diseases were
endemic along the Gulf coast. But whereas many may think that with such
vast areas of standing water, mosquito populations will increase explosively,
this is not necessarily the case. In many parts of the world, excessive
rainfall or flooding often flushes out larval habitats and can lead to
decreased numbers of adults emerging and biting. Mosquitoes have a great
diversity of breeding places. Some prefer organically polluted waters (for
example contaminated with feces, cadavers, or rotting vegetation); other
species breed in domestic containers, drains, water-filled tree-holes,
in flooded meadows, swamps and marshes; while still others prefer, or at
least tolerate, brackish waters. Often mosquito breeding increases some
time after flooding when the water recedes, either naturally or through
pumping, as this tends to increase the number of smaller bodies of water,
which are often more suitable larval habitats than larger ones. With different
mosquito species colonising such ecologically different larval sites, it
is difficult to predict which -- if any -- species will proliferate. The
CDC believe that there might be mosquito problems and that the potential
exists for outbreaks of WNV, St Louis encephalitis, and dengue. For this
to happen, however, there must be infected hosts in the area, or later
entering the area, which in the case of West Nile includes birds, especially
corvids (incidentally, Louisiana is presently a hot spot for WNV), while
St Louis encephalitis is also maintained in birds (mainly passeriformes
and columbiformes). But there is little information on the effect hurricane
Katrina has had, and might have, on local bird populations. In the USA
and most of the rest of the world, dengue viruses have no animal reservoir
hosts. I do not think that the hurricane will increase the likelihood of
yellow fever, as this would require infected people to be in the area,
and such people will be viremic only for a very short period -- about 3-4
days. I might point out that yellow fever is not endemic along the Gulf
coast, but is endemic in many parts of South America. Nor do I believe
there is much chance of malaria transmission from infected troops, because
I imagine any soldier with malarious symptoms (such as high intermittent
fever) returning from overseas would have been treated with antimalarial
drugs. Moreover, infected people would have to be gametocyte carriers.
That is because gametocytes are the forms of the parasite found in humans
that have to be ingested by a female anopheline mosquito for the mosquito
to become infected; then after some 10-14 days, the mosquito becomes infective
and thus able to transmit malaria to humans. I understand that past experience
has generally shown that there are not epidemics of mosquitoborne infections
after hurricanes. For example, I believe that there were no major epidemics
of vector-borne infections after the tsunami that hit Asia in 2004. The
American Mosquito Control Association is well aware of the catastrophic
damage done to parts of Louisiana, Mississippi, and Alabama and the very
severe limitations this has imposed on local authorities who normally routinely
control mosquitoes in populated areas. Whether or not we believe that vectorborne
disease epidemics will appear, I believe control agencies should be as
prepared as possible to deal with any emergencies. Mosquito experience
with Hurricane Camille in 1969 showed that the hurricanes completely wiped
out mosquito populations for at least a month. One can be sure the same
thing is happening along the Gulf coast. Eastern North Carolina had the
same predictions of mosquitoes emerging from the flood waters of hurricane
Floyd in the fall of 1999. Although a considerable amount of aerial spaying
was done immediately following the flood (thanks to federal resources),
I am not convinced that this prevented any significant arboviral disease
outbreaks. Much like the concerns about cholera, typhoid, hepatitis A,
etc. during the earlier parts of the flooding, these fears were unfounded.
During these large flood events, there is so much water that any microbe,
insect or human is easily overwhelmed and often becomes hard to find; but
that doesn't mean we shouldn't keep looking. There were no serious mosquito-borne
outbreaks post Hurricane Floyd, but that probably there would not have
even if there had been no aerial spraying. I might add that
the Vector Control Research Centre, Pondicherry, India reported that the
Asian tsunami created new mosquito larval habitats in the Nicobar and Andaman
Islands, when sea water was left behind post flooding. Such saltwater habitats
were colonized by a local malaria vector, Anopheles sundaicus. This
mosquito, which is widely distributed in Asia, lays her eggs in both fresh
and saline waters. It is stated that there are indications that this mosquito
increase has had a significant impact on human health, with a 14-fold increase
in malaria infection in some regionsref1,
ref2.
Vibrio
vulnificus
might have been picked up by people with open wounds who were forced to
wade through polluted floodwaters for a long time : 7 cases and 5 deaths
have been recorded to Sep 8. It is possible you could have a significant
outbreak of West Nile virus (WNV)
but we won't likely know for weeks. When hurricane Camille barreled into
the eastern USA in 1969, heavy rains and the storm surge wiped out mosquitoes
for about a month. Katrina is expected to have a similar effect. Once the
mosquitoes' breeding cycle is restored, and eggs begin to hatch in standing
water, residents in the Gulf area could be at greater risk for both WNV
and St. Louis encephalitis
(SLE),
another mosquito-borne illness. Widespread spraying of the area might curtail
an outbreak. Whether toxic chemicals and oil in the water might hamper
mosquito breeding is not known. Neither WNV nor St. Louis encephalitis
is passed from person to person. Flooding has probably flushed out mosquito
larval habitats, and it is only when water recedes that smaller and discrete
water bodies will be formed that are likely to be more attractive to mosquito
breeding. Along the Gulf coast, the primary vectors of SLE are Culex
pipiens and Culex quinquefasciatus, mosquitoes that often breed
in waters that are contaminated with feces or other organic matter, habitats
which could become plentiful. Vectors of WNV comprise several mosquito
species, including the Cx. pipiens complex. Widespread insecticidal
spraying is a possible course of action. This could involve larviciding
aquatic waters suspected, or better still known, to be larval breeding
places, or adulticiding. Both can be done from ground-based equipment (or
boats) or by aerial applications. Depending on where adult mosquitoes are
resting, thermal fogs or aerosols can be directed at out of door resting
places, such as at vegetation, or indoors (houses and other buildings)
where one might find adult females of the Cx. pipiens complex. To
spray or not to spray can be a difficult question. It is likely to be costly,
and the effects of it may be inconclusive. I wonder what effect the hurricane
has had on the local bird populations, remembering that passeriformes and
columbiformes are reservoir hosts of SLE, while corvids are reservoir hosts
for WNV. It is helpful to refer to Komar et al., "Experimental Infection
of North American Birds with the New York 1999 Strain of West Nile Virus,
in which it is concluded, with respect to West Nile virus, that "An analysis
of our data shows that passerine birds, charadriiform birds, and at least
2 species of raptors (American Kestrel and Great Horned Owl) are more competent
than species evaluated from the following orders: Anseriformes, Columbiformes,
Galliformes, Gruiformes, Piciformes and Psittaciformes. Indeed, many birds
of the latter orders were found to be incompetent for transmission"ref.
Table 10, "West Nile virus reservoir competence index values derived for
25 species of birds"ref,
is particularly instructive. Larval mosquitoes that get flushed into large
bodies of deep water become food for assorted predators or die from the
increased water salinity. The adult mosquitoes of the Culex
species that are most often associated with the amplification and transmission
of WNV typically shelter in low, enclosed areas (e.g., storm sewers, crawl
spaces, sheds, hollow logs, culverts, etc.), and they would likely be drowned
by the flood waters that came with the storm. It is infected adult mosquitoes
(not birds) that are actually the local reservoirs of WNV, because these
mosquitoes remain infected until the end of their lives. Thus, if
these older adult mosquitoes are the local source of WNV infection, and
they are mostly wiped out by the storm, the local transmission cycle would
be broken. The birds that serve as temporary reservoirs of WNV actually
include many members of the order Passeriformes, not just crows. Furthermore,
birds are not really thought to be reservoirs of the virus; they are amplifying
hosts. When birds are infected with WNV, they remain infected during a
small window (approximately 4-5 days) during which individuals of some
species either remain healthy or become sick or die. During that
window of time, the infected birds may have sufficient viremia to infect
mosquitoes. However, by the end of that period, the bird's immune system
has either eliminated the virus from the bloodstream, or the birds have
died from the virus. In either case, they would no longer be infectious
to mosquitoes. If the hurricane kills most of the local Culex mosquito
population, it may take time for new Culex to recolonize the area.
By that time, local birds would no longer be infectious to mosquitoes.
Therefore, it seems unlikely that there would be much WNV circulating in
mosquitoes in the hardest-hit areas of the Gulf Coast during the weeks
following the hurricane. I agree that adult Culex mosquitoes such
as the Culex pipiens complex that can be vectors of WNV often
shelter in many types of man-made constructions, including cellars, but
I am not convinced flooding would cause their death by drowning. I would
imagine that many would have flown away before their resting sites
became submerged in water. But this is speculation. Regarding birds. Most
publications regard birds as reservoir hosts of WNV despite the fact
that -- in common with most rodents and birds infected with an arbovirus
-- viremia is short-lived, usually a matter of a few days.
If you go to most text books and publication WNV is said to have birds
as reservoir hosts. Similarly most websites cite birds as reservoirs,
including the CDC websites. I agree that infective mosquitoes will remain
infective until they die, but the duration their infective life may be
very short. However, there is evidence that WNV can survive in some hibernating
mosquitoes. Ticks are much longer-lived than mosquitoes and certain species
seem to be capable of transmitting WN virus amongst bird populations,
and in such situations ticks as well as birds could be considered reservoir
hosts. Yes, many other bird species besides corvids are infected, or have
the potential to be infected, with WNV and some species appear to be more
important in the transmission cycle than corvids, but this will depend
on their local distribution. It is difficult to guess exactly what the
repercussions will be in the weeks and months post hurricane Katrina. There
are many published papers showing that flaviviruses, such as WNV, Japanese
encephalitis virus (JEV), St. Louis encephalitis virus (SLEV), tick-borne
encephalitis virus (TBEV), etc. (and other arboviruses), can be isolated
from healthy non-viremic birds, and indeed, from other species, such as
rodents and bats. This is well known to many arbovirologists globally.
Therefore, birds can carry the virus at levels that appear to be below
the threshold for easy detection by conventional methods. Whether or not
these apparently non-viremic birds reactivate the virus to higher titres
under stress or for some other reason has not yet been formally established.
In this sense, birds and other species could be acting as reservoirs. Alternatively,
in theory at least, only one hungry/thirsty infected mosquito may be needed
to survive for the virus to start an epidemicref.
Many human flaviviral diseases, such as WNV, JEV and SLE etc., require
a 2-host life cycle (i.e. bird-mosquito and vise versa) in which humans
are secondary victims. The cycle between birds and mosquitoes is considered
to be dependent mutually on the 2 species. Among these 2 hosts, which one
serves as the reservoir depends on the disease, and sometimes, on one's
point of view. Theoretically, only one hungry/thirsty infected mosquito
may be required to survive for the virus to start an epidemic. After a
major hurricane, such as Katrina, field evidence showed that there were
few adult female mosquitoes and an eerie absence of birds. For normally
ubiquitous mosquitoes to recover their full population, it may take a month
or so, as mentioned by many entomologists. But for the warm-blooded and
brainy birds to return to their normal habitats, I wonder how long it will
take. I am not an expert in ornithology. Colleagues from this field are
welcomed to share their views with us. I overheard that Montserrat Island
was
totally silent without birds and bird song right after hurricane Hugo's
visit in 1992 (August). With or without a lessened population of birds
(migratory or terrestrial) and fewer surviving infective adult female mosquitoes,
one would not expect an unusual surge of outbreaks of endemic flaviviral
infections in the hurricane stricken area, such as New Orleans and Louisiana
more broadly. 2 Air Force planes were scheduled to spray insecticide over
about 150 000 acres (about 234 square miles) east and south of New Orleans
on Monday night. An area about the same size is to be sprayed tonight north
of the city. Mosquito numbers started exploding about 3 days ago. One Louisiana
mosquito-control district reported that the number of trapped mosquitoes
has increased 800% over pre-Katrina levelsMany of the newly hatched mosquitoes
are "floodwater" species that don't normally pose a serious disease threat.
Mosquitoes that usually live in tree canopies have been forced to the ground
by storm damage, and populations of salt-marsh mosquitoes are booming.
They're very aggressive biters, and they can get to such numbers that the
power linemen can't work -- even with repellent on. They can't function
because there are so many mosquitoes. It's unclear how the number of human
WNV cases will be affected. We've never had a hurricane hit when there
was a lot of WNV activity happening at the time, so we're not quite sure
what's going to happen. Areas hit hardest by Katrina might see little increase
in the number of WNV cases. In those places, WNV-infected mosquitoes were
either killed or blown away by the storm. And many of the wild birds that
serve as a reservoir for the virus are gone as well. It's outside the hardest-hit
areas that we would expect the WNV that's in the area to continue. Mosquito
populations are exploding on the Gulf Coast at a time when the risk of
human exposure to the blood-suckers is very high. Rescue and recovery teams
spend all day outside, and some of them camp in tents at night. Some displaced
residents are living outside. In storm-damaged houses that remain habitable,
lack of air conditioning due to ongoing power outages has forced occupants
outside to escape the heat.
During 29 Aug - 11 Sep 2005, surveillance identified 22 new cases of
Vibrio
illness with 5 deaths in persons who had resided in 2 states. These illnesses
were caused by V. vulnificus, V.
parahaemolyticus,
and nontoxigenic Vibrio cholerae.
These organisms are acquired from the environment and are unlikely to cause
outbreaks from person-to-person transmission. No cases of toxigenic V.
cholerae serogroups O1 or O139, the causative agents of cholera, were
identified. This report summarizes the investigation by state and local
health departments and CDC, describes 3 illustrative cases, and provides
background information on Vibrio illnesses. Results of the investigation
underscore the need for heightened clinical awareness, appropriate culturing
of specimens from patients, and empiric treatment of illnesses (particularly
those associated with wound infections) caused by Vibrio species.
No confirmed cases of illness have been identified with onset after 5 Sep
2005; additional Vibrio cases are under investigation. A case of
post-hurricane Vibrio infection was defined as clinical illness
in a person who had resided in a state struck by Hurricane Katrina (i.e.,
Alabama, Louisiana, or Mississippi) with illness onset and reporting during
29 Aug-11 Sep 2005, where Vibrio species was isolated from a wound, blood,
or stool culture. Among cases, a wound-associated Vibrio case was defined
as an illness that likely resulted from infection of a wound or abrasion
acquired before or during immersion in floodwaters. Wound-Associated Illnesses:
18 wound-associated Vibrio cases were reported, in residents of
Mississippi (7) and Louisiana (5); in persons displaced from Louisiana
to Texas (2), Arkansas (2), and Arizona (1); and in a person displaced
from Mississippi to Florida (1). Speciation was performed in clinical laboratories
for 17 of the wound-associated cases; 14 (82%) were V. vulnificus,
and 3 (18%) were V. parahaemolyticus. 5 (28%) patients with wound-associated
Vibrio infections died; 3 deaths were associated with V. vulnificus
infection, and 2 were associated with V. parahaemolyticus infection.
Age of patients with wound-associated illnesses ranged from 31 to 89 years
(median: 73 years). 15 (83%) were male. The majority of patients were hospitalized;
admission dates ranged from 29 Aug to 5 Sep 2005. Not all patients were
initially hospitalized because of their wounds. An underlying condition
that might have increased risk for severe Vibrio illness was reported in
13 (72 percent) of the patients with wounds; these conditions included
heart
disease (7 patients), diabetes mellitus (4), renal disease (3), alcoholism
(3), liver disease (2), peptic ulcer disease (1), immunodeficiency (1),
and malignancy (1). Non-Wound-Associated Illnesses: 4 persons were reported
with non-wound-associated Vibrio infections (2 in Mississippi, 1
in Louisiana, and 1 displaced from Louisiana to Arizona). Information on
the Vibrio species and clinical illness was available for 2 of these
patients; the species were nontoxigenic V. cholerae isolated from
patients with gastroenteritis. One of the infections occurred in a boy
aged 2 months with diarrhea whose stool culture yielded both Salmonella
group C2 and V. cholerae non- O1, non-O139. He was hospitalized
for 2 days in Mississippi. The other V. cholerae non-O1, non-O139
isolate was from a stool specimen from an adult who was not hospitalized.
No deaths were associated with the non-wound cases. After natural disasters
such as Hurricane Katrina, the risk for illness related to infectious diseases
is a public health concern. The findings in this report describe illnesses
caused by Vibrio species, including wound infections resulting from
post-hurricane exposure of wounds to flood waters. These findings underscore
the need for prompt recognition and management of Vibrio wound infections
by healthcare providers. When the number of illnesses from infectious diseases
increases after a natural disaster, they usually are caused by infectious
agents normally present in the community or local environment (Blake PA:
Communicable disease control. In: Gregg MB, ed. The public health consequences
of disasters. Atlanta, GA: US Department of Health and Human Services,
CDC; 1989:7-12). Nationwide, an average of 412 cases of noncholeragenic
Vibrio
(all Vibrio species other than toxigenic V. cholerae O1 or
O139) illnesses were reported each year during 2000-2004, including an
average of 146 cases reported from the 5 Gulf Coast statesref.
The most frequently reported Vibrio species are V. parahaemolyticus,
V.
vulnificus, and nontoxigenic V. cholerae. Vibrio illnesses in
the USA are seasonal and peak during the summer (Figure 2). During 2000-2004,
in the month of September, an average of 14 (range: 11-18) noncholeragenic
Vibrio infections were reported from Gulf Coast states; an average
of 7 cases (range: 4-8) were wound-associated. Except for toxigenic V.
cholerae O1 or O139, Vibrio illnesses are not nationally notifiable
in the USA, and the actual number of noncholeragenic Vibrio illnesses is
likely greater than the number reported. Cholera is a severe diarrheal
illness caused by V. cholerae serogroups O1 or O139, which produce
cholera toxin (i.e., toxigenic V. cholerae O1 or O139). A small
endemic focus of toxigenic V. cholerae O1 exists in the Gulf of
Mexicoref.
During 2000-2004, a total of 16 cases of cholera were reported in the USA,
and 13 (81%) of these infections were acquired during overseas travel or
by consumption of imported seafood. Only 3 (19%) infections were acquired
in the Gulf Coast states, all in the year 2000. Therefore, the risk for
acquiring cholera associated with Hurricane Katrina is extremely low. Since
2000, at least 7 noncholeragenic Vibrio species (V. vulnificus,
V.
parahaemolyticus, nontoxigenic V. cholerae, V. alginolyticus,
V.
fluvialis, V. mimicus, and V. hollisae) have been reported
as causing illness each year in the USA. Although these organisms and those
that cause cholera are grouped together under the genus Vibrio, they cause
distinctly different illnesses. In the USA, noncholeragenic Vibrio usually
are either foodborne, (e.g., resulting from eating raw or undercooked shellfish,
particularly oysters, or other contaminated foods) or wound-associated
(e.g., resulting from exposure to seawater or brackish waters where the
organism naturally occurs). The incubation period for noncholeragenic Vibrio
infection usually is 12-72 hours but can be as long as 1 weekref.
Noncholeragenic Vibrio illnesses are not transmitted easily from person
to person. Outbreaks, which are rare, usually are the result of consuming
contaminated shellfish. The most frequently reported post-hurricane Vibrio
illnesses
were V. vulnificus and V. parahaemolyticus wound infections.
These cases represent an increase over the normal reported incidence of
Vibrio wound infections in Gulf Coast states and are consistent with exposure
after hurricane landfall. Although precise exposure histories are not yet
available for all patients, the infections caused by V. vulnificus
likely resulted from wounds exposed to flood waters among persons with
medical conditions that predisposed them to Vibrio infections. No
evidence has been found of increased Vibrio gastrointestinal illness.
V.
vulnificus wound infections can begin as redness and welling at the
site of the wound and rapidly progress in patients at high risk to cause
systemic illness, including sepsis. Whether acquired through wound infection
or ingestion, V. vulnificus typically causes a severe and life-threatening
illness characterized by fever and chills, decreased blood pressure (septic
shock), and blood-tinged blistering skin lesions (hemorrhagic bullae).
Persons with chronic liver disease or immunocompromising conditions are
particularly at risk for severe V. vulnificus infectionsref
(Tuttle J, Kellerman S, Tauxe RV: The risks of raw shellfish: what every
transplant patient should know. J Transpl Coord 1994; 4:60-3). V. parahaemolyticus
typically causes gastroenteritis after consumption of contaminated shellfish.
Less frequently, V. parahaemolyticus causes wound infections that
are generally less severe than V. vulnificus wound infections. However,
in persons with liver disease or immunocompromising conditions, V. parahaemolyticus
wound infections can lead to death. Nontoxigenic V. cholerae causes
primarily gastroenteritis, but unlike toxigenic V. cholerae O1 or
O139, nontoxigenic V. cholerae do not cause epidemics. Illness caused by
this organism ranges in severity from mild diarrhea to severe watery diarrhea.
Fever and bloody diarrhea are not typically observed. Immunocompromised
persons and persons with liver disease can experience a more severe illness,
including fever, chills, and septic shock. This organism has rarely been
reported to cause wound infections. Vibrio infections are diagnosed
by culture of wound, blood, or stool specimens. For stool specimens, a
selective medium of thiosulfate-citrate-bile salts-sucrose agar (TCBS)
is recommended. If clinical suspicion of enteric Vibrio infection
exists, the microbiology laboratory should be notified so that TCBS media
will be used. Clinical laboratories should send all Vibrio isolates
to state public health laboratories for confirmation. CDC continues to
work with local and state public health officials to investigate post-Katrina
Vibrio
illnesses. Persons working in hurricane-damaged areas, especially in areas
with standing brackish water, should wear boots and other protective gear
to prevent wounds and to prevent exposure of broken skin to contaminated
water. To prevent Vibrio infections, persons with open wounds or
broken skin should avoid contact with brackish water or seawater, especially
if they have preexisting liver disease or other immunocompromising conditions.
Injury prevention is especially important for persons in these high-risk
populations. Healthy persons are at much lower risk for Vibrio infection.
In areas where flood waters have receded and surfaces are dry, Vibrio
should not be a concern because the organism is killed rapidly by drying
(In: Mitscherlich E, Marth EH, eds. Microbial survival in the environment.
New York, NY: Springer-Verlag; 1984:515-34). To reduce the risk for Vibrio
wound infection, persons should wash all wounds that have been exposed
to sea or brackish waters with soap and clean water thoroughly as soon
as possible and seek medical care for any wound that appears infected.
Clinicians should be vigilant for Vibrio infection in hurricane
evacuee populations, particularly in patients with infected wounds and
especially if the patients are in a high-risk group. If V. vulnificus
is suspected, antimicrobial therapy should be initiated immediately; prompt
treatment can improve survival. Antimicrobials effective against Vibrio
infections include doxycycline, 3rd-generation cephalosporins (e.g., ceftazidime),
fluoroquinolones, and aminoglycosides (Daniels NA, Evans MC, Griffin PM:
Noncholera Vibrios. In: Scheld WM, Craig WA, Hughes JM, eds. Emerging infections
4. Washington, DC: ASM Press; 2000:137-47). Wound infections also should
be treated with aggressive attention to the wound site; amputation of the
infected limb is sometimes necessaryref.
3 documented V. cholerae (all non-O1, non-O139) infection related
to the hurricane occurredref.
Vibrio
cholerae is not endemic in Louisiana, but more pathogenic non-cholera
Vibrios are, and they have already killed and will continue to do so. The
ARBOR diseases will return soon, principally West Nile virus -- worse than
in Mississippi -- and spraying will be required later when the floodwaters
recede. Current mild diarrheal diseases are viral and secondary to poor
sanitation and endemic RNA enteric viruses. More serious dysenteric disease
outbreaks could follow among those who are not evacuating flooded areas
and who are consuming contaminated food and water. Dysenteric diseases
should not be a problem in well-run shelters. Baton Rouge will probably
have a hepatitis A outbreak in 4-6 weeks so get vaccinated for HAV now,
because there are inadequate stocks of IgG and HAV vaccine to respond to
an outbreak. As we enter flu season, viral URIs will become a problem among
the elderly in shelters and may result in community acquired pneumonias.
Since many TB+ homeless persons and ex-prisoners may have been
sheltered with the elderly, infants, and the immunosuppressed, MDR-TB could
be transmitted to these susceptible populations. Reactivation of non-MDR
TB in the elderly is usually more of a problem, especially among older
immigrants from countries where TB is endemic (Viet Nam, Mexico, etc.)
and primary infections were acquired during childhood. The elderly often
baby sit the infants, who are highly susceptible to TB. TB should not be
as much of a problem as viral URIs and secondary pneumonias. In the past,
there have been outbreaks in southern Louisiana of V. cholerae "El
Tor" of a specific phage type common from Apalachicola Bay to Galveston
Bay, and usually south of Interstate highway 10 (I-10). We used it in a
trial involving frog legs: see Sang, F.C.; Hugh-Jones, M.E.; Hagstad, H.V.
1987. A Research Note: Viability of Vibrio cholerae 01 on Frog Legs under
Frozen and Refrigerated Conditions and Low Dose Radiation Treatment. Journal
of Food Protection, 50:662-664. We obtained the culture from the NO PH
Laboratory which had recovered it from some prior outbreak. It was never
very virulent in the various outbreaks, and its recovery depended more
on clinician awareness than anything else. I haven't seen it reported for
some time now. Those outbreaks followed a lack of rain and thus presumably
salt-water intrusion into the oyster bays and blue crab trapping areas.
It was largely south of the I-10, presumably because of the Cajun propensity
to under cook their crabs. V. cholerae non-O1 strains are still
recovered occasionally. But, in reality, Jim is absolutely correct about
the present status of cholera in Louisiana.
In the wake of Hurricane Katrina, experts are hoping that the natural
marshland that buffers the Gulf coast can be restored. The lack of a healthy
ecology in the area increased the damage from the August storm, and should
serve as a lesson for other areas that need protection. It has long been
known that the sandy barrier islands and marshy bayous of the Louisiana
coast are capable of acting as a wave-deflecting and energy-absorbing one-two
punch. These natural features are in decline however; more than 60 km2
of
land erode each year. And there is little mud and silt filtering into the
area to replace the eroded material: modern agriculture holds on to silt,
as do dams, so the muddy Mississippi is no longer quite so murky. Frequent
low-intensity storms have battered the area over time, washing away much
of the remaining land. Several of the barrier islands off the Louisiana
coast have been washed away altogether by Katrina. Researchers are not
sure that things would be different in New Orleans had the delta still
had its Delaware-sized swathe of wetlands. But the communities along the
Gulf coast would have been much better off. Smaller storms on the same
areas 50 years ago: you've got people who as kids never saw water in their
yard, and now there's water in the yard and covering the road. Without
a doubt, our coastal landscape used to protect us from storms. Similar
protection from natural landscapes have been noted elsewhere. Healthy bands
of coral reefs or mangroves, for example, saved some areas from the damage
wreaked by the tsunami in the Indian Ocean in December 2004; degraded areas
fared worse. Experts caution that coastal protection in that area is vital.
Bangladesh is losing its coasts much as we are, and they have typhoons
equivalent to our hurricanes. A 1998 document called Coast
2050, written by state officials, called for significant investment
in wetlands restoration, not only for the good of fisheries and endangered
species, but for buffering hurricane storm surges. The state's Department
of Natural Resources says it will act on the top priorities of the plan,
depending on funds awaiting approval or appropriation in Congress in 2005.
Funding is likely to be far short of the $14-billion estimated cost of
the whole plan; observers say they are more likely to get $1.9 billion
over 10 years. Ecosystem services are a hard sell in Congress, let's face
it. It is not clear whether the disaster will focus attention on the area
and liberate funds, or if money will be spent instead on immediate rebuilding.
We need sustained funding for our chronic erosion problem : the scientists
are really frustrated that it has been ignored. I think in many ways you
need a catastrophe for the world to sit up.
Hurricane
Rita , with its winds of 270 km/hr, battered Texas on Sep 2005 : up
to 1 million people were evacuated from Houston and Galveston County. Many
oil rigs and refineries around the coast have been abandoned, threatening
to send US oil prices soaring. Meanwhile, NASA has closed the Johnson Space
Center in Houston, and transferred International Space Station operations
to Russian mission control in Korolev, near Moscow. After Hurricane Katrina,
which devastated New Orleans and its environs on 29 August, the 6-man team
of specialist forecasters at the National Hurricane Center in Miami, Florida,
are under more pressure than ever to predict where Rita will make landfall.
They're working harder than the normal 40 hours a week, and doing extra
shifts. Their shifts are 8-9 hours, and because we provide 24-hour coverage
many of them work overnight. How do they come up with the forecast?
When the specialists start their shifts they're briefed by their predecessors,
and then start collecting data from ships and buoys in the Gulf of Mexico.
They also collate information from aircraft reconnaissance flights, and
get updated satellite images every 30 minutes. All of this gives them a
wind speed and direction, sea heights and atmospheric pressure. They also
have a conference call with other meteorologists around the Gulf to discuss
the data that will go into computer models that predict the hurricane's
path. The specialist uses at least a dozen hurricane models, and sometimes
results diverge. The models can take a system north or west, every which
way. The skill of the specialist is to pick the most likely outcome. From
start to finish it's about a 3-hour process to produce a forecast. The
latest track has it hitting the coast of central Texas, about 160 km below
Houston or so. We're predicting landfall by 7 am local time (11 am GMT)
on Sat 24 Sep 2005. Typically our forecasts are very accurate 3 days in
advance, although right now the cone of uncertainty covers most of the
Texas coast. Is there any chance the storm could peter out before it reaches
the coast? Rita looks like a category 5 at the moment. Although storms
don't tend to maintain that intensity for long, it's definitely going to
strike land as a hurricane. Even if it goes down to a category 1, with
winds > 118 km/hr, that's certainly strong enough to do damage. Katrina
was a category 1 when it hit southeastern Florida, and it uprooted trees
and damaged cars and houses here. How certain can you be? Forecasting the
tracks of hurricanes has become a lot more accurate over the past few years.
For Katrina we got it almost exactly right. Direction forecasting has been
improved by better computing power, more data from ships and buoys to put
into the models, and advances made in those models by researchers. Intensity
forecasting is another matter. At the moment it's difficult to predict
what category hurricanes will be when they hit land. We're trying to improve
that by getting the National Oceanic and Atmospheric Administration and
Air Force airplanes that fly over the storms to make observations, to take
pressure readings and also drop sounders into the hurricane to get measurements
from inside. Has this hurricane season seen record activity? I think this
is the fourth most active Atlantic season on record. The hurricane season
ends on 30 November, but activity will peak around 20 October, so it's
more than likely there will be 2 or 3 more named systems by the end of
it. Is it unusual for Texas to be in the firing line? It's been a while
since a major hurricane reached Texas. The last one was Hurricane Carla,
which hit in 1961. They're certainly infrequent. [Hurricane Carla was a
category-4 storm that caused > 30 deaths. Arguably the worst US natural
disaster was from a hurricane that hit Galveston, Texas, in 1900, killing
at least 8,000 people.] What do the forecasters do when the hurricane season
ends? We issue forecasts for all Atlantic and eastern Pacific tropical
storms year round, but after November they tend to be marine warnings for
shipping. Late in Aug 2005, a Houston medical school finally got a US$50-million
insurance payout for damages it sustained during a tropical storm 4 years
ago. The money, ironically, comes just in time for the Texas Medical Center
to face up to another storm: Hurricane Rita is due to strike this weekend.
At least this time they'll be better prepared recalling lessons they learned
in 2001 from the tropical storm nicknamed Allison. That storm flooded
the multi-institutional centre, drowning research animals in basement labs
and causing priceless tissue and cell samples to thaw and be destroyed.
All told, Allison wreaked $2 billion in losses. The affected institutions
are only just starting to recover this money. Since that disaster, the
medical centre has spent millions of dollars to prevent future storm damage.
Flood gates have been installed and auxiliary power generators moved to
higher floors. The measures may well come into play during the evening
of Friday 23 September and the following morning, when Rita is expected
to slam into the Texas coast. By Friday morning, most of the low-lying
parts of Houston had been evacuated. Traffic jams snarled the interstates
leading north from the city, away from the oncoming storm. At the Baylor
College of Medicine, part of the Texas Medical Center, researchers and
students had cleared out. Susan Berget, vice-president of academic planning,
was one of several officials who planned to hunker down on the campus to
ride out the storm and assess the damage afterwards. Following the 2001
storm, Berget learned firsthand how ill-prepared the US Federal Emergency
Management Agency (FEMA) was for research emergencies. They had never thought
of how to deal with a research loss. To them, transgenic mice is a foreign
concept. Houston scientists have long talked of the consequences of tropical
storms, but before Allison they had taken little action. When oncologist
Kent Osborne moved from San Antonio to Baylor in 1999, taking with him
the world's largest bank of breast-cancer tumour specimens, he was concerned
about flooding. The collection contained about 100,000 specimens and was
later valued between $100 million and $300 million. When Baylor proposed
putting the bank's 30 massive freezers in the flood-vulnerable basement,
we put up a fight. But the freezers weighed so much they couldn't go on
higher floors. They stayed down below until Allison's flood waters tossed
them around like toys. Research losses are usually only partly covered
by insurance. After a natural disaster, FEMA pays 75% of certain losses;
under such terms, it contributed $4.5 million for Baylor to buy 2,000 breast-cancer
specimens to help rebuild its bank. Allison prompted other Houston universities
to make major changes. We have nothing that can be destroyed by water on
our lower floors. Their preparations will undoubtedly be tested this weekend
by Rita. In New Orleans, the lessons of Allison apparently failed to sink
in. After Hurricane Katrina flooded the city on 30 August, the medical
centres at Louisiana State and Tulane universities lost their auxiliary
generators. The backup power supplies had been located on the ground floors
and were easily knocked out by flooding; countless research samples were
lost. Allison was a wakeup call Hurricane Rita's approach toward the Texas
coast has raised the specter of a long-ago killer storm, which struck before
meteorologists were able to track hurricanes with radar and weather satellites.
On September 8, 1900, a powerful hurricane buried the thriving port city
of Galveston in a storm surge of almost 16 feet (5 m). Conservative estimates
put the death toll at 6,000, but 8,000 or more probably died. The unnamed
hurricane that nearly scraped Galveston off the map 105 years ago was very
similar to Hurricane Rita. The 1900 hurricane is thought to have packed
winds of 130-140 miles/hour (210-255 km/hr), which would make it a Category
4 hurricane on today's Saffir-Simpson scale. As of 5 a.m. today, Hurricane
Rita's strongest winds were blowing at 140 miles/hour. Over the past 2
days, millions of residents along the Gulf Coast have been streaming inland
to avoid Rita, which is expected to make landfall Saturday near the Texas-Louisiana
border. Unlike today's evacuees, however, Galveston residents in 1900 had
no idea what was in store for them as the storm drew near their island
city of 37,000. Ida Smith Austin, who survived the hurricane, said she
knew bad weather was expected, but she and her husband didn't alter their
plans. "A storm had been predicted for Friday night of September 7, but
so little impression did it make on my mind that a most beautiful and well
attended moonlight fete was given at our home Oak Lawn that night," Austin
wrote in a letter dated November 6, 1900. The following day, as the hurricane
drew nearer, the storm surge began to cover Galveston. "In a few minutes,
we heard the lapping of the salt water against the sidewalk, and then it
slowly crept into the yard," Austin wrote. "In an incredibly short time,
the water surged over the gallery driven by a furiously blowing wind."
The storm surge crushed buildings and pushed them into a huge pile. The
debris became like a giant bulldozer blade, knocking down more buildings
as the surge moved across the island. After the storm, Galveston residents
were determined to rebuild their city and prevent a recurrence of the tragedy.
The buildings that survived were raised, and sand from the Gulf of Mexico
was pumped onto the island to lift it 8 feet (2.5 meters) above sea level.
A 17-foot-tall (5-m-tall) seawall also was built to protect Galveston from
storm surges. Galveston leaders a century ago probably thought this dramatic
effort would protect the city from whatever the Gulf of Mexico could throw
at it. But they didn't envision a storm like Rita. Depending on where Rita
makes landfall, hurricane forecasters fear the storm could bring a surge
of more than 20 feet (6 m) into Galveston Bay, which would easily overtop
the city's seawall.
Cold War nuclear-bomb testing may have a beneficial legacy: helping
forensic experts to estimate more accurately the age of dead bodies. Radioactive
remains from the many blasts detonated during the 1950s and 60s are present
in the tooth enamel of people all over the world, allowing examiners to
work out when the teeth were formed. The method, developed by researchers
at the Karolinska Institute
in Stockholm, Sweden, has already helped Swedish police to identify 6 of
that country's victims from 2004 Asian tsunami. And it may help with the
task of putting names to the victims of Hurricane Katrina. The technique
works because a burst of nuclear testing in the 1950s, chiefly by the USA
and Russia, released huge amounts of the radioactive isotope carbon-14
into the atmosphere. This was subsequently taken up by crops and other
plants, thus entering the food chain. This bomb carbon is not dangerous,
but it is traceable. Since the signing of the Partial Test Ban Treaty in
1963, which banned above-ground nuclear detonations, atmospheric levels
of C-14 have been dropping at a known rate. And because tooth enamel is
not replenished after a certain age, the carbon in that enamel bears the
hallmarks of the time when it was made. Someone in middle age, whose teeth
were formed in the 1960s, will for example have higher levels of C-14 in
their enamel than a teenager, whose teeth were made when there was less
carbon-14 around. The dating method can judge a person's age to within
1.6 yearsref.
This is an improvement on existing methods, such as examining skeletal
remains, which is thought to be accurate to only 5-10 years. What's more,
the new method gives an actual birth date for the body, rather than simply
an age at death. This could be helpful in situations when it is unclear
how long the victim in question has been deceased. An accurate estimate
of a body's birth date could be a great help in identifying victims of
disasters such as the Asian tsunami or the genocide in the former Yugoslavia.
The alternative method of DNA profiling requires exhaustive analysis of
tissue samples, many of which simply no longer exist. Teeth are far easier
to preserve for later examination. Investigators struggling to identify
bodies left behind by Hurricane Katrina may also adopt the method. Victims
of drowning can sometimes be hard to identify thanks to water damage, and
the flooding may have destroyed some records traditionally used for identification.
In situations like New Orleans they will be having huge problems. In many
cases dental records will have been washed away. As the flurry of Cold
War testing recedes into the past, atmospheric carbon-14 levels will eventually
tail off to a level where the method will no longer work, but that is several
decades away. (1963
nuclear testing treaty)
Possible links between hurricane formation and global warming are a
contentious issue in climate policy. And last week's Hurricane Katrina
in the United States has fanned the flames. The depth of the divide between
supporters and sceptics became apparent in January, when US meteorologist
Chris Landsea resigned from the Intergovernmental Panel on Climate Change
(IPCC). He was protesting against statements made by his panel colleague,
Kevin Trenberth, who had supported a link between warming and storms in
a press conference. Trenberth, head of the climate analysis section at
the National Center for Atmospheric Research in Boulder, Colorado, tried
to set the record straight in an article published in Science in Juneref.
Owing to large natural variability, trends in hurricane frequency are indeed
difficult to attribute to climate change, but human influences on the environment
probably affect hurricane intensity and rainfall. Trenberth's view is supported
by the most recent and solid analysis of hurricane destructiveness over
the past 30 years, by leading US hurricane researcher Kerry Emanuel of
the Massachusetts Institute of Technology in Cambridge, Massachusetts.
In his Nature paperref,
Emanuel concludes that "future warming may lead to an upward trend in tropical
cyclone destructive potential, and ... a substantial increase in hurricane-related
losses in the twenty-first century." The main cause of all this may be
increased sea surface temperatures; these are the most important variable
affecting hurricane formation. Extra warmth means there is more energy
available to a storm, and more water is likely to be sucked up into the
clouds. North Atlantic surface temperatures have been significantly above
average for at least a decade, a trend many scientists agree must be associated
with global warming. In early August, the Gulf of Mexico, was a striking
2-3 °C warmer than it usually is at this time of year. Katrina sucked
out so much heat energy from the Gulf that water temperatures dropped dramatically,
in some regions from 30 °C to 26 °C, after the storm had had passed.
Sea temperatures are likely to remain high until October 2005. The IPCC
came to the tentative conclusion that hurricanes might be on the rise in
their most recent assessment report, published in 2001. It stated: "There
is some evidence that regional frequencies of tropical cyclones may change....
There is also evidence that the peak intensity may increase by 5% to 10%
and precipitation rates may increase by 20% to 30%. But the panel adds
that the certainty of these statements is low: lower than our understanding
of air temperature changes, for example. There is a need for much more
work in this area to provide more robust results. Naturally, the bulk of
the media are less reserved. "The hurricane that struck Louisiana was nicknamed
Katrina by the National Weather Service. Its real name is global warming.
Such statements are not pure invention, but most scientists are made uncomfortable
by this specific kind of attribution. Even in the presence of a statistically
robust trend, it is unscientific to attribute a discrete atmospheric event
to climate change. No matter what one's opinion is on global warming and
hurricanes, you shouldn't score cheap points by turning a scientific question
into a political one. Attempts to use hurricanes to justify energy policies
to mitigate climate change may prove counterproductive. This would only
provide a great opening for criticism of the underlying scientific reasoning.
As director of the Center for the Study of Public Health Impacts of
Hurricanes at Louisiana State University in Baton Rouge, where has your
research been focused? We have had 3 parallel activities over the past
four years. One is to develop a model to understand the physical dimensions
and impact of a hurricane on New Orleans. A second is to try to get an
understanding of the attitudes of the people of New Orleans about how they
would react and where they would go. Tied to that, we have transport engineers
who look at evacuation issues, who worked closely with the state to develop
the rather successful counterflow evacuation this year. Another important
thing is to develop a database of all the infrastructure in New Orleans
and construct a very secure Geographical Information Systems (GIS) database
with > 70 layers of information. Did this work help officials respond to
Katrina? We started using our StormSurge model five days before the storm
arrived; those outputs were given to the state emergency operation centre.
The model predicted that the surge levels would be high enough that not
only were they going to be topping the levees, but that there was a high
probability of breaching. [This led to a call to evacuate the city.] We
got the go-ahead this morning to go into full operational mode with our
GIS database, so we're working closely with state and federal agencies
to integrate new layers of information [such as satellite images of the
flood waters]. We've just migrated the GIS on to a very large computer
platform with a very wide broadband so people can access it easily, because
we're getting inundated. What research are you conducting in the aftermath?
We have a team that went out to assess wind damage on 30 August 2005. They
can't get in to New Orleans but they went to the north shore of Lake Pontchartrain.
There's a team doing aerial reconnaissance tomorrow of the coastlines in
the New Orleans area. We've had individual people going out to look at
specific things, such as trying to measure water levels on the rivers,
where all the gauges are gone, and of course there's the massive GIS operation.
And as soon as it's safe we'll send a group into New Orleans to start sampling
the water for bacteria and other contaminants. Your model predicted that
many people would not evacuate. During the disaster, the stranded were
told to go to the Superdome, where there was no electricity, and not enough
food or medical aid. Some were attacked by gangs, others died from injuries.
Can you explain why this happened? The problem is there are 120,000 people
in New Orleans who don't own motor vehicles. There are some hard-headed
individuals who won't evacuate, and there is a section of the population
who didn't want to evacuate because of the fear of having their homes looted.
The majority of those who didn't leave have few resources, so they're the
most dependent. I think the city was doing the best it could in terms of
utilizing the Superdome. So do you think more money should have been
spent to prepare for those left behind? Yes. What are you most concerned
about now? There are > 400,000 refugees here in Baton Rouge, and you can
feel the desperation in the air. The federal government really needs to
be expropriating properties and building refugee camps, because desperate
people do desperate things. People here in Baton Rouge are trying their
best to help. They're taking in families; I'm taking in a family. My wife's
business is putting ads in the papers to hire refugees, to try and do what
they can to help.
There are fewer hurricanes today than there were a decade ago, but
they are stronger and more destructive, looking at satellite images of
storms from 1970 to 2004ref.
But it is not known whether the changes are associated with global warming.
With the exception of the North Atlantic, the number of storms and stormy
days has recently decreased in all oceans warm enough to form hurricanes,
or typhoons as they are called in the Pacific. Globally, the number of
stormy days per year reached a peak of 870 days in 1995; it has since decreased
by 25% to around 600. The number of storms that reached the highest categories
4 and 5, which have wind speeds of 200 kilometres per hour or more, has
almost doubled over the past 35 years, the team found. In the 1970s, an
average of ten category-4 and -5 storms occurred each year. The average
number of such extreme storms has since risen to 18, although only a fraction
of these eventually hit land. These results agree with recent evidence
that hurricanes have become significantly more destructive over the past
decade. The financial loss that a storm causes is roughly proportional
to the cube of its wind speed : this was used to calculate a 'power dissipation
index'. This index has sharply increased since 1990. Sea surface temperatures
are the dominant variable in hurricane formationref.
A 0.5ºC rise between 1970 and 2004 in sea surface temperatures is
most likely to be an effect of global warming. But scientists are puzzled
as to why, globally, warmer oceans do not lead to uniformly changed hurricane
patterns. The shift towards stronger storms has been much less pronounced
in the North Atlantic, for example, than in other ocean basins. There must
be a complex natural mechanism at play that we don't yet fully understand.
Different ocean basins seem to display different patterns of decadal natural
variability. But with so few long-term cycles to look at, it is hard to
pin down the differences. The relationship between global warming and hurricane
behaviour is hotly debated. Some scientists argue that links must exist;
others say warming influences must be tiny when compared with natural swings.
In either case, the dramatic increase in economic losses from storms is
mostly due to societal changes in hurricane-prone regions.
Web resources :
tornado / whirlwind / twister : a violently
rotating column of air in contact with and extending between a convective
cloud and the surface of the earth. It is the most destructive of all storm-scale
atmospheric phenomena. They can occur anywhere in the world given the right
conditions, especially after the landfall of hurricanes.
Wind speeds in tornadoes range from values below that of hurricane speeds
to > 300 miles per hour. Unlike hurricanes, which produce wind speeds of
similar values over relatively widespread areas (when compared to tornadoes),
the maximum winds in tornadoes are often confined to extremely small areas,
and vary tremendously over very short distances, even within the funnel
itself. The tales of complete destruction of one house next to one that
is totally undamaged are true and well documented. In 1971, Dr. T. Theodore
Fujita of the University of Chicago devised a 6-category scale to classify
U.S. tornadoes into 6 intensity categories, named F0-F5. These categories
are based upon the estimated maximum winds occurring within the funnel.
The Fujita Tornado Scale (or the "F Scale") has subsequently become
the definitive scale for estimating wind speeds within tornadoes based
upon the damage done to buildings and structures. It is used extensively
by the National Weather Service in investigating tornadoes (all tornadoes
are now assigned an F scale), and by engineers in correlating damage to
building structures and techniques with different wind speeds caused by
tornadoes. The Fujita Scale bridges the gap between the Beaufort
wind speed scale and Mach numbers (ratio of the speed of
an object to the speed of sound) by connecting Beaufort Force 12 with Mach
1 in 12 steps. The equation relating the wind velocities (V in mph) with
the F scale (F) is V = 14.1 * ((F+2) to the 1.5 power).
F1 on the Fujita Scale is equal to B12 (73 mph) on the Beaufort scale,
which is the minimum windspeed required to upgrade a tropical storm to
a hurricane. F12 on the Fujita Scale is equal to M1 (738 mph) on the Mach
numbers. Though the Fujita Scale itself ranges up to F12, the strongest
tornadoes max out in the F5 range (261 to 318 mph).
Gale Tornado. Light Damage: Some damage to chimneys; breaks twigs and
branches off tress; pushes over shallow-rooted trees; damages signboards;
some windows broken; hurricane wind speed begins at 73 mph
F1
73-112 mph (63-97 kt)
Cat 1/2/3
Moderate Tornado. Moderate damage: Peels surfaces off roofs; mobile
homes pushed off foundations or overturned; outbuildings demolished; moving
autos pushed off the roads; trees snapped or broken.
F2
113-157 mph (98-136 kt)
Cat 3/4/5
Significant Tornado. Considerable damage: Roofs torn off frame houses;
mobile homes demolished; frame houses with weak foundations lifted and
moved; boxcars pushed over; large trees snapped or uprooted; light-object
missiles generated.
F3
158-206 mph (137-179 kt)
Cat 5
Severe Tornado. Severe damage: Roofs and some walls torn off well-constructed
houses; trains overturned; most trees in forests uprooted; heavy cars lifted
off the ground and thrown; weak pavement blown off roads.
F4
207-260 mph (180-226 kt)
Cat 5
Devastating Tornado. Devastating damage: Well constructed homes leveled;
structures with weak foundations blown off some distance; cars thrown and
disintegrated; large missiles generated; trees in forest uprooted and carried
some distance away.
F5
261-318 mph (227-276 kt)
n/a
Incredible Tornado. Incredible damage: Strong frame houses lifted off
foundations and carried considerable distance to disintegrate; automobile-sized
missiles fly through the air in excess of 300 ft (100 m); trees debarked;
incredible phenomena will occur.
F6-F12
> 319 mph (277 kt)
n/a
The maximum wind speeds of tornadoes are not expected to reach the
F6 wind speeds.
anticyclones / high-pressure system :
an area of relative pressure maximum that has diverging winds and a rotation
opposite to the earth's rotation. This is clockwise the in Northern Hemisphere
and counterclockwise in the Southern Hemisphere
subtropical high : semi-permanent
high-pressure region near 30 degrees latitude.
mass movement : a downhill movement
of soil or fractured rock under the force of gravity.
detritus Loose rock or mineral material that is dislodged from
bedrock by mechanical means and transported from its place of origin.
earthflow : a detachment of soil and broken
rock and its subsequent downslope movement at slow or moderate rates in
a stream- or tongue like form.
mudflow : a flow of water-saturated earth
material possessing a high degree of fluidity during movement. A less-saturated
flowing mass is often called a debris flow. A mudflow originating
on the flank of a volcano is properly called a lahar.
avalanche : a large mass of material falling
or sliding rapidly due to the force of gravity. In many cases, water acts
as a catalyst and/or lubricant. Avalanches often are classified by what
is moving, such as a snow, ice, soil, or rock avalanche. A mixture of these
materials is commonly called a debris flow.
debris avalanche : a flow of unsorted masses of rock and other material
downslope under the influence of gravity. Water is commonly involved as
a catalyst and/or lubricant. For example, a rapid mass movement that included
fragmented cold and hot volcanic rock, water, snow, glacial ice, trees
and other debris and hot pyroclastic material was associated with the May
18 1980 eruption of Mt. St. Helens. Most of the deposits in the upper valley
of the North Fork Toutle River and in the vicinity of Spirit Lake are from
the debris avalanche resulting from the eruption.
landslide : an abrupt movement of soil and
bedrock downhill in response to gravity. Landslides can be triggered by
an earthquake or other natural causes. Undersea landslide can cause tsunami.
debris flow : a type of landslide made up of a mixture of water-saturated
rock debris and soil with a consistency similar to wet cement. Debris flows
move rapidly downslope under the influence of gravity. Sometimes referred
to as earth flows or mud flows.
rockslide : a landslide involving mainly large blocks of detached
bedrock with little or no soil or sand.
slump : a type of landslide in which a mass of rock breaks away
along a curved surface and rotates more or less intact downslope. The sliding
mass of rock is called a slump block.
A study of Native American tales of a 2-headed serpent spirit has hinted
at the potential impact of a fault that lies directly beneath Seattle.
Researchers have been studying stories from the Salish people of the North
American west coast about a'yahos, a spirit associated with shaking of
the ground and rushing, muddy water. They say the tales are strongly linked
to a quake that occurred in AD 900. By tracing the tales to specific locations,
they have found evidence of ancient landslides. Another earthquake in the
now densely populated area could trigger a similar slide, with potentially
disastrous effects. I don't think people fully appreciate the severity
of the landslides : such work can help to educate residents about the risks
of an earthquake. The Seattle fault line, which lies directly beneath that
city in Washington state, last produced a major earthquake in AD 900. That
earthquake, of estimated magnitude 7.4 on the Richter scale, caused landslides,
major ground upheavals and a tsunami in the adjacent Puget Sound. Many
Native American settlements were destroyed, although the area was quickly
repopulated. Today, nearly 4 million people live in the area, but seismologists
know relatively little about the last earthquake, or when another might
occur. Ludwin and her colleagues collected stories about a'yahos in an
attempt to discover more. A dozen appearances of this spirit occurred at
locations close to small faults connected to the larger Seattle fault line,
with five appearances occurring on or very close to the main fault. One
story tells of a 'spirit boulder' associated with the appearance of a'yahos
on a beach in west Seattle, and Ludwin tracked it down with the help of
local Salish people. "If you hadn't heard the stories you would never pick
it out," she says. By using a technique called light detection and ranging
(LIDAR) to image the ground beneath and around the rock, they found that
it lies at the bottom of a landslide near land that is known to have been
pushed upwards by the AD 900 earthquake. At another site in greater Seattle,
stories say that a'yahos demolished a cliff side when people disturbed
him. LIDAR pictures again found evidence of a large and previously unknown
landslide (Ludwin R. S., et al. Seismol. Res. Lett., 76. 431 - 436 (2005)).
The work is interesting, but conventional Earth science studies may be
more helpful in predicting the impacts of future quakes. Human records
do give the event a reality that geological evidence can't ever give, but
such studies can probably tell us more about anthropology than earthquakes.
The Eiger, Switzerland's
most infamous mountain, is traditionally best known for the challenge of
ascending its north face. But it gained fame in another way last week when
a huge chunk fell from its eastern flank, triggering claims that climate
change has been implicated in yet another high-profile natural event. Geologists
predicted the rockfall
in June 2006, after noting that large cracks were working their way into
the mountain's side. The crash finally came on 13 July: Swiss rescue services
estimate that some 700,000 m3 of rock — the size of a large
skyscraper— have tumbled into the valley below. Tourists have flocked to
the area, hoping to witness the mountain shedding even more rock. Geologists
reckon that a further 1.3 million m3 may be at risk, given the
growing cracks. Such a massive collapse is certainly dramatic. But the
fervour with which tourists have seized on the Eiger landslide probably
owes more to the mountain's fame than to the rarity of such events, or
to the possibility that climate change is affecting the mountains. Climate
change can melt the 'glue' that holds rock together in cold environments.
So melting ice can, in theory, release that hold and allow gravity to peel
away chunks of rock. And the glaciers in the Alps do seem to be retreating.
But there's nothing to indicate that unusually warm weather is specifically
responsible for this collapse on the Eiger. And geologists are at pains
to point out that such rockfalls are a natural part of the process by which
mountains rise and ultimately fall. This is just part of the mountain cycle
— it would have happened anyway. Mountains are fundamentally unstable,
otherwise they would just get taller and taller. The point does come where
they get too tall for their own good. Globally, 5-10 large rockfalls occur
each decade. And that's just the ones geologists know about — the real
figure may be as high as 50, because so many of the world's mountain ranges
are all but inaccessible. The Alps were hit by a similar rockfall just
two years ago, when a 75-m-tall chunk of rock came loose in northern Italy,
blocking a hikers' trail. And in 1963, a catastrophic collapse on Italy's
Mount Toc wiped out the village of Longarone, claiming some 2,000 lives.
The Swiss even have a name, sturzstrom, for massive rockfalls that tumble
down mountains and spread far across the terrain below. But that doesn't
mean these events are entirely humdrum. One puzzling feature of these types
of rockfall is their ability to travel several kilometres from the point
of impact under the crumbling mountainside — farther than expected from
simple models. Some geologists argue that the sprawling effects of a sturzstrom
occur because, as the rocks break apart, they trap a layer of air under
them, causing a 'hovercraft' effect that carries the blanket of boulders
along. Another theory is that the impact releases such enormous amounts
of energy that water in the ground becomes steam, which carries the rocks.
But this is little more than speculation. The Eiger event is at the bottom
end of the size range where the effect can be seen. Fortunately, even if
all of the rock that threatens to tumble does fall, it is unlikely to cause
harm — the valley beneath is empty of villages, so the mysterious ability
of sturzstroms to sweep great distances isn't expected to cause a problem
here
Web resources :
rockfall : falling, bouncing, and rolling of debris down slope
Prevention : tiny tremors can signal a collapsing
cliff at least 2 hours before the rocks start to tumble. A team of French
researchers led by David Amitrano of the National Polytechnic Institute
of Lorraine in Nancy and colleagues spotted seismic signals in one seaside
cliff before it collapsed, and the same technique should work on other
unstable pieces of land. The research has persuaded the French government
to consider installing early-warning systems in other vulnerable sites.
There is special concern about cliff collapse in France simply because
many of its coastal towns are built along the clifftops. A Europe-wide
project to study erosion is called PRediction
Of The Erosion of Cliffed Terrains (PROTECT). Other nations have similar
problems. In Denmark, which is one of the major players in the PROTECT
scheme, cliffs on the Baltic coast are particularly prone to crumbling:
a recent collapse killed a woman and dog walking on the beach below. The
chalk cliffs of Dover and Beachy Head on the south coast of England also
experience regular rock falls. Current warning systems, such as rockslide
monitors on roads in the Alps, rely mostly on detecting the noise of a
fall after it starts. So far Amitrano's system is untested, so the team
doesn't yet know how reliable or precise it could be. The chalk cliff at
Mesnil-Val in Haute Normandie is eroding fast, retreating by up to a metre
a year. It is particularly hard to detect seismic tremors in chalk because
the soft rock damps the quivers before they travel far. If it works in
chalk, it'll work in anything. They placed 5 seismic detectors in boreholes
over a 60-m section of the 50-m-high cliff face. Vibrations in the rock
were picked up by the sensors and converted to radio signals. These were
then broadcast, using the French telephone network, to a data-analysis
system in Amitrano's lab at Nancy that was continually monitoring the array's
output. 2 hours before the collapse on 23 June 2002, 1 of the 5 sensors
picked up activity that increased at an exponential rate until at least
a thousand m3 of rock sheared away and fell on to the beach.
Similar precursors should occur in a predictable fashion for other collapses.
Detecting a precursor to a collapse event is great news, but the real test
will be whether this can be done in real time before the collapse, rather
than retrospectively. False alarms of rumbling without a collapse
could cause unnecessary panic and evacuations, and it's not clear whether
it would be feasible to install arrays of sensors over long sections of
coastlineref.
easterly or tropical wave :
an area of relatively low pressure (trough) moving westward through the
trade wind easterlies. Generally, it is associated with extensive cloudiness
and showers, and may be associated with possible tropical cyclone development.
An inverted, migratory wave-like disturbance or trough in the tropical
region that moves from east to west, generally creating only a shift in
winds and rain. The low-level convergence and associated convective weather
occur on the eastern side of the wave axis. Normally, it moves slower than
the atmospheric current in which it is embedded and is considered a weak
trough of low pressure. It is often associated with possible tropical cyclone
development and is also known as a tropical wave.
earthquakes (a.k.a. seaquake when
the epicenter is located in sea) are the result of forces deep within the
Earth's interior that continuously affect the surface of the Earth. The
energy from these forces is stored in a variety of ways within the rocks.
When this energy is released suddenly, for example by shearing movements
along faults in the crust of the Earth, an earthquake results. The area
of the fault where the sudden rupture takes place is called the focus or
hypocenter
of the earthquake. The point on the Earth's surface directly above the
focus is called the epicenter of the earthquake. 90% of the world's
earthquakes occur in specific areas that are the boundaries of the Earth's
major crustal plates. If you like you can start a simulated window earthquake
: don't worry, it's self-limiting and makes no damage! :-)
Shown on the map are the epicenter locations of earthquakes of magnitude
> 4.5 that occurred from 1978 through 1987
The Earth undergoes continuous oscillations, and free oscillation peaks
have been consistently identified in seismic records in the frequency range
2-7 mHz, on days without significant earthquakes. The level of daily excitation
of this 'hum' is equivalent to that of magnitude 5.75 to 6.0 earthquakes,
which cannot be explained by summing the contributions of small earthquakes.
As slow or silent earthquakes have been ruled out as a source for the hum
(except in a few isolated cases), turbulent motions in the atmosphere or
processes in the oceans have been invoked as the excitation mechanism.
An array-based method to detect and locate sources of the excitation of
the hum has been developed and demonstrates that the Earth's hum originates
mainly in the northern Pacific Ocean during Northern Hemisphere winter,
and in the Southern oceans during Southern Hemisphere winter. The Earth's
hum is generated by the interaction between atmosphere, ocean and sea floor,
probably through the conversion of storm energy to oceanic infragravity
waves that interact with seafloor topographyref.
Nonvolcanic tremor activity (i.e., long-duration seismic signals
with no clear P or S waves) have been discovered within a transform plate
boundary zone along the San Andreas Fault (SAF) near Cholame, California,
the inferred epicentral region of the moment magnitude (M) ~7.8 1857 Fort
Tejon earthquake. The tremors occur between 20 to 40 km depth, below the
seismogenic zone (i.e., the upper ~15 km of Earth's crust where earthquakes
occur), and their activity rates may correlate with variations in local
earthquake activityref.
People with mental disorders, people with moderate physical disabilities,
and people who had been hospitalized just prior to the earthquake are the
most vulnerable. The degree of vulnerability increased with decreasing
monthly wage (measured in New Taiwanese dollars (NT$)) (NT$20,000 approximately
NT$39,999. The significant associations of both prequake health status
and socioeconomic status with earthquake death suggest that earthquake
death did not occur randomlyref.
The strongest Earth tides can trigger shallow thrust fault earthquakesref.
Frequency of earthquakes worldwide :
descriptor
magnitude
annual average
Great
>= 8
1 (based on observations since 1900)
Major
7–7.9
17 (based on observations since 1990)
Strong
6–6.9
134 (based on observations since 1990)
Moderate
5–5.9
1,319 (based on observations since 1990)
Light
4–4.9
c. 13,000
Minor
3–3.9
c. 130,000
Very minor
2–2.9
c. 1,300,000
History : the largest (in terms of magnitude)
earthquakes of the 20th century :
location
date
moment magnitude
1.
Chile
May 22, 1960
9.5
2.
on the seafloor near Aceh in northern Indonesia
Dec 26, 2004, 0:59 GMT
9.3
3.
Prince William Sound, Alaska
March 28, 1964, 03:36:14 UT (March 27, 5:36 P.M. local time)
9.2
4.
Andreanof Islands, Aleutian Islands
March 9, 1957
9.1
5.
Kamchatka
Nov. 4, 1952
9.0
6.
off the coast of Ecuador
Jan. 31, 1906
8.8
7.
Rat Islands, Aleutian Islands
Feb. 4, 1965
8.7
around 200 km west of northern Sumatra, Indonesia
Mar 28, 2005
8.7
9.
India-China border
Aug. 15, 1950
8.6
10.
Kamchatka
Feb. 3, 1923
8.5
11.
Banda Sea, Indonesia
Feb. 1, 1938
8.5
12.
Kuril Islands
Oct. 13, 1963
8.5
The largest earthquakes in the USA :
rank
magnitude
date
location
1.
9.2
March 28, 1964, 03:36:14 UT (March 27, 5:36 P.M. local time)
Prince William Sound, Alaska
2.
8.8
March 9, 1957
Andreanof Islands, Alaska
3.
8.7
Feb. 4, 1965
Rat Islands, Alaska
4.
8.3
Nov. 10, 1938
East of Shumagin Islands, Alaska
8.3
July 10, 1958
Lituya Bay, Alaska
6.
8.2
Sept. 10, 1899
Yakutat Bay, Alaska
8.2
Sept. 4, 1899
Near Cape Yakataga, Alaska
8.
8.0
May 7, 1986
Andreanof Islands, Alaska
9.
7.9
Feb. 7, 1812
New Madrid, Missouri
7.9
Jan. 9, 1857
Fort Tejon, California
7.9
April 3, 1868
Ka'u District, Island of Hawaii
7.9
Oct. 9, 1900
Kodiak Island, Alaska
7.9
Nov. 30, 1987
Gulf of Alaska
7.9
Nov.. 3, 2002
Central Alaska
15.
7.8
March 26, 1872
Owens Valley, California
7.8
Feb. 24, 1892
Imperial Valley, California
17.
7.7
Dec. 16, 1811
New Madrid, Missouri area
7.7
April 18, 1906
San Francisco, California
7.7
Oct. 3, 1915
Pleasant Valley, Nevada
20.
7.6
Jan. 23, 1812
New Madrid, Missouri
7.6
June 28, 1992
Landers, California
22.
7.5
July 21, 1952
Kern County, California
23.
7.3
Nov. 4, 1927
West of Lompoc, California
7.3
Dec. 16, 1954
Dixie Valley, Nevada
7.3
Aug. 18, 1959
Hebgen Lake, Montana
7.3
Oct. 28, 1983
Borah Peak, Idaho
Deadliest earthquakes on record (> 50,000
deaths) :
date
location
deaths
magnitude
Jan. 23, 1556
Shansi, China
830,000
n.a.
July 27, 1976
Tangshan, China
255,000 (official. Estimated death toll as high as 655,000.)
8.0
Aug. 9, 1138
Aleppo, Syria
230,000
n.a.
May 22, 1927
near Xining, China
200,000
8.3
Dec. 22, 856
Damghan, Iran
200,000
n.a.
Dec. 16, 1920
Gansu, China
200,000
8.6
March 23, 893
Ardabil, Iran
150,000
n.a.
Sept. 1, 1923
Kwanto, Japan
143,000
8.3
Oct. 5, 1948
Ashgabat, Turkmenistan, USSR
110,000
7.3
Dec. 28, 1908
Messina, Italy
70,000–100,000 (estimated)
7.5
Sept. 1290
Chihli, China
100,000
n.a.
Nov. 1667
Shemakha, Caucasia
80,000
n.a.
Nov. 18, 1727
Tabriz, Iran
77,000
n.a.
Nov. 1, 1755
Lisbon, Portugal
70,000
8.7
Dec. 25, 1932
Gansu, China
70,000
7.6
May 31, 1970
Peru
66,000
7.8
1268 (no date available)
Silicia, Asia Minor
60,000
n.a.
Jan. 11, 1693
Sicily, Italy
60,000
n.a.
May 30, 1935
Quetta, Pakistan
30,000–60,000
7.5
Feb. 4, 1783
Calabria, Italy
50,000
n.a.
June 20, 1990
Iran
50,000
7.7
28.12.1908, 5:20 CET, 30 seconds duration, Messina and Reggio Calabria,
Italy (associated with tsunami)
Bam, Iran : analysis of surface deformation indicates that most of the
seismic moment release along the 20-km-long strike-slip rupture occurred
at a shallow depth of 4?5 km, yet the rupture did not break the surface.
The Bam event may therefore represent an end-member case of the 'shallow
slip deficit' model, which postulates that coseismic slip in the uppermost
crust is systematically less than that at seismogenic depths (4-10 km).
The InSAR-derived surface displacement data from the Bam and other large
shallow earthquakes suggest that the uppermost section of the seismogenic
crust around young and developing faults may undergo a distributed failure
in the interseismic period, thereby accumulating little elastic strainref
28.12.2004, 0100 GMT, a 9.3 magnitude earthquake occurred on the seafloor
near Aceh in northern Indonesia. The quake, the world's biggest for > 40
years and the 4th largest since 1900, has literally redrawn the map, moving
some islands by several metres. A buildup of pressure caused the Indian
Ocean floor to lurch some 15 m towards Indonesia, burrowing under its tectonic
plate and triggering the ferocious swells that smashed the surrounding
shores. The movement is likely to have altered the geography of neighbouring
islands such as Simeulue, the Nicobar and Andaman islands, and Sumatra
itself. The quake follows almost 2 centuries of tension during which the
Indian plate has been pressing against the Burma microplate, which carries
Sumatra and the other islands. The 2 move against one another at an average
rate of around 6 cm a year - about the speed at which fingernails grow.
But this movement does not occur smoothly. There has not been a significant
quake along this fault since 1833, hence the huge force of this one. The
Indian plate's jarring slide released the Burma microplate from its tension,
causing it to spring violently upwards. This bulge could have altered the
elevations, as well as the positions, of the nearby islands. The Andaman
and Nicobar islands may have been raised by the quake; meanwhile, slightly
further from the fault line itself, water levels indicate that the Indonesian
city of Banda Aceh has been left lower than before. Californian seismologists
will use the Global Positioning System to work out how the region's maps
will need to be redrawn. Quakes of this type, called subduction earthquakes,
are commonplace throughout the world but rarely strike with such force.
The length of the rupture was 1,200 kilometres. The earthquake actually
consisted of 3 events within seconds of each other. The initial slip, which
occurred to the west of Sumatra's northern tip, triggered 2 further slips
to the north. The total force released was enough to jolt the entire planet,
and has been estimated as equivalent to almost 200 million tonnes of TNT.
The seafloor bulge unleashed a wave that surged throughout the Indian Ocean,
hitting Indonesia and Thailand within an hour, Sri Lanka and India within
4 hours, and ultimately causing deaths as far away as East Africa. The
swell would have been imperceptible in the open ocean, but as it entered
shallower coastal waters it reared up to several metres in height, destroying
seafront villages and resorts, and washing through farms up to 5 km inland.
Coastlines nearest to the quake's epicentre would have suffered the heaviest
blow. As the wave spreads its energy is fanned out over a wider area, and
as it moves it is gradually tamed by friction. The aftershocks can be expected
to last for several weeks, although experts are confident that there will
not be another tsunami. But the disaster has highlighted the sheer unpredictability
of earthquakes, despite huge research efforts in Japan and the USA. Still
nobody has predicted an earthquake successfully in my opinion. The first
pictures of the earthquake zone that triggered the Asian tsunami show how
the undersea landscape has been transformed by the clashing of tectonic
plates. It is the first time that the sea floor has been observed so soon
after such a serious earthquake. A British Geological Survey team, along
with marine geologists at the Southampton
Oceanography Centre, UK, joined the Royal Navy's survey ship HMS Scott
when it docked in Singapore in late January 2005. They used the craft's
sonar imaging system to map the deformed sea bed about 150 km west of the
Sumatran coast. Coloured contour maps show that the earthquake threw up
large ridges up to 1.5 km high, some of which have collapsed to create
landslides several kilometres wideref.
From this they hope to understand better the geological processes that
produced the earthquake. Their research may ultimately help to assess the
risks from future earthquakes. The data will also be used to update navigational
charts of the area. The earthquake started about 40 km below the sea floor
on 26 December 2004, caused by a sudden movement between the India and
Burma tectonic plates. Where the plates meet, the India plate is steadily
being forced downwards. The earthquake was about twice as energetic as
geologists initially thought : the earthquake's magnitude was 9.3, not
9.0, on the moment magnitude scaleref1,
ref2.
As the scale is logarithmic, an increase of 0.3 translates to a roughly
2-fold increase in magnitude
As fears spread about impending aftershocks and further monstrous waves,
efforts to tackle the devastation and to help survivors are being hampered
in India and Sri Lanka as people flee the shore again, after warnings of
impending aftershocks and further monstrous waves and experts are keen
to reassure people that such tremors are unlikely to worsen the situation
seriously. Strong aftershocks are likely to continue for weeks, months
or even a year, but they're not large enough to generate additional tsunamis.
The largest aftershock measured so far occurred about 3 hours after the
initial quake, had a magnitude of 7.1 and was classified as a major quake
in its own right. And as of 30 December, around 85 aftershocks had been
recorded and categorized as being either strong or moderate. But although
violent, quakes such as these are unlikely to cause more large tsunamis,
which were responsible for the vast majority of the destruction after initial
9.0 magnitude quake. Smaller aftershocks, of magnitude 8 or less, lack
the power to trigger killer waves and are likely just to cause land tremors
on Indonesian islands close to their epicentre. Aftershocks are occurring
because the stress built up between the neighbouring India and Burma tectonic
plates, off the Sumatran coast, was not all released in the original earthquake.
There are subsequent smaller adjustments of the plates as they settle into
a new stable position. Before the current disaster, the last quake to top
9 in magnitude was in Alaska in 1964. Aftershocks there continued for >
1 year.
It was so powerful that it has accelerated the Earth's rotation, shortening
the period of our planet's rotation by some 3 ms.
The change was caused by a shift of mass towards the planet's centre, as
the Indian Ocean's heavy tectonic plate lurched underneath Indonesia's
one. This caused the globe to rotate faster, in the same way that a spinning
figure-skater accelerates by tucking in her arms. The blast literally rocked
the world on its axis : Earth now tilts by an extra 2.5 cm in the wake
of the jolt. But the change will nonetheless be relevant to physicists
charged with keeping the world's official time, which since 1967 has been
based on a battery of around 250 highly accurate atomic clocks in 60 labs
throughout the world. These labs report to the International Bureau of
Weights and Measures near Paris, which sets the Coordinated Universal Time
(UTC). UTC has to be kept as close as possible to the time period of Earth's
rotation, which can fluctuate in response to unpredictable events such
as large earthquakes. Overall, the Earth's rotation tends to slow down
as the Moon's gravity pulls on its seas and continents, causing bulges
that give the opposite effect of the compacting quake. Because of this
trend, physicists have slipped in 22 separate 'leap
seconds' since 1972, each one delaying UTC until the Earth's rotation
catches up. They are inserted either as an extra final second on 31 December,
or at the very end of June. The most recent was at the end of 1998. The
change caused by the Indian Ocean quake, at just a few millionths of a
second, is too slight to need correcting as it runs counter to Earth's
general trend of slowing down. It has never been necessary to subtract
a second from UTC to make up for a speedily spinning Earth. This is partly
because, when the atomic clock system was adopted in 1967, physicists chose
1900 as the year with the best average data on how fast the Earth spins.
This meant that, because of Earth's natural slowing, the atomic clock was
already running fast when it was set up. On average, just 1 one leap second
is needed every year. That makes the fact that we haven't had one since
1998 something of a surprise. Since the last leap second the Earth's rotation
seems to have been changing at a very slow rate.
As the human drama of the Aceh–Andaman earthquake and tsunami unfolded
in the last days of 2004, laymen and scientists began scrambling to understand
what had caused these gigantic disturbances of Earth's crust and seas.
One of the earliest clues was the delineation, within just hours of the
mainshock, of a band of large aftershocks arcing 1,300 km from northern
Sumatra almost as far as Myanmar (Burma)ref.
This seemed to signal that about 25% of the Sunda megathrust, the great
tectonic boundary along which the Australian and Indian plates begin their
descent beneath Southeast Asia, had ruptured. In less than a day, however,
analyses of seismic 'body' wavesref
were indicating that the length of the rupture was only about 400 km. This
early controversy about the length of the megathrust rupture created a
gnawing ambiguity about future dangers to populations around the Bay of
Bengal. If only 400 km of the great fault had ruptured, large unfailed
sections might be poised to deliver another tsunami. If, on the other hand,
most of the submarine fault had broken, then the chances of such a disaster
were much smaller. Early analyses grossly underestimated the rupture length,
and they present an analysis of high-frequency (2–4 Hz) seismic signals
that clearly shows northward propagation of the rupture for a distance
of about 1,200 km. Early estimates of the magnituderef1,
ref2
were far too low. Using extremely long-period seismic 'normal mode' waves,
they calculate that the earthquake's magnitude was 9.3, about 3 times larger
than initial estimates of 9.0 (given the logarithmic nature of earthquake-magnitude
scales)ref.
This much larger size is consistent with slip averaging about 13 m along
a 1,200-km rupture, assuming that much of the slippage occurred too slowly
to be seen in shorter-wavelength seismograms. Thus, they claim the long-versus-short
rupture controversy is solved and that there is no need to worry about
another giant earthquake and tsunami originating along this long section
of the fault. This is destined to be one of the most important earthquakes
of the century. Over the next year or 2, figuring out what happened will
be a showcase both of what modern observations and analysis can do and
of the multidisciplinary nature of modern earthquake science (Committee
on the Science of Earthquakes Living on an Active Earth: Perspectives
on Earthquake Science (Natl Academies Press, Washington DC, 2002)).
In the months ahead, much more will be learned about this giant event.
Satellite imagery and field measurements of dramatically uplifted and submerged
coastlinesref1,
ref2
and the movement of Global Positioning System geodetic stations8, as well
as tsunami records, will all add constraints on the areal extent of the
rupture and the magnitude and sequencing of slip: these, in turn, will
be essential to understanding the tsunami. If all of the megathrust between
northernmost Sumatra and Myanmar has produced its once-a-millennium giant
earthquake, why should we have any immediate concern about another giant
quake or tsunami in the Bay of Bengal? McCloskey et al.ref
offered one answer by estimating the stresses imposed by the giant 2004
rupture on the two big faults farther south. It seems that the section
of the Sunda megathrust immediately to the south, off the coast of northern
Sumatra, is now closer to failure. Likewise for the nearest portion of
the great San Andreas-like Sumatran fault, which runs through Banda Aceh
and down the backbone of the Sumatran mainland. The critical question is
how close to failure the 2004 rupture has moved these two big faults. This
will be moot until more is known about the history of their past ruptures.
It will be necessary to learn how the Sumatran parts of the megathrust
are segmented structurally, and how they have behaved in the past. Immediately
south of the 2004 rupture, for example, it appears from the historical
record that there were very large earthquakes in 1861 and 1907ref.
Where on the megathrust were these ruptures, and how often and how regularly
do they recur? Palaeoseismic data are available only for a 700-km-long
section farther away, from about 1° to 5° south of the Equator.
Giant earthquakes and tsunamis occur there about every 200–230 years, sometimes
as a single giant earthquake, sometimes as two in relatively quick succession,
as happened in 1797 and 1833ref.
Big faults on the northern flank of the 2004 rupture also pose a hazard;
the northern extension of the 2004 rupture continues for another 1,000
km, up the west coast of Myanmar, well past Bangladesh to the eastern end
of the Himalayas. Too little is known of its long-term history to provide
a meaningful assessment of its future behaviour. Moreover, long sections
of the enormous thrust fault along which India is diving down beneath the
Himalayas have not failed for centuries and are only one to three fault-lengths
away from the 2004 rupture. It is sobering to realize that big earthquakes
sometimes occur in clusters (for example, seven of the ten giant earthquakes
of the twentieth century occurred between 1950 and 1965, and five of these
occurred around the northern Pacific margin). Because many of the giant
faults in the Aceh–Andaman neighbourhood have been dormant for a very long
time, it is quite plausible that the recent giant earthquake and tsunami
may not be the only disastrous twenty-first-century manifestation of the
Indian plate's unsteady tectonic journey northwardref On 17 March 2005, a team of seismologists published a paper in Nature
that showed that the huge earthquake that caused the Indian Ocean tsunami
has increased the stress on neighbouring fault lines. The increased strain
on the neighbouring Sumatra fault, which runs through the island itself,
makes Sumatra the most likely place for a subsequent earthquake. The December
earthquake shunted almost 250,000 km2 of the Indian plate under
the Burma microplate's western edge : this has increased the pressure on
the Sumatra fault on the plate's eastern edge, which runs close to the
already ravaged city of Banda Aceh. The most immediate threat is probably
an earthquake of magnitude 7-7.5 on the island. This region has produced
such earthquakes in the past, although not for > 1 century, meaning that
the fault may now be close to breaking point. Also under increased strain
is the Sunda trench, to the southeast of the region where December's earthquake
struck. Like its neighbour, it is also an undersea 'subduction zone' with
a history of destruction: in 1833 and 1861, earthquakes on the Sunda fault
triggered fatal tsunamis. The model shows how the stresses have moved through
the region in the wake of December's slip. Pressure on a 50-kilometre stretch
of the Sunda trench has increased by up to 5 bars, and a 300-kilometre
segment of the Sumatra fault is now under an extra 9 bars of strainref.
It was hard to tell whether this will trigger an earthquake at either location.
The problem was that we don't know the failure stress for any of these
structures. And the calculations are based on a crude estimate of the amount
of slip that occurred on 26 December, which does not take account of any
subsequent shifts in the region. But seismologists have used similar information
to 'telegraph' earthquakes before : the earthquake that devastated Izmit,
Turkey, in 1999 was preceded by a 2-bar increase in pressure that seismologists
spotted some 18 months before the quake. A tsunami triggered by the Sunda
trench would mostly dissipate in the Southern Ocean. But Indonesia, particularly
Sumatra, would almost certainly be struck again, and the wave would probably
wash up on southern African shores too. By showing how one earthquake can
trigger another nearby, the researchers have underlined yet again the case
for installing a tsunami warning system in the Indian Ocean. People think
'well you've had your bad luck', but earthquakes cluster in time and in
space. A warning system will be most useful to people on distant shores,
rather than the Sumatran population. Sumatrans will most likely have ample
warning of an impending tsunami given by the severe shaking of the earthquakeref.
11 day after the publication of the paper, a magnitude-8.7
earthquake struck just after 11:00 pm local time on 28 March, around
200 km west of northern Sumatra : it is classified as a 'great earthquake'
and occurred south of the earthquake that hit the region on 26 December
2004, but researchers are scratching their heads over why this one did
not trigger tsunami. There are reports of significant damage and around
1,000 deaths on the nearby Indonesian island of Nias, although the final
figures are not yet clear. Initial reports from the region suggest however
that this earthquake did not generate vast waves. By contrast, last year's
earthquake, now thought to be magnitude 9.3, triggered tsunamis that spread
as far as Africa and killed an estimated 300,000 people. The earthquake
released only 25% of the energy of its predecessor (the scale on which
magnitude is measured is logarithmic), nonetheless, it is one of the eight
most powerful earthquakes measured since 1900. Lesser earthquakes in 1861
and 1833 in the same region did trigger tsunamis. One possible explanation
for the absence of tsunamis could be that the latest tremor occurred much
deeper in the fault line that slices through the Indian Ocean : this might
have avoided shifts in the sea bed that can displace water and prompt a
tsunami. Seismologists will be studying the position and depth of the earthquake
to try and answer these questions. The tremor is not classified as an aftershock
of the December event, because it did not occur in exactly the same area
but it was probably triggered by last year's earthquake. Researchers had
predicted that the earlier judder, which increased stress on neighbouring
fault lines, would prompt more earthquakes. Monday's event seems to have
occurred in the Sunda trench, one of the places pegged as likely candidates
for a quake as a result of stress changes caused by December's monster.
Even if a tsunami warning system in the Indian Ocean is not yet up and
running, some coastal areas did issue alerts and evacuated coastal areas
more promptly because of lessons learned from the earlier catastrophe.
It doesn't look as if the size of the stress has increased, but the extent
has : the first earthquake affected some 300 km of the Sumatra fault; the
latest event could have elongated the stressed section to some 600 km.
The extent is importan as it increases the chance of there being a weak
patch
Data collected at 60 Global Positioning System (GPS) sites in southeast
Asia show the crustal deformation caused by the 26 December 2004 Sumatra-Andaman
earthquake at an unprecedented large scale. Small but significant co-seismic
jumps are clearly detected > 3,000 km from the earthquake epicentre. The
nearest sites, still > 400 km away, show displacements > 10 cm. The rupture
plane for this earthquake must have been > 1,000 km long and that non-homogeneous
slip is required to fit the large displacement gradients revealed by the
GPS measurements. This kinematic analysis of the GPS recordings indicates
that the centroid of released deformation is located > 200 km north of
the seismological epicentre. It also provides evidence that the rupture
propagated northward sufficiently fast for stations in northern Thailand
to have reached their final positions < 10 min after the earthquake,
hence ruling out the hypothesis of a silent slow aseismic ruptureref.
Web resources :
Earthquakes in rich countries such as the USA or Japan do not tend to cause
huge numbers of casualties, because buildings are designed to withstand
the shaking. But in poorer areas such as parts of Iran, Turkey or the slums
of Mexico, cheaply built housing can collapse almost instantly during an
earthquake, killing thousands and stripping families of their only asset.
It is not the masonry blocks that fail under the onslaught, but rather
the mortar that holds them together : cement manufacturers should be issued
with pamphlets that tell builders exactly how to mix the perfect mortar.
The fault line that runs beneath Tokyo is kilometres closer to the
surface than seismologists realized. That could be worrying news to residents
waiting for the next 'big one' to hit the city, as quakes occurring at
a shallow depth tend to cause more shaking and damage to buildings than
deeper ones. Tokyo is among the most densely populated earthquake zones
in the world, with some 33 million people living and working in the conurbation.
In 1923, around 105,000 people in the city were killed by a huge magnitude
7.9 quake. Occasional quakes continue to plague the city, thanks to the
fact that it sits on a fault line where 2 rocky plates meet: one holds
Honshu Island, while another - the Philippine Sea plate - slides underneath
it. As a result of the seismic activity, the area is perhaps the best-monitored
earthquake region in the world, with an intricate network of global positioning
sensors that detect tiny seismic shifts. But before now, seismologists
had not accurately determined the depth of Tokyo's fault line. This could
be because jolts occurring at the interface where the 2 plates grind together
are quite rare. The many of the big quakes in the region instead have epicentres
embedded more deeply in the lowermost plate, causing researchers to surmise
that the fault line itself was deeper than it is. A picture of the Tokyo
fault line was built up by sending vibrations into the ground from trucks,
air guns and explosives. They examined the pattern of the waves that bounced
back to determine the composition of the rocks beneath. The fault line
itself lies between 4 and 26 km below the city. Previous estimates had
put the depth at between 20 and 40 km. This could mean that an earthquake
occurring at the fault line could be more damaging than previously thought.
It is expected that shallower geometry produces more intense shaking. However,
ground motion is also controlled by factors such as soil type and the precise
direction in which the rock slips. Other experts also caution that the
study will have to be validated. It could be that Sato saw waves bouncing
off an offshoot of the fault running closer to the surface, for example,
rather than the main fault itself. But knowing more about the fault will
certainly help in predicting the effects of future quakes. Before this
survey, in spite of the sophisticated monitoring system, the basic structure
was not well determined. Now we know its precise geometry, we will be able
to better analyse its behaviour. The study may even help to predict the
timing of the next large jolt. By calculating how far the fault slipped
in the 1923 event, and comparing this to the Philippine plate's average
speed of advance and rate of stress build-up, researchers could estimate
the average frequency of quakesref How to map an earthquake : when an earthquake
has happened, usually the first question people ask is, 'How big was it?'
A seismologist will answer this question by quoting a value called the
magnitude – this is a number representing the amount of energy released
by the earthquake. It's this number that is popularly referred to as the
'Richter scale'. An increase of one unit represents a thirty-fold increase
in energy, so an earthquake like the one that ruined Kobe in Japan in 1995
(magnitude nearly 7) was about 900 times as powerful as the earthquake
felt in England and Wales in 1990 (magnitude about 5). However, there is
another question that can be asked: 'How strong was it?' The strength with
which earthquake shaking can be felt is very different from the magnitude,
as it varies with distance from the epicentre of the earthquake. An earthquake
may cause destruction near its epicentre, but as you go further away the
damage becomes less and less. After a certain distance there will be no
damage at all, but people will still feel the vibration, and even further
away the shaking will be so weak people will not even feel it. Seismologists
have always needed a way to measure how strong the earthquake was at different
places, and this measure is called intensity. An interesting thing about
intensity is that it's much easier to measure than magnitude. To calculate
the magnitude of an earthquake, seismologists take measurements from a
seismogram, which is a recording made by a measuring instrument, a seismometer.
Although it is possible for amateurs with a knowledge of electronics to
build their own seismometers as a hobby, professional instruments are expensive
to buy and run. But measuring intensity is something that anyone can do.
Intensity is measured by an intensity scale. The one used in the UK is
called the European macroseismic scale. Macroseismic means 'to do with
the effects of earthquakes', as opposed to microseismic, which means 'to
do with instrumental records of earthquakes (seismograms)'. An intensity
scale is simply a classification of the sort of effects that might be observed
during an earthquake, with numbers increasing with severity. To find out
the intensity value in a town during an earthquake, you just collect descriptions
of what happened, compare them to the descriptions in the intensity scale,
and choose the number that gives you the best match. The principle is exactly
the same as that used for the 'Beaufort scale' for measuring wind speeds.
After an earthquake has happened, a seismologist will attempt to collect
people's experiences and compare them to the intensity scale. Much of this
information is collected using questionnaires, which can be used for doorstep
interviews, or left for people to collect from libraries or post offices,
or even published in a newspaper. If the earthquake has caused damage,
a trip has to be made to the area where the damage occurred to record how
bad it was. All the information then has to be grouped by place once it
has been gathered. One person's experience can vary greatly from another
person's experience, even within the same house, so it is necessary to
collect as many accounts as possible to see what the general experience
was in that area. Typically one would give a single intensity value for
all the accounts from a single village, but for a town or city, it might
be necessary to give different values to the different suburbs. These could
be quite different, because as well as the strength of shaking being affected
by the distance from the earthquake, it's also affected by the local geology.
If part of a town is in a river valley and part is on a rocky hill, the
houses on the soft ground by the river will usually be shaken more than
those on the hill.
The severity of an earthquake can be expressed
in terms of both intensity and magnitude. However, the two terms are quite
different, and they are often confused.
intensity is based on the observed effects of ground shaking on
people, buildings, and natural features. It varies from place to place
within the disturbed region depending on the location of the observer with
respect to the earthquake epicenter. The effect of an earthquake on the
Earth's surface is called the intensity. The intensity scale consists of
a series of certain key responses such as people awakening, movement of
furniture, damage to chimneys, and finally--total destruction. Although
numerous intensity scales have been developed over the last several hundred
years to evaluate the effects of earthquakes, the one currently used in
the United States is the Modified Mercalli (MM) intensity
scale. It was developed in 1931 by the American seismologists Harry
Wood and Frank Neumann. This scale, composed of 12 increasing levels of
intensity that range from imperceptible shaking to catastrophic destruction,
is designated by Roman numerals. It does not have a mathematical basis;
instead it is an arbitrary ranking based on observed effects. The Modified
Mercalli Intensity value assigned to a specific site after an earthquake
has a more meaningful measure of severity to the nonscientist than the
magnitude because intensity refers to the effects actually experienced
at that place. After the occurrence of widely-felt earthquakes, the Geological
Survey mails questionnaires to postmasters in the disturbed area requesting
the information so that intensity values can be assigned. The results of
this postal canvass and information furnished by other sources are used
to assign an intensity value, and to compile isoseismal maps that show
the extent of various levels of intensity within the felt area. The maximum
observed intensity generally occurs near the epicenter. The lower numbers
of the intensity scale generally deal with the manner in which the earthquake
is felt by people. The higher numbers of the scale are based on observed
structural damage. Structural engineers usually contribute information
for assigning intensity values of Vlll or above. The following is an abbreviated
description of the 12 levels of Modified Mercalli intensity.
1 : not felt except by a very few under especially favorable conditions.
2 : felt only by a few persons at rest, especially on upper floors of buildings.
Delicately suspended objects may swing.
3 : felt quite noticeably by persons indoors, especially on upper floors
of buildings. Many people do not recognize it as an earthquake. Standing
motor cars may rock slightly. Vibration similar to the passing of a truck.
Duration estimated.
4 : felt indoors by many, outdoors by few during the day. At night, some
awakened. Dishes, windows, doors disturbed; walls make cracking sound.
Sensation like heavy truck striking building. Standing motor cars rocked
noticeably.
5 : felt by nearly everyone; many awakened. some dishes, windows broken.
Unstable objects overturned. Pendulum clocks may stop.
6 : felt by all, many frightened. Some heavy furniture moved; a few instances
of fallen plaster. Damage slight.
7 : damage negligible in buildings of good design and construction; slight
to moderate in well-built ordinary structures; considerable damage in poorly
built or badly designed structures; some chimneys broken.
8 : damage slight in specially designed structures; considerable damage
in ordinary substantial buildings with partial collapse. Damage great in
poorly built structures. Fall of chmineys, factory stacks, columns, monuments,
walls. Heavy furniture overturned.
9 : damage considerable in specially designed structures; well-designed
frame structures thrown out of plumb. Damage great in substantial buildings,
with partial collapse. Buildings shifted off foundations.
10 : some well-built wooden structures destroyed; most masonry and frame
structures destroyed with foundations. Rail bent.
11 : few, if any (masonry) structures remain standing. Bridges destroyed.
Rails bent greatly.
12 : damage total. Lines of sight and level are distorted. Objects thrown
into the air.
Another measure of the relative strength of an earthquake is the size of
the area over which the shaking is noticed. This measure has been particularly
useful in estimating the relative severity of historic shocks that were
not recorded by seismographs or did not occur in populated areas. The extent
of the associated felt areas indicates that some comparatively large earthquakes
have occurred in the past in places not considered by the general public
to be regions of major earthquake activity. For example, the three shocks
in 1811 and 1812 near New Madrid, Mo., were each felt over the entire eastern
United States. Because there were so few people in the area west of New
Madrid, it is not known how far it was felt in that direction. The 1886
Charleston, S.C., earthquake was also felt over a region of about 2 million
square miles, which includes most of the eastern United States.
magnitude is related to the amount of seismic energy released at the hypocenter
of the earthquake. It is based on the amplitude of the earthquake waves
recorded on instruments which have a common calibration. The magnitude
of an earthquake is thus represented by a single, instrumentally determined
value. Seismic waves are the vibrations from earthquakes that travel through
the Earth; they are recorded on instruments called seismographs. Seismographs
record a zig-zag trace that shows the varying amplitude of ground oscillations
beneath the instrument. Sensitive seismographs, which greatly magnify these
ground motions, can detect strong earthquakes from sources anywhere in
the world. The time, location, and magnitude of an earthquake can be determined
from the data recorded by seismograph stations. The Richter
magnitude scale was developed in 1935 by Charles F. Richter of the
California Institute of Technology in Pasadena as a mathematical device
to compare the size of earthquakes. The magnitude of an earthquake is determined
from the logarithm of the amplitude of waves recorded by seismographs.
Adjustments are included in the magnitude formula to compensate for the
variation in the distance between the various seismographs and the epicenter
of the earthquakes. On the Richter Scale, magnitude is expressed in whole
numbers and decimal fractions. For example, a magnitude of 5.3 might be
computed for a moderate earthquake, and a strong earthquake might be rated
as magnitude 6.3. Because of the logarithmic basis of the scale, each whole
number increase in magnitude represents a 10-fold increase in measured
amplitude; as an estimate of energy, each whole number step in the magnitude
scale corresponds to the release of about 31 times more energy than the
amount associated with the preceding whole number value. At first, the
Richter Scale could be applied only to the records from instruments of
identical manufacture. Now, instruments are carefully calibrated with respect
to each other. Thus, magnitude can be computed from the record of any calibrated
seismograph. Earthquakes with magnitude of < 2.0 are usually called
microearthquakes;
they are not commonly felt by people and are generally recorded only on
local seismographs. Events with magnitudes > 4.5 --there are several thousand
such shocks annually--are strong enough to be recorded by sensitive seismographs
all over the world. Great earthquakes, such as the 1964 Good Friday earthquake
in Alaska, have magnitudes > 8.0. On the average, one earthquake of such
size occurs somewhere in the world each year. Although the Richter Scale
has no upper limit, the largest known shocks have had magnitudes in the
8.8 to 8.9 range. Recently, another scale called the moment
magnitude scale has been devised for more precise study of great
earthquakes. This roughly agrees with numbers from the Richter scale, but
it better estimates the total energy released by an earthquake, and is
better for discriminating between the sizes of very powerful quakes. The
Richter Scale is not used to express damage. An earthquake in a densely
populated area which results in many deaths and considerable damage may
have the same magnitude as a shock in a remote area that does nothing more
than frighten the wildlife. Large-magnitude earthquakes that occur beneath
the oceans may not even be felt by humans.
Once all the intensity numbers have been found, they can be plotted on
a map. The Ordnance Survey 1:625 000 scale series makes excellent base
maps for doing this in the UK (one can overlay the intensity values using
tracing paper). The resulting map gives a complete picture of the effects
of the earthquake. For more detailed studies of small areas, 1: 50 000
scale maps can be used. Usually the seismologist will then want to draw
some contours for the different intensity values. This is not always simple,
as the numbers are often jumbled about on the map; partly due to the changes
in geology over short distances. Each contour is drawn in such a way as
to enclose as many places as possible that have the same intensity, while
not including too many that have a lower intensity, and trying to keep
the curves smooth. The resulting lines are called isoseismals (lines of
equal shaking). Usually these lines tend to form ellipses. An example of
an isoseismal map for a recent British earthquake is shown above. Such
maps are useful for two reasons. Firstly, they can be used to work out
what sort of effects can be expected from earthquakes in the future, when
one knows how rapidly the shaking decreases with distance (this is called
attenuation). Secondly, you can actually work out some of the other statistics
about the earthquake from the isoseismals. For example, measure the area
enclosed by the isoseismal for intensity 3 (in km2) for a British
earthquake. Take the logarithm of this number. The result is equal to the
magnitude (more or less)! And the position of the epicentre can be found
by taking the centre of the innermost isoseismal.Using this, it's possible
to find the magnitude and epicentre of earthquakes in historical times
that happened way before the invention of recording instruments (which
was a little before 1900, but it was not until the 1960s that they became
sensitive enough to record small British earthquakes). The procedure is
much the same; collect as much data as you can about the earthquake (old
local newspaper reports are a good source of information), then sort the
data by place, work out the intensities, make an isoseismal map and take
your measurements. This was the method used for constructing the historical
part of the national catalogue of earthquakes in the UK. A map of this
catalogue is shown here; you can see that some parts of the UK are much
more active than other parts. This sort of study makes an ideal school
project, should there happen to be an earthquake in your district. If the
earthquake is small, you may be able to collect data for the whole of the
area affected by it; even if this is not possible because the felt effects
cover too wide an area, it is still interesting to collect data for the
area around the school and see how much variation in effects occurred.A
special form to collect this data is available. Can you see a decrease
of strength of shaking from one side of the area to the other? Can you
see variations that are a result of local geology? Any data collected by
schools can also be passed on to the national investigation of earthquake
effects, which in Britain is organised by the British Geological Survey.
Alternatively, if you do not want to wait for the next British earthquake
– the ones large enough to be felt over a reasonably wide area happen perhaps
one a year – it's possible to make a study of some earthquake that happened
in the past that affected your area; you may find enough historical data
in your local library, especially if it has files of the local newspapers.
An isoseismal map is like a snapshot of a whole earthquake, summarising
in one picture everything that happened over a wide area. Such maps are
very helpful to seismologists, who use them frequently; yet their construction
requires no special equipment and can be done by anyone with an interest.
Earthquake experts studying a large tremor that struck California in
2004 think they may have found a good way to forecast the size and location
of future earthquakes. Seismologists had long noted that there is a regular
pattern of earthquakes in the region of Parkfield, California, where they
strike on average every 22 years. And a magnitude-6 earthquake that struck
on 28 September 2004 in part fulfilled their predictions. But in some ways
this earthquake caught researchers by surprise. They had anticipated that
it would follow a measurable foreshock; it did not. They also thought it
would start rupturing at the northern end of the fault, as it had done
in the past; it ruptured from the south. And it was quite late, arriving
38 years after the last one. Danijel Schorlemmer and Stefan Wiemer at the
Swiss
Seismological Service in Zurich argue that these characteristics could
have been anticipated by looking at the history of the region differently.
Using an analysis of the small earthquakes that occurred in the area over
the preceding three decades, the pair retrospectively forecast the tremor's
unusual behaviour. The same technique could help identify other at-risk
regions around the world. The technique is not expected to pinpoint the
timing of earthquakes to within days. But if they get it down to within
maybe 10 years, it begins to be important information. The size of earthquakes
is inversely related to how often they occur, so small earthquakes are
much more common than large ones, but the extent to which they are more
common varies from place to place. Once this variation has been mapped,
it has been shown that it occurs even along the length of a single fault.
By looking at the past distribution of such small events, it is possible
to build up a picture of areas that are likely to be under most stress.
This means they can say which parts are most likely to slip in the future.
When they looked at data from Parkfield, Schorlemmer and Wiemer found that
the southern tip of the rupture zone was under greatest stress, which may
explain why the earthquake started thereref.
The entire area seems to have more of the 'large, infrequent' tremors than
experts had previously suspected. This could be the reason why the September
earthquake was 'late'. This type of analysis should prove useful. Strain
levels in fault lines have also been used to forecast the location and
scale of earthquakes, although his methods differ. He broadly anticipated
the 8.7-magnitude tremor that followed the tsunami earthquake of 26 December
2004. But strain studies do not give the whole picture : how and where
an earthquake occurs also depends on the strength of the rock, and therefore
how likely a fault, or a region of a fault, is to give way.
It is commonly thought that the longer the time since last earthquake,
the larger the next earthquake's slip will be. But this logical predictor
of earthquake size, unsuccessful for large earthquakes on a strike-slip
fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5.
Although the time since the preceding earthquake spanned 123 years, the
estimated slip in 1960, which occurred on a fault between the Nazca and
South American tectonic plates, equalled 250-350 years' worth of the plate
motion. Thus the average interval between such giant earthquakes on this
fault should span several centuries. Such long intervals were indeed typical
of the last 2 millennia. Buried soils and sand layers were used as records
of tectonic subsidence and tsunami inundation at an estuary midway along
the 1960 rupture. In these records, the 1960 earthquake ended a recurrence
interval that had begun almost four centuries before, with an earthquake
documented by Spanish conquistadors in 1575. 2 later earthquakes, in 1737
and 1837, produced little if any subsidence or tsunami at the estuary and
they therefore probably left the fault partly loaded with accumulated plate
motion that the 1960 earthquake then expendedref After an earthquake, numerous smaller shocks are triggered over distances
comparable to the dimensions of the mainshock fault rupture, although they
are rare at larger distances. Dynamic deformations (the stresses and strains
associated with seismic waves) can cause further earthquakes at any distance
if their amplitude exceeds several microstrain, regardless of their frequency
content. These triggering requirements are remarkably similar to those
measured in the laboratory for inducing dynamic elastic nonlinear behaviour,
which suggests that the underlying physics is similarref.
Southern California could be in
line for a serious quake along the infamous San Andreas fault, seismologists
have found. New measurements suggest that the region close to Los Angeles,
the traditional earthquake location in Hollywood disaster movies, could
feel the effects of a real-life tremor within the next few years. The southern
part of the San Andreas seems to be building up a considerable amount of
strain, the work suggests. And because no significant earthquake has ruptured
this portion of the fault for at least the past 250 years, it could be
primed to cause a devastating event. It could be tomorrow; it could be
10 years from now. But it appears unlikely to accumulate another few hundred
years of strain. The San Andreas marks a major geological boundary, where
the North American plate of Earth's crust grinds alongside the massive
Pacific plate. In the San Francisco area, the fault has unleashed several
killer quakes, including the 1906 earthquake that left the nascent gold-rush
city in smouldering ruins. But the southern San Andreas has not ruptured
for centuries. Some scientists had suggested that the San Andreas could
be releasing strain slowly over time, by 'creeping' along its length, or
by transferring some of the strain on to neighbouring faults. But the new
study suggests that neither of these processes is occurring to a significant
extent. Fialko gathered 8 years' worth of radar data from European Space
Agency satellites that measure in detail how the ground movesref.
He also added 20 years' worth of data from global-positioning measurements
on the ground. Taken together, the measurements suggest that the two plates
either side of the southern San Andreas are slipping past each other at
around 25 mm per year. Without a recent earthquake to alleviate that strain,
the fault line itself, which has remained essentially static for centuries,
has built up between 5.5 and 7 metres of 'slip deficit'. If released all
at once, that could result in a magnitude-8.0 earthquake, he says, roughly
the size of the devastating 1906 quake in San Francisco. Such a powerful
event might threaten even those buildings constructed to earthquake specifications.
Surprisingly, that strain is building up differently on each side of the
San Andreas. The rock on the North American side seems to be moving more
flexibly, whereas the Pacific side acts more rigidly. That asymmetry is
just now being recognized as an important phenomenon. In the past, we've
sort of thought that elastic properties of the rock don't vary much around
faults. This is showing that there are quite large variations. Another
fault that branches off the San Andreas, the San Jacinto, poses a greater
earthquake risk than previously thought. The San Jacinto runs through southern
California west of the San Andreas, near the communities of Riverside,
Hemet and Borrego Springs. Fialko's calculations suggest that strain is
building up along this fault at a rate twice that previously expected.
Residents of southern California rarely forget that they live in earthquake
country. But the new work will help to pin down where the problems are.
The southern San Andreas is fully loaded for the next event.
Web resources :
Earthquakes at the British
Geological Survey (BGS)
Earthquakes news
tidal wave
flooding : a general and temporary condition
of 1) partial or complete inundation of normally dry land areas from the
overflow of inland or tidal water or rapid accumulation or runoff of surface
waters from any source
flood plain : any land area susceptible
to being inundated by water from any source. Normally the regulatory flood
plain is characterized by the 100-year meaning there is a 1% chance of
flooding per year. The flood plain is often referred to as flood prone
areas
tsunami : a wave train, or series of waves,
generated in a body of water by an impulsive disturbance that vertically
displaces the water column. Earthquakes, landslides, volcanic eruptions,
explosions, and even the impact of cosmic bodies, such as meteorites, can
generate tsunamis. Tsunamis can savagely attack coastlines, causing devastating
property damage and loss of life. Tsunami is a Japanese word with the English
translation, "harbor wave." Represented by two characters, the top character,
"tsu," means harbor, while the bottom character, "nami," means "wave."
In the past, tsunamis were sometimes referred to as "tidal waves" by the
general public, and as "seismic sea waves" by the scientific community.
The term "tidal wave" is a misnomer; although a tsunami's impact upon a
coastline is dependent upon the tidal level at the time a tsunami strikes,
tsunamis are unrelated to the tides. Tides result from the imbalanced,
extraterrestrial, gravitational influences of the moon, sun, and planets.
The term "seismic sea wave" is also misleading. "Seismic" implies an earthquake-related
generation mechanism, but a tsunami can also be caused by a nonseismic
event, such as a landslide or meteorite impact. Tsunamis are unlike wind-generated
waves, which many of us may have observed on a local lake or at a coastal
beach, in that they are characterized as shallow-water waves, with long
periods and wave lengths. The wind-generated swell one sees at a California
beach, for example, spawned by a storm out in the Pacific and rhythmically
rolling in, one wave after another, might have a period of about 10 seconds
and a wave length of 150 m. A tsunami, on the other hand, can have a wavelength
in excess of 100 km and period on the order of one hour. s a result of
their long wave lengths, tsunamis behave as shallow-water waves. A wave
becomes a shallow-water wave when the ratio between the water depth and
its wave length gets very small. Shallow-water waves move at a speed that
is equal to the square root of the product of the acceleration of gravity
(9.8 m/s/s) and the water depth - let's see what this implies: In the Pacific
Ocean, where the typical water depth is about 4000 m, a tsunami travels
at about 200 m/s, or over 700 km/hr. Because the rate at which a wave loses
its energy is inversely related to its wave length, tsunamis not only propagate
at high speeds, they can also travel great, transoceanic distances with
limited energy losses. This animation
shows the propagation of the earthquake-generated 1960 Chilean tsunami
across the Pacific. Note the vastness of the area across which the tsunami
travels - Japan, which is over 17,000 km away from the tsunami's source
off the coast of Chile, lost 200 lives to this tsunami. Also note how the
wave crests bend as the tsunami travels - this is called refraction. Wave
refraction is caused by segments of the wave moving at different speeds
as the water depth along the crest varies. Please note that the vertical
scale has been exagaerated in this animation - tsunamis are only about
a meter high at the most in the open ocean. Tsunamis can be generated when
the sea floor abruptly deforms and vertically displaces the overlying water.
Tectonic earthquakes are a particular kind of earthquake that are associated
with the earth's crustal deformation; when these earthquakes occur beneath
the sea, the water above the deformed area is displaced from its equilibrium
position. Waves are formed as the displaced water mass, which acts under
the influence of gravity, attempts to regain its equilibrium. When large
areas of the sea floor elevate or subside, a tsunami can be created. Large
vertical movements of the earth's crust can occur at plate boundaries.
Plates interact along these boundaries called faults. Around the margins
of the Pacific Ocean, for example, denser oceanic plates slip under continental
plates in a process known as subduction. Subduction earthquakes are particularly
effective in generating tsunamis. This simulation
of the 1993 Hokkaido earthquake-generated tsunami, developed by Takeyuki
Takahashi of the Disaster Control Research Center, Tohoku University, Japan,
shows the initial water-surface profile over the source area and the subsequent
wave propagation away from the source. Areas in blue represent a water
surface that is lower than the mean water level, while areas in red represent
an elevated water surface. The initial water-surface profile, as shown
in this image, reflects a large, long uplifted area of the sea floor lying
to the west (left) of Okushiri Island, with a much smaller subsided area
immediately adjacent to the southwest corner of Okushiri. A tsunami can
be generated by any disturbance that displaces a large water mass from
its equilibrium position. In the case of earthquake-generated tsunamis,
the water column is disturbed by the uplift or subsidence of the sea floor.
Submarine landslides, which often accompany large earthquakes, as well
as collapses of volcanic edifices, can also disturb the overlying water
column as sediment and rock slump downslope and are redistributed across
the sea floor. Similarly, a violent submarine volcanic eruption can create
an impulsive force that uplifts the water column and generates a tsunami.
Conversely, supermarine landslides and cosmic-body impacts disturb the
water from above, as momentum from falling debris is transferred to the
water into which the debris falls. Generally speaking, tsunamis generated
from these mechanisms, unlike the Pacific-wide tsunamis caused by some
earthquakes, dissipate quickly and rarely affect coastlines distant from
the source area.
This image shows Lituya Bay, Alaska, after a huge, landslide-generated
tsunami occurred on July 9, 1958. The earthquake-induced rockslide, shown
in upper right-hand corner of this image, generated a 525 m splash-up immediately
across the bay, and razed trees along the bay and across LaChausse Spit
before leaving the bay and dissipating in the open waters of the Gulf of
Alaska. As a tsunami leaves the deep water of the open ocean and travels
into the shallower water near the coast, it transforms. A tsunami travels
at a speed that is related to the water depth - hence, as the water depth
decreases, the tsunami slows. The tsunami's energy flux, which is dependent
on both its wave speed and wave height, remains nearly constant. Consequently,
as the tsunami's speed diminishes as it travels into shallower water, its
height grows. Because of this shoaling effect, a tsunami, imperceptible
at sea, may grow to be several meters or more in height near the coast.
When it finally reaches the coast, a tsunami may appear as a rapidly rising
or falling tide, a series of breaking waves, or even a bore. As a tsunami
approaches shore, we've learned in the "What happens to a tsunami as it
approaches land?" section that it begins to slow and grow in height. Just
like other water waves, tsunamis begin to lose energy as they rush onshore
- part of the wave energy is reflected offshore, while the shoreward-propagating
wave energy is dissipated through bottom friction and turbulence. Despite
these losses, tsunamis still reach the coast with tremendous amounts of
energy. Tsunamis have great erosional potential, stripping beaches of sand
that may have taken years to accumulate and undermining trees and other
coastal vegetation. Capable of inundating, or flooding, hundreds of meters
inland past the typical high-water level, the fast-moving water associated
with the inundating tsunami can crush homes and other coastal structures.
Tsunamis may reach a maximum vertical height onshore above sea level, often
called a runup height, of 10, 20, and even 30 meters. This numerical simulation
shows the 1923 Kanto tsunami attacking a Japanese village. A longer
version of the this animation is also available. Note that the structures
in this model are rigid - in a real-life tsunami, coastal structures often
are destroyed.
Devastating tsunamis are known in historical times to have affected
the populated coasts of Papua New Guinea, Japan, Hawaii, Crete, Sicily
and the Crimea — to name just a few. In the Pacific region, where 80% of
all tsunamis occur, a 1947 analysis indicated that seismic sea waves higher
than
7.5 metres occur on average every 15 years (Heck, N. H. Bull. Seismol.
Soc. Am. 37, 269?286 (1947)). Outside the Pacific, tsunami frequencies
have been studied in some detail only for the Aegean and Black Sea regions.
Records there reveal that the coastal and surrounding areas of Turkey have
been affected by > 90 tsunamis over the past 3,000 yearsref.
For most other areas, information concerning the return periods of tsunamis
is scarce. A rough comparison of tsunami frequencies in different parts
of the globe was done in 2000 by the London-based Benfield
Hazard Research Centre (BHRC), as part of its Tsunami
Risks Project. The resulting risk analysis estimates the return periods
of 10-metre waves to be about 1,000 years for the North Atlantic and Indian
oceans, southern Japan and the Caribbean, 500 years for the Philippines
and the Mediterranean Sea, 250 years for Alaska, South America and Kamchatka
in eastern Siberia, and < 200 years for Hawaii and the southwest Pacific
684 BC : 4 Pacific tsunamis higher than 30 metres.
1755, 1 November : a fire following an earthquake destroyed two-thirds
of Lisbon, Portugal. In panic, the population sought shelter near the shoreline,
only to be hit by waves said to be as high as houses. More than 60,000
people died.
2004, Dec 26 : the Indian Ocean tsunami occurred at about 01:00 GMT, when
the Indian tectonic plate moved underneath the neighbouring Burma microplate
(earthquake moment magnitude : 8.9), raising it by about 10 metres along
a length of more than 1,000 km and sending a wave propagating through the
full depth of the overlying ocean at high speed. With wavelengths much
larger than the depth of the ocean, such waves propagate across the great
distances of the open sea without much surface perturbation and with very
little energy loss, until shallower coastal shelves slow the wave and increase
its amplitude — resulting, in this case, in a calamity of biblical proportions.
The tsunami displaced in the region of a trillion tonnes of water, driving
it towards southeast Asia's coastline (Indonesia, Sri Lanka, Thailand,
India, Maldives, and Somalia) in a long, low-amplitude wave travelling
at up to 900 km/h across the Indian Ocean
> 200,000 deaths (80,000 in Indonesia, 10,000 in India, > 5000 in Thailand),
500,000 injured, 5 million homeless without food and water in an area spanning
over 5,000 km. Photographs were captured by a range of Earth-observation
satellites including IKONOS,
SPOT
2, SPOT 5 and
RADARSAT-1,
and by the Indian National Remote Sensing Agency (available from the Centre
for Remote Imaging, Sensing and Processing (CRISP)) : the pairs of
images show devastated regions before and after the tsunami
Banda Aceh, Sumatra
10 January 2003
29 December 2004
Trinkat Island, Nicobar Islands
21 December 2004
26 December 2004
Katchall Island, Nicobar Islands
10 July 2004
28 December 2004
Akkaraipattu, Sri Lanka
8 February 2002
2 January 2005
Lhoknga, Indonesia
10 January 2003
29 December 2004
2 international tsunami warning bodies exist under UNESCO's
Intergovernmental Oceanographic Commission (IOC): the International
Coordination Group for the Tsunami Warning System in the Pacific (ITSU),
and the International
Tsunami Information Center (ITIC) based in Hawaii. They get by on annual
budgets from the IOC of about US$40,000 and $80,000, respectively, which
are supplemented by grants from nations on the Pacific rim. To predict
a tsunami with any useful time advantage, researchers say, data on small
changes in sea level and pressure have to be collected directly from the
floor and surface of the ocean. The strength of the event depends on the
displacement of the ocean floor, not on the strength of the earthquake.
Some buoys that could provide such data are already in place in the Indian
Ocean. A network of seabed pressure sensors and seismographs, can detect
Pacific Ocean tsunamis within minutes. It wasn't concerns about issuing
a false alarm, however, that prevented scientists in India, Sri Lanka,
and the Maldives from alerting authorities to the tsunami threat. Instead,
researchers say, the reason was near-total ignorance. At the National Geophysical
Research Institute (NGRI) in the south Indian city of Hyderabad, for example,
seismologists knew of the earthquake within minutes after it struck but
didn't consider the possibility of a tsunami until it was too late. In
fact, at about 8 a.m., 1 hour after the tsunami had already begun its assault
on Indian territory by pummeling the islands of Andaman and Nicobar some
200 km northwest of the epicenter, institute officials were reassuring
the media that the Sumatran event posed no threat to the Indian subcontinent.
About the same time, in neighboring Sri Lanka, scientists at the country's
only seismic monitoring station, in Kandy, reached a similar conclusion.
Within 20 minutes of the earthquake, at least 3 monitoring stations in
the United States had detected it, initially estimating its magnitude to
be around 8, which is not exceptional for submarine quakes and is an order
of magnitude smaller than the eventual value of 9 that made this the world's
largest seismic event for 40 years. The United States Geological Survey
(USGS) circulated the information to about 100 people, mostly its own researchers
and senior officials, within 16 minutes, and sent a more detailed bulletin
to a list of external contacts, including the US Department of State, after
1 hour. The USGS has no responsibility for tsunami monitoring and its statement
did not mention the risk of such an event. The Hawaii-based Pacific Tsunami
Warning Center (PTWC), meanwhile, sent out a bulletin to its regular circulation
list, noting that the event presented no tsunami risk in the Pacific. At
2:04 GMT, the PTWC put out another bulletin revising the quake up to magnitude
8.5. Because there was no information about sea levels in the area, the
existence of a tsunami was merely hypothetical, but staff were worried
enough to begin looking for numbers to call in Asia. According to Kong,
the team tried and failed to reach colleagues in Indonesia. Australia was
contacted, although to little avail, as that country experienced only 0.5
m waves. It was not until 3:30 that the team in Hawaii saw news reports
on the Internet of casualties in Sri Lanka. Without a predetermined communication
plan, warning efforts were doomed from the start. The PTWC will in future
directly contact the US state department, which can communicate risks to
any nation, at any time. Indonesian seismologists initially underestimated
the strength of the earthquake, according to local news reports. And although
officials there had very little time in which to act, a monitoring station
that could have provided early warning of the devastating Indian Ocean
tsunamis lacked the telephone connection needed to relay news of the impending
disaster. A seismograph designed to detect the earthquakes that cause tsunamis
was installed on the Indonesian island of Java in 1996, but the data it
collects is not sent to the central government in Jakarta because the telephone
line has been disconnected since an office move in 2000. Officials in Jakarta
were alerted to the earthquake that caused the giant waves by readings
from the country's other 60 or so seismographs, but a lack of data from
the specialized Java station prevented them from issuing a tsunami warning.
Countries such as Sri Lanka and India, which suffered thousands of casualties,
could potentially have been warned some 2 hours before the waves completed
the 1,500-km journey from the earthquake's epicentre off Indonesia. India's
government and scientific establishment have been heavily criticized for
failing to provide warning of a tsunami that drowned at least 12,000 people
on the nation's eastern coast. Newspapers and opposition spokesmen have
asked why a country with India's scientific resources couldn't better prepare
for such an event. Ministers immediately pledged up to US$29 million to
build a tsunami-monitoring system, and promised to seek more cooperation
with the Pacific Tsunami Warning Center in Hawaii. There was record of
a tsunami hitting the Indian coastline in 1833 and 1883. India now plans
to install a network of 10 to 12 seafloor pressure sensors to be imported
from the United States, as well as several floating sensors on ocean buoys,
linked to an Indian geostationary satellite. Critics say that the tragedy
exposed a major weakness in the current system, which authorizes only the
Indian
Meteorological Department to put out hazard alerts. "Data were pouring
into our lab but we cannot issue alerts even if we can analyse the data
for tsunami potential," says one researcher at the National
Geophysical Research Institute (NGRI) in Hyderabad. They also want
to know why the Indian air force, whose base in Car Nicobar Island was
submerged by tides 1 hour before the waves hit the mainland, failed to
provide any public warning. The tsunami spared India's main rocket launch
site at Sriharikota Island, 80 km north of Chennai. But it damaged cooling
water pumps at a nuclear power station at Kalpakkam, leaving staff with
very little time to shut down the plant safely.
Efforts over the years to get an Indian Ocean warning system
in place have made little progress in the face of national governments'
reluctance to invest in them. A master plan prepared in 1999 by ITSU, one
of the international organizations that plans for the monitoring of tsunamis,
stated: "Tsunami hazards exist on both sides of the Atlantic Ocean, in
the eastern Indian Ocean, and in the Mediterranean, Caribbean, and Black
Seas. Efforts to establish warning centers in those areas should be encouraged.".
In 2003, a working group on the Tsunami Warning System in the Southwest
Pacific and Indian Ocean was established within ITSU. But the first chair
of the group, a representative from Indonesia, left soon after his appointment
and that the group then split into 2 according to region. However, the
most important differentiating factor has been the readiness of 'Pacific
rim' nations such as Japan, Australia and the United States to support
a cheap but potentially effective system for monitoring and for educating
the public about an infrequent risk. India, Indonesia and the other nations
on the Indian Ocean's rim are relatively poor countries with needs that
seemed more pressing than that of planning against the remote — but nonetheless
inevitable — prospect of a tsunami. UNESCO will now make an observation
system in the Indian Ocean a priority. The first thing we will do is send
out a survey team in January or February and then we want to set up a conference
in the area. Needless to say, there is little reluctance now to accept
the need for the system. The UN International Strategy for Disaster Reduction
has also said that one should be built within 1 year. And the Indian government,
under intense domestic pressure for its failure to warn people on its eastern
coast, said it would spend up to US$29 million to build a system itself.
Early warning systems could watch for other natural disaster risks, as
well as tsunamis : there has been an enormous amount of focus on tsunamis,
but we need to take a multihazard approach. The need for a similar system
in the Indian Ocean has been discussed at regular intervals by the IOC
since at least 1999. The most recent meeting of the Commission to discuss
the threat was in June 2004, although no direct action was decided upon
because it has been difficult to raise the millions of dollars needed would.
Indonesian authorities could have worked out from their seismograph data
that the quake would generate tsunamis, but you need a set of protocols
about how to warn people, such as sirens on the streets. That's more critical
than the science. The lack of such protocols in Indian Ocean nations is
due in part to the infrequency of tsunamis in the region. If there hasn't
been an event in living memory, people don't think it will happen again.
The Pacific monitoring system was set up in 1965, following tsunamis generated
by quakes in Chile in 1960 and Alaska in 1964. But the last event on a
similar scale in the Indian Ocean, caused by the eruption of Krakatau,
took place in 1883. Fewer than half a dozen big ones have been recorded
in the past 250 years. The issue of a tsunami warning system in the Indian
Ocean will be discussed at a summit of affected countries to be held in
Jakarta, Indonesia on 6 January and will feature at the UN World Conference
on Disaster Reduction to be held in Kobe, Japan, later this month. In order
to build a new warning system, several key components must be put in place.
the region needs an extensive network of seismographs, which pick up the
tremors from underwater earthquakes. Although some such sensors already
exist, they need to be upgraded and their numbers increased.
regional centres must be established to process and interpret the seismographic
information in real time, and predict the likely impact and location of
subsequent tsunamis.
communication systems must be set up that can relay swift warnings internationally,
regionally and then to local communities. This might involve, for example,
messages sent to mosques that can sound an alarm siren. The most important
component of such preparation is public education, so that local inhabitants
are aware, for example, of the fact that a dramatic recession of the ocean
is in itself a warning of an impending event. The next most important component
is the construction of a simple network that will quickly convey warning
information from the seismological stations to some central point (such
as the Pacific Tsunami Warning Center in Hawaii) and back out again to
local radio and television channels, perhaps using siren systems in regions
that can afford them.
locals must be taught in advance how to act in the event of a warning and,
if necessary, be provided with the means to escape to safe ground.
It is the education and social coordination that will be the hardest to
implement, but experts are hopeful that a rudimentary warning system based
on existing seismographs and upgraded communication systems might be up
and running by the end of 2005. Most attention is focused on the Indian
Ocean at the moment, but researchers warn that other regions at risk of
tsunamis, including the Caribbean, the coasts of Central and South America
and the Mediterranean, also lack adequate warning systems.
Rescuing the ravaged coral reefs and mangrove swamps will also be vital
for rebuilding coastal communities. News reports and pictures have revealed
giant coral heads stranded inland and vanished shorelines. United
Nations Environment Program (UNEP) announced on 30 December that it
would commit US$1 million to the task, and officials say they hope more
money will follow. One of the biggest concerns for ecologists is that fragile
coral reefs and mangrove swamps from Sri Lanka to Sumatra may have been
uprooted by surging waters or smothered by washed out mud and debris. This
could prove particularly damaging to reefs already rendered fragile by
climate change and pollution. Damage to the plants and animals in these
ecosystems could have knock-on effects for the human communities that depend
on them. Coral reefs and mangrove swamps are vital feeding and breeding
grounds for fish, so their destruction could cut local fishing and food
supplies over the longer term, and leave coastlines more exposed to erosion
and storm damage. Swamped beaches may also have wiped out nesting sites
for sea turtles. And researchers are keen to assess whether mangrove swamps
cleared prior to the tsunami, to make way for hotels or other coastal developments,
may have flattened a natural wave barrier, leaving the coastline more vulnerable.
The UNEP project will assess all these aspects and others, such as whether
oil, chemical and other industrial plants along the coast might have spewed
out pollution, and whether freshwater reservoirs have been contaminated
by the seawater. To survey the damage, experts will initially focus on
satellite images that reveal the reach of floodwaters before and after
the tsunami or changes in submerged coral reefs. Assessments on the ground
will be ramped up later on. But cataloguing the ecological damage is just
the first step. After this, researchers and policy-makers must figure out
how to rebuild these shattered ecosystems. When reconstruction begins,
marine reserves and national parks will be among the priorities, because
it is these that will be most resilient to future human and natural impacts.
In some areas, including important national parks, the wave has encouraged
the spread of alien invasive species, such as prickly pears and salt-tolerant
mesquite. A mesquite is a spiny leguminous tree or shrub and found chiefly
in southern USA : the shrub has sweet pods that are eaten by livestock.
Prickly pears, a type of cactus, are most prevalent in Mexico, where many
of the 100 different species are cultivated as a food source. Neither species
is native to Sri Lanka, but they existed in small numbers in limited coastal
areas. Local plants and animals have not co-evolved with these alien plants
so when alien plants dominate in the ecosystem they will reduce the diversity
of the local fauna and flora. In those areas with healthy coral reefs and
mangroves, the impact of the devastating events the tsunami were significantly
reduced. Mangrove forests along shorelines in Sri Lanka and other Asian
nations hit by the tsunami are considered critical to halting erosion,
but much of growth has fallen victim in recent years to intense coastal
development. Well over 500 million kilograms (1,100 million pounds) of
rubble were created by the tsunami, posing an enormous challenge to the
solid waste management system in Sri Lanka. Resettlement and reconstruction
are now placing a huge burden on natural resources, specially through the
location of new settlements in or near ecologically sensitive locations,
and increased demand for sand and wood for reconstruction and firewood
for brick-making. These activities are thought to have the potential to
cause more irreversible damage to Sri Lanka's environment than did the
tsunami itself
Risks :
the most immediate fear is that without a supply of clean drinking water,
many thousands of those who survived the waves will die from infectious
illnesses such as
orofecal contamination : the floods contaminated drinking water and disrupting
sewage systems. As time passes, faeces start to contaminate the water supply
and cholera will become a threat. The most urgent priority is to supply
communities with clean drinking water : from Dec 30, Oxfam plans to send
out emergency water tanks that can hold 11,000 litres each. The first 6
of these shipments will go to Sri Lanka and to the province of Aceh, in
Indonesia.
contaminated water-associated enteric disease
cholera (Vibrio cholerae)
: outbreaks typically begin to appear one to two weeks after a catastrophe.
Although doctors had seen several hundred cases of diarrhoea in the affected
countries, no one had reported a cholera outbreak as of Friday 7 January.
The number of people suffering from diarrhoea does not exceed the typical
level of the illness usually found in the region. 20 cholera cases were
being treated in a refugee camp in Nagan Raya, in the Meulaboh area, Aceh.
2 of the cases were serious, with the patients severely dehydrated from
constant diarrhea. One 12-year-old child had contracted the disease, while
the other cases were adults. With proper treatments and sanitation efforts,
there is a chance that outbreaks of cholera might not materialize. Despite
these rumors a WHO official in Aceh denied the presence of cholera patients
: suspected cases were reported, and they proved to show negative results.
As of 24 Jan 2005, there have been no confirmed outbreaks of cholera in
any part of the province. Rumours of cholera in an area adjacent to Meulaboh
have not been confirmed, but there were cases of uncomplicated diarrhea,
without deaths or severe dehydration
shigellosis has been diagnosed in 10 patients. Speciated specimens were
all Shigella flexneri.
No Shigella dysenteriae
has been found. Water and sanitation have been strengthened in the area
most affected, and there has been no further bloody diarrhea there since
19 Jan 2005. Bloody diarrhea has also been reported from other areas of
the province.
cases of typhoid and jaundice may not start appearing for another month
or so as they have a long incubation period. The coming few months would
also be crucial for monitoring the health of expectant mothers and babies
and that antenatal care would be stepped up.
2 patients with suspected cases are hospitalized in Meulaboh. Takuapa Hospital
in Phangnga, Southern Thailand, was one of the major referral centers for
people injured during the tsunami, and saw approximately 1000 patients
on 26 Dec 2004. On 30 Dec 2004, a patient who had suffered aspiration developed
severe pneumonia, and 2 blood cultures taken on that day were positive
for Burkholderia pseudomallei. A further 4 patients have since been
diagnosed with B. pseudomallei pneumonia. All had aspirated during
the tsunami. Their age ranged from 24-65 years. 2 individuals had known
risk factors for melioidosis (diabetes mellitus). All patients are currently
responding to antimicrobial treatment. During the last 6 years, a total
of 9 cases of melioidosis have been diagnosed at Takuapa Hospital (none
of which were during the preceding 12 months). This raises the possibility
that individuals affected by the tsunami are at increased risk of melioidosis.
A melioidosis surveillance team has been formed, and active clinical and
serological surveillance is now underway in the area. Previous soil sampling
has demonstrated that a small proportion of sites contain B. pseudomallei.
A longitudinal study of soil sampling is now underway to examine the presence
of B. pseudomallei in tsunami-affected and -unaffected soil in the
Phangnga area. All individuals affected by the tsunami may be at risk of
developing melioidosis. Given the normal mode of acquisition of the disease,
people at highest risk are likely to be those who sustained aspiration
or lacerations, particularly if the wound was contaminated with soil. The
variable and sometimes extended period of incubation prior to clinical
symptoms means that those affected by the tsunami in Thailand may have
a higher long-term risk of disease. Case reports of melioidosis in returning
travelers or indigenous peoples from Indonesia, Sri Lanka, and India indicate
that people affected by the tsunami in these areas should also be considered
at increased risk. Many cases of pneumonic melioidosis are related to a
primary cutaneous introduction of the bacterium, but aspiration of the
organism directly into the respiratory tract can also cause the disease.
It is not stated whether any of the affected individuals had extrapulmonary
manifestations of melioidosis.
2 European tsunami victims from whom Burkholderia pseudomallei has
been isolated. One, who sustained extensive wounds in Phuket, grew B.
pseudomallei as part of a mixed growth from a wound swab, but has no
other evidence of melioidosis. The island of Phuket is not an area of high
incidence under normal circumstances, and only a handful of cases in the
world literature are known to have been acquired in many of the other areas
affected by the tsunami, such as Sri Lanka or Indonesia. It may be that
this episode will also point to areas where melioidosis is currently under-recognized.
The other, who was exposed in Khao Lac, developed respiratory insufficiency
while being treated for soft tissue injuries, and grew B. pseudomallei
from pleural fluid
an Australian from Brisbane, who was holidaying in Sri Lanka (where not
many reports of melioidosis have been reported) at the time and was swept
inland a considerable distance by the tsunami, had a pure growth of B.
pseudomallei from a sustained leg injury resulting in abscess formation.
The pathology report indicated "light growth", with resistance to amoxicillin,
cephalothin, gentamicin, and tobramycin. However, the reporting pathologist
indicated that sensitivity testing may be unreliable with slow-growing
organisms such as the one in question
3 isolates in Finnish tourists who were visiting Khao Lak on the southwest
coast of Thailand : 2 strains were isolated from wound swabs and one from
a blood culture. 1 patient was from Peijas Hospital, Vantaa (a town in
the metropolitan Helsinki area), and to my knowledge had no symptoms characteristic
of melioidosis. The bacterium was found only in the first sample after
the patient's return 31 Dec 2004 from Asia. The 2 other patients were diagnosed
with pneumonia. One was from Turku (a city on the Southwest coast) and
B.
pseudomallei was isolated from the patient's blood in Turku University
Hospital. The other finding was from a wound treated in Keski-Suomen Keskussairaala
(a hospital in Central Finland). The infection is very rare in Europe,
and the only case to have been previously reported in Finland was in a
man who traveled to Thailand in 2000ref.
The 1st case was in a 17-year-old woman with a deep wound in her lower
leg. B. pseudomallei was isolated from a wound swab. She had been
treated in a hospital in Bangkok for 3 days before returning to Finland.
On arrival in Finland, her left foot was red and swollen, and a swab taken
before revision of the wound grew B. pseudomallei. Consecutive swabs
remained negative, and further plastic surgery procedures were carried
out a week later. The patient was treated with clindamycin and ciprofloxacin.
She did not develop any clinical symptoms of melioidosis and has fully
recovered. The 2nd case was a 47-year-old male. He had several superficial
wounds, some of which had been surgically treated in Khao Lak. On arrival
in Finland, he had a fever, but his general health was good, and vital
signs were normal. He had a deeper wound in his right elbow and a small
abscess in the corner of his left eye. Aspiration pneumonia was suspected,
because he had breathed in muddy water, and his chest x-ray showed bilateral
changes. B. pseudomallei was isolated from 2 blood cultures taken
during his 1st day in the hospital. This patient is considered to have
had a confirmed case of meliodosis. He was treated with broad spectrum
intravenous antibiotics (ceftriaxone, clindamycin and levofloxacin, followed
by meropenem and ciprofloxacin after the results of the sensitivity testing
were obtained). His fever continued for 10 consecutive days, but he has
now recovered. He is still on doxycycline and trimethoprim/sulfamethoxazole
therapy, which is to be continued for 20 weeks. The 3rd case was a 54-year-old
man who had a wound infection and was sent to the hospital by a general
practitioner one day after returning to Finland. 2 of his wounds had been
sutured in Thailand. After admission to the hospital in Finland, he developed
septic shock and was treated in an intensive care unit (ICU) for 3 days.
He did not have pneumonia and was treated with meropenem and ciprofloxacin.
All blood cultures remained negative. A wound swab taken during wound revision
3 days after the patient was released from the ICU grew B. pseudomallei.
A diagnosis of meliodosis is presumed. The patient was treated in the hospital
for 29 days and recovered fully. His antibiotic treatment has been discontinued.
water pools may represent a reservoir for vectors of mosquito-transmitted
diseases
malaria (Plasmodium spp.
including Plasmodium
cynomologi,
Plasmodium
falciparum,
Plasmodium
malariae,
Plasmodium
ovale,Plasmodium
vivax).
Risk of outbreaks of malaria and dengue remains high, due to a substantial
increase in the potential breeding sites for Anopheles sundaicus,
Aedes
aegypti and Ae. albopictus. A comprehensive program of vector
control, early diagnosis and treatment, and training for management of
complicated malaria has commenced
overcrowding at refugee camps has intensified the spread of
pneumonia,
and health officials fear that respiratory diseases may claim many lives
in the already devastated region. Rainy weather and lack of shelter in
hard-hit countries such as Sri Lanka have left people more vulnerable to
this type of illness
the WHO on Mon 10 Jan 2005 confirmed 2 unconnected cases of measles
virus
in tsunami survivors in separate villages outside the regional capital
of Banda Aceh on the Indonesian island of Sumatra. UNICEF already was in
the midst of a campaign to vaccinate 600,000 people in devastated Sumatra
against the disease when the cases occurred, and it immediately inoculated
a ring of 1200 people living around the villages where each case was confirmed.
Malnutrition in refugee camps has been known to push the death rate as
high as 30%. Measles circulates in Indonesia, and there have been several
outbreaks during the last few years, because most children have not been
vaccinated. WHO recommends immunizing > 90% of children to protect the
population from outbreaks. In Aceh province, only about 25% of the children
were vaccinated before the earthquake. At least 13 cases of measles have
been reported among children in relief camps in Andamans and 15 in Tamil
Nadu. There have been sporadic cases of measles. A province-wide program
of measles immunization (from ages 6 months to 15 years) and vitamin A
distribution (6 months to 5 years) is in process, with priority given to
the areas where the surveillance has detected cases.
serious wound infections : GeoSentinel
sites and network members are increasingly reporting them. All such wounds
that appear infected should be treated with intravenous therapy with more
than routine antistreptococcal and antistaphylococcal therapy while appropriate
cultures are pending.
Clostridium tetani
: many inhabitants who survived tsunamis had not been immunized against
the potentially fatal infection. Indeed, it is not surprising that tetanus
would occur after the tsunami, given that many people were injured by the
debris the waves picked up, even if only with minor cuts, and ended up
lying in the dirty water. People are becoming infected when they search
for corpses or useful objects in the rubble left by the tsunami. Wounds
on their arms and legs can become infected by the tetanus bacteria when
they walk through the mud. Although the lay public tends to associate tetanus
with injuries such as stepping on a rusty nail, the injury itself can be
quite minor, and no signs of local infection or injury may be present.
Postinjury prophylaxis uses tetanus immunoglobulin and tetanus toxoid booster.
In a mass injury situation, this is difficult to accomplish : Australia's
entire stock of specific tetanus immunoglobulin, which amounted to 15 vials
(a sign of how few cases the West now suffers), was imported to Indonesia.
There is also a shortage of tetanus immune globulin in the area. 67 cases
of the disease: 45 in Banda Aceh (20 deaths), 15 in Meulaboh (5 deaths)
and 6 in Sigli, at either end of the Indonesian disaster zone in Sumatra.
They are almost certainly being replicated in the cut-off towns and villages
along the coast in between. There is a high mortality rate of about 25%.
Since the disease has an incubation period of between 2 and 60 days, most
cases are only starting to show up now. As of 24 Jan 2005, > 90 patients
at hospitals in Banda Aceh, Meulaboh (west coast), and Sigli (northeast
coast) were diagnosed with tetanus. All seem to be the result of injuries
sustained on the day of the tsunami, with the peak being passed on 12 Jan
2005. Hospitals are equipped to manage cases, and there is a preventive
program for workers involved in the clean-up campaign (protective clothing
and tetanus toxoid immunization). New cases are on the decline.
Of 2 patients who have returned from Phuket, the 1st had multiple skin
lacerations, 3 of which were very deep. Swabs from 2 of these grew Aeromonas
hydrophila resistant to amoxicillin, amoxicillin-clavulanate, and cephalexin,
and susceptible to ceftriaxone, the carbapenems, aminoglycosides, and ciprofloxacin.
The 3rd deep laceration cultured Pseudomonas
aeruginosa,
which was susceptible to all standard anti-pseudomonals. The 2nd patient
had extensive superficial areas of skin disruption. Some of these were
cellulitic, and Aeromonas
hydrophila
was cultured. Susceptibilities were identical to the above patient. Patient
also had some degree of aspiration pneumonia and was treated with meropenem,
then changed to ciprofloxacin, and is doing well. A 56 year old patient
from Sri Lanka was injured in the tsunami while in a beach hut in Tangalle
and was washed 1 km inland to a rice paddy. The patient suffered multiple
superficial lacerations and 2 deeper soft tissue injuries, to the right
leg and left iliac crest, which were debrided "locally" and at Colombo
hospital. The microbiology and histopathology to date is as follows: -
probable disseminated mucormycosis arising from the deep leg wound; multiresistant
E.
coli sensitive to chloro and meropenem only; a multiresistant Gram
negative rod (GNR), likely Acinetobacter
sp,
sensitive to meropenem; a GNR, closest resemblance to Achromobacter
piechaudii,
sensitive to ceftazidime, meropenem, pipericillin, timentin; Pseudomonas
aeruginosa, fully sensitive to antipseudomonas drugs; MRSA from initial
swab in one superfical ulcer, sensitive to vancomycin; probable Aeromonas
sp.; Proteus sp.; some things on the vibrio plates to be identified;
no Burkholderia yet ! In large scale emergencies, there is often
immense pressure from staff, patients, and their families to deal with
wounds in one single surgical procedure, avoiding the need for planning
a follow-up surgical procedure of delayed primary or secondary closure.
Unfortunately, suturing wounds that are contaminated during one single
procedure will inevitably lead to more severe infection, often with life
or limb threatening consequences. Knowledge of adequacy of wound surgery
allows better interpretation of bacteriological results from such wounds.
Education regarding wound infection and wound healing directed toward staff
operating in the field encourages good (and often simple) tetanus cover
and surgical technique (such as debridement and wound cleaning), and encourages
leaving the wound open, allowing it to drain and covering it with an appropriate
dressing, to be closed at a later stage if infection is not present. Reassuringly,
some of the reports specifically mention debridement, which together with
wound cleaning and irrigation is the correct initial management of these
potentially contaminated wounds, which may include devitalized tissue.
However, wound management experience from most disasters and wars in many
countries suggests that this basic principle has to be relearned again
and again, often regardless of technical levels of skill or facilities
available. In this respect, some of the affected countries will already
have considerable wound management expertise that can be shared
at least 3 Australian survivors of the Indian Ocean tsunami have been infected
with cutaneous mucormycosis,
fatal in 80% of cases if left untreated. Found in soil and rotting plants,
it aggressively infects wounds and kills flesh with necrotizing lesions.
It is impossible to estimate how many survivors in tsunami-affected areas
were infected, because it was hard to detect even in hospitals with advanced
laboratories. Cutaneous mucormycosis might develop in survivors, but this
disease can be difficult to diagnose and even harder to treat, particularly
in those remaining in affected regions. You can find the fungi in any soil
or decaying matter, it's usually not a major problem for fit, healthy people,
but it can take off in people with deep wounds and run down immune systems.
.
a man injured in the tsunami of Dec 26, 2004, returned to Sydney for management
of his soft-tissue injuries. Despite broad-spectrum antibiotics, surgical
wound debridement, and vigilant wound care, his condition worsened. Muscle
and fat necrosis developed in a previously debrided thigh wound, and necrotising
lesions arose from previous abrasions. Histological analysis showed mucormycosis
in 3 non-contiguous sites, and Apophysomyces
elegans
was isolated from excised wound tissue. Wound infections, both bacterial
and fungal, will undoubtedly add to the morbidity and mortality already
recorded in tsunami-affected areasref.
The man had been staying in a beach hut at the Sri Lankan town of Tangalle
when the tsunami struck. The force of the water forced him through the
wall of the hut, causing deep flesh wounds that became infected with the
fungus. Kocecny said the man was being treated in an isolation unit, although
mucormycosis is not communicable. Treatment involves surgically removing
all dead and infected tissue and administering a course of antifungal therapy
in a field outside Banda Aceh, the Indonesian town devastated by the tsunami
in the Indian Ocean, > 1,000 dead bodies
were unceremoniously bulldozed into a mass grave at the end of December.
The indignity of such burial methods adds to the suffering of the survivors
and potentially robs them of a chance to identify the bodies of relatives
and friends. But there seems to be no option. Yet mass graves have already
been used in the wake of the tsunami in Indonesia, Sri Lanka and Thailand,
for fear that the bodies will otherwise cause epidemics of disease. In
Thailand, Red Cross officials have been told to prepare a grave for 10,000
bodies (which is twice the current death toll announced by the Thai government).
Whether this is practical for the vast numbers of fatalities in areas worst
hit by the tsunami is another matter. But the report claims that identifying
large numbers of corpses is a "technical challenge that can be met, regardless
of the number of victims, if the authorities act in accordance with specific
procedures". Not all their guidelines can be followed in regions with many
fatalities. There are some areas where the resources are not available,
but, in some places, efforts are under way to take fingerprints or collect
DNA samples from dead bodies, so that relatives may at least get confirmation
that a family member has died.
Children, especially those left orphaned, and the elderly are thought to
face the greatest risk.
the floods destroyed many hospitals
the hundreds of thousands of people sleeping in overcrowded temporary camps
will aid the spread of infectious diseases.
the ability to deliver relief is also hampered by the remoteness of affected
areas such as the Maldives : there are islands that haven't been reached
yet
in Sri Lanka, landmines left over from the country's civil war are a serious
concern as people try to return to their homes. The water has displaced
these explosives, which are bound to cause serious injuries.
around 2 million people in 12 countries could face food shortages according
to FAO. Many people are unable
to buy food because they have lost their income and livelihoods. And local
food production took a hit when boats and fisheries were destroyed and
agricultural land flooded. Because emergency food packages are just a temporary
solution, the FAO is recommending that international aid be directed, as
soon as possible, towards longer term goals. Such assistance might involve
replacing fishing equipment and boats, draining blocked irrigation systems
and fields, and supplying seed. The FAO launched an appeal earlier this
month for US$26 million to finance these types of project, but the rehabilitation
will take years. In 1 of 2 reports published (Impacts
of the Tsunami on Fisheries, Aquaculture and Coastal Livelihoods report),
FAO estimates damage to the fishing industries in the Indian Ocean. It
notes that Aceh province in northern Sumatra, for example, lost around
70% of its canoe-based fishing fleet as well as its ponds and hatcheries
for crabs, shrimp and fish. Loss of rice crops, which are the staple food
for all countries in the region, will also have an impact, according to
the second report (Impact
of the Tsunami Disaster on Food Availability and Food security in the Affected
Countries report). In Indonesia and Sri Lanka, for example, the main
paddy crop had been planted just before the tsunami struck. On a positive
note, the report says that there is enough surplus food in the region,
from affected or neighbouring countries, to cover immediate needs. The
main problem will be lack of good transportation to distribute the food.
FAO is starting a more comprehensive survey of the types of aid needed
to secure food supplies, such as provision of transportation or cold storage,
in coordination with the United Nations and other aid agencies
there is the possibility that these events may lead to a much larger pandemic
avian
influenza
problem. The relief effort has brought a number of people from around the
globe to a region not only affected by the tsunami but also affected by
avian influenza. It is possible that the local population in the area might
have some innate immunity due to repeated exposure to avian influenza.
However, with the influx of immune-naive foreign aid workers, there seems
to be a potential for spread into people who may be much more susceptible.
It is a safe assumption that hygienic conditions in the area are going
to be lacking for some time. In addition, many of these workers might almost
suspect that they will come down with an illness because of the circumstances,
and may simply shrug off the 1st signs and symptoms. As they return to
their countries of origin, they may unwittingly depart during the prodromal
phase of illness only to act as the index cases of pandemic flu in their
countries. It would be prudent for federal, state, and provincial public
health departments to set up surveillance systems to monitor the health
of individuals who traveled to help with the tsunami recovery efforts.
In addition, the established AI surveillance and control systems/activities
in the poultry sector might suffer from the aftermath of the current disaster.
In Thailand, 5 provinces have been affected by the tsunami: Phuket, Krabi,
Koh Phi Phi, Phang Nga and Koh Lanta. Of these, only Phang Nga has reported
any HPAI outbreaks, and that was during February 2004. All other Thai provinces
that have reported avian influenza outbreaks in the past 12 months are
outside the area hit by the tsunami. In Indonesia, Sumatra was hit by the
tsunami, mainly in the north. Only South Sumatra province has reported
HPAI outbreaks, one with onset around 11 Dec 2003 and one with onset at
the end of May 2004. In Malaysia, the only state to report HPAI outbreaks
is Kelatan State, which is the northeast corner of peninsular Malaysia,
i.e., the opposite side to that hit by the tsunami. Other tsunami-affected
areas, including Bangladesh, Myanmar, India and Sri Lanka confirmed to
the OIE between February and July 2004 that they are (or were at the time)
free of HPAIref.
This geographic observation is important not only in terms of evaluating
the risk of avian influenza infection during the relief efforts, but also
in terms of assessing the likelihood that locals have some innate immunity
against avian influenza due to previous exposure. Although it is encouraging
to see that the affected regions differ, we cannot dismiss the pandemic
influenza concern. It is possible that unidentified avian influenza (AI)
outbreaks have occurred in the tsunami-affected regions. It is also likely
that national resources will be diverted to disaster relief, potentially
compromising the ongoing avian influenza surveillance and control. I echo
the sentiment about the importance of human health surveillance for relief
workers and for local populations. As well, we should stress the importance
of immunization and protection not just against the usual suspects (cholera,
malaria, etc.) but also against human influenza. Such immunization will
reduce the risk of co-infections with avian and human influenza viruses.
Relief workers should, therefore, be immunized against human influenza
as well as against other travel-related diseases. The unsanitary conditions
might make human close contact with infected birds a larger problem, or
unnoticed avian outbreaks may occur as public health resources are overwhelmed
in the area. And, certainly medical aid workers have a higher chance of
contact with persons with respiratory illnesses. But I would think there
has been a larger influx of tourists from around the world than there will
be an influx of aid workers now. If we were going to see a lack of natural
immunity in foreigners, increasing the risk of potential human avian influenza
cases, wouldn't we have already seen it? As dehydration, weight loss from
inadequate caloric intake, and waterborne illnesses take their toll, a
population of unhealthy humans living in difficult and crowded conditions
will develop. If avian influenza A (H5N1) jumps into
this population, there is the potential for rapid evolution and amplification
and then dissemination of a virus capable of causing a human pandemic.
The pandemic may not initially be detected due to the inability to clinically
identify influenza among a population with limited medical access -- who
will likely battle many different febrile illnesses. In addition, detection
of avian influenza A (H5N1) in domesticated and wild
birds will not be a priority. We need immediate surveillance for avian
influenza A (H5N1) in sick people in this region
as well as in domesticated and wild birds. Teams must be mobilized quickly
by appropriate public health organizations and sent to sites in and around
the disaster areas.
Interpol called for foreign forensic teams working to identify victims
of the Asian tsunami to adhere to the same standard of forensic methods
for disaster victim identification.
The international police organization said that DNA profiles, fingerprints,
and other data like tattoos and birthmarks could be processed on a standardized
data sheet and entered into a centralized storage point. Teams in Thailand
alone (where an estimated 2000 foreign tourists are said to have perished
in the tsunami waves that struck on December 26) make up the largest disaster
victim identification event in recent history. If they have to go through
the full work of using all the methods available on all the bodies, it
could take a year. Beh added, however, that the process might be faster
than anticipated because experts could skip the traditional methods of
looking at dental records and comparing X-rays. Outrage over reports that
foreigners had been bagged and buried in unmarked mass graves began to
subside. Thai government spokesmen said that unidentified foreigners were
being stored in refrigerated containers. There is no doubt the mass graves
do exist, but Thai officials said no foreigners had been buried there.
Unidentified Thai disaster victims are being placed in what officials are
calling temporary graves along with electronic chips that identify any
personal information know to match the corpse.
The villagers of Kirinda in southeast Sri Lanka lost almost everything
in the Indian Ocean tsunami. Now, New York-based Architecture
for Humanity hopes to make this tiny fishing community a model for
reconstruction efforts in the regions affected by the disaster. The biggest
fear is that unscrupulous builders will throw up unplanned structures in
dangerous locations. The Sri Lankan government recommends that coastal
villages re-establish themselves further from the shore. But this isn't
what the locals want, so the architects are likely to go along with the
villagers' views in this case. Communities won't be moving an inch : people
even want to pitch their tents where their home was. However, they will
fight against the cobbling together of shoreline shacks by fishermen. Neighbouring
villages that had sand dunes, rather than sea-front buildings, suffered
far less at the hands of the deadly wave. The architects are also keen
to preserve Kirinda's environmental resources, including the bird sanctuary
and national parkland that flank the village. They plan to ensure that
the construction effort uses local resources, employing local workers and
building with wood and earth, rather than concrete. It is often local economics
rather than local planning that dictates how towns grow in the developing
world. Fishing villages owe their livelihood to the sea, so communities
will end up colonizing the sea front. But if people do not want to move,
he warns that they will need to make themselves more resilient in the face
of future tsunamis. One way to do this is to ensure that key infrastructure
elements such as hospitals and police stations are located in protected
spots. Those overseeing construction efforts also need to be aware of how
their actions might precipitate other disasters, such as landslides. The
easiest way to avoid tsunamis is to keep buildings at a higher elevation,
but you need to check the slopes are stable, especially when vegetation
is removed. Hong Kong island, for example, has been destabilized by heavy
construction. For poor communities in the regions affected by the Indian
Ocean tsunami, awareness may be the best defence. Learning the warning
signs, and having a well-rehearsed escape plan, could save countless lives
next time around.
Coral buffeted by the Indian Ocean tsunami in December 2004 withstood
the onslaught, according to an investigation of reefs in the Andaman Sea.
The discovery contradicts anecdotal reports suggesting that the region's
reefs had been decimated by the disaster. Last week, Coral
Cay Conservation, an international organization based in London, published
an assessment of the damage to the coral reefs of the Surin Islands Marine
National Park, a group of 5 jungle-covered, granite islands about 60 km
off the western coast of Thailand. Coral Cay's team of marine scientists,
led by researcher James Comley, measured damage to 28 km of reef in February
and March 2005. They reported instances of upturned or broken coral, coral
covered with sediment and coral displaced by the collapse of a reef. But
while some sites suffered severe damage, overall, only 8% of the coral
coverage before the tsunami will ultimately have been lost, even if all
of the tsunami-damaged coral dies. This exceptionally low statistic was
a surprise to the survey team. In other areas of Thailand, the proportion
of live coral damaged by the tsunami is many orders higher. However, the
damage to reefs elsewhere may be less severe than feared because the researchers
also observed early signs of regrowth in broken and upturned corals. It
would appear that a healthy coral reef system may be capable of regenerating
rapidly even in the aftermath of a natural event as momentous as a tsunami.
The relatively slight damage to coral around the Surin Islands is probably
due to the geography of the area, the researchers add. Many of the sites
surveyed are isolated, surrounded by cliffs and steep forested land rather
than the settlements and beaches next to some of the hardest-hit reefs.
Much of the physical damage caused to these less fortunate coral reef systems
was brought about by debris swept onto the reef by the receding tsunami
waters, much of which is of human origin. The studyref,
one of the most comprehensive assessments of biological systems since the
tsunami, illustrates the buffering capacity of healthy coral reefs. Perhaps
this tragic event will add further fuel to the fire for the conservation
of these highly valuable and critically important natural resources
The huge influx of aid did help to stave off disease, but also had
its downsides. Doctors and nurses from around the world arrived in great
numbers, but some of them made demands on the already stretched local health
infrastructures that actually did more harm than good. And other well-meaning
aid workers brought unnecessary supplies. The tsunami was limited to a
very narrow strip of land, sparing hospitals and infrastructures farther
inland, so most of the field hospitals flown in were actually underused.
Hundreds of agencies have been involved in post-tsunami relief activities,
but they haven't always communicated with each other in the best way. In
some of the most severely damaged areas, such as Indonesia's Aceh province
and southern Sri Lanka, fewer but better-prepared aid workers might have
achieved more. Still, the WHO has been monitoring diseases characteristic
of refugee areas, such as cholera, dysentery or malaria, and it has not
seen these increase hugely in the Indian Ocean region. Although there have
been some cases of measles, diarrhoea and chest infections, these were
common diseases before the disaster. International aid has had a very positive
effect in keeping such problems in check. Experts at the meeting called
for the organization of pre-planned packages of assistance, so that countries
and aid agencies will know what to do in the event of a major disaster.
Parcelling out responsibilities for different kinds of aid (doctors, food,
housing) to various parties ahead of time should prevent confusion. The
WHO offered to coordinate such activities. Speakers also called for money
to be spent on reducing countries' vulnerability to natural disasters.
At least 10 to 20% of the estimated US$5-billion pot for tsunami aid should
be spent on education, establishing warning systems, and ensuring that
buildings are being constructed properly.
Web resources : Coral
Reef Conservation Program
The International Coordination Group
for the Tsunami Warning System in the Pacific, comprised of 26 participating
international Member
States, has the functions of monitoring seismological and tidal stations
throughout the Pacific Basin to evaluate potentially tsunamigenic earthquakes
and disseminating tsunami warning information. The Pacific Tsunami Warning
Center (PTWC) is the operational center of the Pacific TWS. Located near
Honolulu, Hawaii, PTWC provides tsunami warning information to national
authorities in the Pacific Basin. As part of an international cooperative
effort to save lives and protect property, the National
Oceanic and Atmospheric Administration's (NOAA)National
Weather Service (NWS) operates 2 tsunami warning centers. The Alaska
Tsunami Warning Center ATWC) in Palmer, Alaska, serves as the regional
Tsunami Warning Center for Alaska, British Columbia, Washington, Oregon,
and California. The Pacific Tsunami Warning Center in Ewa Beach, Hawaii,
serves as the regional Tsunami Warning Center for Hawaii and as a national/international
warning center for tsunamis that pose a Pacific-wide threat. This international
warning effort became a formal arrangement in 1965 when PTWC assumed the
international warning responsibilities of the Pacific Tsunami Warning System
(PTWS). The PTWS is composed of 26 international Member States that are
organized as the International Coordination Group for the Tsunami Warning
System in the Pacific. The objective of the PTWS is to detect, locate,
and determine the magnitude of potentially tsunamigenic earthquakes occurring
in the Pacific Basin or its immediate margins. Earthquake information is
provided by seismic stations operated by PTWC, ATWC, the U.S. Geological
Survey's National Earthquake Information
Center and international sources. If the location and magnitude of
an earthquake meet the known criteria for generation of a tsunami, a tsunami
warning is issued to warn of an imminent tsunami hazard. The warning includes
predicted tsunami arrival times at selected coastal communities within
the geographic area defined by the maximum distance the tsunami could travel
in a few hours. A tsunami watch with additional predicted tsunami arrival
times is issued for a geographic area defined by the distance the tsunami
could travel in a subsequent time period. If a significant tsunami is detected
by sea-level monitoring instrumentation, the tsunami warning is extended
to the entire Pacific Basin. Sea-level (or tidal) information is provided
by NOAA's National Ocean
Service (NOS), PTWC, ATWC, university monitoring networks and other
participating nations of the PTWS. The International
Tsunami Information Center (ITIC), part of the Intergovernmental Oceanographic
Commission, monitors and evaluates the performance and effectiveness of
the Pacific Tsunami Warning System. This effort encourages the most effective
data collection, data analysis, tsunami impact assessment and warning dissemination
to all TWS participants. Tsunami watch, warning, and information bulletins
are disseminated to appropriate emergency officials and the general public
by a variety of communication methods.
tsunami watch, warning and information bulletins issued by PTWC and ATWC
are disseminated to local, state, national and international users as well
as the media. These users, in turn, disseminate the tsunami information
to the public, generally over commercial radio and television channels.
the NOAA Weather Radio System, based on a large number of VHF transmitter
sites, provides direct broadcast of tsunami information to the public.
the US Coast Guard also
broadcasts urgent marine warnings and related tsunami information to coastal
users equipped with medium frequency (MF) and very high frequency (VHF)
marine radios.
local authorities and emergency managers are responsible for formulating
and executing evacuation plans for areas under a tsunami warning. The public
should stay-tuned to the local media for evacuation orders should a tsunami
warning be issued. And, the public should not return to low-lying areas
until the tsunami threat has passed and the "all clear" is announced by
the local authorities.
The goal is to be able to predict, for any given coast with a given topography,
which areas are most vulnerable and thus in greatest need of evacuation.
Such predictions would be easier to make if ocean basins resembled swimming
pools and continents were rectangular-shaped slabs with perfect edges.
But the uneven contours of sea floors and the jagged geometry of coastlines
make tsunami modeling a complex engineering problem in the real world.
Exactly how a tsunami will travel through the ocean depends on factors
including the intensity of the earthquake and the shape of the basin; how
the waves will hit depends, among other factors, on the lay of the land
at the shore. What makes tsunami warnings even more complicated is that
undersea quakes of magnitudes as great as 7.5 can often fail to generate
tsunami waves tall < 5 cm.
In general, if you think a tsunami may be coming, the ground shakes
under your feet or you hear there is a warning, tell your relatives and
friends, and move quickly to higher ground.
Important facts to know about tsunamis :
tsunamis that strike coastal locations in the Pacific Ocean Basin are most
always caused by earthquakes. These earthquakes might occur far away or
near where you live.
some tsunamis can be very large. In coastal areas their height can be as
great as 30 feet or more (100 feet in extreme cases), and they can move
inland several hundred feet.
all low-lying coastal areas can be struck by tsunamis.
a tsunami consists of a series of waves. Often the first wave may not be
the largest. The danger from a tsunami can last for several hours after
the arrival of the first wave.
tsunamis can move faster than a person can run.
sometimes a tsunami causes the water near the shore to recede, exposing
the ocean floor.
the force of some tsunamis is enormous. Large rocks weighing several tons
along with boats and other debris can be moved inland hundreds of feet
by tsunami wave activity. Homes and other buildings are destroyed. All
this material and water move with great force and can kill or injure people.
tsunamis can occur at any time, day or night.
tsunamis can travel up rivers and streams that lead to the ocean.
If you are on land :
be aware of tsunami facts. This knowledge could save your life! Share this
knowledge with your relatives and friends. It could save their lives!
if you are in school and you hear there is a tsunami warning, you should
follow the advice of teachers and other school personnel.
if you are at home and hear there is a tsunami warning, you should make
sure your entire family is aware of the warning. Your family should evacuate
your house if you live in a tsunami evacuation zone. Move in an orderly,
calm and safe manner to the evacuation site or to any safe place outside
your evacuation zone. Follow the advice of local emergency and law enforcement
authorities.
if you are at the beach or near the ocean and you feel the earth shake,
move immediately to higher ground, DO NOT wait for a tsunami warning to
be announced. Stay away from rivers and streams that lead to the ocean
as you would stay away from the beach and ocean if there is a tsunami.
A regional tsunami from a local earthquake could strike some areas before
a tsunami warning could be announced.
tsunamis generated in distant locations will generally give people enough
time to move to higher ground. For locally-generated tsunamis, where you
might feel the ground shake, you may only have a few minutes to move to
higher ground.
high, multi-story, reinforced concrete hotels are located in many low-lying
coastal areas. The upper floors of these hotels can provide a safe place
to find refuge should there be a tsunami warning and you cannot move quickly
inland to higher ground. Local Civil Defense procedures may, however, not
allow this type of evacuation in your area. Homes and small buildings located
in low-lying coastal areas are not designed to withstand tsunami impacts.
Do not stay in these structures should there be a tsunami warning.
offshore reefs and shallow areas may help break the force of tsunami waves,
but large and dangerous wave can still be a threat to coastal residents
in these areas. Staying away from all low-lying areas is the safest advice
when there is a tsunami warning.
If you are on a boat:
since tsunami wave activity is imperceptible in the open ocean, do not
return to port if you are at sea and a tsunami warning has been issued
for your area. Tsunamis can cause rapid changes in water level and unpredictable
dangerous currents in harbors and ports.
if there is time to move your boat or ship from port to deep water (after
a tsunami warning has been issued), you should weigh the following considerations
:
most large harbors and ports are under the control of a harbor authority
and/or a vessel traffic system. These authorities direct operations during
periods of increased readiness (should a tsunami be expected), including
the forced movement of vessels if deemed necessary. Keep in contact with
the authorities should a forced movement of vessel be directed. Smaller
ports may not be under the control of a harbor authority. If you are aware
there is a tsunami warning and you have time to move your vessel to deep
water, then you may want to do so in an orderly manner, in consideration
of other vessels. Owners of small boats may find it safest to leave their
boat at the pier and physically move to higher ground, particularly in
the event of a locally-generated tsunami. Concurrent severe weather conditions
(rough seas outside of safe harbor) could present a greater hazardous situation
to small boats, so physically moving yourself to higher ground may be the
only option.
damaging wave activity and unpredictable currents can effect harbors for
a period of time following the initial tsunami impact on the coast. Contact
the harbor authority before returning to port making sure to verify that
conditions in the harbor are safe for navigation and berthing.
18 months after a deadly tsunami killed > 200,000 people in the region,
a tsunami early warning system for the Indian Ocean was up and running.
But the downstream flow of information from local authorities to the populations
and communities most at risk is still unresolved, so the system may not
yet be capable of saving lives. The state-of-the-art IndoTsunami
system, set up and coordinated by the UNESCO, consists of a seismic network,
a set of buoys deployed throughout the Indian Ocean, and several deep-ocean
pressure centres that measure the power and propagation of waves. All data
are transmitted in real time to the existing tsunami warning centres in
Japan and Hawaii, which have traditionally focussed on the Pacific. If
a potentially tsunami-generating quake occurs in the Indian Ocean, these
centres will issue a warning to the authorities in 24 countries around
the Indian Ocean. Four countries, Somalia, Yemen, Saudi Arabia and the
United Arab Emirates, are not yet linked into the system. But these countries
still need to find the best way to get the information out to the people.
If a tsunami were to strike tonight, millions living near beaches around
the Indian Ocean would still not be alerted in time. It is the last and
crucial mile which is still missing. Some countries, such as Australia,
India and Malaysia, are more active than others. But we are not sure how
many do actually have the capacity of warning people before a tsunami hits
the beach. Coastal populations could be warned by sirens, for example.
Even in countries where dissemination systems are in place, they may not
work. In a recent test of the Pacific tsunami warning system, Thailand
found that an attempt to alert people by sending text-messages to their
mobile phones crashed the telephone system. The new systems need to be
tested in real situations. New communication tools are vulnerable to saturation
when they are most needed. New siren systems may be heard on one side of
a bay but not on the other. Even a timely and 100% accurate and precise
warning will not provide any protection if people do not know how to respond
to the emergency. Efficient warning systems, as exist in Hawaii, for example,
also require sufficient and well-signposted escape routes and a high level
of preparedness among people at risk. Some people have been known to head
to the beach to watch the spectacle when an alert is issued. Building national
preparedness is the most difficult part of establishing early warning systems.
An intergovernmental coordination group will discuss this and other remaining
problems for the Indian Ocean system at a meeting in Bali from 31 July
to 2 August.
A local-scale tsunami killed at least 5 people in Indonesia. The recently
installed warning system did issue an appropriate alert, authorities say,
but it is unclear how many lives this warning saved. A tsunami warning
was issued at 11.07 GMT on Jul ? 2006 (17.07 local time), just 4 minutes
after a quake measuring 7.2 on the Richter scale occurred several hundred
kilometres south of Jakarta, Indonesia, in the Sunda fault. Gauges deployed
in the Indian Ocean measured a 7-centimetre sea-level rise, indicating
that the quake had generated a tsunami. Coastlines on Indonesia's island
of Java shortly thereafter were struck by a 2-metre-high wave, according
to eye-witness reports. The Indonesian president later confirmed on the
radio that large waves had killed at least five people and caused severe
destruction. Indonesia has only recently been linked to the tsunami early
warning system for the Indian Ocean, which UNESCO is setting up together
with nations in the region. The instruments in the Indian Ocean were only
just activated in June 2006. These send data to the pre-existing Pacific
Tsunami Warning Center, which has its main centres in Hawaii and Japan.
In this instance, the warning for Indonesia originated in Japan. Authorities
in Jakarta did receive that warning in real time, but it is as yet unclear
whether they were able to convey the information to the local population.
They would not have had much time to do so. Although reports vary, experts
calculate that the waves probably arrived at the coast only 15 to 20 minutes
after the earthquake. Experts suspect that fresh memories of the Boxing
Day tsunami from 2004 may have led many people to run inland on feeling
the tremor. This simple response could have saved many lives. The good
news is that the system has responded and seems to work fine. It was a
successful first test, but it is too early to say exactly how successful.
The earthquake itself was not abnormal in location or size. This was a
pretty standard quake, nothing unusual. The tsunami was generated as vertical
motion lifted up the seafloor. The risk of another ocean-wide tsunami,
such as the one that killed more than 200,000 on 26 December 2004, was
very small this time. Seismologists expect the next big quake in the region
to occur farther north than where this one struck, but they note there
is a large earthquake potential everywhere along the Indonesian subduction
zone. How large was the quake? After the quake came a number of aftershocks,
but only the big one sparked a tsunami. The quake was initially reported
as having a magnitude of 7.1 or 7.2, depending on the source. This was
upped to 7.7 in the following days. Is that magnitude unusual? As Java's
stunned residents have been reminded, major quakes are a fact of life in
Indonesia. The archipelago is part of the Pacific 'ring of fire' — a vast,
40,000-kilometre horseshoe-shaped array of plate boundaries that also includes
the US west coast. The network produces 90% of the world's earthquakes,
and more than 80% of major ones, defined as those with a magnitude of more
than 7. It's a perfectly normal event for that part of the world. It's
a major quake but nonetheless part for the course. If quakes are that common,
are tsunamis too? Not all quakes cause tsunamis — the sea floor has to
move far and fast enough to spark a wave. But tidal waves are certainly
not unheard of. In June 1994, a quake-sparked tsunami reported to be as
high as 15 metres killed some 200 coastal residents in east Java. In the
devastating December 2004 event, which killed some 200,000 people, water
marks were reportedly found 20 m above sea level. Although eye-witness
accounts tend to be unreliable, reports from the most recent event tell
of waves from 2 to 10 metres tall. Scientists have estimated that the Indian
Ocean might see an ocean-wide, 10-metre wave event (such as the one in
2004) only once every 1,000 years. Is Indonesia prepared for this? A tsunami
warning system has recently been installed in the Indian Ocean, with instrumentation
on buoys coming online this May. That system did successfully issue an
alert. Lives were lost, experts agree, because of a 'last-mile syndrome'
— although the global networks that keep a watch for deadly quakes and
tsunamis is functioning, there is no way yet to efficiently get that information
from officials out to people on the beach. The Indonesian government was
not planning to install sirens on Java until 2007, after first providing
them for Sumatra, the island hit hardest by the 2004 tsunami. There's not
enough money and manpower to do everything at once, so they took the decision
— probably rightly — to do Sumatra first. The US Geological Survey's National
Earthquake Information Center (NEIC) in Denver, Colorado, monitors earthquakes
around the world and relays warnings to some 54,000 contact points in the
event of a large tremor. Are we getting better at detecting such threats?
Spurred in part by the 2004 tusnami, the NEIC recently replaced its 25-year-old
computer system. The latest system is fed with data, via satellite, from
some 475 seismic stations around the globe. Whereas before the 2004 quake
it would have taken an hour to identify the size and location of a large
quake, the task can now be accomplished in 12 to 13 minutes. That said,
magnitude estimates produced by these computers for large quakes still
need to be adjusted by an expert who can take various factors into account,
including the size and depth of the fault rupture. Once you get into the
sevens or bigger, you need techniques that are not automated yet. Researchers
are also working on ways to use Global Positioning Systems to more quickly
calculate how far the Earth moved during a quake, to get a better, faster
estimate of its true sizeref.
Web resources :
rogue waves : the towering walls of water that, some experts suspect,
sink tens of ships every year
CO2-rich lake bottoms : researchers involved in a unusual
project to prevent natural disasters in 2 African lakes say the first few
years of the attempt have gone well, but have not yet made things safe.
The project aims to remove
dangerous levels of carbon dioxide from the bottom of 2 lakes sitting
over volcanic sites in Cameroon. One of these, Lake Nyos, exploded
in 1986, suffocating > 1,700 people in the surrounding area with a plume
of CO2. After the explosion, scientists realized that a landslide
or some other event had caused water rich in CO2 to rise from
the safety of the deeps. As its pressure decreased, the rising water released
the gas dissolved within it, in the same way that bubbles form in champagne
once the bottle is opened. As a result, the water shot to the surface,
suddenly releasing fatal amounts of CO2 into the atmosphere.
To solve the problem, an international team decided to pipe out the CO2-rich
water in a controlled way. The first pipe was installed in 2001 in Lake
Nyos. Another pipe was deployed in 2003 in nearby Lake Monoun, which
killed about 40 people when it exploded in 1984. Some worried that the
pipes would destabilize the lakes and cause another dangerous release of
CO2. This prompted the installation of a webcam to monitor the
Lake Nyos pipe, and a system to shut down the pipe remotely in case of
problems. In both lakes, a warm, light layer of water stays on the surface,
acting as a cap for the gas-heavy water below. This keeps the carbon dioxide-rich
water locked up in depths below about 40 m. Although gas continues to seep
into the lowest waters owing to volcanic activity, it stays dissolved thanks
to the high pressures. Thankfully the pipes have not altered this situationref.
It now looks as though nothing is likely to happen, but I still feel the
lake should be treated with great care. The single pipes have reduced the
lakes' CO2 content by 12-14% compared with levels right before
they were installed, but that's not enough. Right now, there is thought
to be more CO2 in the lakes than was released 20 years ago.
Only a reduction of 80-90% will ensure that a similar explosion won't happen
again, but the pipes become less efficient as the gas pressure decreases.
The modelling shows that in another 5 years with 4 more pipes Lake Nyos
would be degassed to safe levels. In Monoun it would take 2 years with
1 more pipe. Safe levels are reached once the gas pressure is the same
as the atmospheric pressure at the lake surface. The research team had
always planned to install more pipes, but that will cost between US$1 million
and $2 million. And Lake Nyos faces further problems. A dam at the lake's
edge is weakening; if it gives, the leak could lower the lake's water level
by 40 m, causing a flood that would affect 5,000 people. The lake could
be drained to prevent this, but the CO2 would have to be safely
siphoned off first.
wildfires : homes > 700 m from the adjacent
bushland are generally safe, and the percentage of houses burned down decreases
linearly with distance from the edge of the bush. Around 60% of the homes
within the first 50 m are destroyed. Most of the houses are set ablaze
by wind-borne embers rather than by radiant heat or direct contact with
the fire. Are fiercer wildfires driven by climate change or poor forest
management? Scientists and the media have been debating this burning question
for decades. Now a large study of recent western US forest fires shows
that, for the Rockies at least, climate is to blame. Every year, forest
fires burn hundreds of homes, severely damage natural resources, and attract
more than US$1 billion in fire-fighting costs across the USA. As the fires
rage more strongly, so does the debate about their causes. Some explanations
cite changing land-use patterns: livestock grazing and extensive logging
in the early 20th century was followed by forest regrowth and accumulation
of burnable matter, which has increased the ferocity and spread of wildfires.
But growing amounts of scientific research have indicated that local climate
changes may also be to blame. A large-scale study of western US forest
fires shows what many had suspected: forest-fire activity has dramatically
increased in recent years, particularly in the northern Rockies, in step
with local climate change. Most people studying this are ecologists, not
climatologists. We looked at things on a much larger scale; only then do
you start seeing patterns. The researchers looked at 1,166 large forest
wildfires in the western USA over 34 years. They found a dramatic jump
in fire behaviour after 1987. In the 17-year period following that date,
wildfire frequency was nearly four times higher, and area burned six times
higher, than in the previous 17 years. The team compared their year-on-year
forest fire data with the corresponding spring and summer temperatures
and seasonal timing of snowmelt. Forest fires in the northern Rockies,
which accounted for most of the wildfire increase, were strongly associated
with increased temperatures, when snows melted earlier and dry seasons
lasted longer. This is one of the first big indicators of climate-change
impacts in the continental USA. Lots of people think climate change and
the ecological responses are 50 to 100 years away. But it's happening now
in forest ecosystems through fire. The study has implications for current
US fire suppression policies, which focus on removing shrubs and selectively
logging trees to create open forests. Because local climate is the driving
factor behind fires in the Rockies, this vegetation management may not
work. There is always a tendency to have a 'one size fits all' policy,'selectively
log everywhere', for example, but that would be inappropriate in places
such as the Rockies. In the Rockies, lodgepole pine and spruce-fir forests
are densely packed. But trees in the ponderosa pine forests of the southwest
of the study region are more sparsely scattered. Fires in these open forests
are generally carried in kindling near the forest floor. Even so, the team
found that climate oscillations between wet and dry periods (creating lush
plants that can later burn) were a factor in these wildfires too. For the
areas most affected by climate change, it looks like things can only get
worse. All climate model projections suggest that hotter springs and summers
will be coming to the region in the next few decades if global warming
trends continue. The researchers suggest that more large wildfires could
burn enough biomass to make western US forests produce carbon dioxide than
they soak up. It is alarming that forests in the West and elsewhere may
become important carbon sources in the greenhouse-gas story, something
that is not included in current climate projections. One test of the Westerling
findings, will be to examine fire-climate linkages on longer time scales,
looking back to tree-ring and lake-sediment records when climate and greenhouse
gases were quite differentref.
Web resources :
icebergs : the plodding course of the Long
Island-sized iceberg, which scientists predicted would crash into an Antarctic
glacier by 15 Jan 2005, has been halted indefinitely. The iceberg, called
B-15A,
has run aground about 4 km from the Drygalski Ice Tongue and is
just jiggling back and forth. It will probably stay like this for some
time. Some scientists are concerned that B-15A will inhibit the movement
of ships in McMurdo Sound, and force penguins to travel farther for food.
But there is no reason for concern. B-15A is nowhere near the penguin colonies
that are being studied. Ships are not currently being affected, and we
have no reason to think that this will change. In March 2000, the longest
iceberg ever seen broke off the Ross Ice Shelf, a floating mass of ice
east of McMurdo Station. In a few weeks the iceberg, named B-15, split
into several pieces, the largest of these being B-15A. As snow accumulates
on the mainland, it compacts and forms ice. As ice builds up, it flows
outward into floating shelves and is sloughed off the front of these as
an iceberg. The rate of iceberg formation depends on how much snow falls.
Global warming has increased the number of icebergs breaking away from
the Antarctic mainland, but B-15's calving is still thought to be a natural
occurrence
Natural catastrophes that can lead to loss of the ED itself are often overlooked
in disaster plans and drills. Plans should address evacuation of patients,
including transportation and equipment to take, and drills should include
these scenarios. You should have alternatives to cope with loss of power
and communications, such as oxygen-powered backup generators. To prepare
for a deluge of individuals coming to the ED after a disaster, decide how
to utilize volunteers, and limit access by using disaster identification
vests. New Environment of Care standards from the Joint Commission on Accreditation
of Healthcare Organizations require you to perform a hazard and vulnerability
analysisref.
Cadaver management
: sometimes, natural or man-made disasters may mean that the normal disposal
procedures cannot be followed. Under these circumstances, care must be
taken to ensure that the disposal of human remains does not face an already
stressed population with further risks. Ideally bodies should be cremated
but if this is not possible burial with at least 1 metre of earth over
the cadavers (to prevent access by scavengers and pests) is a satisfactory
alternative. Religious and social practices should be followed as far as
possible. Burial sites must be chosen so as to avoid the risks that water
sources may be contaminated. The major hazard facing emergency service
personnel is spilt blood and any risk can be greatly reduced by preventing
contact with blood (use of gloves, face and eye protection, and protective
clothing where necessary). Bodies that have been decaying for some time
present little risk. The organisms likely to be present are their own body
flora and water or environmental organisms. The use of proper protective
clothing will protect personnel handling such material. Bodies should always
be transported to mortuary facilities in waterproof body bags or cleanable,
fluid retentive (e.g. fiberglass) temporary coffins. A report
issued last September by the Pan American Health Organization (PAHO) addressed
the management of dead bodies in disaster situations. Its findings suggest
that the corpses from the 26 December catastrophe pose no serious risk
of spreading infection and disease. Bodies should always be buried in a
way that allows for later exhumation. The use of common graves should be
avoided in all circumstances. The report calls mass burials "a violation
of the human rights of the surviving family members". It is widely believed
that swift burial is the only way to prevent the spread of diseases such
as cholera. But that is a myth, the PAHO report reveals. Cholera does not
appear spontaneously in the body of a person who did not have it to begin
with. And although harmful bacteria or viruses in a corpse can in theory
be spread by rats, flies, fleas and other animals, that doesn't tend to
happen in practice. The temperature of a body falls rapidly after death,
so even the most resistant bacteria and viruses die quickly in an animal
that has died. Past experience shows that unburied dead bodies pose a negligible
risk to those who do not come into physical contact with them. Handling
of bodies by relief workers does, of course, require protective clothing.
The report recommends that bodies be carefully reported and tagged before
being placed in individual body bags. Changing beliefs about the need for
mass graves will be a slow business. It took 2-3 decades for people in
the Western world to understand that it is best not to move a person injured
in a street accident, for example. To change ideas at a global level takes
a lot of time. Therefore, the utmost priority from a public health point
of view is not to bury all the dead immediately (which is what people on
TV seem to be telling us), but to restore the safety of the water supply.
In fact, for many of the affected countries, it is not a matter of "restoring"
but establishing a safe source of water!!! It will also be important
to pay attention to environmental issues like vector control, waste disposal,
and hygiene education.
Disaster
victim
identification
: when the devastating tsunami struck the Indian Ocean in December 2004,
it was not DNA that identified most of the victims but traditional forensic
methods such as comparing dental records. In Thailand, for example, DNA
techniques put names to < 1% of the victims. The scale of the disaster
made the detection effort particularly difficult. Teams were dealing with
thousands of bodies in a hot, wet climate, where roads and other infrastructure
had been destroyed and lab facilities were virtually non-existent. In other
recent disasters, such as 9/11 and the massacres in the former Yugoslavia,
DNA identification proved to be the most useful tool. But in Thailand neither
the time nor the facilities were available. One advantage teeth have over
DNA is that they can be stored. Without refrigeration DNA samples would
have degraded. The public perception is that DNA would be the be-all and
end-all of the identification process. To get 1500 identifications with
DNA you would need to process as many as 8000 samples, because of the need
to profile several relatives for each corpse. That is where the dentists
came in. In Thailand some 75% of bodies were identified using dental records,
10% by fingerprints and just 0.5% using DNA profiling. For the remainder
a combination of techniques was used. One advantage teeth have over DNA
is that they can be easily stored to be compared with dental records later.
Without refrigeration, DNA samples would have quickly degraded. Forensic
odontologists were still working in Phuket months after the disaster. By
February, they had identified > 400 people with dental records. By April
> 1200 had been identified, and by July the number had reached 1700. The
number identified by September is 2200, and 3300 bodies are unidentified.
Dental records could not be found for all the Thai victims. In these cases
the investigators made tentative identifications from photos of the victim
smiling. The successes were achieved despite some foreign forensic teams'
practice of concentrating on their own nationals. A few pathologists who
spoke to New Scientist complained that teams from some countries sorted
through the piles of bodies looking for their own citizens. Some foreign
teams ignored the corpses of local victims. There were 1-2 who never reformed
and genuinely couldn't or wouldn't acknowledge that they were doing anything
wrong...They were parasitising on the rest of us. By contrast, Clement
singles out the New Zealand team for praise. New Zealand only lost 3 people,
yet their dental teams were there in numbers for a very long time helping
everyone else quite selflessly
Web resources :
The headline concern for the tsunami victims, living and dead, will be
with us for a few weeks; by contrast, their need for assistance and reform
will stretch over decades. Huge sums have been pledged in aid and a bold
commitment made to build an Indian Ocean tsunami warning system. But, if
past is precedent, only a fraction of recent pledges will materialise and
the already overdue warning system will remain a pipe dream for the affected
communities. The headlines rightly applaud the compassionate outpouring
of the public around the world but fail to question the logic of promoting
one-off giving from individuals rather than sustained involvement by governments.
Disasters are part of normality, and if we are to have a longlasting effect
we need to rethink the way aid is delivered and invest in development to
help minimise the effects of natural phenomena.
The pledging of $5bn (£2.6bn, 3.8bn) for survivors of the tsunami
only 3 weeks after the event, is an impressive expression of global concern.
The track record of delivering on such commitments, however, is anything
but reassuring
Bam earthquake, Iran (data from E Mansilla, Universad Nacional Autónoma
de México)
1 bn in January 2004
116m December 2004
Pledges fail to materialise for a host of reasons, some rational but some
indefensible. Countries affected by disaster may have difficulty in absorbing
huge aid flows quickly. Disasters destroy the infrastructure vital to delivering
aid. Many of the civil servants, technicians, and local leaders who would
normally carry out rehabilitation work may be killed in the disaster, as
they were in the Rwandan genocide and the tsunami. Donors may attach conditions
that make contributions difficult to use. These conditions may reflect
a policy agenda (such as requiring the recipient country to respect human
rights) or express a commercial objective (such as mandating purchase of
goods and services from a given donor government). Funds pledged may pay
for items that never quite make it into the disbursement ledger, such as
the costs of military flights, shipping, and human resources for disaster
response. Some of what gets counted as assistance actually takes the form
of loans, further increasing poor countries' external debt. The dispassionate
observer may be forgiven for wondering whether donors are motivated by
the need to be viewed as pledging or by the desire to make a real difference
in the lives and livelihoods of affected populations. The present funding
and aid effort around the Indian Ocean is likely to meet most of the vital
needs for survival. We should be less optimistic, however, about the medium
and long term. The logic of disaster response tells us that funds should
be pledged for identified needs. Needs are identified by local communities,
affected governments, non-governmental organisations, and the UN. The UN
produces, for most major disasters, a consolidated appeal that pulls together
the needs identified by all its agencies in collaboration with the affected
communities. The requests are, however, rarely met in fullref
:
Funds requested by UN for disasters and % not met, 2000-4
2000
2001
2002
2003
2004
Funds appealed for (total of all appeals in given year) ($bn)
1.9
2.6
4.4
5.2
3.4
Average % of appeal not met
40.8
44.7
23.5
24.2
40.0
The UN appeals are not wish lists; they represent real and immediate needs,
and usually minimum needs at that. Any shortfall in funding should be seen
as a failure on the part of the international community. Sadly, aid for
long term needs rarely represents new money. Edward Clay, senior research
associate at the Overseas Development Institute, London, noted immediately
after the tsunami: "The research evidence is that the immediate response
to natural disasters involves some new money, but that rehabilitation needs
are often met by switching aid money between uses rather than increasing
total aid to the countries affected."ref
His speculation is proving correct. The United States has pledged $350m,
and in early January had committed $44.6m of that to actual programmingref.
Yet US Agency for International Development officials have acknowledged
on national news programmes that these funds will come from existing aid
budgets rather than new money. This means that those funds cannot now be
spent on other priorities—for example, in Sudan or Bam. However, the US
administration has so far resisted the suggestion that supplemental funding
be sought to respond to the emergency. The present system of tallying aid
flows distorts the apparent generosity of different parts of the world,
playing up that of Western governments and playing down that of the general
public and smaller nations of the world. The figures for funds pledged
and spent usually derive from the UN's financial tracking system and, for
historical data, from the Organisation for Economic Cooperation and Development's
development assistance committee. These sources systematically underestimate
global humanitarian giving. Public donations to non-governmental organisations
are not reflected fully. Estimates suggest that Western nongovernmental
organisations may raise as much as $2bn annually from the public. Analyst
Abby Stoddard estimates that globally some 75% of the humanitarian resources
of non-governmental organisations come from the general publicref.
Funds from Islamic agencies and Islamic government to government funding
also tend to be minimised in normal aid reporting. Another undercounted
source is the considerable funding from diaspora groups around the world.
In early 2001, after the earthquake in Gujarati, expatriate Gujaratis provided
hundreds of millions of dollars above and beyond their normal remittances
from overseasref.
Finally the investment made by disaster survivors themselves in rebuilding
their own communities is rarely calculated. One of the promising and noteworthy
developments from the tsunami response has been the contributions from
middle class people in India to relief efforts there and in neighbouring
countriesref.
On the government side, the first ever contribution by the government of
China to an international relief mobilisation also bodes well for more
inclusive and participatory international arrangements. The present system,
however, will not necessarily capture these figures. The focus during the
days after the tsunami, at least in the United States, has been on promoting
private donations. The US president, joined by the past two presidents,
has been appealing on television for citizens to dig deep and help out.
Although the response from the American public has been overwhelming, such
appeals hint at responsibility shifting. The emotional response to high
profile suffering ensures a generous public response. That motivation,
however, is insufficient as a source of funding for rehabilitation, prevention
and reduction of future disasters, and tackling long term vulnerability.
These more complex efforts should be seen as the common responsibility
of states. Governments should not be allowed to shed this responsibility
by appealing for private donations. Donor agencies reinforce the problem
of longer term redevelopment by putting a premium on rapid disbursement
of emergency funds. They have comparatively few funding lines available
for rehabilitation and redevelopment activities. Redevelopment should include
measures to mitigate the effects of a given disaster and to prepare to
respond to the next disaster. In most development planning, however, disaster
resilience, particularly for the poorest and most vulnerable communities,
isn't on the agenda. Changes in approaches are needed that take people's
livelihood strategies, knowledge, motivations, and other resources seriously,
acknowledging the reality of disaster hazards and building disaster resilience
into national development plans. Past experience suggests that little of
the $5bn pledged for the tsunami is likely to go to the excellent but unheralded
work of local non-governmental organisations that have been trying to provide
increased security for the poorest households through microcredit, literacy
training, and citizen based identification of hazards. An example of the
small but high impact schemes that are often overlooked by major donors
is provided by the US non-governmental organisation, Grassroots International,
which has a partnership with the Sri Lankan union of small-scale fishermen
to help repair and replace boats and nets. Disaster proofing also involves
a great deal of risk judgment. What if the disaster never happens? Ninety
per cent of all recorded tsunamis occur in the Pacific. Until now, governments
have viewed the extension of the Pacifiic tsunami warning system (which
has existed since 1949) into other ocean basins as a poor use of money
on the basis of costs and benefits. This narrow view, however, ignores
the possibility of merging cyclone and tsunami warning systems. Cyclones
are much more common than tsunamis in the Indian Ocean, and a combined
system would make sense. Geologists, seismologists, and tsunami experts
have failed to interact with cyclone experts, and governments have been
slow to take up their various suggestions. In 2003, an Australian geologist
sought to get the UN's international coordination group for the tsunami
warning system in the Pacific to include the Indian Ocean. In a decision
that allowed governments to appear to be acting without really doing so,
the group voted "to establish a sessional working group to prepare a recommendation
to establish an intersessional working group that will study the establishment
of a regional warning system for the southwest pacific and Indian Ocean.".
Calls from Indian nongovernmental organisations for improvements in the
warning system and the extension of cyclone warning to include tsunamis
after the supercyclone disaster in Orissa state in 1999 were also ignored
(K A Aryal, Disaster and Development Centre, Northumbria University, personal
communication). When the smoke and mirrors clear, the minutes of such international
groups reveal a timidity among governments unwilling to take the necessary
preventive steps. We hope that the discussions on reducing natural disasters
at the World Conference on Disaster Reduction in Kobe on 18-22 January
(www.unisdr.org/wcdr) will be animated by urgency and energy born of recent
events. The global picture on natural disasters is changing. Data compiled
by the Centre for Research on the Epidemiology of Disasters at Louvain
University, Brussels, paints a disturbing picture. Over the past two decades,
the annual deaths from disaster have fallen by around 30% whereas the number
of people affected by disaster has gone up by 59%. The data also show that
the number of disasters, almost all meteorological in nature, is steadily
rising year on year, from an average of 150 a year in 1980 to over 450
a year today. Although some of this increase may be due to better reporting,
a substantial part represents a very real phenomenon. Even more striking
is the relation between development and disaster vulnerability. Analysng
2557 disasters between 1991 and 2000, the International Red Cross and Red
Crescent Federation found that in countries with high human development
indices there were 23 deaths per disaster, whereas in countries with low
indices 1052 people died per disaster. In other words, development is an
investment in disaster mitigation. The fall in deaths from disasters is
partly explained by better warning systems and preparedness for predictable
flooding. The data also signal the reluctance of governments to sustain
high death tolls in disasters. In the past (for example, Ethiopia in the
1970s), high death rates in natural disasters have led directly to uprisings
and the overthrow of regimes. Organised attempts to reduce death rates
immediately after disasters have characterised relief work over the past
30 years. The rise in the number of people affected, and therefore the
numbers needing to rebuild and facing future disasters, is concerning.
The increase may reflect three factors. The first is climate change, which
is producing more frequent extreme meteorological events and an inexorable
rise in sea level. The second is the global trend of urbanisation, with
most growth being in shantytowns and marginalised areas of cities and most
large cities being located in coastal areas. Thirdly, the complexity of
the development process tends to increase vulnerability to disaster. Miserable
local and national governance (as seen in Haiti in last year's floods and
landslides), debt policies, the structure of donor aid, the dynamics of
population growth and forced displacement, and the growing effect of economic
globalisation all have their effect. What is needed is a completely transformed
arrangement for responding to large scale disasters. We propose a new system
with three main features based on the premise that disasters are here to
stay and will radically affect development. Firstly, UN relief agencies
should be funded by assessed contributions from member countries rather
than having to appeal for money after each disaster. UN peacekeeping operations,
equally vital to saving lives, are already successfully funded on this
basis. Such a funding mechanism would still leave room for governments
to make supplementary voluntary payments, but it would also allow agencies
to build reserve funds, to invest in training, and to act more quickly
and save more lives. Secondly, the system for tracking humanitarian pledges,
donations, and spending should be greatly refined and extended. It is morally
inexcusable for pledges of funds to fade away as the cameras move on and
for agreed rehabilitation and disaster proofing programmes to be halted,
half funded. States and the global public need a financial tracking system
that can trumpet when such abuses are taking place. The system also needs
to be adapted to capture the funding provided by the public, which is often
of the same order of magnitude as state funding. Thirdly, the mind set
of economic development in areas prone to disaster needs to be radically
changed. In northern countries, all new development programmes examine
the risk from disaster and seek to protect infrastructure and economic
processes from their worst effects. But in the south, where the economic
costs of disaster are lower but human costs higher, no such attitude prevails.
This is the most fundamental challenge. States, corporations, and donor
agencies need to ensure that vulnerability to disaster is taken into account
in economic development strategies and incorporate local knowledge and
capacity for self protection. If they have the will to be successful, the
next natural hazard could remain just that—a hazard but not a devastating
disaster that compounds with profound vulnerability and leaves massive
death and destruction in its wakeref.
Inequity in the scale of response poses other problems. Over the past decade,
about half of the $2bn (£1bn; 1.5bn) committed to the Inter Agency
Standing Committee Consolidated Appeals went to high profile crises such
as those occurring in Bosnia, Afghanistan, and Kosovo. Other countries
affected by chronic conflict, such as Liberia and Somalia, received much
less per person affected, although their needs are at least as great. Rapid
onset disasters can also trigger a series of responses that are influenced
more by emotions or political motives than by evidence based assessments
of needsref.
Some of these responses are harmful and can add to the suffering or chaos,
such as rapid burials in mass graves because of unwarranted fear of epidemicsref.
It is harder to raise funds for disaster prevention, preparedness, and
mitigation than for response, even though there is strong evidence that
response costs several times more to achieve the same effect. A dollar
spent on constructing hospitals and houses to withstand natural hazards,
such as hurricanes or floods, can save an estimated $5-7 on rebuilding
after severe damageref.
We agree with the authors' plea that funding arrangements should be revisited,
but recognise that change will be incremental. We must make use of windows
of opportunity for improvement, including learning from the tsunami experience.
Donors are responding to criticism on inequitable funding allocations through
the Good Humanitarian Donorship Initiative, in which they endorsed a set
of principles and good practices to improve their role in humanitarian
interventionsref.
The initiative also commits to "strengthen the capacity of affected countries
and local communities to prevent, prepare for, mitigate and respond to
humanitarian crises, with the goal of ensuring that governments and local
communities are better able to meet their responsibilities and co-ordinate
effectively with humanitarian partners." The tragedy of 26 December 2004
is providing much evidence to support the rationale behind this goal. Thanks
to strong institutions, and supported by civil society and non-governmental
organisations, the national authorities of countries affected by the tsunami
have been active in action and coordination for health since the first
hours of relief. Within four weeks of the disaster occurring, the response
is already shifting to rehabilitation. It is essential that available funds
are also used for a long term strategy, to reduce the vulnerability to
future disasters. Last week's world conference on disaster reduction in
Kobe proposed that 10% of funds for responding to disasters be invested
in strengthening preparednessref.
Much expertise exists in this field, and programmes like the Leaders courses
can enhance capacity to develop disaster reduction programmes by increasing
strategic management, leadership, and analytical skills of the participantsref.
Governments in Latin America now help their neighbours when they are hit
by hurricanes and other natural disasters. Similar capacity is urgently
needed in other parts of the world. Disasters undermine and reverse development,
particularly in fragile states, where some indicators for the millennium
development goals are in declineref.
Adequate funding is needed to make progress and we should encourage donor
countries to achieve the Monterrey consensus of providing 0.7% of their
gross domestic product as development aid to the poorest countriesref.
New ways must be found to coordinate and invest development and humanitarian
funds in fragile states, so that communities at risk can receive adequate
support to improve their livelihoods and reduce their vulnerabilitiesref.
Benfield Hazard Research
Centre (BHRC) at University College London comprises 3 groups: Geological
Hazards, Meteorological Hazards & Seasonal Forecasting, and Disaster
Studies & Management.
Sphere Project was launched
in 1997 by a group of humanitarian NGOs and the Red Cross and Red Crescent
movement. Humanitarian Charter and Minimum Standards in Disaster Response