
- Order:
- Duration: 4:25
- Published: 2007-06-25
- Uploaded: 2011-02-20
- Author: brayshay
Emergency measures in relieving famine primarily include providing deficient micronutrients, such as vitamins and minerals, through fortified sachet powders or directly through supplements. Aid groups have begun to use a famine relief model based on giving cash or cash vouchers to the hungry to pay local farmers, rather than buying food from donor countries as the latter distorts local food markets.
Long-term measures include investment in modern agriculture techniques, such as fertilizers and irrigation, which largely eradicated hunger in the developed world. World Bank strictures restrict government subsidies for farmers, and increasing use of fertilizers is opposed by some environmental groups because of its unintended consequences: adverse effects on water supplies and habitat.
Food shortages in a population are caused either by a lack of food or by difficulties in food distribution; it may be worsened by natural climate fluctuations and by extreme political conditions related to oppressive government or warfare. One of the proportionally largest historical famines was that of the Great Famine in Ireland. It began in 1845 because of potato disease and occurred even as food was being shipped from Ireland to England. Only the English could afford to pay higher prices. Recently historians have revised their assessments about how much control the English could have exercised in reducing the famine, finding they did more to try to help than is generally understood. The conventional explanation until 1981 for the cause of famines was the Food availability decline (FAD) hypothesis. The assumption was that the central cause of all famines was a decline in food availability. However, FAD could not explain why only a certain section of the population such as the agricultural laborer was affected by famines while others were insulated from famines. Based on the studies of some recent famines, the decisive role of FAD has been questioned and it has been suggested that the causal mechanism for precipitating starvation includes many variables other than just decline of food availability. According to this view, famines are a result of entitlements, the theory being proposed is called the "failure of exchange entitlements" or FEE. A person may own various commodities which can be exchanged in a market economy for the other commodities he or she needs. The exchange can happen via trading or production or through a combination of the two. These entitlements are called trade-based or production-based entitlements. Per this proposed view, famines are precipitated due to a break down in the ability of the person to exchange his entitlements. An example of famines due to FEE is the inability of an agricultural laborer to exchange his primary entitlement, i.e., labor for rice when his employment became erratic or was completely eliminated.
Some elements make a particular region more vulnerable to famine. These include:
In certain cases, such as the Great Leap Forward in China (which produced the largest famine in absolute numbers), North Korea in the mid-1990s, or Zimbabwe in the early-2000s, famine can occur as an unintentional result of government policy. Malawi ended its famine by subsidizing farmers against the strictures of the World Bank. In France, the Hundred Years' War, crop failures and epidemics reduced the population by two-thirds.
The failure of a harvest or change in conditions, such as drought, can create a situation whereby large numbers of people continue to live where the carrying capacity of the land has temporarily dropped radically. Famine is often associated with subsistence agriculture. The total absence of agriculture in an economically strong area does not cause famine; Arizona and other wealthy regions import the vast majority of their food, since such regions produce sufficient economic goods for trade.
Famines have also been caused by volcanism. The 1815 eruption of the Mount Tambora volcano in Indonesia caused crop failures and famines worldwide and caused the worst famine of the 19th century. The current consensus of the scientific community is that the aerosols and dust released into the upper atmosphere causes cooler temperatures by preventing the sun's energy from reaching the ground. The same mechanism is theorized to be caused by very large meteorite impacts to the extent of causing mass extinctions.
Beginning in the 20th century, nitrogen fertilizers, new pesticides, desert farming, and other agricultural technologies began to be used to increase food production, in part to combat famine. Between 1950 and 1984, as the Green Revolution influenced agriculture, world grain production increased by 250%. Much of this gain is non-sustainable. Such agricultural technologies temporarily increased crop yields, but as early as 1995, there were signs that they may be contributing to the decline of arable land (e.g. persistence of pesticides leading to soil contamination and decline of area available for farming). Developed nations have shared these technologies with developing nations with a famine problem, but there are ethical limits to pushing such technologies on lesser developed countries. This is often attributed to an association of inorganic fertilizers and pesticides with a lack of sustainability.
David Pimentel, professor of ecology and agriculture at Cornell University, and Mario Giampietro, senior researcher at the National Research Institute on Food and Nutrition (INRAN), place in their study Food, Land, Population and the U.S. Economy the maximum U.S. population for a sustainable economy at 200 million. To achieve a sustainable economy and avert disaster, the United States must reduce its population by at least one-third, and world population will have to be reduced by two-thirds, says study. The authors of this study believe that the mentioned agricultural crisis will only begin to impact us after 2020, and will not become critical until 2050. The oncoming peaking of global oil production (and subsequent decline of production), along with the peak of North American natural gas production will very likely precipitate this agricultural crisis much sooner than expected.
Geologist Dale Allen Pfeiffer claims that coming decades could see spiraling food prices without relief and massive starvation on a global level such as never experienced before. Water deficits, which are already spurring heavy grain imports in numerous smaller countries, may soon do the same in larger countries, such as China or India. The water tables are falling in scores of countries (including Northern China, the US, and India) due to widespread overpumping using powerful diesel and electric pumps. Other countries affected include Pakistan, Iran, and Mexico. This will eventually lead to water scarcity and cutbacks in grain harvest. Even with the overpumping of its aquifers, China has developed a grain deficit, contributing to the upward pressure on grain prices. Most of the three billion people projected to be added worldwide by mid-century will be born in countries already experiencing water shortages.
After China and India, there is a second tier of smaller countries with large water deficits — Algeria, Egypt, Iran, Mexico, and Pakistan. Four of these already import a large share of their grain. Only Pakistan remains marginally self-sufficient. But with a population expanding by 4 million a year, it will also soon turn to the world market for grain. According to a UN climate report, the Himalayan glaciers that are the principal dry-season water sources of Asia's biggest rivers - Ganges, Indus, Brahmaputra, Yangtze, Mekong, Salween and Yellow - could disappear by 2035 as temperatures rise and human demand rises. It was later revealed that the source used by the UN climate report actually stated 2350, not 2035. Approximately 2.4 billion people live in the drainage basin of the Himalayan rivers. India, China, Pakistan, Afghanistan, Bangladesh, Nepal and Myanmar could experience floods followed by severe droughts in coming decades. In India alone, the Ganges provides water for drinking and farming for more than 500 million people.
Relief technologies including immunization, improved public health infrastructure, general food rations and supplementary feeding for vulnerable children, has provided temporary mitigation to the mortality impacts of famines, while leaving their economic consequences unchanged, and not solving the underlying issue of too large a regional population relative to food production capability. Humanitarian crises may also arise from genocide campaigns, civil wars, refugee flows and episodes of extreme violence and state collapse, creating famine conditions among the affected populations.
Despite repeated stated intentions by the world's leaders to end hunger and famine, famine remains a chronic threat in much of Africa and Asia. In July 2005, the Famine Early Warning Systems Network labelled Niger with emergency status, as well as Chad, Ethiopia, South Sudan, Somalia and Zimbabwe. In January 2006, the United Nations Food and Agriculture Organization warned that 11 million people in Somalia, Kenya, Djibouti and Ethiopia were in danger of starvation due to the combination of severe drought and military conflicts. In 2006, the most serious humanitarian crisis in Africa is in Sudan's region Darfur.
Some believed that the Green Revolution was an answer to famine in the 1970s and 1980s. The Green Revolution began in the 20th century with hybrid strains of high-yielding crops. Between 1950 and 1984, as the Green Revolution transformed agriculture around the globe, world grain production increased by 250%. Some criticize the process, stating that these new high-yielding crops require more chemical fertilizers and pesticides, which can harm the environment. However, it was an option for developing nations suffering from famine. These high-yielding crops make it technically possible to feed more people. However, there are indications that regional food production has peaked in many world sectors, due to certain strategies associated with intensive agriculture such as groundwater overdrafting and overuse of pesticides and other agricultural chemicals.
Frances Moore Lappé, later co-founder of the Institute for Food and Development Policy (Food First) argued in Diet for a Small Planet (1971) that vegetarian diets can provide food for larger populations, with the same resources, compared to omnivorous diets.
Noting that modern famines are sometimes aggravated by misguided economic policies, political design to impoverish or marginalize certain populations, or acts of war, political economists have investigated the political conditions under which famine is prevented. Amartya Sen|group=note}} states that the liberal institutions that exist in India, including competitive elections and a free press, have played a major role in preventing famine in that country since independence. Alex de Waal has developed this theory to focus on the "political contract" between rulers and people that ensures famine prevention, noting the rarity of such political contracts in Africa, and the danger that international relief agencies will undermine such contracts through removing the locus of accountability for famines from national governments.
Famine is also accompanied by lower fertility. Famines therefore leave the reproductive core of a population—adult women—lesser affected compared to other population categories, and post-famine periods are often characterized a "rebound" with increased births. Even though the theories of Thomas Malthus would predict that famines reduce the size of the population commensurate with available food resources, in fact even the most severe famines have rarely dented population growth for more than a few years. The mortality in China in 1958–61, Bengal in 1943, and Ethiopia in 1983–85 was all made up by a growing population over just a few years. Of greater long-term demographic impact is emigration: Ireland was chiefly depopulated after the 1840s famines by waves of emigration.
In modern times, local and political governments and non-governmental organizations that deliver famine relief have limited resources with which to address the multiple situations of food insecurity that are occurring simultaneously. Various methods of categorizing the gradations of food security have thus been used in order to most efficiently allocate food relief. One of the earliest were the Indian Famine Codes devised by the British in the 1880s. The Codes listed three stages of food insecurity: near-scarcity, scarcity and famine, and were highly influential in the creation of subsequent famine warning or measurement systems. The early warning system developed to monitor the region inhabited by the Turkana people in northern Kenya also has three levels, but links each stage to a pre-planned response to mitigate the crisis and prevent its deterioration.
The experiences of famine relief organizations throughout the world over the 1980s and 1990s resulted in at least two major developments: the "livelihoods approach" and the increased use of nutrition indicators to determine the severity of a crisis. Individuals and groups in food stressful situations will attempt to cope by rationing consumption, finding alternative means to supplement income, etc. before taking desperate measures, such as selling off plots of agricultural land. When all means of self-support are exhausted, the affected population begins to migrate in search of food or fall victim to outright mass starvation. Famine may thus be viewed partially as a social phenomenon, involving markets, the price of food, and social support structures. A second lesson drawn was the increased use of rapid nutrition assessments, in particular of children, to give a quantitative measure of the famine's severity.
Since 2004, many of the most important organizations in famine relief, such as the World Food Programme and the U.S. Agency for International Development, have adopted a five-level scale measuring intensity and magnitude. The intensity scale uses both livelihoods' measures and measurements of mortality and child malnutrition to categorize a situation as food secure, food insecure, food crisis, famine, severe famine, and extreme famine. The number of deaths determines the magnitude designation, with under 1000 fatalities defining a "minor famine" and a "catastrophic famine" resulting in over 1,000,000 deaths.
The World Bank and some rich nations press nations that depend on them for aid to cut back or eliminate subsidized agricultural inputs such as fertilizer, in the name of privatization even as the United States and Europe extensively subsidized their own farmers. Many, if not most, of the farmers are too poor to afford fertilizer at market prices. Fortifying foods such as peanut butter sachets (see Plumpy'Nut) and Spirulina have revolutionized emergency feeding in humanitarian emergencies because they can be eaten directly from the packet, do not require refrigeration or mixing with scarce clean water, can be stored for years and, vitally, can be absorbed by extremely ill children. The United Nations World Food Conference of 1974 declared Spirulina as 'the best food for the future' and its ready harvest every 24 hours make it a potent tool to eradicate malnutrition. Additionally, supplements, such as Vitamin A capsules or Zinc tablets to cure diarrhea in children, are used.
There is a growing realization among aid groups that giving cash or cash vouchers instead of food is a cheaper, faster, and more efficient way to deliver help to the hungry, particularly in areas where food is available but unaffordable. The UN's World Food Program, the biggest non-governmental distributor of food, announced that it will begin distributing cash and vouchers instead of food in some areas, which Josette Sheeran, the WFP's executive director, described as a "revolution" in food aid. The aid agency Concern Worldwide is piloting an method through a mobile phone operator, Safaricom, which runs a money transfer program that allows cash to be sent from one part of the country to another. US Law, which requires buying food at home rather than where the hungry live, is inefficient because approximately half of what is spent goes for transport. Fred Cuny further pointed out "studies of every recent famine have shown that food was available in-country — though not always in the immediate food deficit area" and "even though by local standards the prices are too high for the poor to purchase it, it would usually be cheaper for a donor to buy the hoarded food at the inflated price than to import it from abroad."
Ethiopia has been pioneering a program that has now become part of the World Bank's prescribed recipe for coping with a food crisis and had been seen by aid organizations as a model of how to best help hungry nations. Through the country's main food assistance program, the Productive Safety Net Program, Ethiopia has been giving rural residents who are chronically short of food, a chance to work for food or cash. Foreign aid organizations like the World Food Program were then able to buy food locally from surplus areas to distribute in areas with a shortage of food.
Historians of African famine have documented repeated famines in Ethiopia. Possibly the worst episode occurred in 1888 and succeeding years, as the epizootic rinderpest, introduced into Eritrea by infected cattle, spread southwards reaching ultimately as far as South Africa. In Ethiopia it was estimated that as much as 90 percent of the national herd died, rendering rich farmers and herders destitute overnight. This coincided with drought associated with an el Nino oscillation, human epidemics of smallpox, and in several countries, intense war. The Ethiopian Great famine that afflicted Ethiopia from 1888 to 1892 cost it roughly one-third of its population. In Sudan the year 1888 is remembered as the worst famine in history, on account of these factors and also the exactions imposed by the Mahdist state. Colonial "pacification" efforts often caused severe famine, as for example with the repression of the Maji Maji revolt in Tanganyika in 1906. The introduction of cash crops such as cotton, and forcible measures to impel farmers to grow these crops, also impoverished the peasantry in many areas, such as northern Nigeria, contributing to greater vulnerability to famine when severe drought struck in 1913.
However, for the middle part of the 20th century, agriculturalists, economists and geographers did not consider Africa to be famine prone (they were much more concerned about Asia). There were notable counter-examples, such as the famine in Rwanda during World War II and the Malawi famine of 1949, but most famines were localized and brief . The specter of famine recurred only in the early 1970s, when Ethiopia and the west African Sahel suffered drought and famine. The Ethiopian famine of that time was closely linked to the crisis of feudalism in that country, and in due course helped to bring about the downfall of the Emperor Haile Selassie. The Sahelian famine was associated with the slowly growing crisis of pastoralism in Africa, which has seen livestock herding decline as a viable way of life over the last two generations.
Since then, African famines have become more frequent, more widespread and more severe. Many African countries are not self-sufficient in food production, relying on income from cash crops to import food. Agriculture in Africa is susceptible to climatic fluctuations, especially droughts which can reduce the amount of food produced locally. Other agricultural problems include soil infertility, land degradation and erosion, swarms of desert locusts, which can destroy whole crops, and livestock diseases. The Sahara reportedly spreads at a rate of up to 30 miles a year. The most serious famines have been caused by a combination of drought, misguided economic policies, and conflict. The 1983–85 famine in Ethiopia, for example, was the outcome of all these three factors, made worse by the Communist government's censorship of the emerging crisis. In Sudan at the same date, drought and economic crisis combined with denials of any food shortage by the then-government of President Gaafar Nimeiry, to create a crisis that killed perhaps 250,000 people—and helped bring about a popular uprising that overthrew Nimeiry.
Numerous factors make the food security situation in Africa tenuous, including political instability, armed conflict and civil war, corruption and mismanagement in handling food supplies, and trade policies that harm African agriculture. An example of a famine created by human rights abuses is the 1998 Sudan famine. AIDS is also having long-term economic effects on agriculture by reducing the available workforce, and is creating new vulnerabilities to famine by overburdening poor households. On the other hand, in the modern history of Africa on quite a few occasions famines acted as a major source of acute political instability. In Africa, if current trends of population growth and soil degradation continue, the continent might be able to feed just 25% of its population by 2025, according to UNU's Ghana-based Institute for Natural Resources in Africa.
When a stressed monarchy shifted from state management and direct shipments of grain to monetary charity in the mid-nineteenth century, the system broke down. Thus the 1867–68 famine under the Tongzhi Restoration was successfully relieved but the Great North China Famine of 1877–78 , caused by drought across northern China, was a catastrophe. The province of Shanxi was substantially depopulated as grains ran out, and desperately starving people stripped forests, fields, and their very houses for food. Estimated mortality is 9.5 to 13 million people. (Mike Davis, Late Victorian Holocausts)
The largest famine of the 20th century, and almost certainly of all time, was the 1958–61 Great Leap Forward famine in China. The immediate causes of this famine lay in Mao Zedong's ill-fated attempt to transform China from an agricultural nation to an industrial power in one huge leap. Communist Party cadres across China insisted that peasants abandon their farms for collective farms, and begin to produce steel in small foundries, often melting down their farm instruments in the process. Collectivisation undermined incentives for the investment of labor and resources in agriculture; unrealistic plans for decentralized metal production sapped needed labor; unfavorable weather conditions; and communal dining halls encouraged overconsumption of available food (see Chang, G, and Wen, G (1997), "Communal dining and the Chinese Famine 1958-1961" ). Such was the centralized control of information and the intense pressure on party cadres to report only good news—such as production quotas met or exceeded—that information about the escalating disaster was effectively suppressed. When the leadership did become aware of the scale of the famine, it did little to respond, and continued to ban any discussion of the cataclysm. This blanket suppression of news was so effective that very few Chinese citizens were aware of the scale of the famine, and the greatest peacetime demographic disaster of the 20th century only became widely known twenty years later, when the veil of censorship began to lift.
The 1958–61 famine is estimated to have caused excess mortality of about 36 to 45 million, with a further 30 million cancelled or delayed births. It was only when the famine had wrought its worst that Mao was forced to reverse agricultural collectivisation policies, which were effectively dismantled in 1978. China has not experienced a famine of the proportions of the Great Leap Forward since 1961, (Woo-Cummings, 2002) although there is widespread ongoing malnutrition in many rural areas of China in current times.
The observations of the Famine Commission of 1880 support the notion that food distribution is more to blame for famines than food scarcity. They observed that each province in British India, including Burma, had a surplus of foodgrains, and the annual surplus was 5.16 million tons (Bhatia, 1970). At that time, annual export of rice and other grains from India was approximately one million tons.
In 1966, there was a close call in Bihar, when the United States allocated 900,000 tons of grain to fight the famine. Three years of drought in India resulted in an estimated 1.5 million deaths from starvation and disease.
The Great Persian Famine of 1870–1871 is believed to have caused the death of 1.5 million persons in Persia (present–day Iran), which would represent 20–25 percent of Persia's estimated total population of 6–7 million.
Lebanon became increasingly dependent on food imports from abroad, making the country extremely vulnerable to famine during World War I. By the end of the war, an estimated 100,000 of Lebanon's 450,000 population had died of famine.
During the 17th century, continuing the trend of previous centuries, there was an increase in market-driven agriculture. Farmers, people who rented land in order to make a profit off of the product of the land, employing wage labour, became increasingly common, particularly in western Europe. It was in their interest to produce as much as possible on their land in order to sell it to areas that demanded that product. They produced guaranteed surpluses of their crop every year if they could. Farmers paid their labourers in money, increasing the commercialization of rural society. This commercialization had a profound impact on the behaviour of peasants. Farmers were interested in increasing labour input into their lands, not decreasing it as subsistence peasants were.
Subsistence peasants were also increasingly forced to commercialize their activities because of increasing taxes. Taxes that had to be paid to central governments in money forced the peasants to produce crops to sell. Sometimes they produced industrial crops, but they would find ways to increase their production in order to meet both their subsistence requirements as well as their tax obligations. Peasants also used the new money to purchase manufactured goods. The agricultural and social developments encouraging increased food production were gradually taking place throughout the sixteenth century, but were spurred on more directly by the adverse conditions for food production that Europe found itself in the early seventeenth century — there was a general cooling trend in the Earth's temperature starting at the beginning end of the sixteenth century.
The 1590s saw the worst famines in centuries across all of Europe, except in certain areas, notably the Netherlands. Famine had been relatively rare during the 16th century. The economy and population had grown steadily as subsistence populations tend to when there is an extended period of relative peace (most of the time). Subsistence peasant populations will almost always increase when possible since the peasants will try to spread the work to as many hands as possible. Although peasants in areas of high population density, such as northern Italy, had learned to increase the yields of their lands through techniques such as promiscuous culture, they were still quite vulnerable to famines, forcing them to work their land even more intensively.
Famine is a very destabilizing and devastating occurrence. The prospect of starvation led people to take desperate measures. When scarcity of food became apparent to peasants, they would sacrifice long-term prosperity for short-term survival. They would kill their draught animals, leading to lowered production in subsequent years. They would eat their seed corn, sacrificing next year's crop in the hope that more seed could be found. Once those means had been exhausted, they would take to the road in search of food. They migrated to the cities where merchants from other areas would be more likely to sell their food, as cities had a stronger purchasing power than did rural areas. Cities also administered relief programs and bought grain for their populations so that they could keep order. With the confusion and desperation of the migrants, crime would often follow them. Many peasants resorted to banditry in order to acquire enough to eat.
One famine would often lead to difficulties in following years because of lack of seed stock or disruption of routine, or perhaps because of less-available labour. Famines were often interpreted as signs of God's displeasure. They were seen as the removal, by God, of His gifts to the people of the Earth. Elaborate religious processions and rituals were made to prevent God's wrath in the form of famine.
The great famine of the 1590s began the period of famine and decline in the 17th century. The price of grain, all over Europe was high, as was the population. Various types of people were vulnerable to the succession of bad harvests that occurred throughout the 1590s in different regions. The increasing number of wage labourers in the countryside were vulnerable because they had no food of their own, and their meager living was not enough to purchase the expensive grain of a bad-crop year. Town labourers were also at risk because their wages would be insufficient to cover the cost of grain, and, to make matters worse, they often received less money in bad-crop years since the disposable income of the wealthy was spent on grain. Often, unemployment would be the result of the increase in grain prices, leading to ever-increasing numbers of urban poor.
All areas of Europe were badly affected by the famine in these periods, especially rural areas. The Netherlands was able to escape most of the damaging effects of the famine, though the 1590s were still difficult years there. Actual famine did not occur, for the Amsterdam grain trade [with the Baltic] guaranteed that there would always be something to eat in the Netherlands although hunger was prevalent.
The Netherlands had the most commercialized agriculture in all of Europe at this time, growing many industrial crops, such as flax, hemp, and hops. Agriculture became increasingly specialized and efficient. As a result, productivity and wealth increased, allowing the Netherlands to maintain a steady food supply. By the 1620s, the economy was even more developed, so the country was able to avoid the hardships of that period of famine with even greater impunity.
The years around 1620 saw another period of famines sweep across Europe. These famines were generally less severe than the famines of twenty-five years earlier, but they were nonetheless quite serious in many areas. Perhaps the worst famine since 1600, the great famine in Finland in 1696, killed one-third of the population.
Two massive famines struck France between 1693 and 1710, killing over two million people. In both cases the impact of harvest failure was exacerbated by wartime demands on the food supply.
As late as the 1690s, Scotland experienced famine which reduced the population of parts of Scotland by at least 15%.
The famine of 1695–96 killed roughly 10% of Norway's population. At least nine severe harvest failures were recorded in the Scandinavian countries between 1740 and 1800, each resulting in a substantial rise of the death rate.
The period of 1740–43 saw frigid winters and summer droughts which led to famine across Europe leading to a major spike in mortality. (cited in Davis, Late Victorian Holocausts, 281) The freezing winter of 1740-41, which led to widespread famine in northern Europe, may owe its origins to a volcanic eruption.
The Great Famine, which lasted from 1770 until 1771, killed about one tenth of Czech lands’ population, or 250,000 inhabitants, and radicalized countrysides leading to peasant uprisings.
Other areas of Europe have known famines much more recently. France saw famines as recently as the nineteenth century. Famine still occurred in eastern Europe during the 20th century.
The frequency of famine can vary with climate changes. For example, during the little ice age of the 15th century to the 18th century, European famines grew more frequent than they had been during previous centuries.
Because of the frequency of famine in many societies, it has long been a chief concern of governments and other authorities. In pre-industrial Europe, preventing famine, and ensuring timely food supplies, was one of the chief concerns of many governments, which employed various tools to alleviate famines, including price controls, purchasing stockpiles of food from other areas, rationing, and regulation of production. Most governments were concerned by famine because it could lead to revolt and other forms of social disruption.
Famine returned to the Netherlands during World War II in what was known as the Hongerwinter. It was the last famine of Europe, in which approximately 30,000 people died of starvation. Some other areas of Europe also experienced famine at the same time.
In northern Italy, a report of 1767 noted that there had been famine in 111 of the previous 316 years (i.e. the period 1451-1767) and only sixteen good harvests. During the terrible famine of 1680, some 80,000 persons, out of a total population of 250,000, are said to have died in Sardinia.
In 1783 the volcano Laki in south-central Iceland erupted. The lava caused little direct damage, but ash and sulfur dioxide spewed out over most of the country, causing three-quarters of the island's livestock to perish. In the following famine, around ten thousand people died, one-fifth of the population of Iceland. [Asimov, 1984, 152-153]
Iceland was also hit by a potato famine between 1862 and 1864. Lesser known than the Irish potato famine, the Icelandic potato famine was caused by the same blight that ravaged most of Europe during the 1840s. About 5 percent of Iceland's population died during the famine.
Droughts and famines in Imperial Russia are known to have happened every 10 to 13 years, with average droughts happening every 5 to 7 years. Eleven major famines scourged Russia between 1845 and 1922, one of the worst being the famine of 1891–2. Famines continued in the Soviet era, the most notorious being the Holodomor in various parts of the country, especially the Volga, and the Ukrainian and northern Kazakh SSR's during the winter of 1932–1933. The Soviet famine of 1932–1933 is nowadays reckoned to have cost an estimated 6 million lives. The last major famine in the USSR happened in 1947 due to the severe drought and the mismanagement of grain reserves by the Soviet government.
The 872 days of the Siege of Leningrad (1941–1944) caused unparalleled famine in the Leningrad region through disruption of utilities, water, energy and food supplies. This resulted in the deaths of about one million people.
Brazil's 1877–78 Grande Seca (Great Drought), the most severe ever recorded in Brazil, caused approximately half a million deaths. The one from 1915 was devastating too.
Case studies:
Category:Cannibalism Category:Development studies Category:Poverty Category:Population
This text is licensed under the Creative Commons CC-BY-SA License. This text was originally published on Wikipedia and was developed by the Wikipedia community.