Did humans transform the Sahara from lush Eden to desert?

David K Wright, Seoul National University

Once upon a time, the Sahara was green. There were vast lakes. Hippos and giraffe lived there, and large human populations of fishers foraged for food alongside the lakeshores. The Conversation

The “African Humid Period” or “Green Sahara” was a time between 11,000 and 4,000 years ago when significantly more rain fell across the northern two-thirds of Africa than it does today.

The vegetation of the Sahara was highly diverse and included species commonly found on the margins of today’s rainforests along with desert-adapted plants. It was a highly productive and predictable ecosystem in which hunter-gatherers appear to have flourished.

Sahara desert

These conditions stand in marked contrast to the current climate of northern Africa. Today, the Sahara is the largest hot desert in the world. It lies in the subtropical latitudes dominated by high-pressure ridges, where the atmospheric pressure at the Earth’s surface is greater than the surrounding environment. These ridges inhibit the flow of moist air inland.

How the Sahara became a desert

The stark difference between 10,000 years ago and now largely exists due to changing orbital conditions of the earth – the wobble of the earth on its axis and within its orbit relative to the sun.

But this period ended erratically. In some areas of northern Africa, the transition from wet to dry conditions occurred slowly; in others it seems to have happened abruptly. This pattern does not conform to expectations of changing orbital conditions, since such changes are slow and linear.

The most commonly accepted theory about this shift holds that devegetation of the landscape meant that more light reflected off the ground surface (a process known as albedo), helping to create the high-pressure ridge that dominates today’s Sahara.

But what caused the initial devegetation? That’s uncertain, in part because the area involved with studying the effects is so vast. But my recent paper presents evidence that areas where the Sahara dried out quickly happen to be the same areas where domesticated animals first appeared. At this time, where there is evidence to show it, we can see that the vegetation changes from grasslands into scrublands.

Scrub vegetation dominates the modern Saharan and Mediterranean ecosystems today and has significantly more albedo effects than grasslands.

If my hypothesis is correct, the initial agents of change were humans, who initiated a process that cascaded across the landscape until the region crossed an ecological threshold. This worked in tandem with orbital changes, which pushed ecosystems to the brink.

Historical precedent

There’s a problem with testing my hypothesis: datasets are scarce. Combined ecological and archaeological research across northern Africa is rarely undertaken.

But well-tested comparisons abound in prehistoric and historic records from across the world. Early Neolithic farmers of northern Europe, China and southwestern Asia are documented as significantly deforesting their environments.

In the case of East Asia, nomadic herders are believed to have intensively grazed the landscape 6,000 years ago to the point of reducing evapo-transpiration – the process which allows clouds to form – from the grasslands, which weakened monsoon rainfall.

Their burning and land-clearance practices were so unprecedented that they triggered significant alterations to the relationship between the land and the atmosphere that were measurable within hundreds of years of their introduction.

Similar dynamics occurred when domesticated animals were introduced to New Zealand and North America upon initial settlement by Europeans in the 1800s – only in these instances they were documented and quantified by historical ecologists.

New Zealand colonists sheep

New Zealand’s colonial pastoralists transformed the country’s landscape.
William Allsworth

Ecology of fear

Landscape burning has been occurring for millions of years. Old World landscapes have hosted humans for more than a million years and wild grazing animals for more than 20 million years. Orbitally induced changes in the climate are as old as the earth’s climate systems themselves.

So what made the difference in the Sahara? A theory called the “ecology of fear” may contribute something to this discussion. Ecologists recognise that the behaviour of predatory animals toward their prey has a significant impact on landscape processes. For example, deer will avoid spending significant time in open landscapes because it makes them easy targets for predators (including humans).

If you remove the threat of predation, the prey behave differently. In Yellowstone National Park, the absence of predators is argued to have changed grazers’ habits. Prey felt more comfortable grazing alongside the exposed riverbanks, which increased the erosion in those areas. The re-introduction of wolves into the ecosystem completely shifted this dynamic and forests regenerated within several years. By altering the “fear-based ecology”, significant changes in landscape processes are known to follow.

The introduction of livestock to the Sahara may have had a similar effect. Landscape burning has a deep history in the few places in which it has been tested in the Sahara. But the primary difference between pre-Neolithic and post-Neolithic burning is that the ecology of fear was altered.

Most grazing animals will avoid landscapes that have been burned, not only because the food resources there are relatively low, but also because of exposure to predators. Scorched landscapes present high risks and low rewards.

But with humans guiding them, domesticated animals are not subject to the same dynamics between predator and prey. They can be led into recently burned areas where the grasses will be preferentially selected to eat and the shrubs will be left alone. Over the succeeding period of landscape regeneration, the less palatable scrubland will grow faster than succulent grasslands – and, thus, the landscape has crossed a threshold.

It can be argued that early Saharan pastoralists changed the ecology of fear in the area, which in turn enhanced scrubland at the expense of grasslands in some places, which in turn enhanced albedo and dust production and accelerated the termination of the African Humid Period.

I tested this hypothesis by correlating the occurrences and effects of early livestock introduction across the region, but more detailed paleoecological research is needed. If proven, the theory would explain the patchy nature of the transition from wet to dry conditions across northern Africa.

Lessons for today

Although more work remains, the potential of humans to profoundly alter ecosystems should send a powerful message to modern societies.

More than 35% of the world’s population lives in dryland ecosystems, and these landscapes must be carefully managed if they are to sustain human life. The end of the African Humid Period is a lesson for modern societies living on drylands: if you strip the vegetation, you alter the land-atmosphere dynamics, and rainfall is likely to diminish.

This is precisely what the historic records of rainfall and vegetation in the south-western desert of the United States demonstrates, though the precise causes remain speculative.

In the meantime, we must balance economic development against environmental stewardship. Historical ecology teaches us that when an ecological threshold is crossed, we cannot go back. There are no second chances, so the long-term viability of 35% of humanity rests on maintaining the landscapes where they live. Otherwise we may be creating more Sahara Deserts, all around the world.

David K Wright, Associate Professor, Department of Archaeology and Art History, Seoul National University

This article was originally published on The Conversation. Read the original article.

Broader approach needed for health-related SDGs

By Ochieng’ Ogodo

[WINDHOEK, NAMIBIA] African governments should invest more on research and development (R&D) to achieve health-related Sustainable Development Goals (SDGs) and universal health coverage (UHC), a forum has heard.

According to the WHO Regional Office for Africa, which organised the forum, African health systems have seen significant improvements in health indicators but still suffer from challenges such as reduced health financing, inadequate health workforce and poor information and monitoring systems.

“African governments need to dedicate [reasonable] amounts of money to research that answers critical questions.”

Emmanuel Ankrah Odame, Ghana’s Ministry of Health

Emmanuel Ankrah Odame, director of policy planning, monitoring and evaluation in Ghana’s Ministry of Health, said during the 1st Regional Forum on Strengthening Health Systems for the SDGs and UHC last month (12-13 December) in Namibia that African governments need to fund R&D mainly from their own sources rather than largely depending on external donors.

“R&D is key and communicating research is important,” said Odame. “We need to do research that involves and better influences our health systems. African governments need to dedicate [reasonable] amounts of money to research that answers critical questions. Researchers … and public service servants need to work [together] and not in silos if we want to make any progress in SDGs.”

Odame added that African academicians should think about improving the conditions of the people, especially the poor, instead of getting published mainly for their own satisfaction such as promotions and titles.

Odame urged African countries to urgently analyse human capacity in health against demands for healthcare services and address existing gaps in providing universal health care for their populations.

He noted that human capacity in health goes beyond doctors and nurses. “We need a broader approach that include having even anthropologists and geographers to be able to understand and appreciate the health needs of our diverse societies fully, ” he explained, adding that members of the public should be educated on the SDGs so they can put pressure on policymakers to address issues that impact positively on their lives.

According to Andreas Mwoombola, permanent secretary of Namibia’s Ministry of Health and Social Services, the forum came up with practical approaches such as a framework for strategic health systems actions, mechanisms for monitoring systems strengthening efforts of countries and the region, and a framework for strategic health investments at country and inter-country levels.

“It easy to put things on paper but to implement them is a very tough situation that will require looking at resources available including human and technologies,” said Mwoombola. ”This will call for careful prioritisation based on what is critically urgent.”

Martin Ekeke Monono, an adviser on violence and injury prevention and disability for the WHO Regional Office for Africa, told SciDev.Net that countries need to organise meetings involving all relevant individuals and institutions to create awareness leading to translation of the commitments into tangible actions that affect people’s lives.

This piece was produced by SciDev.Net’s Sub-Saharan Africa English desk.


This article was originally published on SciDev.Net. Read the original article.

South Africa should follow Portugal to decriminalise drug use

Monique Marks, Durban University of Technology

In 2015 I started joining police in the South African port city of Durban on their “drug operations”. Most of my journeys were at night. They focused mainly on the policing of street level drug users. The Conversation

The brief of the police was to increase arrests for drug possession, drug use and drug dealing. They were well aware that the easiest way to achieve these targets – critical to their performance management – was to carry out arrests emanating from observing street level drug users engaged in drug transactions.

So began a night of witnessing “buy and busts”. Random searches generally proved to be successful. Predominantly young black men were searched. Small amounts of illicit drugs were found in their possession. They were detained and thrust into the back of the police van where I was sitting, observing and interacting, in my ethnographic mode.

The majority of those apprehended were either in possession of, or using, a drug called “whoonga”. Whoonga – like “sugars”, “nyaope”, “unga” – is essentially poor grade heroin mixed with a variety of bulking agents (some very toxic).

South Africa Portugal decriminalization drugs heroin whoonga

Young people using the infamous ‘nyaope’ drug in Johannesburg.
Moeletsi Mabe/ The Times


The visible effect of whoonga was a deep feeling of relaxation and even sleepiness. Not surprisingly, the whoonga users who were arrested didn’t resist the police. If anything, they were submissive.

Whoonga users, I realised, are the low hanging fruit that the police target to ensure that their arrests rates look good. But I had a second and more significant realisation as I spoke with the whoonga users – that South Africa falls very short of having the correct approach to dealing with drug user disorders. Arrests and strong arm law enforcement play no role in curtailing whoonga use.

On the contrary, it pushes drug use and drug markets further underground, making it almost impossible to design programmes to reduce the harms associated with drug use. The winners, it’s clear, are the big time dealers who are able to capitalise on dark networks to continue operating.

Other countries have shown that there are better ways of managing the problem. One example is Portugal.

Ineffective war on drugs

The war on drugs in South Africa, as in the US, has in no way reduced the supply or the demand of drugs. And without a doubt it’s led to an increase in the harms associated with drugs as users once incarcerated and left with a criminal record become increasingly marginalised.

Criminalisation results in reduced possibilities for people who use drugs to normalise their lives and to reintegrate. Endless punishment, rather than support, has fundamentally harmful consequences for individual drug users, their families and the broader community.

Fear of arrest and stigmatisation prevents the problems that underlay problematic drug use being talked about, leaving users and their families isolated, hopeless and vulnerable. Failure to see the people behind the drugs, and the real problems that lie beneath drug use, has devastating outcomes.

Another approach

So what should South Africans be talking about in this context? They should be talking about bringing drug use and the markets into the open, in much the same way as has been done in Portugal since 2001. This means moving away from the senseless and unproductive war on drugs to the possibility of decriminalisation which would allow proper support to be provided to drug users and their families, and would dramatically decrease the power of dark networks.

This sounds radical and counter-intuitive, but the evidence from Portugal speaks for itself.

Since the introduction of decriminalisation of drug use and possession, heroin use in the country has decreased dramatically as have the numbers of overdose. By contrast in the US drug use disorders and drug markets are growing and spreading.

Aside from decriminalisation South Africans should also be talking about the treatment that’s available to people who use drugs, particularly those who have limited financial resources. The country’s public health system offers no proper treatment for drug use disorders, and it has very few public “rehabilitation” centres. Those that are operational have very low retention and success rates. In Durban, for example, there’s one public rehabilitation centre. The waiting list to gain entry is very long and it lacks the medication to assist heroin users through withdrawal and ongoing medical maintenance.

Moralistic narratives

South Africa has found itself stuck in conservative moralistic narratives about drug use that do little to reduce the harms associated with drugs. Those with heroin use disorders are well aware that existing public health and social development facilities are ineffective and inadequate.

Users on the streets talked about the need for opioid substitution therapy. The word Methadone came up constantly, a medicine that’s viewed as the only hope for detox and for long term maintenance.

Methadone and other opioid substitute medications represent the only hope for many with heroin use disorders. But these medications are currently not available in the public sector in South Africa, other than in one hospital in Cape Town for a limited period of time and for a limited number of beneficiaries. This is despite the use of government issued Opioid Substitution Therapy (OST) more than 80 countries worldwide, some for more than 30 years. This list includes countries on the continent such as Tanzania, Mauritius, Kenya and Senegal.

It’s now fallen on universities and NGOs to establish low threshold OST Demonstration Projects. The first project begins in Durban in April 2017, run by the Urban Futures Centre at the Durban University of Technology, together with TB/HIV Care Association.

The Durban OST Demonstration Project has support from both the KwaZulu-Natal and National Department of Health, although not at a financial level. This OST Demonstration Project will use Methadone supplied by Equity Pharmaceuticals and will have a cohort of 50 beneficiaries, all of whom will be from very low income circumstances. This project is guided by very comprehensive protocols which have received ethical clearance from both the KZN Department of Health and the Durban University of Technology.

There have also been public debates, dialogues with police and robust engagements with government officials particularly from the departments of health and social development. As a result views on drug use disorders are slowly shifting, at least in the minds of some key players.

History teaches us time and again that prohibition and silencing seldom had good results. The moment is here to be bold and to ensure that the rights of the most vulnerable are protected and that they are provided the scaffolding (medical and otherwise) to lead productive and connected lives.

Monique Marks, Head of Urban Futures Centre, Durban University of Technology

This article was originally published on The Conversation. Read the original article.

Alliances in cassava R&D will aid food security

By Gilbert Nakweya

[YAOUNDE, CAMEROON] Strengthening collaborations among institutions and small-scale cassava farmers could help Central Africa reduce hunger and foster nutrition security, experts say.

Research scientists from academic institutions and policymakers say that collaborations in research and development would promote innovation to address the challenges of nutrition insecurity such as stunting in children.

The experts were speaking at a forum on cassava that brought delegates and smallholders farmers in Cameroon to discuss the challenges and opportunities in cassava farming in Cameroon, Central African Republic, Chad, Congo, Democratic Republic of Congo and Gabon.

“Nutrition insecurity is a real problem in Central Africa affecting many households.”

Judith Francis, CTA

The forum was organised by the Netherlands-headquartered Technical Centre for Agriculture and Rural Cooperation (CTA) in partnership with Food and Agriculture Organization of the United Nations, and Nigerian-headquartered International Institute for Tropical Agriculture.

Bringing together the main actors in the cassava value chain in Central Africa to have a common agenda to be addressed by researchers, policy makers and financiers was the key objective of the forum that took place in Cameroon early this month (6-9 December). The forum also provided a platform for trade between smallholders and potential buyers of cassava.

The experts were concerned that despite efforts to achieving food security in the region, little is being done to address undernutrition, especially stunting and a rise in non-communicable diseases.

They noted that understanding the nexus between agriculture, food and nutrition has become a research and development priority.

“Nutrition insecurity is a real problem in Central Africa affecting many households,” says Judith Francis.

Francis, who is a senior programmes coordinator, science and technology policy, CTA, tells SciDev.Net that well-coordinated collaborations among organisations are necessary to help address nutritional challenges.

She adds that cassava has a lot of nutritive value with its leaves rich in vitamin A and minerals such as potassium which could help address stunting problems he the increasing burden of non-communicable diseases including diabetes.

“But we need accurate and timely data on nutrition to be collected in Central Africa to help address the challenge,” explains Francis, adding that the forum has helped raise the profile of cassava as an important nutrition security crop.

Ben Bennet, the director of UK-headquartered Natural Resources Institute, adds that there is a need to continue educating people on the nutritional values of cassava.

Bennet urges African governments to play a central role in coordinating individuals and institutions to drive the nutrition security agenda in Central Africa.

This piece was produced by SciDev.Net’s Sub-Saharan Africa English desk.


This article was originally published on SciDev.Net. Read the original article.

Cholera – African countries fail to do even the basics

Samuel Kariuki, Kenya Medical Research Institute

There have been a spate of cholera outbreaks in a number of sub-Saharan countries including South Sudan, Mozambique and Malawi. Cholera can move from one country to another, killing hundreds of people in its wake. The Conversation Africa’s Health and Medicine Editor Joy Wanja Muraya spoke to Sam Kariuki on how to facilitate prevention, detection and better responses to public health threats associated with the disease. The Conversation

Why are cholera outbreaks common and deadly in Africa?

Cholera is an acute diarrhoeal disease that can kill within hours if left untreated.

Each year 1.3 to 4.0 million cases of the illness occurs around the world, leading to between 21 000 to 143 000 deaths. About two thirds of these are in developing countries, mostly in sub Saharan Africa..

cholera Africa

Cholera is usually transmitted through contaminated water or food.
Ahmed Saad/Reuters


Cholera is caused by a gram negative bacterium called Vibrio cholerae usually transmitted through contaminated water or food in areas with poor sanitation and lack of clean drinking water.

Cholera is referred to as a disease of poverty because of the lack of social development in the areas in which it occurs.

The constant threat of natural catastrophes such as flooding and man made ones including civil unrest, make the management and prevention of cholera a huge challenge in most of Africa.

Several conditions on the continent make it fertile ground for the emergence and rapid spread of cholera. These include:

  • Inadequate access to clean water and sanitation facilities, especially in peri-urban slums, where basic infrastructure isn’t available.
  • Camps for internally displaced persons or refugees, where minimum requirements of clean water and sanitation have not been met. Crowded camps are fertile ground for a cholera outbreak.
  • Other humanitarian crises including flooding and earthquakes, civil unrest or war that causes disruption of water and sanitation systems.

What has been the progress of cholera outbreak responses in Africa?

Providing communal toilets, water vending points and improved sewage disposal in urban informal settlements have borne fruit in Kenya and Ghana.

But generally the lack of comprehensive programs for improvement of general public health especially for vulnerable populations like refugees and informal settlement residents is a challenge.

African countries have not achieved nearly enough. This is true when it comes to detecting primary cases and then isolating and treating them to arrest further transmission. This is particularly the case in refugee camps.

Very often efforts to provide clean drinking water, safe disposal of sewage and improved housing are poorly coordinated, halfhearted and mediocre.

The increase in population especially in urban informal settlements has been exponential over the last two decades posing a major challenge for public health as more people flock to the cities in search of jobs.

On top of this a lack of political maturity in many African countries as well as greed for political power has led to civil unrest and chaos which in turn has resulted in internal displacements of huge populations.

There are recommended vaccines that can minimise the spread of cholera. But they are rarely used as for most governments this not a priority.

Cost matters! Unlike cholera vaccine, most of the Expanded Programme on Immunisation vaccines are usually provided free through the GAVI initiative. Hence my suggestion that the cholera vaccine be made part of EPI initiative for endemic areas/regions.

Are there reasons for optimism?

Vaccines can prevent up to 65% of vulnerable populations from getting cholera. This also keeps away other food borne diseases such as typhoid, dysentery, E. coli and diarrhoea.

Currently there are three WHO pre-qualified oral cholera vaccines: Dukoral®, Shanchol™, and Euvichol® and they all require two doses for full protection.

Dukoral® is mainly used for travellers. Dukoral® provides approximately 65% protection against cholera for two years. Shanchol™ and Euvichol® are essentially the same vaccine produced by two different manufacturers. The current cost for a 2 -dose regimen is US$3·7 for Shanchol and Euvichol and $10·5 for Dukoral. It may be high time these vaccines were placed on the same level of importance as EPI-supported vaccines especially for endemic areas in order to increase affordability and faster roll-out.

There must be a minimum of two weeks delay between first dose and booster dose for these two vaccines.

But apart from better use and distribution of available vaccines, much more needs to be done. Investment is needed especially for vulnerable and at risk populations living in slums and refugee camps.

Surveillance and mapping cholera hot spots is critical to encourage prompt treatment and control measures when a primary case is identified.

Samuel Kariuki, Researcher Microbiology/Infectious Diseases, Kenya Medical Research Institute

This article was originally published on The Conversation. Read the original article.