Appendix 1_LiteratureReview

Mitigation of diffuse agricultural pollution using buffer zones, ditches, ponds and subsurface bioreactors – Review

Jane Hawkins and Martin Blackwell

Rothamsted Research, North Wyke, Okehampton, Devon, EX20 2SB, UK

Introduction

Agricultural land is widely acknowledged as being a major source of environmental contaminants such as nutrients (especially nitrate (NO3) and phosphorus (P)), pathogens, pesticides and sediment that contribute to diffuse pollution that can lead to contamination of surface water bodies. In order to reach the target ecological status for UK waters as outlined in the Water Framework Directive (WFD) (2000/60/EC), a major requirement is the reduction or mitigation of DWPA. In order to assist with this the UK government Department for Environment, Food and Rural Affairs (DEFRA) has established a Catchment Sensitive Farming (CSF) initiative, the principle aim of which is to raise awareness of DWPA, and to encourage voluntary action by farmers to adopt measures to reduce transport of pollutants. A total of 44 mitigation methods to control DWPA have been identified by Cuttle et al. (2007). One of the methods is the establishment of riparian buffer zones to intercept the transfer of pollutants to watercourses. These have be shown to be the most cost effective approach for the reduction of P transfer from agricultural land to surface waters, compared with a range of other mitigation methods (Haygarth et al., 2009).

During the past few decades the quantity of published literature on buffer zones and their functioning and design has steadily increased, but much of the research has failed to adopt a multi-functional approach, instead focussing on specific issues such as sediment and P in isolation (Dorioz et al., 2006; Owens et al., 2007). Furthermore, there is still inadequate understanding of many of the basic mechanisms and processes involved in buffer zone functioning and in particular the compatibility of the different processes that control the various forms of DWPA which buffers can mitigate. This has resulted in their inappropriate and inefficient application and buffer zones have been found not to deliver their anticipated benefits to UK water quality (Leeds-Harrison et al., 1996). Even with protocols for the implementation of buffer zones having been described (Environment-Agency, 1996; MAFF, 1997), and although they are included in current environmental management schemes, a strategic implementation policy with regard to buffer zone establishment for protection of surface waters has failed to be developed. One of the failings of the protocol described by these organisations has been their ‘broad-brush’ approach, which, in a country where buffer strip width is largely constrained by field and farm size compared with, for example, North America, is often impracticable. No consideration has been given to individual location characteristics, which has often resulted in inefficient buffer zones being established. What is required, is a more strategic, targeted, and integrated approach, which would not only deliver better water quality benefits, but more efficient use of land and landscape features. One of the main problems in the UK with regard to buffer zone efficiency is the hydrological by-passing or short circuiting of buffer zones via agricultural runoff, channelized flow and sub-surface drainage (Leeds-Harrison et al., 1996).

Little work has been carried out on the potential role of other complementary edge-of-field mitigation methods such as managed ditches and ponds, which may improve pollutant removal, especially if used either alongside or instead of the more conventional use of buffer zones. By their very nature, ditches are well placed to interact with a large proportion of water moving from agricultural land to rivers and lakes. Simple techniques can be applied that enable ditches to act as linear wetlands, providing the capability of water quality improvement/pollution control together with hydrological regulation (Posthumus et al., 2008). Evidence suggests that farm ponds could also be effective at reducing the nutrient loading of surface waters draining from agricultural land and can also help reduce the velocity of runoff waters, thereby helping to reduce downstream sediment losses while at the same time attenuating flood peaks (Hawkins and Scholefield, 2002; Heathwaite et al., 2005). Subsurface bioreactors are effectively ditches containing materials with high hydraulic conductivity and often a high carbon (C) content that are placed below the soil surface in such a way as to intercept contaminated groundwater, which passes through the reactive media, and where a range of contaminants including NO3 (Robertson et al., 2005; Schipper and Vojvodic-Vukovic, 1998) and P (Baker et al., 1998) are transformed into environmentally benign forms or immobilised. If combined with conventional buffer zones that are subject to by-passing by subsurface drainage, they could potentially enhance the nutrient removal capacity of a buffer zone.

This review firstly identifies the main natural processes occurring in edge-of-field mitigation methods that can be exploited to mitigate the environmental impact of pollutants. Secondly, the review gives a reflection on the current use and reported efficiencies of buffers zones, subsurface permeable reactive barriers, ditches and ponds to control DWPA. The main gaps in our knowledge of how to successfully and optimally implement these measures are identified.

Processes mitigating DWPA

Although modern farming practices can often lead to the pollution of surface waters, there are many generic natural processes occurring within the landscape features such as buffer zones,ditches and ponds which can be exploited to mitigate the effects of DWPA including.

Table 1. Processes for amelioration of DWPA from Blackwell et. al. 2002

Agriculturally derived pollutantWetland Process for Amelioration / Associated water quality problems / Natural process for amelioration
Nutrients (especially N and P) / Eutrophication
Toxicity / Denitrification
Precipitation
Plant uptake
Adsorption
Sediment deposition / retention
Pesticides/herbicides / Toxicity / Adsorption
Plant uptake
SedimentSediment Deposition/Retention / Eutrophication
Silting of gravels / Sediment deposition / retention
Hydrological regulation
Pathogens / Disease / Sediment deposition / retention
Adsorption
Predation
Heavy metals / Toxicity / Plant uptake
Sediment deposition / retention
Precipitation
Sorption
BOD and COD / De-oxygenation / Adsorption
Sediment deposition / retention
Oxidation/mineralisation

A brief description and method of mitigation action for these processes is as follows:

A. Denitrification and Nitrification

Denitrification involves the dissimilative reduction of oxidised forms of nitrogen (N), in particular NO3, to gaseous forms of N, particularly nitrous oxide (N2O) and di-nitrogen (N2) (Blackmer and Bremner, 1977). Denitrification is carried out mainly by facultative anaerobic bacteria belonging to a number of genera, which inhabit nearly all known environments (Groffman, 1994). However, purely chemical mechanisms may result in a similar reduction of oxidised N-compounds (Brady, 1990; Buresh and Moraghan, 1976). An essential requirement of the process is anaerobicity, as such conditions stimulate denitrifying organisms to use NO3 as an electron acceptor in the absence of free oxygen (O2). Consequently the potential for the process to occur is usually greatest in wet environments. Most denitrifying bacteria are of a heterotrophic nature and therefore require a supply of easily oxidisable C as a respiratory substrate (Alexander, 1961). Coupled nitrification/denitrification (see oxidation below) can be an important process in some situations e.g. saturated soils or ponds (Reddy et al., 1989). The particular significance of this process in terms of nutrient removal is that it involves total export of N from the system to the atmosphere, rather than temporary storage by other processes such as plant uptake or organic matter accumulation in the soil.

Denitrification is also an important process for a variety of other reasons; the farming community is interested in the process as it can lead to large losses of expensive NO3- fertilisers, with some estimates claiming that up to 30% of applied fertiliser-N can be lost via the process (Averill and Tiedje, 1981), while environmental managers view the process from two different perspectives. Firstly, it can perform a useful function by removing NO3- from diffuse agricultural run-off, and so maintain or improve the quality of surface water bodies and prevent eutrophication (Blackwell et al., 1999; Woodward et al., 2009; Zaman et al., 2008). Secondly, it can be viewed as a potentially problematic process, by which the radiatively active gas N2O is produced, sometimes in large quantities, depending on whether either partial reduction to N2O or complete reduction to N2 takes place (van den Heuvel et al., 2009).

In contrast to denitrification, nitrification is the transformation of ammonium (NH4) to NO3. and dependent on sufficient O2 levels in soil and sediments. Although highly reduced conditions are favourable for denitrification, anoxic conditions inhibit nitrifiers to an extent where NO3 becomes limiting. A reduction in denitrification may lead to an accumulation of NH4 in ponded water bodies. Thus, maintenance of the conditions favourable for the nitrifier population is important especially where loading of NH4 to waters is already sizeable.

B. Volatilisation

Ammonia volatilisation is a process whereby ammonia (NH3) is produced from ammoniacal N in solution and is returned to the atmosphere in a gaseous form. The rate of volatilisation is primarily driven by the free NH3 concentration and temperature (Craggs, 2008). Although losses of N due to volatilisation are thought to be relatively small (Hargreaves, 1998), levels ranging from 10-27.4% NH3 have been reported (Banerjee et al., 1990; Gross et al., 2000; PaezOsuna et al., 1997). This is an undesirable N removal process since NH3 is an atmospheric pollutant that can have harmful affects on terrestrial and aquatic environments due to wet and dry deposition (Asman, 1994).

C. Sedimentation

Agricultural land can be a major source of sediment, which in its own right can act as a pollutant (Owens et al., 2005). Deposition of this material into river bed gravels can cause deoxygenation with subsequent loss of invertebrate habitats, and failure of spawning by salmonids and other fish (Harrod et al., 2002). Nutrients, especially P, as well as pesticides, heavy metals and pathogens can all be sorbed onto, or associated with, sediment particles, while high biochemical oxygen demand (BOD) is often related to the presence of organic particles (Horowitz, 1991; van der Perk and Jetten, 2006). There are two main processes by which sediment can be removed from agricultural runoff:

1. Flow velocity reduction

This can result from reduction in slope gradient, discharge into ponded water bodies, infiltration of water into soils or increased friction and detention of surface water by vegetation, causing surface deposition of particulate matter from suspension (Dillaha et al., 1989). Densely packed, fine vegetation such as grass offers the most resistance to flow at shallow depths i.e. has the greatest roughness coefficient (Hook, 2003; Mitsch and Gosselink, 2000; Munoz-Carpena et al., 1999), though its impact can be reduced in deeper flows (Hammer, 1992). The greater the reduction in flow velocity or the longer the period of ponding, the greater the amount of sediment deposited, and in particular finer particles (Dillaha et al., 1989). The importance of this process for P retention is demonstrated by Mitsch et al. (1979) who reported P retention by sedimentation of 3.6 g P m-2 per year in a riparian buffer zone. This was estimated as being eighteen times that of all other P retention mechanisms.

2. Filtration by vegetation

Vegetation can act as a filter, depending upon its structure and density. Larger soil particles can become trapped within litter layers or among stems and leaves (Dillaha and Inamdar, 1997).

D. Chemical precipitation

Phosphorus readily forms precipitates with aluminium (Al), and iron (Fe) under acidic conditions, while combinations with calcium (Ca) and magnesium (Mg) are more usual under alkaline conditions (Nriagu, 1972). In low energy environments these precipitates will settle and become stored. However, anaerobic conditions can have the reverse effect and result in the mobilisation of previously precipitated P (Patrick and Khalid, 1974), as can alterations in pH (Richardson, 1989). The relative importance of precipitation processes in preventing P transfer is not clear because particulate matter of this kind has been shown to be transferred along various sub-soil pathways (Haygarth et al., 1998), and therefore may simply be transported in a different form.

E. Oxidation

In otherwise anaerobic soil environments, some plants can release large amounts of O2 from their roots, and consequently provide aerobic pockets (Jaynes and Carpenter, 1986). This can be an important for processes such as the coupling of nitrification and denitrification processes (Lloyd, 1993; Reddy et al., 1989), optimising the removal of N from the soil.

F. Plant and bacterial uptake

Plant roots remove nutrients from soils by uptake of dissolved solutes in soil water, although direct exchange between nutrient ions adsorbed onto soil particles and hydrogen (H+) ions on root surfaces can also take place (Reddy et al., 1989). Where plant growth is strongly seasonal, the process of nutrient removal and storage by this means is correspondingly limited to the growing season. During dormant periods, the senescence and decomposition of above ground plant material results in the release of nutrients which are recycled into the system. To avoid this, harvesting and removal of the vegetation is required to achieve nutrient removal (Koopmans et al., 2004). Plant species, age, root architecture and size and stage of development are important determinants of nutrient uptake (Fohse et al., 1988). In addition, heavy metals can become incorporated into plant material, as either short- or long-term storage, depending on plant types and conditions (Klopatek, 1978). Soil bacterial populations can greatly increase under favourable conditions, resulting in the assimilation of large quantities of nutrients. However, the storage time can vary considerably, and unfavourable conditions can result in rapid reductions in bacterial numbers with the release of large quantities of nutrients (Groffman, 1994).

G. Adsorption

Adsorption of dissolved materials onto soil particles through ionic bonding can be a significant process for the reduction of concentrations of various nutrients (Leinweber et al., 2002). Since the capacity for adsorption depends on the amount of soil surface available, the small size and relatively large surface area of clay particles means that clay soils generally have the highest adsorption capacity of mineral soils. The type and amount of ions adsorbed also depends on the anion and cation exchange capacities of a soil which is influenced largely by the mineralogy of the clay particles (greatest in swelling clays such as smectite), and the pH of the soil (White and Zelazny, 1986). Cations are adsorbed more strongly than anions, but at low pH anion adsorption capacity increases. In addition to clays, humified soil organic material also has a high adsorptive capacity (Stevenson, 1994).

H. Additional significant processes

Improvements in the quality of water draining from wetlands can result from the inactivation and predation of pathogenic bacteria and viruses. Protozoa, which are often abundant in wetlands, can consume large quantities of bacteria, while bacteriophages (viruses that infect bacteria) can reduce numbers of susceptible bacteria by fatal infection (Kadlec and Knight, 1996; Nuttall et al., 1997).

Different combinations of these processes occur to different degrees in the various features considered here. It is important to recognise that each type of feature itself may vary considerably with regard to properties such as slope, soil type, hydrology and vegetation, and consequently so will the processes acting within them. Here we consider the key types of mitigation features that can be used to control DWPA.

Buffer zones

‘Buffer zone’ is a generic term referring to naturally or semi-naturally vegetated areas typically situated between agricultural land and a surface water body, although some buffer zones can be distal from water bodies (e.g. contour buffer strips – see below). They can be effective in protecting water bodies and other habitats from harmful impacts such as high nutrient, pesticide or sediment loadings resulting from land use practices (Blackwell et al., 1999). Additionally, they act as a physical barrier restricting the spreading of fertilisers and sprays in the proximity of features along which they are situated. The degree to which protection is provided by them depends on the processes that can operate in a specific buffer zone, and this is dependent upon several factors including size (Gergel et al., 2005), location (Pinay and Burt, 2001), hydrology (Correll, 1997), vegetation (Abu-Zreig et al., 2004) and soil type (Leeds-Harrison et al., 1996) of the buffer zone, as well as the nature of the pollution against which it is mitigating. Considerable confusion often arises from the terminology associated with buffer zones, as many different expressions are employed to describe a wide range of landscape features used for the mitigation of DWPA. A comprehensive definition and classification of the various terms associated with buffer zones, such as buffer strips, contour strips, riparian buffer zones, vegetated filter strips, etc. is provided by Owens et al. (2007).

The key processes operating in buffer zones varies depending on whether they comprise freely or poorly draining soils. In buffer zones with poorly draining, wet soils, processes such as denitrification occur, acting to remove N from the system (Gilliam et al., 1997). The ability of buffer zones to remove NO3 from agricultural runoff via denitrification has been reported by many researchers including Peterjohn and Correll (1984), Cooper (1990), Blackwell et al. (1999) and Hefting (2006), all of whom report NO3 concentration reductions in surface or ground water of 75% or more of the original concentration in water discharging into the buffer zones studied. On the other hand, while P dynamics in buffer zones can be complex, as described by Dorioz et al. (2006), removal tends to be primarily through the retention of sediment to which P is bound and consequently is more prevalent in buffer zones with freely draining soils through which infiltration of runoff occurs, leaving sediment trapped at the soil surface (Dillaha and Inamdar, 1997; Dillaha et al., 1989). Figures for P removal by buffer zones are generally impressive with reported retention of total P ranging from 40% to 90% (Borin et al., 2005; Duchemin and Madjoub, 2004; Schmitt et al., 1999). However, several researchers report that for dissolved P, buffer zones can sometime be net emitters with retention ranging from -80% to +95% (Duchemin and Madjoub, 2004; Uusi-Kamppa et al., 2000). This variability and ability to act as a source is largely attributed to plant uptake of dissolved P during summer, and its subsequent release due to litterfall in winter. As a consequence of the different conditions required for optimal performance of the two processes it is unlikely that N and P removal in buffer zones will occur concurrently to any great extent in buffer zones. The key processes associated with N removal in buffer zones usually involve some form of transformation and potential emission (e.g. via gaseous N emission following denitrification), meaning N removal is effectively sustainable. For P removal though, concerns exist about the sustainability of the effectiveness of buffer zones, especially with regard to sediment-associated P, and questions are arising on the effective life-span of buffer zones for P removal.