Appendix A.

To illustrate the steps linking monitoring and data collection to decision making we provide an example of a habitat conservation problem. For simplicity, assume there is a population of a species of concern with an initial abundance of 100 animals, and that this population has an intrinsic rate of growth = -0.2 that will lead to eventual extinction in the absence of intervention. Suppose we are charged with saving this population and have a proposed management action (a) (say, habitat improvement). Next assume that we have three conceptual models (models 1-3) that relate the effectiveness of this action to the population in question. The first model supposes that on average our management action increases the growth potential of a population linearly, so that (ignoring random effects) we have:

,(A1)

where is population size at time t, is the per-capita growth rate given no increase in habitat quality (, and represents improvements in habitat quality induced by management, relative to . This simple model predicts that the population’s finite rate of increase will change as a linear function of the decision () (Figure A1a).

The second model postulates a threshold response to our management action, where increases in beyond a certain point have no impact on population growth, so that (ignoring random effects) we have:

(A2)

(Figure A1b). The third model relating our management action to population response postulates no relationship between our action and the population, and can be expressed as:

(A3)

(Figure A1c). Clearly, belief in each of these alternative models has differing implications for our conservation decision and a multiple-hypothesis paradigm that retains each prediction (possibly weighted by evidence in its favor) provides the following model

(A4)

where i indexes each alternative model.

Now assume that we have a 5-year time frame to investigate how our management decision influences the population of concern, so [0, 5] where t=0 is the current year and t=5 represents the system after 5 years have elapsed. Finally, consider an objective (J) that seeks to maximize population size at the end of the 5-year time frame, while also taking into account the costs of management. Therefore,

(A5)

where

is the cumulative cost of the action over [0,5] and is an upper limit to cost. For this example, assume that costs are $100/unit of habitat improvement and that =$700; thus our objective function would be:

, (A6)

and the decision problem is to find the array of decisions () that maximize this function, subject to initial conditions (current abundance ) and our model of system dynamics (models 1-4). In principle the habitat decisions could be different each year, which results in a very large (and complex) decision space. For simplicity, assume that our decision is held constant over each of the 5 years prior to t=5, so that ( has its influence beyond the end of the time frame of interest). Under these assumptions, we can graphically evaluate the objective outcome over the range of feasible actions under each of the 3 alternative models (Figure A2a,b,c), and by visual inspection, estimate the value of a that provides the maximum for eq. A6. Clearly, the optimal value of a depends strongly on the underlying dynamics assumed for the system, with optimal under the first model (Figure A1a, A2a), under the second (Figure A1b, A2b), and under the third (A1c, A2c). The radically different choices for an optimal decision, and the large differences in resulting objective value (see the vertical axes in Figure A2a,b,c) drive home the point that uncertainty about system dynamics can have profound implications for decision making. Conversely, if we could reduce this uncertainty via information from monitoring, field studies, or experiments, that information would have value to us as measured by the defined resource objective.

Linking monitoring to decision making

Take our simple case of habitat management, and now assume a current population state of with the same intrinsic rate of growth = -0.2, but with the decision to conserve habitat at the rate of =0.6. Under models 1-3 we would predict the most likely value for as 280, 260, and 160, respectively. Now suppose in year t+1 we observe in our monitoring program as our estimate of . In general, our estimate will need to take into account environmental stochasticity, partial management control, and other sources of variation. In addition, we will undoubtedly have statistical uncertainty (sampling variance) in our monitoring estimate for . For simplicity, assume, that is, is an unbiased estimate of. Also assume that all the sources of variation (environmental, management control, and statistical) are known, and together can be quantified by a constant standard deviation, in this case. Finally, assuming that the statistical model for the observation of is normal, we can estimate statistical likelihoods under each of the 3 competing hypotheses that relate our action to population response as

(A7)

where is the prediction for under each alternative model. The likelihoods of the observed value under the respective predictions are {0.00794, 0.00763, 0.00131} (Figure A3). Finally, these values can be used, together with Bayes’ Theorem, to provide posterior updates of the model evidentiary weights.

.(A8)

This procedure requires prior evidentiary weights, which we take as initially equal (1/3) for each model, resulting in posterior weights of 0.470, 0.452, and 0.078, respectively, for the 3 models. Note that a decrease in uncertainty to (e.g., through improved monitoring and management control) results in a clearer separation of the alternative models, where now yields posterior weights of 0.731, 0.269, and 0.000 (Figure A3b). This illustrates the importance of reducing those aspects of uncertainty that are controllable, to achieve more rapid learning.

Finally, although updating through monitoring of model evidence (last paragraph) and model parameters (penultimate paragraph) are often thought of as distinct enterprises, they are in actuality closely linked, which is made clear by considering the joint distribution of data, system states, and model evidence following decision making at time t and subsequent monitoring at t+1:

.(A9)

By sequential conditioning we have

(A10)

where the 4 components represents the statistical model for the monitoring data, the model-specific predictions for population state, uncertainty in model parameters given a specified model, and prior probabilities for the alternative models. Application of Bayes’ Theorem leads to posterior inference on each of the unknown quantities , via marginal and conditional distributions of their joint posterior distribution (typically obtained by procedures such as MCMC). Thus, in principle, a single analytical procedure can simultaneously use monitoring information to inform the system state (and its value), update parameter values under each model, and change the evidentiary weights on alternative models.

How does information gained from monitoring then enter into decision making? To illustrate, return to our simple habitat management decision problem outlined earlier. Note that the objective value for any set of decisionsis given by equation A5, which in turn depends on the outcome (). However, because of structural uncertainty we have a different prediction for this value given depending on which hypothesis about population response we subscribe to. Essentially, we have three objective functions, one for each model

(A11)

where is the value for under each of the three models. An approach for making a decision given uncertainty is to select so as to maximize

,(A12)

where are the evidentiary weights for each model. Given equal weights, in this example the decision would still favor a approaching one, as would be selected under full faith in model 1 (Figure A4a). Suppose however that monitoring feedback leads to a change in the evidentiary weights to for the three models. This now suggests an optimal decision of (Figure A4b). We leave aside here the details of how to obtain a set of management actions {at} so as to maximize Ji, noting that for recurrent decisions in stochastic systems this will normally require approaches such as dynamic programming.

Figure A1. Alternative models of population response to habitat augmentation (a) linear increase in population growth rate, (b) threshold saturation in population growth rate, (c) null relation to population growth rate

Figure A2. Objective functions under alternative models of the impact of habitat augmentation on population growth rate. (a) linear increase in population growth rate, (b) threshold saturation in population growth rate, (c) null relation to population growth rate.

Figure A3. Statistical likelihoods under 3 alternative models for a range of potential observations of population state (a) sd=50, (b) sd =10.

Figure A4. Weighted average of objectives under 3 models with equal weights (A4a) and with weights of 0.1, 0.3, and 0.6 on models 1, 2, and 3 (A4b).

Appendix B. Adaptive updating based on observed system states

For simplicity, assume, that is, is an unbiased estimate of. Also assume that all sources of variation (environmental, management control, and statistical) are known, and together can be quantified by a constant standard deviation (). Finally, assuming the statistical model for the observation is normal, we can estimate statistical likelihoods under each competing hypotheses as

(B1)

where is the prediction for under each alternative model give the observed value . Finally, these values can be used, together with Bayes’ Theorem, to provide posterior updates of model (m) evidentiary weights, given the number of models (k).

.(B2)

Note that a decrease in uncertainty, i.e. reduction in σ (e.g., through improved monitoring and management control), would result in a clearer separation of alternative models. This illustrates the importance of reducing those aspects of uncertainty that are controllable, to achieve more rapid learning.

Finally, although updating through monitoring of model evidence and model parameters are often thought of as distinct enterprises, they are really closely linked, as is made clear by considering the joint distribution of data, system states, and model evidence following decision making at time t and subsequent monitoring at t+1:

.(B3)

By sequential conditioning we have

(B4)

where the 4 components represents the statistical model for the monitoring data, the model-specific predictions for population state, uncertainty in model parameters given a specified model, and prior probabilities for the alternative models. Application of Bayes’ Theorem leads to posterior inference on each of the unknown quantities, via marginal and conditional distributions of their joint posterior distribution (typically obtained by procedures such as MCMC). Thus, in principle, a single analytical procedure can simultaneously use monitoring information to inform the system state (and its value), update parameter values under each model, and change the evidentiary weights on alternative models.

How does information gained from monitoring then enter into decision making? Note that the objective value for any set of decisionsis given by where is the cumulative cost of the actions over [0,T] and is an upper limit to cost, which in turn depends on the outcome (). However, because of structural uncertainty we have a different prediction for this value given depending on which hypothesis pertaining to a population response we subscribe to. Essentially, we have a unique objective function for each model and an approach for making a decision given uncertainty is to select so as to maximize

,(B5)

where are the evidentiary weights for each model (m).

Appendix C

To take the case of site-specific abundance for a given species i at site j within region k, a general model of state transition is

(C1)

where is abundance of species i on site j within region k at time t, are local climate drivers at time t, are management actions, and are stochastic factors; represents the fact that may depends not only on initial abundance at site k(j) but also on individuals potentially colonizing from other sites in the region. The function involves survival, reproduction, and movement parameters; these parameters are, in turn, hierarchically specified as functions of,and , depending on the hypothesis being entertained. For example, under H2a*b (Table 2), per-capita reproduction rates at a specific site are predicted to be a function of local climate drivers (influenced in part by elevation), management (e.g., supplemental feeding), and random factors, e.g.,

(C2)

where ,represent, respectively, the influence of climate drivers and supplemental feeding on reproduction rates. The above relationship could be easily modified under alternative hypotheses; e.g., under H2a*c, the influence of climate drivers would be principally at the regional scale

.(C3)

Cross-scale, hierarchical linkages between models are easily developed. For example, a sensible hypothesis is that local climate drivers, management, and random factors influence reproduction rates, but that these effects are constrained by the manifestation of these factors at regional scales:

(C4)

where

.

This general approach potentially extends to include community structure,

,(C5)

with taking into account the potential influence of other species (resource competition). Likewise, we can form a similar dynamic model but for the reduced state space of local species occupancy as

(C6)

where

and would involve hierarchical modeling of local extinction and colonization probabilities give the hypotheses under consideration. For instance, under H1a local site colonization and extinction probabilities are predicted to be functions of local (e.g., elevation) climate gradients, possibly mitigated by management

;(C7)

cross-scale, hierarchical relationships can be formed in a manner similar to eq. C3. Further hierarchical integration occurs by linking the species-specific models together into a community model, which now allows for competitive and other interactions between the species involved.