833 resultados para Multi-model inference
Using life strategies to explore the vulnerability of ecosystem services to invasion by alien plants
Resumo:
Invasive plants can have different effects of ecosystem functioning and on the provision of ecosystem services, from strongly deleterious impacts to positive effects. The nature and intensity of such effects will depend on the service and ecosystem being considered, but also on features of life strategies of invaders that influence their invasiveness as well as their influence of key processes of receiving ecosystems. To address the combined effect of these various factors we developed a robust and efficient methodological framework that allows to identify areas of possible conflict between ecosystem services and alien invasive plants, considering interactions between landscape invasibility and species invasiveness. Our framework combines the statistical robustness of multi-model inference, efficient techniques to map ecosystem services, and life strategies as a functional link between invasion, functional changes and potential provision of services by invaded ecosystems. The framework was applied to a test region in Portugal, for which we could successfully predict the current patterns of plant invasion, of ecosystem service provision, and finally of probable conflict (expressing concern for negative impacts, and value for positive impacts on services) between alien species richness (total and per plant life strategy) and the potential provision of selected services. Potential conflicts were identified for all combinations of plant strategy and ecosystem service, with an emphasis for those concerning conflicts with carbon sequestration, water regulation and wood production. Lower levels of conflict were obtained between invasive plant strategies and the habitat for biodiversity supporting service. The added value of the proposed framework in the context of landscape management and planning is discussed in perspective of anticipation of conflicts, mitigation of negative impacts, and potentiation of positive effects of plant invasions on ecosystems and their services.
Resumo:
We describe a novel dissimilarity framework to analyze spatial patterns of species diversity and illustrate it with alien plant invasions in Northern Portugal. We used this framework to test the hypothesis that patterns of alien invasive plant species richness and composition are differently affected by differences in climate, land use and landscape connectivity (i.e. Geographic distance as a proxy and vectorial objects that facilitate dispersal such as roads and rivers) between pairs of localities at the regional scale. We further evaluated possible effects of plant life strategies (Grime's C-S-R) and residence time. Each locality consisted of a 1 km(2) landscape mosaic in which all alien invasive species were recorded by visiting all habitat types. Multi-model inference revealed that dissimilarity in species richness is more influenced by environmental distance (particularly climate), whereas geographic distance (proxies for dispersal limitations) is more important to explain dissimilarity in species composition, with a prevailing role for ecotones and roads. However, only minor differences were found in the responses of the three C-S-R strategies. Some effect of residence time was found, but only for dissimilarity in species richness. Our results also indicated that environmental conditions (e.g. climate conditions) limit the number of alien species invading a given site, but that the presence of dispersal corridors determines the paths of invasion and therefore the pool of species reaching each site. As geographic distances (e.g. ecotones and roads) tend to explain invasion at our regional scale highlights the need to consider the management of alien invasions in the context of integrated landscape planning. Alien species management should include (but not be limited to) the mitigation of dispersal pathways along linear infrastructures. Our results therefore highlight potentially useful applications of the novel multimodel framework to the anticipation and management of plant invasions. (C) 2013 Elsevier GmbH. All rights reserved.
Big Decisions and Sparse Data: Adapting Scientific Publishing to the Needs of Practical Conservation
Resumo:
The biggest challenge in conservation biology is breaking down the gap between research and practical management. A major obstacle is the fact that many researchers are unwilling to tackle projects likely to produce sparse or messy data because the results would be difficult to publish in refereed journals. The obvious solution to sparse data is to build up results from multiple studies. Consequently, we suggest that there needs to be greater emphasis in conservation biology on publishing papers that can be built on by subsequent research rather than on papers that produce clear results individually. This building approach requires: (1) a stronger theoretical framework, in which researchers attempt to anticipate models that will be relevant in future studies and incorporate expected differences among studies into those models; (2) use of modern methods for model selection and multi-model inference, and publication of parameter estimates under a range of plausible models; (3) explicit incorporation of prior information into each case study; and (4) planning management treatments in an adaptive framework that considers treatments applied in other studies. We encourage journals to publish papers that promote this building approach rather than expecting papers to conform to traditional standards of rigor as stand-alone papers, and believe that this shift in publishing philosophy would better encourage researchers to tackle the most urgent conservation problems.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The allometric growth of two groups of Nassarius vibex on beds of the bivalve Mytella charruana on the northern coast of the State of Sao Paulo, was evaluated between September 2006 and February 2007 in the bed on Camaroeiro Beach, and from March 2007 to June 2007 at Cidade Beach. The shells from Camaroeiro were longer and wider and had a smaller shell aperture than those from Cidade; a principal components analysis also confirmed different morphometric patterns between the areas. The allometric growth of the two groups showed great variation in the development of individuals. The increase of shell width and height in relation to shell length did not differ between the two areas. Shell aperture showed a contrasting growth pattern, with individuals from Camaroeiro having smaller apertures. The methodology based on Kullback-Leibler information theory and the multi-model inference showed, for N. vibex, that the classic linear allometric growth was not the most suitable explanation for the observed morphometric relationships. The patterns of relative growth observed in the two groups of N. vibex may be a consequence of different growth and variation rates, which modifies the development of the individuals. Other factors such as food resource availability and environmental parameters, which might also differ between the two areas, should also be considered.
Resumo:
Model predictive control (MPC) is usually implemented as a control strategy where the system outputs are controlled within specified zones, instead of fixed set points. One strategy to implement the zone control is by means of the selection of different weights for the output error in the control cost function. A disadvantage of this approach is that closed-loop stability cannot be guaranteed, as a different linear controller may be activated at each time step. A way to implement a stable zone control is by means of the use of an infinite horizon cost in which the set point is an additional variable of the control problem. In this case, the set point is restricted to remain inside the output zone and an appropriate output slack variable is included in the optimisation problem to assure the recursive feasibility of the control optimisation problem. Following this approach, a robust MPC is developed for the case of multi-model uncertainty of open-loop stable systems. The controller is devoted to maintain the outputs within their corresponding feasible zone, while reaching the desired optimal input target. Simulation of a process of the oil re. ning industry illustrates the performance of the proposed strategy.
Resumo:
Computed Tomography (CT) represents the standard imaging modality for tumor volume delineation for radiotherapy treatment planning of retinoblastoma despite some inherent limitations. CT scan is very useful in providing information on physical density for dose calculation and morphological volumetric information but presents a low sensitivity in assessing the tumor viability. On the other hand, 3D ultrasound (US) allows a highly accurate definition of the tumor volume thanks to its high spatial resolution but it is not currently integrated in the treatment planning but used only for diagnosis and follow-up. Our ultimate goal is an automatic segmentation of gross tumor volume (GTV) in the 3D US, the segmentation of the organs at risk (OAR) in the CT and the registration of both modalities. In this paper, we present some preliminary results in this direction. We present 3D active contour-based segmentation of the eye ball and the lens in CT images; the presented approach incorporates the prior knowledge of the anatomy by using a 3D geometrical eye model. The automated segmentation results are validated by comparing with manual segmentations. Then, we present two approaches for the fusion of 3D CT and US images: (i) landmark-based transformation, and (ii) object-based transformation that makes use of eye ball contour information on CT and US images.
Resumo:
Many climate models have problems simulating Indian summer monsoon rainfall and its variability, resulting in considerable uncertainty in future projections. Problems may relate to many factors, such as local effects of the formulation of physical parametrisation schemes, while common model biases that develop elsewhere within the climate system may also be important. Here we examine the extent and impact of cold sea surface temperature (SST) biases developing in the northern Arabian Sea in the CMIP5 multi-model ensemble, where such SST biases are shown to be common. Such biases have previously been shown to reduce monsoon rainfall in the Met Office Unified Model (MetUM) by weakening moisture fluxes incident upon India. The Arabian Sea SST biases in CMIP5 models consistently develop in winter, via strengthening of the winter monsoon circulation, and persist into spring and summer. A clear relationship exists between Arabian Sea cold SST bias and weak monsoon rainfall in CMIP5 models, similar to effects in the MetUM. Part of this effect may also relate to other factors, such as forcing of the early monsoon by spring-time excessive equatorial precipitation. Atmosphere-only future time-slice experiments show that Arabian Sea cold SST biases have potential to weaken future monsoon rainfall increases by limiting moisture flux acceleration through non-linearity of the Clausius-Clapeyron relationship. Analysis of CMIP5 model future scenario simulations suggests that, while such effects are likely small compared to other sources of uncertainty, models with large Arabian Sea cold SST biases suppress the range of potential outcomes for changes to future early monsoon rainfall.
Resumo:
A necessary condition for a good probabilistic forecast is that the forecast system is shown to be reliable: forecast probabilities should equal observed probabilities verified over a large number of cases. As climate change trends are now emerging from the natural variability, we can apply this concept to climate predictions and compute the reliability of simulated local and regional temperature and precipitation trends (1950–2011) in a recent multi-model ensemble of climate model simulations prepared for the Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5). With only a single verification time, the verification is over the spatial dimension. The local temperature trends appear to be reliable. However, when the global mean climate response is factored out, the ensemble is overconfident: the observed trend is outside the range of modelled trends in many more regions than would be expected by the model estimate of natural variability and model spread. Precipitation trends are overconfident for all trend definitions. This implies that for near-term local climate forecasts the CMIP5 ensemble cannot simply be used as a reliable probabilistic forecast.
Resumo:
The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.
Resumo:
Simulated multi-model “diversity” in aerosol direct radiative forcing estimates is often perceived as a measure of aerosol uncertainty. However, current models used for aerosol radiative forcing calculations vary considerably in model components relevant for forcing calculations and the associated “host-model uncertainties” are generally convoluted with the actual aerosol uncertainty. In this AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in twelve participating models. Even with prescribed aerosol radiative properties, simulated clear-sky and all-sky aerosol radiative forcings show significant diversity. For a purely scattering case with globally constant optical depth of 0.2, the global-mean all-sky top-of-atmosphere radiative forcing is −4.47Wm−2 and the inter-model standard deviation is 0.55Wm−2, corresponding to a relative standard deviation of 12 %. For a case with partially absorbing aerosol with an aerosol optical depth of 0.2 and single scattering albedo of 0.8, the forcing changes to 1.04Wm−2, and the standard deviation increases to 1.01W−2, corresponding to a significant relative standard deviation of 97 %. However, the top-of-atmosphere forcing variability owing to absorption (subtracting the scattering case from the case with scattering and absorption) is low, with absolute (relative) standard deviations of 0.45Wm−2 (8 %) clear-sky and 0.62Wm−2 (11 %) all-sky. Scaling the forcing standard deviation for a purely scattering case to match the sulfate radiative forcing in the Aero- Com Direct Effect experiment demonstrates that host model uncertainties could explain about 36% of the overall sulfate forcing diversity of 0.11Wm−2 in the AeroCom Direct Radiative Effect experiment.
Resumo:
Instrumental observations, palaeo-proxies, and climate models suggest significant decadal variability within the North Atlantic subpolar gyre (NASPG). However, a poorly sampled observational record and a diversity of model behaviours mean that the precise nature and mechanisms of this variability are unclear. Here, we analyse an exceptionally large multi-model ensemble of 42 present-generation climate models to test whether NASPG mean state biases systematically affect the representation of decadal variability. Temperature and salinity biases in the Labrador Sea co-vary and influence whether density variability is controlled by temperature or salinity variations. Ocean horizontal resolution is a good predictor of the biases and the location of the dominant dynamical feedbacks within the NASPG. However, we find no link to the spectral characteristics of the variability. Our results suggest that the mean state and mechanisms of variability within the NASPG are not independent. This represents an important caveat for decadal predictions using anomaly-assimilation methods.
Resumo:
Differences-in-Differences (DID) is one of the most widely used identification strategies in applied economics. However, how to draw inferences in DID models when there are few treated groups remains an open question. We show that the usual inference methods used in DID models might not perform well when there are few treated groups and errors are heteroskedastic. In particular, we show that when there is variation in the number of observations per group, inference methods designed to work when there are few treated groups tend to (under-) over-reject the null hypothesis when the treated groups are (large) small relative to the control groups. This happens because larger groups tend to have lower variance, generating heteroskedasticity in the group x time aggregate DID model. We provide evidence from Monte Carlo simulations and from placebo DID regressions with the American Community Survey (ACS) and the Current Population Survey (CPS) datasets to show that this problem is relevant even in datasets with large numbers of observations per group. We then derive an alternative inference method that provides accurate hypothesis testing in situations where there are few treated groups (or even just one) and many control groups in the presence of heteroskedasticity. Our method assumes that we know how the heteroskedasticity is generated, which is the case when it is generated by variation in the number of observations per group. With many pre-treatment periods, we show that this assumption can be relaxed. Instead, we provide an alternative application of our method that relies on assumptions about stationarity and convergence of the moments of the time series. Finally, we consider two recent alternatives to DID when there are many pre-treatment groups. We extend our inference method to linear factor models when there are few treated groups. We also propose a permutation test for the synthetic control estimator that provided a better heteroskedasticity correction in our simulations than the test suggested by Abadie et al. (2010).
Resumo:
This work addresses issues related to analysis and development of multivariable predictive controllers based on bilinear multi-models. Linear Generalized Predictive Control (GPC) monovariable and multivariable is shown, and highlighted its properties, key features and applications in industry. Bilinear GPC, the basis for the development of this thesis, is presented by the time-step quasilinearization approach. Some results are presented using this controller in order to show its best performance when compared to linear GPC, since the bilinear models represent better the dynamics of certain processes. Time-step quasilinearization, due to the fact that it is an approximation, causes a prediction error, which limits the performance of this controller when prediction horizon increases. Due to its prediction error, Bilinear GPC with iterative compensation is shown in order to minimize this error, seeking a better performance than the classic Bilinear GPC. Results of iterative compensation algorithm are shown. The use of multi-model is discussed in this thesis, in order to correct the deficiency of controllers based on single model, when they are applied in cases with large operation ranges. Methods of measuring the distance between models, also called metrics, are the main contribution of this thesis. Several application results in simulated distillation columns, which are close enough to actual behaviour of them, are made, and the results have shown satisfactory
Resumo:
Postsurgical complication of hypertension may occur in cardiac patients. To decrease the chances of complication it is necessary to reduce elevated blood pressure as soon as possible. Continuous infusion of vasodilator drugs, such as sodium nitroprusside (Nipride), would quickly lower the blood pressure in most patients. However, each patient has a different sensitivity to infusion of Nipride. The parameters and the time delays of the system are initially unknown. Moreover, the parameters of the transfer function associated with a particular patient are time varying. the objective of the study is to develop a procedure for blood pressure control i the presence of uncertainty of parameters and considerable time delays. So, a methodology was developed multi-model, and for each such model a Preditive Controller can be a priori designed. An adaptive mechanism is then needed for deciding which controller should be dominant for a given plant