994 resultados para causal modeling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships with four modeling methods run with multiple scenarios of (1) sources of occurrences and geographically isolated background ranges for absences, (2) approaches to drawing background (absence) points, and (3) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved by using a global dataset for model training, rather than restricting data input to the species’ native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e. into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g. boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post-hoc test conducted on a new Partenium dataset from Nepal validated excellent predictive performance of our “best” model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for Parthenium hysterophorus L. (Asteraceae; parthenium). However, discrepancies between model predictions and parthenium invasion in Australia indicate successful management for this globally significant weed. This article is protected by copyright. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeling the distributions of species, especially of invasive species in non-native ranges, involves multiple challenges. Here, we developed some novel approaches to species distribution modeling aimed at reducing the influences of such challenges and improving the realism of projections. We estimated species-environment relationships with four modeling methods run with multiple scenarios of (1) sources of occurrences and geographically isolated background ranges for absences, (2) approaches to drawing background (absence) points, and (3) alternate sets of predictor variables. We further tested various quantitative metrics of model evaluation against biological insight. Model projections were very sensitive to the choice of training dataset. Model accuracy was much improved by using a global dataset for model training, rather than restricting data input to the species’ native range. AUC score was a poor metric for model evaluation and, if used alone, was not a useful criterion for assessing model performance. Projections away from the sampled space (i.e. into areas of potential future invasion) were very different depending on the modeling methods used, raising questions about the reliability of ensemble projections. Generalized linear models gave very unrealistic projections far away from the training region. Models that efficiently fit the dominant pattern, but exclude highly local patterns in the dataset and capture interactions as they appear in data (e.g. boosted regression trees), improved generalization of the models. Biological knowledge of the species and its distribution was important in refining choices about the best set of projections. A post-hoc test conducted on a new Partenium dataset from Nepal validated excellent predictive performance of our “best” model. We showed that vast stretches of currently uninvaded geographic areas on multiple continents harbor highly suitable habitats for Parthenium hysterophorus L. (Asteraceae; parthenium). However, discrepancies between model predictions and parthenium invasion in Australia indicate successful management for this globally significant weed. This article is protected by copyright. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Major infrastructure and construction (MIC) projects are those with significant traffic or environmental impact, of strategic and regional significance and high sensitivity. The decision making process of schemes of this type is becoming ever more complicated, especially with the increasing number of stakeholders involved and their growing tendency to defend their own varied interests. Failing to address and meet the concerns and expectations of stakeholders may result in project failures. To avoid this necessitates a systematic participatory approach to facilitate decision-making. Though numerous decision models have been established in previous studies (e.g. ELECTRE methods, the analytic hierarchy process and analytic network process) their applicability in the decision process during stakeholder participation in contemporary MIC projects is still uncertain. To resolve this, the decision rule approach is employed for modeling multi-stakeholder multi-objective project decisions. Through this, the result is obtained naturally according to the “rules” accepted by any stakeholder involved. In this sense, consensus is more likely to be achieved since the process is more convincing and the result is easier to be accepted by all concerned. Appropriate “rules”, comprehensive enough to address multiple objectives while straightforward enough to be understood by multiple stakeholders, are set for resolving conflict and facilitating consensus during the project decision process. The West Kowloon Cultural District (WKCD) project is used as a demonstration case and a focus group meeting is conducted in order to confirm the validity of the model established. The results indicate that the model is objective, reliable and practical enough to cope with real world problems. Finally, a suggested future research agenda is provided.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we describe our investigation of the cointegration and causal relationships between energy consumption and economic output in Australia over a period of five decades. The framework used in this paper is the single-sector aggregate production function, which is the first comprehensive approach used in an Australian study of this type to include energy, capital and labour as separate inputs of production. The empirical evidence points to a cointegration relationship between energy and output and implies that energy is an important variable in the cointegration space, as are conventional inputs capital and labour. We also find some evidence of bidirectional causality between GDP and energy use. Although the evidence of causality from energy use to GDP was relatively weak when using the thermal aggregate of energy use, once energy consumption was adjusted for energy quality, we found strong evidence of Granger causality from energy use to GDP in Australia over the investigated period. The results are robust, irrespective of the assumptions of linear trends in the cointegration models, and are applicable for different econometric approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of ‘wet litter’, which occurs primarily in grow-out sheds for meat chickens (broilers), has been recognised for nearly a century. Nevertheless, it is an increasingly important problem in contemporary chicken-meat production as wet litter and associated conditions, especially footpad dermatitis, have developed into tangible welfare issues. This is only compounded by the market demand for chicken paws and compromised bird performance. This review considers the multidimensional causal factors of wet litter. While many causal factors can be listed it is evident that the critical ones could be described as micro-environmental factors and chief amongst them is proper management of drinking systems and adequate shed ventilation. Thus, this review focuses on these environmental factors and pays less attention to issues stemming from health and nutrition. Clearly, there are times when related avian health issues of coccidiosis and necrotic enteritis cannot be overlooked and the development of efficacious vaccines for the latter disease would be advantageous. Presently, the inclusion of phytate-degrading enzymes in meat chicken diets is routine and, therefore, the implication that exogenous phytases may contribute to wet litter is given consideration. Opinion is somewhat divided as how best to counter the problem of wet litter as some see education and extension as being more beneficial than furthering research efforts. However, it may prove instructive to assess the practice of whole grain feeding in relation to litter quality and the incidence of footpad dermatitis. Additional research could investigate the relationships between dietary concentrations of key minerals and the application of exogenous enzymes with litter quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrologic impacts of climate change are usually assessed by downscaling the General Circulation Model (GCM) output of large-scale climate variables to local-scale hydrologic variables. Such an assessment is characterized by uncertainty resulting from the ensembles of projections generated with multiple GCMs, which is known as intermodel or GCM uncertainty. Ensemble averaging with the assignment of weights to GCMs based on model evaluation is one of the methods to address such uncertainty and is used in the present study for regional-scale impact assessment. GCM outputs of large-scale climate variables are downscaled to subdivisional-scale monsoon rainfall. Weights are assigned to the GCMs on the basis of model performance and model convergence, which are evaluated with the Cumulative Distribution Functions (CDFs) generated from the downscaled GCM output (for both 20th Century [20C3M] and future scenarios) and observed data. Ensemble averaging approach, with the assignment of weights to GCMs, is characterized by the uncertainty caused by partial ignorance, which stems from nonavailability of the outputs of some of the GCMs for a few scenarios (in Intergovernmental Panel on Climate Change [IPCC] data distribution center for Assessment Report 4 [AR4]). This uncertainty is modeled with imprecise probability, i.e., the probability being represented as an interval gray number. Furthermore, the CDF generated with one GCM is entirely different from that with another and therefore the use of multiple GCMs results in a band of CDFs. Representing this band of CDFs with a single valued weighted mean CDF may be misleading. Such a band of CDFs can only be represented with an envelope that contains all the CDFs generated with a number of GCMs. Imprecise CDF represents such an envelope, which not only contains the CDFs generated with all the available GCMs but also to an extent accounts for the uncertainty resulting from the missing GCM output. This concept of imprecise probability is also validated in the present study. The imprecise CDFs of monsoon rainfall are derived for three 30-year time slices, 2020s, 2050s and 2080s, with A1B, A2 and B1 scenarios. The model is demonstrated with the prediction of monsoon rainfall in Orissa meteorological subdivision, which shows a possible decreasing trend in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many species inhabit fragmented landscapes, resulting either from anthropogenic or from natural processes. The ecological and evolutionary dynamics of spatially structured populations are affected by a complex interplay between endogenous and exogenous factors. The metapopulation approach, simplifying the landscape to a discrete set of patches of breeding habitat surrounded by unsuitable matrix, has become a widely applied paradigm for the study of species inhabiting highly fragmented landscapes. In this thesis, I focus on the construction of biologically realistic models and their parameterization with empirical data, with the general objective of understanding how the interactions between individuals and their spatially structured environment affect ecological and evolutionary processes in fragmented landscapes. I study two hierarchically structured model systems, which are the Glanville fritillary butterfly in the Åland Islands, and a system of two interacting aphid species in the Tvärminne archipelago, both being located in South-Western Finland. The interesting and challenging feature of both study systems is that the population dynamics occur over multiple spatial scales that are linked by various processes. My main emphasis is in the development of mathematical and statistical methodologies. For the Glanville fritillary case study, I first build a Bayesian framework for the estimation of death rates and capture probabilities from mark-recapture data, with the novelty of accounting for variation among individuals in capture probabilities and survival. I then characterize the dispersal phase of the butterflies by deriving a mathematical approximation of a diffusion-based movement model applied to a network of patches. I use the movement model as a building block to construct an individual-based evolutionary model for the Glanville fritillary butterfly metapopulation. I parameterize the evolutionary model using a pattern-oriented approach, and use it to study how the landscape structure affects the evolution of dispersal. For the aphid case study, I develop a Bayesian model of hierarchical multi-scale metapopulation dynamics, where the observed extinction and colonization rates are decomposed into intrinsic rates operating specifically at each spatial scale. In summary, I show how analytical approaches, hierarchical Bayesian methods and individual-based simulations can be used individually or in combination to tackle complex problems from many different viewpoints. In particular, hierarchical Bayesian methods provide a useful tool for decomposing ecological complexity into more tractable components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

- Purpose This paper aims to investigate how direct mail consumption contributes to brand relationship quality. Store flyers and other direct mailings continue to play a significant role in many companies’ communication strategies. Research on this topic predominantly investigates driving store traffic and sales. Less is known regarding the consumer side, such as the value that consumers may derive from the consumption of direct mailings and the effects of such a value on brand relationship quality. To address this limitation, this paper tests a causal model of the contribution of direct mail value to brand commitment, drawing on a value framework that integrates social theory of engagement regimes and literature on experiential customer value. - Design/methodology/approach The empirical work of this paper is based on a rigorous four-study mixed methods design, involving qualitative study, confirmatory factor analysis and partial least squares structural modeling. - Findings The authors develop two second-order formatively designed scales – familiar value and planned value scales – that illustrate the role of engagement regimes in consumer behavior. Although both types of value contribute equally to direct mail attachment, they exert contrasting effects on other mediational consumer responses, such as reading and gratitude. Finally, the proposed theoretical model appears to be robust in predicting customers’ brand commitment. - Research limitations/implications This study provides new insights into the research on consumer value and brand relational communication. - Originality/value This study is the first to consider consumer benefits from the social perspective of engagement regimes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Solidification processes are complex in nature, involving multiple phases and several length scales. The properties of solidified products are dictated by the microstructure, the mactostructure, and various defects present in the casting. These, in turn, are governed by the multiphase transport phenomena Occurring at different length scales. In order to control and improve the quality of cast products, it is important to have a thorough understanding of various physical and physicochemical phenomena Occurring at various length scales. preferably through predictive models and controlled experiments. In this context, the modeling of transport phenomena during alloy solidification has evolved over the last few decades due to the complex multiscale nature of the problem. Despite this, a model accounting for all the important length scales directly is computationally prohibitive. Thus, in the past, single-phase continuum models have often been employed with respect to a single length scale to model solidification processing. However, continuous development in understanding the physics of solidification at various length scales oil one hand and the phenomenal growth of computational power oil the other have allowed researchers to use increasingly complex multiphase/multiscale models in recent. times. These models have allowed greater understanding of the coupled micro/macro nature of the process and have made it possible to predict solute segregation and microstructure evolution at different length scales. In this paper, a brief overview of the current status of modeling of convection and macrosegregation in alloy solidification processing is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two oxazolidine-2-thiones, thio-analogs of linezolid, were synthesized and their antibacterial properties evaluated. Unlike oxazolidinones, the thio-analogs did not inhibit the growth of Gram positive bacteria. A molecular modeling study has been carried out to aid understanding of this unexpected finding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Platelet endothelial cell adhesion molecule 1 (PECAM-1) has many functions, including its roles in leukocyte extravasation as part of the inflammatory response and in the maintenance of vascular integrity through its contribution to endothelial cell−cell adhesion. PECAM-1 has been shown to mediate cell−cell adhesion through homophilic binding events that involve interactions between domain 1 of PECAM-1 molecules on adjacent cells. However, various heterophilic ligands of PECAM-1 have also been proposed. The possible interaction of PECAM-1 with glycosaminoglycans (GAGs) is the focus of this study. The three-dimensional structure of the extracellular immunoglobulin (Ig) domains of PECAM-1 were constructed using homology modeling and threading methods. Potential heparin/heparan sulfate-binding sites were predicted on the basis of their amino acid consensus sequences and a comparison with known structures of sulfate-binding proteins. Heparin and other GAG fragments have been docked to investigate the structural determinants of their protein-binding specificity and selectivity. The modeling has predicted two regions in PECAM-1 that appear to bind heparin oligosaccharides. A high-affinity binding site was located in Ig domains 2 and 3, and evidence for a low-affinity site in Ig domains 5 and 6 was obtained. These GAG-binding regions were distinct from regions involved in PECAM-1 homophilic interactions.