951 resultados para Probabilities
Resumo:
Deforestation often occurs as temporal waves and in localized fronts termed 'deforestation hotspots' driven by economic pulses and population pressure. Of particular concern for conservation planning are 'biodiversity hotspots' where high concentrations of endemic species undergo rapid loss and fragmentation of habitat. We investigate the deforestation process in Caqueta, a biodiversity hotspot and major colonization front of the Colombian Amazon using multi-temporal satellite imagery of the periods 1989-1996-1999-2002. The probabilities of deforestation and regeneration were modeled against soil fertility, accessibility and neighborhood terms, using logistic regression analysis. Deforestation and regeneration patterns and rates were highly variable across the colonization front. The regional average annual deforestation rate was 2.6%, but varied locally between -1.8% (regeneration) and 5.3%, with maximum rates in landscapes with 40-60% forest cover and highest edge densities, showing an analogous pattern to the spread of disease. Soil fertility and forest and secondary vegetation neighbors showed positive and significant relationships with the probability of deforestation. For forest regeneration, soil fertility had a significant negative effect while the other parameters were marginally significant. The logistic regression models across all periods showed a high level of discrimination power for both deforestation and forest regeneration, with ROC values > 0.80. We document the effect of policies and institutional changes on the land clearing process, such as the failed peace process between government and guerillas in 1999-2002, which redirected the spread of deforestation and increased forest regeneration. The implications for conservation in biologically rich areas, such as Caqueta are discussed. (c) 2005 Elsevier B.V All rights reserved.
Resumo:
Effective detection of population trend is crucial for managing threatened species. Little theory exists, however, to assist managers in choosing the most cost-effective monitoring techniques for diagnosing trend. We present a framework for determining the optimal monitoring strategy by simulating a manager collecting data on a declining species, the Chestnut-rumped Hylacola (Hylacola pyrrhopygia parkeri), to determine whether the species should be listed under the IUCN (World Conservation Union) Red List. We compared the efficiencies of two strategies for detecting trend, abundance, and presence-absence surveys, underfinancial constraints. One might expect the abundance surveys to be superior under all circumstances because more information is collected at each site. Nevertheless, the presence-absence data can be collected at more sites because the surveyor is not obliged to spend a fixed amount of time at each site. The optimal strategy for monitoring was very dependent on the budget available. Under some circumstances, presence-absence surveys outperformed abundance surveys for diagnosing the IUCN Red List categories cost-effectively. Abundance surveys were best if the species was expected to be recorded more than 16 times/year; otherwise, presence-absence surveys were best. The relationship between the strategies we investigated is likely to be relevant for many comparisons of presence-absence or abundance data. Managers of any cryptic or low-density species who hope to maximize their success of estimating trend should find an application for our results.
Resumo:
The effects of substance P (SP) on nicotinic acetylcholine (ACh)-evoked currents were investigated in parasympathetic neurons dissociated from neonatal rat intracardiac ganglia using standard whole cell, perforated patch, and outside-out recording configurations of the patch-clamp technique. Focal application of SP onto the soma reversibly decreased the peak amplitude of the ACh-evoked current with half-maximal inhibition occurring at 45 mu M and complete block at 300 mu M SP. Whole cell current-voltage (I-V) relationships obtained in the absence and presence of SP indicate that the block of ACh-evoked currents by SP is voltage independent. The rate of decay of ACh-evoked currents was increased sixfold in the presence of SP (100 mu M), suggesting that SP may increase the rate of receptor desensitization. SP-induced inhibition of ACh-evoked currents was observed following cell dialysis and in the presence of either 1 mM 8-Br-cAMP, a membrane-permeant cAMP analogue, 5 mu M H-7, a protein kinase C inhibitor, or 2 mM intracellular AMP-PNP, a nonhydrolyzable ATP analogue. These data suggest that a diffusible cytosolic second messenger is unlikely to mediate SP inhibition of neuronal nicotinic ACh receptor (nAChR) channels. Activation of nAChR channels in outside-out membrane patches by either ACh (3 mu M) or cytisine (3 mu M) indicates the presence of at least three distinct conductances (20, 35, and 47 pS) in rat intracardiac neurons. In the presence of 3 mu M SP, the large conductance nAChR channels are preferentially inhibited. The open probabilities of the large conductance classes activated by either ACh or cytisine were reversibly decreased by 10- to 30-fold in the presence of SP. The single-channel conductances were unchanged, and mean apparent channel open times for the large conductance nAChR channels only were slightly decreased by SP. Given that individual parasympathetic neurons of rat intracardiac ganglia express a heterogeneous population of nAChR subunits represented by the different conductance levels, SP appears to preferentially inhibit those combinations of nAChR subunits that form the large conductance nAChR channels. Since ACh is the principal neurotransmitter of extrinsic (vagal) innervation of the mammalian heart, SP may play an important role in modulating autonomic control of the heart.
Resumo:
The Tree Augmented Naïve Bayes (TAN) classifier relaxes the sweeping independence assumptions of the Naïve Bayes approach by taking account of conditional probabilities. It does this in a limited sense, by incorporating the conditional probability of each attribute given the class and (at most) one other attribute. The method of boosting has previously proven very effective in improving the performance of Naïve Bayes classifiers and in this paper, we investigate its effectiveness on application to the TAN classifier.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
Traditionally, machine learning algorithms have been evaluated in applications where assumptions can be reliably made about class priors and/or misclassification costs. In this paper, we consider the case of imprecise environments, where little may be known about these factors and they may well vary significantly when the system is applied. Specifically, the use of precision-recall analysis is investigated and compared to the more well known performance measures such as error-rate and the receiver operating characteristic (ROC). We argue that while ROC analysis is invariant to variations in class priors, this invariance in fact hides an important factor of the evaluation in imprecise environments. Therefore, we develop a generalised precision-recall analysis methodology in which variation due to prior class probabilities is incorporated into a multi-way analysis of variance (ANOVA). The increased sensitivity and reliability of this approach is demonstrated in a remote sensing application.
Resumo:
The XSophe computer simulation software suite consisting of a daemon, the XSophe interface and the computational program Sophe is a state of the art package for the simulation of electron paramagnetic resonance spectra. The Sophe program performs the computer simulation and includes a number of new technologies including; the SOPHE partition and interpolation schemes, a field segmentation algorithm, homotopy, parallelisation and spectral optimisation. The SOPHE partition and interpolation scheme along with a field segmentation algorithm greatly increases the speed of simulations for most systems. Multidimensional homotopy provides an efficient method for accurately tracing energy levels and hence tracing transitions in the presence of energy level anticrossings and looping transitions and allowing computer simulations in frequency space. Recent enhancements to Sophe include the generalised treatment of distributions of orientational parameters, termed the mosaic misorientation linewidth model and a faster more efficient algorithm for the calculation of resonant field positions and transition probabilities. For complex systems the parallelisation enables the simulation of these systems on a parallel computer and the optimisation algorithms in the suite provide the experimentalist with the possibility of finding the spin Hamiltonian parameters in a systematic manner rather than a trial-and-error process. The XSophe software suite has been used to simulate multifrequency EPR spectra (200 MHz to 6 00 GHz) from isolated spin systems (S > ~½) and coupled centres (Si, Sj _> I/2). Griffin, M.; Muys, A.; Noble, C.; Wang, D.; Eldershaw, C.; Gates, K.E.; Burrage, K.; Hanson, G.R."XSophe, a Computer Simulation Software Suite for the Analysis of Electron Paramagnetic Resonance Spectra", 1999, Mol. Phys. Rep., 26, 60-84.
Resumo:
In this paper we propose a fast adaptive Importance Sampling method for the efficient simulation of buffer overflow probabilities in queueing networks. The method comprises three stages. First we estimate the minimum Cross-Entropy tilting parameter for a small buffer level; next, we use this as a starting value for the estimation of the optimal tilting parameter for the actual (large) buffer level; finally, the tilting parameter just found is used to estimate the overflow probability of interest. We recognize three distinct properties of the method which together explain why the method works well; we conjecture that they hold for quite general queueing networks. Numerical results support this conjecture and demonstrate the high efficiency of the proposed algorithm.
Resumo:
O presente estudo busca identificar em que medida o pertencimento de classe social interfere nos sentidos e perspectivas do jovem brasileiro frente ao futebol espetáculo, bem como compreender os mecanismos sociais que determinam a decisão de os sujeitos investirem na carreira profissional esportiva em detrimento da trajetória escolar longa. Começamos com uma abordagem sociológica, a análise crítica dos processos de difusão, massificação e profissionalização do futebol ocorridas no contexto da sociedade pós-industrial. Enfocamos a conflituosa mutação da modalidade, inicialmente elitizada com fins social-distintivos para esporte de massa ideal de ascensão social da classe popular , além da dependência com a mídia, das razões que levaram a caracterizar-se como produto da indústria cultural, culminando com a transformação do futebol em mercadoria submetida às leis e lógica da sociedade de consumo. Em seguida, entrelaçamos as relações de poder, aliança e concorrência dos agentes sociais que participam do complexo campo das práticas esportivas com os canais com que o público jovem estabelece contato com o esporte espetáculo. Neste aspecto, especial atenção foi dada à mídia, difusora da ideologia de uma sociedade capitalista aberta, que reforça a idéia do esporte como via de ascensão social para indivíduos de baixa renda, na mesma medida em que oculta em seu discurso as reais probabilidades de concretização do sucesso esportivo. A pesquisa de campo foi realizada em duas escolas do município de São Bernardo do Campo (SP), uma da rede pública estadual e outra da rede particular de ensino. Assim, constituímos dois grupos com alunos de distintas classes sociais, compostos por estudantes do 1º ano do Ensino Médio, sexo masculino, com 15 anos de idade e praticantes de futebol nas aulas de Educação Física. Metodologicamente, fizemos uso da observação participante e de entrevistas como instrumentos para a coleta de dados. Conjugadas aos objetivos do estudo, estruturamos a análise do material colhido em seis categorias, articulando questões sobre o prosseguimento nos estudos e o trabalho, as tendências para a pratica esportiva profissional, as representações sociais em torno do futebol, os usos e costumes no tempo livre e as expectativas da pratica esportiva implicadas pela herança cultural familiar. Como referencial teórico de análise, utilizamos de Pierre Bourdieu os conceitos de campo, habitus, estratégia, capital econômico, social e, principalmente, capital cultural, partindo da hipótese de que o nível cultural dos alunos e seus familiares interferem nos sentidos e formas de apropriação do esporte. Entre outras conclusões, obtivemos como resultado a configuração de uma trajetória esportiva profissional voltada para os alunos de baixa classe social, em oposição à trajetória escolar longa, estrategicamente adotada pelos alunos de classe social alta.(AU)
Resumo:
Minimization of a sum-of-squares or cross-entropy error function leads to network outputs which approximate the conditional averages of the target data, conditioned on the input vector. For classifications problems, with a suitably chosen target coding scheme, these averages represent the posterior probabilities of class membership, and so can be regarded as optimal. For problems involving the prediction of continuous variables, however, the conditional averages provide only a very limited description of the properties of the target variables. This is particularly true for problems in which the mapping to be learned is multi-valued, as often arises in the solution of inverse problems, since the average of several correct target values is not necessarily itself a correct value. In order to obtain a complete description of the data, for the purposes of predicting the outputs corresponding to new input vectors, we must model the conditional probability distribution of the target data, again conditioned on the input vector. In this paper we introduce a new class of network models obtained by combining a conventional neural network with a mixture density model. The complete system is called a Mixture Density Network, and can in principle represent arbitrary conditional probability distributions in the same way that a conventional neural network can represent arbitrary functions. We demonstrate the effectiveness of Mixture Density Networks using both a toy problem and a problem involving robot inverse kinematics.
Resumo:
Radial Basis Function networks with linear outputs are often used in regression problems because they can be substantially faster to train than Multi-layer Perceptrons. For classification problems, the use of linear outputs is less appropriate as the outputs are not guaranteed to represent probabilities. We show how RBFs with logistic and softmax outputs can be trained efficiently using the Fisher scoring algorithm. This approach can be used with any model which consists of a generalised linear output function applied to a model which is linear in its parameters. We compare this approach with standard non-linear optimisation algorithms on a number of datasets.
Resumo:
This report outlines the derivation and application of a non-zero mean, polynomial-exponential covariance function based Gaussian process which forms the prior wind field model used in 'autonomous' disambiguation. It is principally used since the non-zero mean permits the computation of realistic local wind vector prior probabilities which are required when applying the scaled-likelihood trick, as the marginals of the full wind field prior. As the full prior is multi-variate normal, these marginals are very simple to compute.
Resumo:
In many problems in spatial statistics it is necessary to infer a global problem solution by combining local models. A principled approach to this problem is to develop a global probabilistic model for the relationships between local variables and to use this as the prior in a Bayesian inference procedure. We show how a Gaussian process with hyper-parameters estimated from Numerical Weather Prediction Models yields meteorologically convincing wind fields. We use neural networks to make local estimates of wind vector probabilities. The resulting inference problem cannot be solved analytically, but Markov Chain Monte Carlo methods allow us to retrieve accurate wind fields.
Resumo:
Radial Basis Function networks with linear outputs are often used in regression problems because they can be substantially faster to train than Multi-layer Perceptrons. For classification problems, the use of linear outputs is less appropriate as the outputs are not guaranteed to represent probabilities. In this paper we show how RBFs with logistic and softmax outputs can be trained efficiently using algorithms derived from Generalised Linear Models. This approach is compared with standard non-linear optimisation algorithms on a number of datasets.