910 resultados para Bayesian maximum entropy
Resumo:
We studied the distribution of Palearctic green toads (Bufo viridis subgroup), an anuran species group with three ploidy levels, inhabiting the Central Asian Amudarya River drainage. Various approaches (one-way, multivariate, components variance analyses and maximum entropy modelling) were used to estimate the effect of altitude, precipitation, temperature and land vegetation covers on the distribution of toads. It is usually assumed that polyploid species occur in regions with harsher climatic conditions (higher latitudes, elevations, etc.), but for the green toads complex, we revealed a more intricate situation. The diploid species (Bufo shaartusiensis and Bufo turanensis) inhabit the arid lowlands (from 44 to 789 m a.s.l.), while tetraploid Bufo pewzowi were recorded in mountainous regions (340-3492 m a.s.l.) with usually lower temperatures and higher precipitation rates than in the region inhabited by diploid species. The triploid species Bufo baturae was found in the Pamirs (Tajikistan) at the highest altitudes (2503-3859 m a.s.l.) under the harshest climatic conditions.
Resumo:
We estimated the geographic distributions of triatomine species in Central-West Region of Brazil (CW) and analysed the climatic factors influencing their occurrence. A total of 3,396 records of 27 triatomine species were analysed. Using the maximum entropy method, ecological niche models were produced for eight species occurring in at least 20 municipalities based on 13 climatic variables and elevation. Triatoma sordida and Rhodnius neglectus were the species with the broadest geographic distributions in CW Brazil. The Cerrado areas in the state of Goiás were found to be more suitable for the occurrence of synanthropic triatomines than the Amazon forest areas in the northern part of the state of Mato Grosso. The variable that best explains the evaluated models is temperature seasonality. The results indicate that almost the entire region presents climatic conditions that are appropriate for at least one triatomine species. Therefore, it is recommended that entomological surveillance be reinforced in CW Brazil.
Resumo:
To understand the geographic distribution of visceral leishmaniasis (VL) in the state of Mato Grosso do Sul (MS), Brazil, both the climatic niches of Lutzomyia longipalpis and VL cases were analysed. Distributional data were obtained from 55 of the 79 counties of MS between 2003-2012. Ecological niche models (ENM) of Lu. longipalpis and VL cases were produced using the maximum entropy algorithm based on eight climatic variables. Lu. longipalpis showed a wide distribution in MS. The highest climatic suitability for Lu. longipalpis was observed in southern MS. Temperature seasonality and annual mean precipitation were the variables that most influenced these models. Two areas of high climatic suitability for the occurrence of VL cases were predicted: one near Aquidauana and another encompassing several municipalities in the southeast region of MS. As expected, a large overlap between the models for Lu. longipalpis and VL cases was detected. Northern and northwestern areas of MS were suitable for the occurrence of cases, but did not show high climatic suitability for Lu. longipalpis . ENM of vectors and human cases provided a greater understanding of the geographic distribution of VL in MS, which can be applied to the development of future surveillance strategies.
Resumo:
The properties and cosmological importance of a class of non-topological solitons, Q-balls, are studied. Aspects of Q-ball solutions and Q-ball cosmology discussed in the literature are reviewed. Q-balls are particularly considered in the Minimal Supersymmetric Standard Model with supersymmetry broken by a hidden sector mechanism mediated by either gravity or gauge interactions. Q-ball profiles, charge-energy relations and evaporation rates for realistic Q-ball profiles are calculated for general polynomial potentials and for the gravity mediated scenario. In all of the cases, the evaporation rates are found to increase with decreasing charge. Q-ball collisions are studied by numerical means in the two supersymmetry breaking scenarios. It is noted that the collision processes can be divided into three types: fusion, charge transfer and elastic scattering. Cross-sections are calculated for the different types of processes in the different scenarios. The formation of Q-balls from the fragmentation of the Aflieck-Dine -condensate is studied by numerical and analytical means. The charge distribution is found to depend strongly on the initial energy-charge ratio of the condensate. The final state is typically noted to consist of Q- and anti-Q-balls in a state of maximum entropy. By studying the relaxation of excited Q-balls the rate at which excess energy can be emitted is calculated in the gravity mediated scenario. The Q-ball is also found to withstand excess energy well without significant charge loss. The possible cosmological consequences of these Q-ball properties are discussed.
Resumo:
A wide range of modelling algorithms is used by ecologists, conservation practitioners, and others to predict species ranges from point locality data. Unfortunately, the amount of data available is limited for many taxa and regions, making it essential to quantify the sensitivity of these algorithms to sample size. This is the first study to address this need by rigorously evaluating a broad suite of algorithms with independent presence-absence data from multiple species and regions. We evaluated predictions from 12 algorithms for 46 species (from six different regions of the world) at three sample sizes (100, 30, and 10 records). We used data from natural history collections to run the models, and evaluated the quality of model predictions with area under the receiver operating characteristic curve (AUC). With decreasing sample size, model accuracy decreased and variability increased across species and between models. Novel modelling methods that incorporate both interactions between predictor variables and complex response shapes (i.e. GBM, MARS-INT, BRUTO) performed better than most methods at large sample sizes but not at the smallest sample sizes. Other algorithms were much less sensitive to sample size, including an algorithm based on maximum entropy (MAXENT) that had among the best predictive power across all sample sizes. Relative to other algorithms, a distance metric algorithm (DOMAIN) and a genetic algorithm (OM-GARP) had intermediate performance at the largest sample size and among the best performance at the lowest sample size. No algorithm predicted consistently well with small sample size (n < 30) and this should encourage highly conservative use of predictions based on small sample size and restrict their use to exploratory modelling.
Resumo:
The inversion problem concerning the windowed Fourier transform is considered. It is shown that, out of the infinite solutions that the problem admits, the windowed Fourier transform is the "optimal" solution according to a maximum-entropy selection criterion.
Resumo:
This correspondence studies the formulation of members ofthe Cohen-Posch class of positive time-frequency energy distributions.Minimization of cross-entropy measures with respect to different priorsand the case of no prior or maximum entropy were considered. It isconcluded that, in general, the information provided by the classicalmarginal constraints is very limited, and thus, the final distributionheavily depends on the prior distribution. To overcome this limitation,joint time and frequency marginals are derived based on a "directioninvariance" criterion on the time-frequency plane that are directly relatedto the fractional Fourier transform.
Resumo:
Maximum entropy modeling (Maxent) is a widely used algorithm for predicting species distributions across space and time. Properly assessing the uncertainty in such predictions is non-trivial and requires validation with independent datasets. Notably, model complexity (number of model parameters) remains a major concern in relation to overfitting and, hence, transferability of Maxent models. An emerging approach is to validate the cross-temporal transferability of model predictions using paleoecological data. In this study, we assess the effect of model complexity on the performance of Maxent projections across time using two European plant species (Alnus giutinosa (L.) Gaertn. and Corylus avellana L) with an extensive late Quaternary fossil record in Spain as a study case. We fit 110 models with different levels of complexity under present time and tested model performance using AUC (area under the receiver operating characteristic curve) and AlCc (corrected Akaike Information Criterion) through the standard procedure of randomly partitioning current occurrence data. We then compared these results to an independent validation by projecting the models to mid-Holocene (6000 years before present) climatic conditions in Spain to assess their ability to predict fossil pollen presence-absence and abundance. We find that calibrating Maxent models with default settings result in the generation of overly complex models. While model performance increased with model complexity when predicting current distributions, it was higher with intermediate complexity when predicting mid-Holocene distributions. Hence, models of intermediate complexity resulted in the best trade-off to predict species distributions across time. Reliable temporal model transferability is especially relevant for forecasting species distributions under future climate change. Consequently, species-specific model tuning should be used to find the best modeling settings to control for complexity, notably with paleoecological data to independently validate model projections. For cross-temporal projections of species distributions for which paleoecological data is not available, models of intermediate complexity should be selected.
Resumo:
Coherent anti-Stokes Raman scattering is the powerful method of laser spectroscopy in which significant successes are achieved. However, the non-linear nature of CARS complicates the analysis of the received spectra. The objective of this Thesis is to develop a new phase retrieval algorithm for CARS. It utilizes the maximum entropy method and the new wavelet approach for spectroscopic background correction of a phase function. The method was developed to be easily automated and used on a large number of spectra of different substances.. The algorithm was successfully tested on experimental data.
Resumo:
In this dissertation, active galactic nuclei (AGN) are discussed, as they are seen with the high-resolution radio-astronomical technique called Very Long Baseline Interferometry (VLBI). This observational technique provides very high angular resolution (_ 10−300 = 1 milliarcsecond). VLBI observations, performed at different radio frequencies (multi-frequency VLBI), allow to penetrate deep into the core of an AGN to reveal an otherwise obscured inner part of the jet and the vicinity of the AGN’s central engine. Multi-frequency VLBI data are used to scrutinize the structure and evolution of the jet, as well as the distribution of the polarized emission. These data can help to derive the properties of the plasma and the magnetic field, and to provide constraints to the jet composition and the parameters of emission mechanisms. Also VLBI data can be used for testing the possible physical processes in the jet by comparing observational results with results of numerical simulations. The work presented in this thesis contributes to different aspects of AGN physics studies, as well as to the methodology of VLBI data reduction. In particular, Paper I reports evidence of optical and radio emission of AGN coming from the same region in the inner jet. This result was obtained via simultaneous observations of linear polarization in the optical and in radio using VLBI technique of a sample of AGN. Papers II and III describe, in detail, the jet kinematics of the blazar 0716+714, based on multi-frequency data, and reveal a peculiar kinematic pattern: plasma in the inner jet appears to move substantially faster that that in the large-scale jet. This peculiarity is explained by the jet bending, in Paper III. Also, Paper III presents a test of the new imaging technique for VLBI data, the Generalized Maximum Entropy Method (GMEM), with the observed (not simulated) data and compares its results with the conventional imaging. Papers IV and V report the results of observations of the circularly polarized (CP) emission in AGN at small spatial scales. In particular, Paper IV presents values of the core CP for 41 AGN at 15, 22 and 43 GHz, obtained with the help of the standard Gain transfer (GT) method, which was previously developed by D. Homan and J.Wardle for the calibration of multi-source VLBI observations. This method was developed for long multi-source observations, when many AGN are observed in a single VLBI run. In contrast, in Paper V, an attempt is made to apply the GT method to single-source VLBI observations. In such observations, the object list would include only a few sources: a target source and two or three calibrators, and it lasts much shorter than the multi-source experiment. For the CP calibration of a single-source observation, it is necessary to have a source with zero or known CP as one of the calibrators. If the archival observations included such a source to the list of calibrators, the GT could also be used for the archival data, increasing a list of known AGN with the CP at small spatial scale. Paper V contains also calculation of contributions of different sourced of errors to the uncertainty of the final result, and presents the first results for the blazar 0716+714.
Resumo:
This work present the application of a computer package for generating of projection data for neutron computerized tomography, and in second part, discusses an application of neutron tomography, using the projection data obtained by Monte Carlo technique, for the detection and localization of light materials such as those containing hydrogen, concealed by heavy materials such as iron and lead. For tomographic reconstructions of the samples simulated use was made of only six equal projection angles distributed between 0º and 180º, with reconstruction making use of an algorithm (ARIEM), based on the principle of maximum entropy. With the neutron tomography it was possible to detect and locate polyethylene and water hidden by lead and iron (with 1cm-thick). Thus, it is demonstrated that thermal neutrons tomography is a viable test method which can provide important interior information about test components, so, extremely useful in routine industrial applications.
Resumo:
A linear prediction procedure is one of the approved numerical methods of signal processing. In the field of optical spectroscopy it is used mainly for extrapolation known parts of an optical signal in order to obtain a longer one or deduce missing signal samples. The first is needed particularly when narrowing spectral lines for the purpose of spectral information extraction. In the present paper the coherent anti-Stokes Raman scattering (CARS) spectra were under investigation. The spectra were significantly distorted by the presence of nonlinear nonresonant background. In addition, line shapes were far from Gaussian/Lorentz profiles. To overcome these disadvantages the maximum entropy method (MEM) for phase spectrum retrieval was used. The obtained broad MEM spectra were further underwent the linear prediction analysis in order to be narrowed.
Resumo:
Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.
Resumo:
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal
Resumo:
Dans la sémantique des cadres de Fillmore, les mots prennent leur sens par rapport au contexte événementiel ou situationnel dans lequel ils s’inscrivent. FrameNet, une ressource lexicale pour l’anglais, définit environ 1000 cadres conceptuels, couvrant l’essentiel des contextes possibles. Dans un cadre conceptuel, un prédicat appelle des arguments pour remplir les différents rôles sémantiques associés au cadre (par exemple : Victime, Manière, Receveur, Locuteur). Nous cherchons à annoter automatiquement ces rôles sémantiques, étant donné le cadre sémantique et le prédicat. Pour cela, nous entrainons un algorithme d’apprentissage machine sur des arguments dont le rôle est connu, pour généraliser aux arguments dont le rôle est inconnu. On utilisera notamment des propriétés lexicales de proximité sémantique des mots les plus représentatifs des arguments, en particulier en utilisant des représentations vectorielles des mots du lexique.