962 resultados para Estimation par maximum de vraisemblance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thermal limits of individual animals were originally proposed as a link between animal physiology and thermal ecology. Although this link is valid in theory, the evaluation of physiological tolerances involves some problems that are the focus of this study. One rationale was that heating rates shall influence upper critical limits, so that ecological thermal limits need to consider experimental heating rates. In addition, if thermal limits are not surpassed in experiments, subsequent tests of the same individual should yield similar results or produce evidence of hardening. Finally, several non-controlled variables such as time under experimental conditions and procedures may affect results. To analyze these issues we conducted an integrative study of upper critical temperatures in a single species, the ant Atta sexdens rubropiosa, an animal model providing large numbers of individuals of diverse sizes but similar genetic makeup. Our specific aims were to test the 1) influence of heating rates in the experimental evaluation of upper critical temperature, 2) assumptions of absence of physical damage and reproducibility, and 3) sources of variance often overlooked in the thermal-limits literature; and 4) to introduce some experimental approaches that may help researchers to separate physiological and methodological issues. The upper thermal limits were influenced by both heating rates and body mass. In the latter case, the effect was physiological rather than methodological. The critical temperature decreased during subsequent tests performed on the same individual ants, even one week after the initial test. Accordingly, upper thermal limits may have been overestimated by our (and typical) protocols. Heating rates, body mass, procedures independent of temperature and other variables may affect the estimation of upper critical temperatures. Therefore, based on our data, we offer suggestions to enhance the quality of measurements, and offer recommendations to authors aiming to compile and analyze databases from the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Defining pharmacokinetic parameters and depletion intervals for antimicrobials used in fish represents important guidelines for future regulation by Brazilian agencies of the use of these substances in fish farming. This article presents a depletion study for oxytetracycline (OTC) in tilapias (Orechromis niloticus) farmed under tropical conditions during the winter season. High performance liquid chromatography, with fluorescence detection for the quantitation of OTC in tilapia fillets and medicated feed, was developed and validated. The depletion study with fish was carried out under monitored environmental conditions. OTC was administered in the feed for five consecutive days at daily dosages of 80 mg/kg body weight. Groups of ten fish were slaughtered at 1, 2, 3, 4, 5, 8, 10, 15, 20, and 25 days after medication. After the 8th day posttreatment, OTC concentrations in the tilapia fillets were below the limit of quantitation (13 ng/g) of the method. Linear regression of the mathematical model of data analysis presented a coefficient of 0.9962. The elimination half- life for OTC in tilapia fillet and the withdrawal period were 1.65 and 6 days, respectively, considering a percentile of 99% with 95% of confidence and a maximum residue limit of 100 ng/g. Even though the study was carried out in the winter under practical conditions where water temperature varied, the results obtained are similar to others from studies conducted under controlled temperature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we analyzed the phylogeographic pattern and historical demography of an endemic Atlantic forest (AF) bird, Basileuterus leucoblepharus, and test the influence of the last glacial maximum (LGM) on its population effective size using coalescent simulations. We address two main questions: (i) Does B. leucoblepharus present population genetic structure congruent with the patterns observed for other AF organisms? (ii) How did the LGM affect the effective population size of B. leucoblepharus? We sequenced 914 bp of the mitochondrial gene cytochrome b and 512 bp of the nuclear intron 5 of beta-fibrinogen of 62 individuals from 15 localities along the AF. Both molecular markers revealed no genetic structure in B. leucoblepharus. Neutrality tests based on both loci showed significant demographic expansion. The extended Bayesian skyline plot showed that the species seems to have experienced demographic expansion starting around 300,000 years ago, during the late Pleistocene. This date does not coincide with the LGM and the dynamics of population size showed stability during the LGM. To further test the effect of the LGM on this species, we simulated seven demographic scenarios to explore whether populations suffered specific bottlenecks. The scenarios most congruent with our data were population stability during the LGM with bottlenecks older than this period. This is the first example of an AF organism that does not show phylogeographic breaks caused by vicariant events associated to climate change and geotectonic activities in the Quaternary. Differential ecological, environmental tolerances and habitat requirements are possibly influencing the different evolutionary histories of these organisms. Our results show that the history of organism diversification in this megadiverse Neotropical forest is complex. Crown Copyright (c) 2012 Published by Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] The aim of this work is to propose a new method for estimating the backward flow directly from the optical flow. We assume that the optical flow has already been computed and we need to estimate the inverse mapping. This mapping is not bijective due to the presence of occlusions and disocclusions, therefore it is not possible to estimate the inverse function in the whole domain. Values in these regions has to be guessed from the available information. We propose an accurate algorithm to calculate the backward flow uniquely from the optical flow, using a simple relation. Occlusions are filled by selecting the maximum motion and disocclusions are filled with two different strategies: a min-fill strategy, which fills each disoccluded region with the minimum value around the region; and a restricted min-fill approach that selects the minimum value in a close neighborhood. In the experimental results, we show the accuracy of the method and compare the results using these two strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ES] En este trabajo proponemos un nuevo modelo para el cálculo de la disparidad y la reconstrucción 3-D a partir de un sistema estéreo compuesto por 2 imágenes en color. Proponemos un nuevo modelo para el cálculo de la disparidad basado en un criterio de energía. Para calcular los mínimos de este funcional de energía utilizamos la ecuación en derivadas parciales de Euler-Langrage asociada. Este modelo es una extensión a imágenes color del modelo desarrollado en "L. Alvarez, R. Deriche, J. Sánchez and J. Weickert, Dense disparity map estimation respecting image discontinuities : A PDE and Scale-Space Based Approach. INRIA Rapport de Recherche Nº 3874, 2000". Con algunos cambios en la estrategia parav evitar caer en mínimos locales de la energía. Por último presentamos algunas experiencias numéricas de la reconstrucción 3-D obtenida con este método en algunos pares estéreos de imágenes reales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the scientific achievement of the last decades in the astrophysical and cosmological fields, the majority of the Universe energy content is still unknown. A potential solution to the “missing mass problem” is the existence of dark matter in the form of WIMPs. Due to the very small cross section for WIMP-nuleon interactions, the number of expected events is very limited (about 1 ev/tonne/year), thus requiring detectors with large target mass and low background level. The aim of the XENON1T experiment, the first tonne-scale LXe based detector, is to be sensitive to WIMP-nucleon cross section as low as 10^-47 cm^2. To investigate the possibility of such a detector to reach its goal, Monte Carlo simulations are mandatory to estimate the background. To this aim, the GEANT4 toolkit has been used to implement the detector geometry and to simulate the decays from the various background sources: electromagnetic and nuclear. From the analysis of the simulations, the level of background has been found totally acceptable for the experiment purposes: about 1 background event in a 2 tonne-years exposure. Indeed, using the Maximum Gap method, the XENON1T sensitivity has been evaluated and the minimum for the WIMP-nucleon cross sections has been found at 1.87 x 10^-47 cm^2, at 90% CL, for a WIMP mass of 45 GeV/c^2. The results have been independently cross checked by using the Likelihood Ratio method that confirmed such results with an agreement within less than a factor two. Such a result is completely acceptable considering the intrinsic differences between the two statistical methods. Thus, in the PhD thesis it has been proven that the XENON1T detector will be able to reach the designed sensitivity, thus lowering the limits on the WIMP-nucleon cross section by about 2 orders of magnitude with respect to the current experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This doctoral thesis was focused on the investigation of enantiomeric and non-enantiomeric biogenic organic compound (BVOC) emissions from both leaf and canopy scales in different environments. In addition, the anthropogenic compounds benzene, toluene, ethylbenzene, and xylenes (BTEX) were studied. BVOCs are emitted into the lower troposphere in large quantities (ca. 1150 Tg C ·yr-1), approximately an order of magnitude greater than the anthropogenic VOCs. BVOCs are particularly important in tropospheric chemistry because of their impact on ozone production and secondary organic aerosol formation or growth. The BVOCs examined in this study were: isoprene, (-)/ (+)-α-pinene, (-)/ (+)-ß-pinene, Δ-3-carene, (-)/ (+)-limonene, myrcene, eucalyptol and camphor, as these were the most abundant BVOCs observed both in the leaf cuvette study and the ambient measurements. In the laboratory cuvette studies, the sensitivity of enantiomeric enrichment change from the leaf emission has been examined as a function of light (0-1600 PAR) and temperature (20-45°C). Three typical Mediterranean plant species (Quercus ilex L., Rosmarinus officinalis L., Pinus halepensis Mill.) with more than three individuals of each have been investigated using a dynamic enclosure cuvette. The terpenoid compound emission rates were found to be directly linked to either light and temperature (e.g. Quercus ilex L.) or mainly to temperature (e.g. Rosmarinus officinalis L., Pinus halepensis Mill.). However, the enantiomeric signature showed no clear trend in response to either the light or temperature; moreover a large variation of enantiomeric enrichment was found during the experiment. This enantiomeric signature was also used to distinguish chemotypes beyond the normal achiral chemical composition method. The results of nineteen Quercus ilex L. individuals, screened under standard conditions (30°C and 1000 PAR) showed four different chemotypes, whereas the traditional classification showed only two. An enclosure branch cuvette set-up was applied in the natural boreal forest environment from four chemotypes of Scots pine (Pinus sylvestris) and one chemotype of Norway spruce (Picea abies) and the direct emissions compared with ambient air measurements above the canopy during the HUMPPA-COPEC 2010 summer campaign. The chirality of a-pinene was dominated by (+)-enantiomers from Scots pine while for Norway spruce the chirality was found to be opposite (i.e. Abstract II (-)-enantiomer enriched) becoming increasingly enriched in the (-)-enantiomer with light. Field measurements over a Spanish stone pine forest were performed to examine the extent of seasonal changes in enantiomeric enrichment (DOMINO 2008). These showed clear differences in chirality of monoterpene emissions. In wintertime the monoterpene (-)-a-pinene was found to be in slight enantiomeric excess over (+)-a-pinene at night but by day the measured ratio was closer to one i.e. racemic. Samples taken the following summer in the same location showed much higher monoterpene mixing ratios and revealed a strong enantiomeric excess of (-)-a-pinene. This indicated a strong seasonal variance in the enantiomeric emission ratio which was not manifested in the day/night temperature cycles in wintertime. A clear diurnal cycle of enantiomeric enrichment in a-pinene was also found over a French oak forest and the boreal forest. However, while in the boreal forest (-)-a-pinene enrichment increased around the time of maximum light and temperature, the French forest showed the opposite tendency with (+)-a-pinene being favored. For the two field campaigns (DOMINO 2008 and HUMPPA-COPEC 2010), the BTEX were also investigated. For the DOMINO campaign, mixing ratios of the xylene isomers (meta- and para-) and ethylbenzene, which are all well resolved on the ß-cyclodextrin column, were exploited to estimate average OH radical exposures to VOCs from the Huelva industrial area. These were compared to empirical estimates of OH based on JNO2 measured at the site. The deficiencies of each estimation method are discussed. For HUMPPA-COPEC campaign, benzene and toluene mixing ratios can clearly define the air mass influenced by the biomass burning pollution plume from Russia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A large number of proposals for estimating the bivariate survival function under random censoring has been made. In this paper we discuss nonparametric maximum likelihood estimation and the bivariate Kaplan-Meier estimator of Dabrowska. We show how these estimators are computed, present their intuitive background and compare their practical performance under different levels of dependence and censoring, based on extensive simulation results, which leads to a practical advise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the interplay of smoothness and monotonicity assumptions when estimating a density from a sample of observations. The nonparametric maximum likelihood estimator of a decreasing density on the positive half line attains a rate of convergence at a fixed point if the density has a negative derivative. The same rate is obtained by a kernel estimator, but the limit distributions are different. If the density is both differentiable and known to be monotone, then a third estimator is obtained by isotonization of a kernel estimator. We show that this again attains the rate of convergence and compare the limit distributors of the three types of estimators. It is shown that both isotonization and smoothing lead to a more concentrated limit distribution and we study the dependence on the proportionality constant in the bandwidth. We also show that isotonization does not change the limit behavior of a kernel estimator with a larger bandwidth, in the case that the density is known to have more than one derivative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses estimation of the tumor incidence rate, the death rate given tumor is present and the death rate given tumor is absent using a discrete multistage model. The model was originally proposed by Dewanji and Kalbfleisch (1986) and the maximum likelihood estimate of the tumor incidence rate was obtained using EM algorithm. In this paper, we use a reparametrization to simplify the estimation procedure. The resulting estimates are not always the same as the maximum likelihood estimates but are asymptotically equivalent. In addition, an explicit expression for asymptotic variance and bias of the proposed estimators is also derived. These results can be used to compare efficiency of different sacrifice schemes in carcinogenicity experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Latent class regression models are useful tools for assessing associations between covariates and latent variables. However, evaluation of key model assumptions cannot be performed using methods from standard regression models due to the unobserved nature of latent outcome variables. This paper presents graphical diagnostic tools to evaluate whether or not latent class regression models adhere to standard assumptions of the model: conditional independence and non-differential measurement. An integral part of these methods is the use of a Markov Chain Monte Carlo estimation procedure. Unlike standard maximum likelihood implementations for latent class regression model estimation, the MCMC approach allows us to calculate posterior distributions and point estimates of any functions of parameters. It is this convenience that allows us to provide the diagnostic methods that we introduce. As a motivating example we present an analysis focusing on the association between depression and socioeconomic status, using data from the Epidemiologic Catchment Area study. We consider a latent class regression analysis investigating the association between depression and socioeconomic status measures, where the latent variable depression is regressed on education and income indicators, in addition to age, gender, and marital status variables. While the fitted latent class regression model yields interesting results, the model parameters are found to be invalid due to the violation of model assumptions. The violation of these assumptions is clearly identified by the presented diagnostic plots. These methods can be applied to standard latent class and latent class regression models, and the general principle can be extended to evaluate model assumptions in other types of models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The degree of polarization of a refected field from active laser illumination can be used for object identifcation and classifcation. The goal of this study is to investigate methods for estimating the degree of polarization for refected fields with active laser illumination, which involves the measurement and processing of two orthogonal field components (complex amplitudes), two orthogonal intensity components, and the total field intensity. We propose to replace interferometric optical apparatuses with a computational approach for estimating the degree of polarization from two orthogonal intensity data and total intensity data. Cramer-Rao bounds for each of the three sensing modalities with various noise models are computed. Algebraic estimators and maximum-likelihood (ML) estimators are proposed. Active-set algorithm and expectation-maximization (EM) algorithm are used to compute ML estimates. The performances of the estimators are compared with each other and with their corresponding Cramer-Rao bounds. Estimators for four-channel polarimeter (intensity interferometer) sensing have a better performance than orthogonal intensities estimators and total intensity estimators. Processing the four intensities data from polarimeter, however, requires complicated optical devices, alignment, and four CCD detectors. It only requires one or two detectors and a computer to process orthogonal intensities data and total intensity data, and the bounds and estimator performances demonstrate that reasonable estimates may still be obtained from orthogonal intensities or total intensity data. Computational sensing is a promising way to estimate the degree of polarization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.