55 resultados para Standard error


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate the conditions for detectability of two planets in a 2/1 mean-motion resonance from radial velocity data, as a function of their masses, number of observations and the signal-to-noise ratio. Even for a data set of the order of 100 observations and standard deviations of the order of a few meters per second, we find that Jovian-size resonant planets are difficult to detect if the masses of the planets differ by a factor larger than similar to 4. This is consistent with the present population of real exosystems in the 2/1 commensurability, most of which have resonant pairs with similar minimum masses, and could indicate that many other resonant systems exist, but are currently beyond the detectability limit. Furthermore, we analyze the error distribution in masses and orbital elements of orbital fits from synthetic data sets for resonant planets in the 2/1 commensurability. For various mass ratios and number of data points we find that the eccentricity of the outer planet is systematically overestimated, although the inner planet`s eccentricity suffers a much smaller effect. If the initial conditions correspond to small-amplitude oscillations around stable apsidal corotation resonances, the amplitudes estimated from the orbital fits are biased toward larger amplitudes, in accordance to results found in real resonant extrasolar systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A avaliação do coeficiente de variação (CV) como medida da precisão dos experimentos tem sido feita com diversas culturas, espécies animais e forrageiras por meio de trabalhos sugerindo faixas de classificação dos valores, considerando-se a média, o desvio padrão e a distribuição dos valores de CV das diversas variáveis respostas envolvidas nos experimentos. Neste trabalho, objetivouse estudar a distribuição dos valores de CV de experimentos com a cultura do feijão, propondo faixas que orientem os pesquisadores na avaliação de seus estudos com cada variável. Os dados utilizados foram obtidos de revisão em revistas que publicam artigos científicos com a cultura do feijão. Foram consideradas as variáveis: rendimento, número de vagens por planta, número de grãos por vagem, peso de 100 grãos, estande final, altura de plantas e índice de colheita. Foram obtidas faixas de valores de CV para cada variável tomando como base a distribuição normal, utilizando-se também a distribuição dos quantis amostrais e a mediana e o pseudo-sigma, classificando-os como baixo, médio, alto e muito alto. Os cálculos estatísticos para verificação da normalidade dos dados foram implementados por meio de uma função no software estatístico livre R. Os resultados obtidos indicaram que faixas de valores de CV diferiram entre as diversas variáveis apresentando ampla variação justificando a necessidade de utilizar faixa de avaliação específica para cada variável.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A compact frequency standard based on an expanding cold (133)CS cloud is under development in our laboratory. In a first experiment, Cs cold atoms were prepared by a magneto-optical trap in a vapor cell, and a microwave antenna was used to transmit the radiation for the clock transition. The signal obtained from fluorescence of the expanding cold atoms cloud is used to lock a microwave chain. In this way the overall system stability is evaluated. A theoretical model based on a two-level system interacting with the two microwave pulses enables interpretation for the observed features, especially the poor Ramsey fringes contrast. (C) 2008 Optical Society of America.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Genome wide association studies (GWAS) are becoming the approach of choice to identify genetic determinants of complex phenotypes and common diseases. The astonishing amount of generated data and the use of distinct genotyping platforms with variable genomic coverage are still analytical challenges. Imputation algorithms combine directly genotyped markers information with haplotypic structure for the population of interest for the inference of a badly genotyped or missing marker and are considered a near zero cost approach to allow the comparison and combination of data generated in different studies. Several reports stated that imputed markers have an overall acceptable accuracy but no published report has performed a pair wise comparison of imputed and empiric association statistics of a complete set of GWAS markers. Results: In this report we identified a total of 73 imputed markers that yielded a nominally statistically significant association at P < 10(-5) for type 2 Diabetes Mellitus and compared them with results obtained based on empirical allelic frequencies. Interestingly, despite their overall high correlation, association statistics based on imputed frequencies were discordant in 35 of the 73 (47%) associated markers, considerably inflating the type I error rate of imputed markers. We comprehensively tested several quality thresholds, the haplotypic structure underlying imputed markers and the use of flanking markers as predictors of inaccurate association statistics derived from imputed markers. Conclusions: Our results suggest that association statistics from imputed markers showing specific MAF (Minor Allele Frequencies) range, located in weak linkage disequilibrium blocks or strongly deviating from local patterns of association are prone to have inflated false positive association signals. The present study highlights the potential of imputation procedures and proposes simple procedures for selecting the best imputed markers for follow-up genotyping studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hardy-Weinberg Equilibrium (HWE) is an important genetic property that populations should have whenever they are not observing adverse situations as complete lack of panmixia, excess of mutations, excess of selection pressure, etc. HWE for decades has been evaluated; both frequentist and Bayesian methods are in use today. While historically the HWE formula was developed to examine the transmission of alleles in a population from one generation to the next, use of HWE concepts has expanded in human diseases studies to detect genotyping error and disease susceptibility (association); Ryckman and Williams (2008). Most analyses focus on trying to answer the question of whether a population is in HWE. They do not try to quantify how far from the equilibrium the population is. In this paper, we propose the use of a simple disequilibrium coefficient to a locus with two alleles. Based on the posterior density of this disequilibrium coefficient, we show how one can conduct a Bayesian analysis to verify how far from HWE a population is. There are other coefficients introduced in the literature and the advantage of the one introduced in this paper is the fact that, just like the standard correlation coefficients, its range is bounded and it is symmetric around zero (equilibrium) when comparing the positive and the negative values. To test the hypothesis of equilibrium, we use a simple Bayesian significance test, the Full Bayesian Significance Test (FBST); see Pereira, Stern andWechsler (2008) for a complete review. The disequilibrium coefficient proposed provides an easy and efficient way to make the analyses, especially if one uses Bayesian statistics. A routine in R programs (R Development Core Team, 2009) that implements the calculations is provided for the readers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Data: Photodynamic therapy (PDT) involves the photoinduction of cytotoxicity using a photosensitizer agent, a light source of the proper wavelength, and the presence of molecular oxygen. A model for tissue response to PDT based on the photodynamic threshold dose (Dth) has been widely used. In this model cells exposed to doses below Dth survive while at doses above the Dth necrosis takes place. Objective: This study evaluated the light Dth values by using two different methods of determination. One model concerns the depth of necrosis and the other the width of superficial necrosis. Materials and Methods: Using normal rat liver we investigated the depth and width of necrosis induced by PDT when a laser with a gaussian intensity profile is used. Different light doses, photosensitizers (Photogem, Photofrin, Photosan, Foscan, Photodithazine, and Radachlorin), and concentrations were employed. Each experiment was performed on five animals and the average and standard deviations were calculated. Results: A simple depth and width of necrosis model analysis allows us to determine the threshold dose by measuring both depth and surface data. Comparison shows that both measurements provide the same value within the degree of experimental error. Conclusion: This work demonstrates that by knowing the extent of the superficial necrotic area of a target tissue irradiated by a gaussian light beam, it is possible to estimate the threshold dose. This technique may find application where the determination of Dth must be done without cutting the tissue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we investigate the duality linking standard and tachyon scalar field homogeneous and isotropic cosmologies in N + 1 dimensions. We determine the transformation between standard and tachyon scalar fields and between their associated potentials, corresponding to the same background evolution. We show that, in general, the duality is broken at a perturbative level, when deviations from a homogeneous and isotropic background are taken into account. However, we find that for slow-rolling fields the duality is still preserved at a linear level. We illustrate our results with specific examples of cosmological relevance, where the correspondence between scalar and tachyon scalar field models can be calculated explicitly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The optimal discrimination of nonorthogonal quantum states with minimum error probability is a fundamental task in quantum measurement theory as well as an important primitive in optical communication. In this work, we propose and experimentally realize a new and simple quantum measurement strategy capable of discriminating two coherent states with smaller error probabilities than can be obtained using the standard measurement devices: the Kennedy receiver and the homodyne receiver.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the massless scalar, Dirac, and electromagnetic fields propagating on a 4D-brane, which is embedded in higher-dimensional Gauss-Bonnet space-time. We calculate, in the time domain, the fundamental quasinormal modes of a spherically symmetric black hole for such fields. Using WKB approximation we study quasinormal modes in the large multipole limit. We observe also a universal behavior, independent on a field and value of the Gauss-Bonnet parameter, at an asymptotically late time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a new and simple method to determine the molecular weight of proteins in dilute solution, with an error smaller than similar to 10%, by using the experimental data of a single small-angle X-ray scattering (SAXS) curve measured on a relative scale. This procedure does not require the measurement of SAXS intensity on an absolute scale and does not involve a comparison with another SAXS curve determined from a known standard protein. The proposed procedure can be applied to monodisperse systems of proteins in dilute solution, either in monomeric or multimeric state, and it has been successfully tested on SAXS data experimentally determined for proteins with known molecular weights. It is shown here that the molecular weights determined by this procedure deviate from the known values by less than 10% in each case and the average error for the test set of 21 proteins was 5.3%. Importantly, this method allows for an unambiguous determination of the multimeric state of proteins with known molecular weights.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: There are several studies in the literature depicting measurement error in gene expression data and also, several others about regulatory network models. However, only a little fraction describes a combination of measurement error in mathematical regulatory networks and shows how to identify these networks under different rates of noise. Results: This article investigates the effects of measurement error on the estimation of the parameters in regulatory networks. Simulation studies indicate that, in both time series (dependent) and non-time series (independent) data, the measurement error strongly affects the estimated parameters of the regulatory network models, biasing them as predicted by the theory. Moreover, when testing the parameters of the regulatory network models, p-values computed by ignoring the measurement error are not reliable, since the rate of false positives are not controlled under the null hypothesis. In order to overcome these problems, we present an improved version of the Ordinary Least Square estimator in independent (regression models) and dependent (autoregressive models) data when the variables are subject to noises. Moreover, measurement error estimation procedures for microarrays are also described. Simulation results also show that both corrected methods perform better than the standard ones (i.e., ignoring measurement error). The proposed methodologies are illustrated using microarray data from lung cancer patients and mouse liver time series data. Conclusions: Measurement error dangerously affects the identification of regulatory network models, thus, they must be reduced or taken into account in order to avoid erroneous conclusions. This could be one of the reasons for high biological false positive rates identified in actual regulatory network models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An analytical procedure for multiple standard additions of arsenic species using sequential injection analysis (SIA) is proposed for their quantification in seafood extracts. SIA presented flexibility for generating multiple specie standards at the ng mL(-1) concentration level by adding different volumes of As(III), As(V), monomethylarsonic (MMA) and dimethylarsinic (DMA) to the sample. The mixed sample plus standard solutions were delivered from SIA to fill the HPLC injection loop. Subsequently, As species were separated by HPLC and analyzed by atomic fluorescence spectrometry (AFS). The proposed system comprised two independently controlled modules, with the HPLC loop acting as the intermediary device. The analytical frequency was enhanced by combining the actions of both modules. While the added sample was flowing through the chromatographic column towards the detection system, the SIA program started performing the standard additions to another sample. The proposed method was applied to spoiled seafood extracts. Detection limits based on 3 sigma for As(III), As(V), MMA and DMA were 0.023, 0.39, 0.45 and 1.0 ng mL(-1), respectively. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For environmental quality assessment, INAA has been applied for determining chemical elements in small (200 mg) and large (200 g) samples of leaves from 200 trees. By applying the Ingamells` constant, the expected percent standard deviation was estimated in 0.9-2.2% for 200 mg samples. Otherwise, for composite samples (200 g), expected standard deviation varied from 0.5 to 10% in spite of analytical uncertainties ranging from 2 to 30%. Results thereby suggested the expression of the degree of representativeness as a source of uncertainty, contributing for increasing of the reliability of environmental studies mainly in the case of composite samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.