848 resultados para Random Variable


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The reliability of measurement refers to unsystematic error in observed responses. Investigations of the prevalence of random error in stated estimates of willingness to pay (WTP) are important to an understanding of why tests of validity in CV can fail. However, published reliability studies have tended to adopt empirical methods that have practical and conceptual limitations when applied to WTP responses. This contention is supported in a review of contingent valuation reliability studies that demonstrate important limitations of existing approaches to WTP reliability. It is argued that empirical assessments of the reliability of contingent values may be better dealt with by using multiple indicators to measure the latent WTP distribution. This latent variable approach is demonstrated with data obtained from a WTP study for stormwater pollution abatement. Attitude variables were employed as a way of assessing the reliability of open-ended WTP (with benchmarked payment cards) for stormwater pollution abatement. The results indicated that participants' decisions to pay were reliably measured, but not the magnitude of the WTP bids. This finding highlights the need to better discern what is actually being measured in VVTP studies, (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efficient automatic protein classification is of central importance in genomic annotation. As an independent way to check the reliability of the classification, we propose a statistical approach to test if two sets of protein domain sequences coming from two families of the Pfam database are significantly different. We model protein sequences as realizations of Variable Length Markov Chains (VLMC) and we use the context trees as a signature of each protein family. Our approach is based on a Kolmogorov-Smirnov-type goodness-of-fit test proposed by Balding et at. [Limit theorems for sequences of random trees (2008), DOI: 10.1007/s11749-008-0092-z]. The test statistic is a supremum over the space of trees of a function of the two samples; its computation grows, in principle, exponentially fast with the maximal number of nodes of the potential trees. We show how to transform this problem into a max-flow over a related graph which can be solved using a Ford-Fulkerson algorithm in polynomial time on that number. We apply the test to 10 randomly chosen protein domain families from the seed of Pfam-A database (high quality, manually curated families). The test shows that the distributions of context trees coming from different families are significantly different. We emphasize that this is a novel mathematical approach to validate the automatic clustering of sequences in any context. We also study the performance of the test via simulations on Galton-Watson related processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adaptive process in motor learning was examined in terms of effects of varying amounts of constant practice performed before random practice. Participants pressed five response keys sequentially, the last one coincident with the lighting of a final visual stimulus provided by a complex coincident timing apparatus. Different visual stimulus speeds were used during the random practice. 33 children (M age=11.6 yr.) were randomly assigned to one of three experimental groups: constant-random, constant-random 33%, and constant-random 66%. The constant-random group practiced constantly until they reached a criterion of performance stabilization three consecutive trials within 50 msec. of error. The other two groups had additional constant practice of 33 and 66%, respectively, of the number of trials needed to achieve the stabilization criterion. All three groups performed 36 trials under random practice; in the adaptation phase, they practiced at a different visual stimulus speed adopted in the stabilization phase. Global performance measures were absolute, constant, and variable errors, and movement pattern was analyzed by relative timing and overall movement time. There was no group difference in relation to global performance measures and overall movement time. However, differences between the groups were observed on movement pattern, since constant-random 66% group changed its relative timing performance in the adaptation phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To evaluate the impact of atypical retardation patterns (ARP) on detection of progressive retinal nerve fiber layer (RNFL) loss using scanning laser polarimetry with variable corneal compensation (VCC). DESIGN: Observational cohort study. METHODS: The study included 377 eyes of 221 patients with a median follow-up of 4.0 years. Images were obtained annually with the GDx VCC (Carl Zeiss Med, itec Inc, Dublin, California, USA), along with optic disc stereophotographs and standard automated perimetry (SAP) visual fields. Progression was determined by the Guided Progression Analysis software for SAP and by masked assessment of stereophotographs by expert graders. The typical scan score (TSS) was used to quantify the presence of ARPs on GDx VCC images. Random coefficients models were used to evaluate the relationship between ARP and RNFL thickness measurements over time. RESULTS: Thirty-eight eyes (10%) showed progression over time on visual fields, stereophotographs, or both. Changes in TSS scores from baseline were significantly associated with changes in RNFL thickness measurements in both progressing and nonprogressing eyes. Each I unit increase in TSS score was associated with a 0.19-mu m decrease in RNFL thickness measurement (P < .001) over time. CONCLUSIONS: ARPs had a significant effect on detection of progressive RNFL loss with the GDx VCC. Eyes with large amounts of atypical patterns, great fluctuations on these patterns over time, or both may show changes in measurements that can appear falsely as glaucomatous progression or can mask true changes in the RNFL. (Am J Ophthalmol 2009;148:155-163. (C) 2009 by Elsevier Inc. All rights reserved.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing the relationship between the baseline value and subsequent change of a continuous variable is a frequent matter of inquiry in cohort studies. These analyses are surprisingly complex, particularly if only two waves of data are available. It is unclear for non-biostatisticians where the complexity of this analysis lies and which statistical method is adequate.With the help of simulated longitudinal data of body mass index in children,we review statistical methods for the analysis of the association between the baseline value and subsequent change, assuming linear growth with time. Key issues in such analyses are mathematical coupling, measurement error, variability of change between individuals, and regression to the mean. Ideally, it is better to rely on multiple repeated measurements at different times and a linear random effects model is a standard approach if more than two waves of data are available. If only two waves of data are available, our simulations show that Blomqvist's method - which consists in adjusting for measurement error variance the estimated regression coefficient of observed change on baseline value - provides accurate estimates. The adequacy of the methods to assess the relationship between the baseline value and subsequent change depends on the number of data waves, the availability of information on measurement error, and the variability of change between individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Summary points: - The bias introduced by random measurement error will be different depending on whether the error is in an exposure variable (risk factor) or outcome variable (disease) - Random measurement error in an exposure variable will bias the estimates of regression slope coefficients towards the null - Random measurement error in an outcome variable will instead increase the standard error of the estimates and widen the corresponding confidence intervals, making results less likely to be statistically significant - Increasing sample size will help minimise the impact of measurement error in an outcome variable but will only make estimates more precisely wrong when the error is in an exposure variable

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the behavior of the random-bond Ising model at zero temperature by numerical simulations for a variable amount of disorder. The model is an example of systems exhibiting a fluctuationless first-order phase transition similar to some field-induced phase transitions in ferromagnetic systems and the martensitic phase transition appearing in a number of metallic alloys. We focus on the study of the hysteresis cycles appearing when the external field is swept from positive to negative values. By using a finite-size scaling hypothesis, we analyze the disorder-induced phase transition between the phase exhibiting a discontinuity in the hysteresis cycle and the phase with the continuous hysteresis cycle. Critical exponents characterizing the transition are obtained. We also analyze the size and duration distributions of the magnetization jumps (avalanches).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The decay of an unstable state under the influence of external colored noise has been studied by means of analog experiments and digital simulations. For both fixed and random initial conditions, the time evolution of the second moment ¿x2(t)¿ of the system variable was determined and then used to evaluate the nonlinear relaxation time. The results obtained are found to be in excellent agreement with the theoretical predictions of the immediately preceding paper [Casademunt, Jiménez-Aquino, and Sancho, Phys. Rev. A 40, 5905 (1989)].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All derivations of the one-dimensional telegraphers equation, based on the persistent random walk model, assume a constant speed of signal propagation. We generalize here the model to allow for a variable propagation speed and study several limiting cases in detail. We also show the connections of this model with anomalous diffusion behavior and with inertial dichotomous processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a nonlinear measure of dependence between random variables in the context of remote sensing data analysis. The Hilbert-Schmidt Independence Criterion (HSIC) is a kernel method for evaluating statistical dependence. HSIC is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces. The HSIC empirical estimator is very easy to compute and has good theoretical and practical properties. We exploit the capabilities of HSIC to explain nonlinear dependences in two remote sensing problems: temperature estimation and chlorophyll concentration prediction from spectra. Results show that, when the relationship between random variables is nonlinear or when few data are available, the HSIC criterion outperforms other standard methods, such as the linear correlation or mutual information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variation in queen number alters the genetic structure of social insect colonies, which in turn affects patterns of kin-selected conflict and cooperation. Theory suggests that shifts from single- to multiple-queen colonies are often associated with other changes in the breeding system, such as higher queen turnover, more local mating, and restricted dispersal. These changes may restrict gene flow between the two types of colonies and it has been suggested that this might ultimately lead to sympatric speciation. We performed a detailed microsatellite analysis of a large population of the ant Formica selysi, which revealed extensive variation in social structure, with 71 colonies headed by a single queen and 41 by multiple queens. This polymorphism in social structure appeared stable over time, since little change in the number of queens per colony was detected over a five-year period. Apart from queen number, single- and multiple-queen colonies had very similar breeding systems. Queen turnover was absent or very low in both types of colonies. Single- and multiple-queen colonies exhibited very small but significant levels of inbreeding, which indicates a slight deviation from random mating at a local scale and suggests that a small proportion of queens mate with related males. For both types of colonies, there was very little genetic structuring above the level of the nest, with no sign of isolation by distance. These similarities in the breeding systems were associated with a complete lack of genetic differentiation between single- and multiple-queen colonies, which provides no support for the hypothesis that change in queen number leads to restricted gene flow between social forms. Overall, this study suggests that the higher rates of queen turnover, local mating, and population structuring that are often associated with multiple-queen colonies do not appear when single- and multiple-queen colonies still coexist within the same population, but build up over time in populations consisting mostly of multiple-queen colonies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les polymères sensibles à des stimuli ont été largement étudiés ces dernières années notamment en vue d’applications biomédicales. Ceux-ci ont la capacité de changer leurs propriétés de solubilité face à des variations de pH ou de température. Le but de cette thèse concerne la synthèse et l’étude de nouveaux diblocs composés de deux copolymères aléatoires. Les polymères ont été obtenus par polymérisation radicalaire contrôlée du type RAFT (reversible addition-fragmentation chain-transfer). Les polymères à bloc sont formés de monomères de méthacrylates et/ou d’acrylamides dont les polymères sont reconnus comme thermosensibles et sensible au pH. Premièrement, les copolymères à bloc aléatoires du type AnBm-b-ApBq ont été synthétisés à partir de N-n-propylacrylamide (nPA) et de N-ethylacrylamide (EA), respectivement A et B, par polymérisation RAFT. La cinétique de copolymérisation des poly(nPAx-co-EA1-x)-block-poly(nPAy-co-EA1-y) et leur composition ont été étudiées afin de caractériser et évaluer les propriétés physico-chimiques des copolymères à bloc aléatoires avec un faible indice de polydispersité . Leurs caractères thermosensibles ont été étudiés en solution aqueuse par spectroscopie UV-Vis, turbidimétrie et analyse de la diffusion dynamique de la lumière (DLS). Les points de trouble (CP) observés des blocs individuels et des copolymères formés démontrent des phases de transitions bien définies lors de la chauffe. Un grand nombre de macromolécules naturels démontrent des réponses aux stimuli externes tels que le pH et la température. Aussi, un troisième monomère, 2-diethylaminoethyl methacrylate (DEAEMA), a été ajouté à la synthèse pour former des copolymères à bloc , sous la forme AnBm-b-ApCq , et qui offre une double réponse (pH et température), modulable en solution. Ce type de polymère, aux multiples stimuli, de la forme poly(nPAx-co-DEAEMA1-x)-block-poly(nPAy-co-EA1-y), a lui aussi été synthétisé par polymérisation RAFT. Les résultats indiquent des copolymères à bloc aléatoires aux propriétés physico-chimiques différentes des premiers diblocs, notamment leur solubilité face aux variations de pH et de température. Enfin, le changement d’hydrophobie des copolymères a été étudié en faisant varier la longueur des séquences des blocs. Il est reconnu que la longueur relative des blocs affecte les mécanismes d’agrégation d’un copolymère amphiphile. Ainsi avec différents stimuli de pH et/ou de température, les expériences effectuées sur des copolymères à blocaléatoires de différentes longueurs montrent des comportements d’agrégation intéressants, évoluant sous différentes formes micellaires, d’agrégats et de vésicules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze a finite horizon, single product, periodic review model in which pricing and production/inventory decisions are made simultaneously. Demands in different periods are random variables that are independent of each other and their distributions depend on the product price. Pricing and ordering decisions are made at the beginning of each period and all shortages are backlogged. Ordering cost includes both a fixed cost and a variable cost proportional to the amount ordered. The objective is to find an inventory policy and a pricing strategy maximizing expected profit over the finite horizon. We show that when the demand model is additive, the profit-to-go functions are k-concave and hence an (s,S,p) policy is optimal. In such a policy, the period inventory is managed based on the classical (s,S) policy and price is determined based on the inventory position at the beginning of each period. For more general demand functions, i.e., multiplicative plus additive functions, we demonstrate that the profit-to-go function is not necessarily k-concave and an (s,S,p) policy is not necessarily optimal. We introduce a new concept, the symmetric k-concave functions and apply it to provide a characterization of the optimal policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze an infinite horizon, single product, periodic review model in which pricing and production/inventory decisions are made simultaneously. Demands in different periods are identically distributed random variables that are independent of each other and their distributions depend on the product price. Pricing and ordering decisions are made at the beginning of each period and all shortages are backlogged. Ordering cost includes both a fixed cost and a variable cost proportional to the amount ordered. The objective is to maximize expected discounted, or expected average profit over the infinite planning horizon. We show that a stationary (s,S,p) policy is optimal for both the discounted and average profit models with general demand functions. In such a policy, the period inventory is managed based on the classical (s,S) policy and price is determined based on the inventory position at the beginning of each period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Synapsing Variable Length Crossover (SVLC) algorithm provides a biologically inspired method for performing meaningful crossover between variable length genomes. In addition to providing a rationale for variable length crossover it also provides a genotypic similarity metric for variable length genomes enabling standard niche formation techniques to be used with variable length genomes. Unlike other variable length crossover techniques which consider genomes to be rigid inflexible arrays and where some or all of the crossover points are randomly selected, the SVLC algorithm considers genomes to be flexible and chooses non-random crossover points based on the common parental sequence similarity. The SVLC Algorithm recurrently "glues" or synapses homogenous genetic sub-sequences together. This is done in such a way that common parental sequences are automatically preserved in the offspring with only the genetic differences being exchanged or removed, independent of the length of such differences. In a variable length test problem the SVLC algorithm is shown to outperform current variable length crossover techniques. The SVLC algorithm is also shown to work in a more realistic robot neural network controller evolution application.