803 resultados para Probability sample
Resumo:
A generalized or tunable-kernel model is proposed for probability density function estimation based on an orthogonal forward regression procedure. Each stage of the density estimation process determines a tunable kernel, namely, its center vector and diagonal covariance matrix, by minimizing a leave-one-out test criterion. The kernel mixing weights of the constructed sparse density estimate are finally updated using the multiplicative nonnegative quadratic programming algorithm to ensure the nonnegative and unity constraints, and this weight-updating process additionally has the desired ability to further reduce the model size. The proposed tunable-kernel model has advantages, in terms of model generalization capability and model sparsity, over the standard fixed-kernel model that restricts kernel centers to the training data points and employs a single common kernel variance for every kernel. On the other hand, it does not optimize all the model parameters together and thus avoids the problems of high-dimensional ill-conditioned nonlinear optimization associated with the conventional finite mixture model. Several examples are included to demonstrate the ability of the proposed novel tunable-kernel model to effectively construct a very compact density estimate accurately.
Resumo:
BACKGROUND: Enriching poultry meat with long-chain n-3 polyunsaturated fatty acids (LC n-3 PUFA) can increase low population intakes of LC n-3 PUFA, but fishy taints can spoil reheated meat. This experiment determined the effect of different amounts of LC n-3 PUFA and vitamin E in the broiler diet on the fatty acid composition and sensory characteristics of the breast meat. Ross 308 broilers (120) were randomly allocated to one of five treatments from 21 to 42 days of age. Diets contained (g kg−1) 0, 9 or 18 LC n-3 PUFA (0LC, 9LC, 18LC), and 100, 150 or 200 mg LD--tocopherol acetate kg−1 (E). The five diets were 0LC100E, 9LC100E, 18LC100E, 18LC150E, 18LC200E, with four pens per diet, except 18LC100E (eight pens). Breast meat was analysed for fatty acids (uncooked) and sensory analysis by R-index (reheated). RESULTS: LC n-3 PUFA content (mg kg−1 meat) was 514 (0LC100E) and 2236 (9LC and 18LC). Compared with 0LC100E, meat from 18LC100E and 18LC150E tasted significantly different, while 23% of panellists detected fishy taints in 9LC100E and 18LC200E. CONCLUSION: Chicken meat can be enriched with nutritionally meaningful amounts of LC n-3 PUFA, but > 100 mg dl--tocopherol acetate kg−1 broiler diet is needed to protect reheated meat from oxidative deterioration. Copyright © 2010 Society of Chemical Industry
Resumo:
A two-locus match probability is presented that incorporates the effects of within-subpopulation inbreeding (consanguinity) in addition to population subdivision. The usual practice of calculating multi-locus match probabilities as the product of single-locus probabilities assumes independence between loci. There are a number of population genetics phenomena that can violate this assumption: in addition to consanguinity, which increases homozygosity at all loci simultaneously, gametic disequilibrium will introduce dependence into DNA profiles. However, in forensics the latter problem is usually addressed in part by the careful choice of unlinked loci. Hence, as is conventional, we assume gametic equilibrium here, and focus instead on between-locus dependence due to consanguinity. The resulting match probability formulae are an extension of existing methods in the literature, and are shown to be more conservative than these methods in the case of double homozygote matches. For two-locus profiles involving one or more heterozygous genotypes, results are similar to, or smaller than, the existing approaches.
Resumo:
A new sparse kernel probability density function (pdf) estimator based on zero-norm constraint is constructed using the classical Parzen window (PW) estimate as the target function. The so-called zero-norm of the parameters is used in order to achieve enhanced model sparsity, and it is suggested to minimize an approximate function of the zero-norm. It is shown that under certain condition, the kernel weights of the proposed pdf estimator based on the zero-norm approximation can be updated using the multiplicative nonnegative quadratic programming algorithm. Numerical examples are employed to demonstrate the efficacy of the proposed approach.
Resumo:
Recent research has suggested that relatively cold UK winters are more common when solar activity is low (Lockwood et al 2010 Environ. Res. Lett. 5 024001). Solar activity during the current sunspot minimum has fallen to levels unknown since the start of the 20th century (Lockwood 2010 Proc. R. Soc. A 466 303–29) and records of past solar variations inferred from cosmogenic isotopes (Abreu et al 2008 Geophys. Res. Lett. 35 L20109) and geomagnetic activity data (Lockwood et al 2009 Astrophys. J. 700 937–44) suggest that the current grand solar maximum is coming to an end and hence that solar activity can be expected to continue to decline. Combining cosmogenic isotope data with the long record of temperatures measured in central England, we estimate how solar change could influence the probability in the future of further UK winters that are cold, relative to the hemispheric mean temperature, if all other factors remain constant. Global warming is taken into account only through the detrending using mean hemispheric temperatures. We show that some predictive skill may be obtained by including the solar effect.
Resumo:
The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain
Resumo:
We consider the finite sample properties of model selection by information criteria in conditionally heteroscedastic models. Recent theoretical results show that certain popular criteria are consistent in that they will select the true model asymptotically with probability 1. To examine the empirical relevance of this property, Monte Carlo simulations are conducted for a set of non–nested data generating processes (DGPs) with the set of candidate models consisting of all types of model used as DGPs. In addition, not only is the best model considered but also those with similar values of the information criterion, called close competitors, thus forming a portfolio of eligible models. To supplement the simulations, the criteria are applied to a set of economic and financial series. In the simulations, the criteria are largely ineffective at identifying the correct model, either as best or a close competitor, the parsimonious GARCH(1, 1) model being preferred for most DGPs. In contrast, asymmetric models are generally selected to represent actual data. This leads to the conjecture that the properties of parameterizations of processes commonly used to model heteroscedastic data are more similar than may be imagined and that more attention needs to be paid to the behaviour of the standardized disturbances of such models, both in simulation exercises and in empirical modelling.
Resumo:
Consideration is given to a standard CDMA system and determination of the density function of the interference with and without Gaussian noise using sampling theory concepts. The formula derived provides fast and accurate results and is a simple, useful alternative to other methods
Resumo:
Recent research in social neuroscience proposes a link between mirror neuron system (MNS) and social cognition. The MNS has been proposed to be the neural mechanism underlying action recognition and intention understanding and more broadly social cognition. Pre-motor MNS has been suggested to modulate the motor cortex during action observation. This modulation results in an enhanced cortico-motor excitability reflected in increased motor evoked potentials (MEPs) at the muscle of interest during action observation. Anomalous MNS activity has been reported in the autistic population whose social skills are notably impaired. It is still an open question whether traits of autism in the normal population are linked to the MNS functioning. We measured TMS-induced MEPs in normal individuals with high and low traits of autism as measured by the autistic quotient (AQ), while observing videos of hand or mouth actions, static images of a hand or mouth or a blank screen. No differences were observed between the two while they observed a blank screen. However participants with low traits of autism showed significantly greater MEP amplitudes during observation of hand/mouth actions relative to static hand/mouth stimuli. In contrast, participants with high traits of autism did not show such a MEP amplitude difference between observation of actions and static stimuli. These results are discussed with reference to MNS functioning.
Resumo:
This paper presents practical approaches to the problem of sample size re-estimation in the case of clinical trials with survival data when proportional hazards can be assumed. When data are readily available at the time of the review, on a full range of survival experiences across the recruited patients, it is shown that, as expected, performing a blinded re-estimation procedure is straightforward and can help to maintain the trial's pre-specified error rates. Two alternative methods for dealing with the situation where limited survival experiences are available at the time of the sample size review are then presented and compared. In this instance, extrapolation is required in order to undertake the sample size re-estimation. Worked examples, together with results from a simulation study are described. It is concluded that, as in the standard case, use of either extrapolation approach successfully protects the trial error rates. Copyright © 2012 John Wiley & Sons, Ltd.