22 resultados para Random Number of Ancestors

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Random number generation (RNG) is a functionally complex process that is highly controlled and therefore dependent on Baddeley's central executive. This study addresses this issue by investigating whether key predictions from this framework are compatible with empirical data. In Experiment 1, the effect of increasing task demands by increasing the rate of the paced generation was comprehensively examined. As expected, faster rates affected performance negatively because central resources were increasingly depleted. Next, the effects of participants' exposure were manipulated in Experiment 2 by providing increasing amounts of practice on the task. There was no improvement over 10 practice trials, suggesting that the high level of strategic control required by the task was constant and not amenable to any automatization gain with repeated exposure. Together, the results demonstrate that RNG performance is a highly controlled and demanding process sensitive to additional demands on central resources (Experiment 1) and is unaffected by repeated performance or practice (Experiment 2). These features render the easily administered RNG task an ideal and robust index of executive function that is highly suitable for repeated clinical use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A parallel hardware random number generator for use with a VLSI genetic algorithm processing device is proposed. The design uses an systolic array of mixed congruential random number generators. The generators are constantly reseeded with the outputs of the proceeding generators to avoid significant biasing of the randomness of the array which would result in longer times for the algorithm to converge to a solution. 1 Introduction In recent years there has been a growing interest in developing hardware genetic algorithm devices [1, 2, 3]. A genetic algorithm (GA) is a stochastic search and optimization technique which attempts to capture the power of natural selection by evolving a population of candidate solutions by a process of selection and reproduction [4]. In keeping with the evolutionary analogy, the solutions are called chromosomes with each chromosome containing a number of genes. Chromosomes are commonly simple binary strings, the bits being the genes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates random number generators in stochastic iteration algorithms that require infinite uniform sequences. We take a simple model of the general transport equation and solve it with the application of a linear congruential generator, the Mersenne twister, the mother-of-all generators, and a true random number generator based on quantum effects. With this simple model we show that for reasonably contractive operators the theoretically not infinite-uniform sequences perform also well. Finally, we demonstrate the power of stochastic iteration for the solution of the light transport problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of calculating the probability of error in a DS/SSMA system has been extensively studied for more than two decades. When random sequences are employed some conditioning must be done before the application of the central limit theorem is attempted, leading to a Gaussian distribution. The authors seek to characterise the multiple access interference as a random-walk with a random number of steps, for random and deterministic sequences. Using results from random-walk theory, they model the interference as a K-distributed random variable and use it to calculate the probability of error in the form of a series, for a DS/SSMA system with a coherent correlation receiver and BPSK modulation under Gaussian noise. The asymptotic properties of the proposed distribution agree with other analyses. This is, to the best of the authors' knowledge, the first attempt to propose a non-Gaussian distribution for the interference. The modelling can be extended to consider multipath fading and general modulation

Relevância:

100.00% 100.00%

Publicador:

Resumo:

None of the current surveillance streams monitoring the presence of scrapie in Great Britain provide a comprehensive and unbiased estimate of the prevalence of the disease at the holding level. Previous work to estimate the under-ascertainment adjusted prevalence of scrapie in Great Britain applied multiple-list capture–recapture methods. The enforcement of new control measures on scrapie-affected holdings in 2004 has stopped the overlapping between surveillance sources and, hence, the application of multiple-list capture–recapture models. Alternative methods, still under the capture–recapture methodology, relying on repeated entries in one single list have been suggested in these situations. In this article, we apply one-list capture–recapture approaches to data held on the Scrapie Notifications Database to estimate the undetected population of scrapie-affected holdings with clinical disease in Great Britain for the years 2002, 2003, and 2004. For doing so, we develop a new diagnostic tool for indication of heterogeneity as well as a new understanding of the Zelterman and Chao’s lower bound estimators to account for potential unobserved heterogeneity. We demonstrate that the Zelterman estimator can be viewed as a maximum likelihood estimator for a special, locally truncated Poisson likelihood equivalent to a binomial likelihood. This understanding allows the extension of the Zelterman approach by means of logistic regression to include observed heterogeneity in the form of covariates—in case studied here, the holding size and country of origin. Our results confirm the presence of substantial unobserved heterogeneity supporting the application of our two estimators. The total scrapie-affected holding population in Great Britain is around 300 holdings per year. None of the covariates appear to inform the model significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Field experiments were carried out to assess the effects of nitrogen fertilization and seed rate on the Hagberg falling number (HFN) of commercial wheat hybrids and their parents. Applying nitrogen (200 kg N ha(-1)) increased HFN in two successive years. The HFN of the hybrid Hyno Esta was lower than either of its parents (Estica and Audace), particularly when nitrogen was not applied. Treatment effects on HFN were negatively associated with a-amylase activity. Phadebas grain blotting suggested two populations of grains with different types of a-amylase activity: Estica appeared to have a high proportion of grains with low levels of late maturity endosperm a-amylase activity (LMEA); Audace had a few grains showing high levels of germination amylase; and the hybrid, Hyno Esta, combined the sources from both parents to show heterosis for a-amylase activity. Applying nitrogen reduced both apparent LMEA and germination amylase. The effects on LMEA were associated with the size and disruption of the grain cavity, which was greater in Hyno Esta and Estica and in zero-nitrogen treatments. External grain morphology failed to explain much of the variation in LMEA and cavity size, but there was a close negative correlation between cavity size and protein content. Applying nitrogen increased post-harvest dormancy of the grain. Dormancy was greatest in Estica and least in Audace. It is proposed that effects of seed rate, genotype and nitrogen fertilizer on HFN are mediated through factors affecting the size and disruption of the grain cavity and therefore LMEA, and through factors affecting dormancy and therefore germination amylase. (c) 2004 Society of Chemical Industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews Bayesian procedures for phase 1 dose-escalation studies and compares different dose schedules and cohort sizes. The methodology described is motivated by the situation of phase 1 dose-escalation studiesin oncology, that is, a single dose administered to each patient, with a single binary response ("toxicity"' or "no toxicity") observed. It is likely that a wider range of applications of the methodology is possible. In this paper, results from 10000-fold simulation runs conducted using the software package Bayesian ADEPT are presented. Four designs were compared under six scenarios. The simulation results indicate that there are slight advantages of having more dose levels and smaller cohort sizes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

None of the current surveillance streams monitoring the presence of scrapie in Great Britain provide a comprehensive and unbiased estimate of the prevalence of the disease at the holding level. Previous work to estimate the under-ascertainment adjusted prevalence of scrapie in Great Britain applied multiple-list capture-recapture methods. The enforcement of new control measures on scrapie-affected holdings in 2004 has stopped the overlapping between surveillance sources and, hence, the application of multiple-list capture-recapture models. Alternative methods, still under the capture-recapture methodology, relying on repeated entries in one single list have been suggested in these situations. In this article, we apply one-list capture-recapture approaches to data held on the Scrapie Notifications Database to estimate the undetected population of scrapie-affected holdings with clinical disease in Great Britain for the years 2002, 2003, and 2004. For doing so, we develop a new diagnostic tool for indication of heterogeneity as well as a new understanding of the Zelterman and Chao's lower bound estimators to account for potential unobserved heterogeneity. We demonstrate that the Zelterman estimator can be viewed as a maximum likelihood estimator for a special, locally truncated Poisson likelihood equivalent to a binomial likelihood. This understanding allows the extension of the Zelterman approach by means of logistic regression to include observed heterogeneity in the form of covariates-in case studied here, the holding size and country of origin. Our results confirm the presence of substantial unobserved heterogeneity supporting the application of our two estimators. The total scrapie-affected holding population in Great Britain is around 300 holdings per year. None of the covariates appear to inform the model significantly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Severe malarial anaemia is a major complication of malaria infection and is multifactorial resulting from loss of circulating red blood cells (RBCs) from parasite replication, as well as immune-mediated mechanisms. An understanding of the causes of severe malarial anaemia is necessary to develop and implement new therapeutic strategies to tackle this syndrome of malaria infection. Methods: Using analysis of variance, this work investigated whether parasite-destruction of RBCs always accounts for the severity of malarial anaemia during infections of the rodent malaria model Plasmodium chabaudi in mice of a BALB/c background. Differences in anaemia between two different clones of P. chabaudi were also examined. Results: Circulating parasite numbers were not correlated with the severity of anaemia in either BALB/c mice or under more severe conditions of anaemia in BALB/c RAG2 deficient mice (lacking T and B cells). Mice infected with P. chabaudi clone CB suffered more severe anaemia than mice infected with clone AS, but this was not correlated with the number of parasites in the circulation. Instead, the peak percentage of parasitized RBCs was higher in CB-infected animals than in AS-infected animals, and was correlated with the severity of anaemia, suggesting that the availability of uninfected RBCs was impaired in CB-infected animals. Conclusion: This work shows that parasite numbers are a more relevant measure of parasite levels in P. chabaudi infection than % parasitaemia, a measure that does not take anaemia into account. The lack of correlation between parasite numbers and the drop in circulating RBCs in this experimental model of malaria support a role for the host response in the impairment or destruction of uninfected RBC in P. chabaudi infections, and thus development of acute anaemia in this malaria model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a method for reconstructing 3D frontier points, contour generators and surfaces of anatomical objects or smooth surfaces from a small number, e. g. 10, of conventional 2D X-ray images. The X-ray images are taken at different viewing directions with full prior knowledge of the X-ray source and sensor configurations. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the number of viewing directions is fixed and the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The technique may be used not only in medicine but also in industrial applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new approach is presented to identify the number of incoming signals in antenna array processing. The new method exploits the inherent properties existing in the noise eigenvalues of the covariance matrix of the array output. A single threshold has been established concerning information about the signal and noise strength, data length, and array size. When the subspace-based algorithms are adopted the computation cost of the signal number detector can almost be neglected. The performance of the threshold is robust against low SNR and short data length.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Compared with the postprandial events after a single meal, different events occur when a second meal is ingested 4–6 h after a first meal. There is a rapid appearance of chylomicrons in the circulation carrying fat ingested with the first meal, with a peak 1 h after the second meal. Objective: Our goal was to examine whether different dietary oils have effects on the storage of triacylglycerol as a result of differences in their digestion, absorption, and incorporation into chylomicrons. Design: A single-blind, randomized, within-subject crossover design was used to study the effects of palm oil, safflower oil, a mixture of fish and safflower oil, and olive oil on postprandial apolipoprotein (apo) B-48, retinyl ester, and triacylglycerol in the Sf > 400 fraction with the use of a sequential meal protocol. Results: For triacylglycerol, retinyl ester, and apo B-48, the time to reach peak concentration was significantly earlier after the second meal than after the first meal (P < 0.005). This was apparent with each of the dietary oils. The pattern of the apo B-48 response differed significantly among the dietary oils, with olive oil resulting in higher concentrations after both meals (P = 0.003). The ratio of triacylglycerol to apo B-48 was significantly lower after olive oil feeding than after feeding with the other oils (P = 0.02). Conclusions: The rapid entry of chylomicrons after the ingestion of a second meal 5 h after a first meal was seen with all of the oils investigated. The short-term ingestion of olive oil produced more chylomicrons than did the other dietary oils, which may have been due to differences in the metabolic handling of olive oil within the gut.