932 resultados para Random sums


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract Radiation metabolomics employing mass spectral technologies represents a plausible means of high-throughput minimally invasive radiation biodosimetry. A simplified metabolomics protocol is described that employs ubiquitous gas chromatography-mass spectrometry and open source software including random forests machine learning algorithm to uncover latent biomarkers of 3 Gy gamma radiation in rats. Urine was collected from six male Wistar rats and six sham-irradiated controls for 7 days, 4 prior to irradiation and 3 after irradiation. Water and food consumption, urine volume, body weight, and sodium, potassium, calcium, chloride, phosphate and urea excretion showed major effects from exposure to gamma radiation. The metabolomics protocol uncovered several urinary metabolites that were significantly up-regulated (glyoxylate, threonate, thymine, uracil, p-cresol) and down-regulated (citrate, 2-oxoglutarate, adipate, pimelate, suberate, azelaate) as a result of radiation exposure. Thymine and uracil were shown to derive largely from thymidine and 2'-deoxyuridine, which are known radiation biomarkers in the mouse. The radiation metabolomic phenotype in rats appeared to derive from oxidative stress and effects on kidney function. Gas chromatography-mass spectrometry is a promising platform on which to develop the field of radiation metabolomics further and to assist in the design of instrumentation for use in detecting biological consequences of environmental radiation release.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo simulation was used to evaluate properties of a simple Bayesian MCMC analysis of the random effects model for single group Cormack-Jolly-Seber capture-recapture data. The MCMC method is applied to the model via a logit link, so parameters p, S are on a logit scale, where logit(S) is assumed to have, and is generated from, a normal distribution with mean μ and variance σ2 . Marginal prior distributions on logit(p) and μ were independent normal with mean zero and standard deviation 1.75 for logit(p) and 100 for μ ; hence minimally informative. Marginal prior distribution on σ2 was placed on τ2=1/σ2 as a gamma distribution with α=β=0.001 . The study design has 432 points spread over 5 factors: occasions (t) , new releases per occasion (u), p, μ , and σ . At each design point 100 independent trials were completed (hence 43,200 trials in total), each with sample size n=10,000 from the parameter posterior distribution. At 128 of these design points comparisons are made to previously reported results from a method of moments procedure. We looked at properties of point and interval inference on μ , and σ based on the posterior mean, median, and mode and equal-tailed 95% credibility interval. Bayesian inference did very well for the parameter μ , but under the conditions used here, MCMC inference performance for σ was mixed: poor for sparse data (i.e., only 7 occasions) or σ=0 , but good when there were sufficient data and not small σ .

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A physical random number generator based on the intrinsic randomness of quantum mechanics is described. The random events are realized by the choice of single photons between the two outputs of a beamsplitter. We present a simple device, which minimizes the impact of the photon counters’ noise, dead-time and after pulses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In all European Union countries, chemical residues are required to be routinely monitored in meat. Good farming and veterinary practice can prevent the contamination of meat with pharmaceutical substances, resulting in a low detection of drug residues through random sampling. An alternative approach is to target-monitor farms suspected of treating their animals with antimicrobials. The objective of this project was to assess, using a stochastic model, the efficiency of these two sampling strategies. The model integrated data on Swiss livestock as well as expert opinion and results from studies conducted in Switzerland. Risk-based sampling showed an increase in detection efficiency of up to 100% depending on the prevalence of contaminated herds. Sensitivity analysis of this model showed the importance of the accuracy of prior assumptions for conducting risk-based sampling. The resources gained by changing from random to risk-based sampling should be transferred to improving the quality of prior information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe several simulation algorithms that yield random probability distributions with given values of risk measures. In case of vanilla risk measures, the algorithms involve combining and transforming random cumulative distribution functions or random Lorenz curves obtained by simulating rather general random probability distributions on the unit interval. A new algorithm based on the simulation of a weighted barycentres array is suggested to generate random probability distributions with a given value of the spectral risk measure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic models for three-dimensional particles have many applications in applied sciences. Lévy–based particle models are a flexible approach to particle modelling. The structure of the random particles is given by a kernel smoothing of a Lévy basis. The models are easy to simulate but statistical inference procedures have not yet received much attention in the literature. The kernel is not always identifiable and we suggest one approach to remedy this problem. We propose a method to draw inference about the kernel from data often used in local stereology and study the performance of our approach in a simulation study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a fully automatic, robust approach for segmenting proximal femur in conventional X-ray images. Our method is based on hierarchical landmark detection by random forest regression, where the detection results of 22 global landmarks are used to do the spatial normalization, and the detection results of the 59 local landmarks serve as the image cue for instantiation of a statistical shape model of the proximal femur. To detect landmarks in both levels, we use multi-resolution HoG (Histogram of Oriented Gradients) as features which can achieve better accuracy and robustness. The efficacy of the present method is demonstrated by experiments conducted on 150 clinical x-ray images. It was found that the present method could achieve an average point-to-curve error of 2.0 mm and that the present method was robust to low image contrast, noise and occlusions caused by implants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge of landmarks and contours in anteroposterior (AP) pelvis X-rays is invaluable for computer aided diagnosis, hip surgery planning and image-guided interventions. This paper presents a fully automatic and robust approach for landmarking and segmentation of both pelvis and femur in a conventional AP X-ray. Our approach is based on random forest regression and hierarchical sparse shape composition. Experiments conducted on 436 clinical AP pelvis x-rays show that our approach achieves an average point-to-curve error around 1.3 mm for femur and 2.2 mm for pelvis, both with success rates around 98%. Compared to existing methods, our approach exhibits better performance in both the robustness and the accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Perceptual learning is a training induced improvement in performance. Mechanisms underlying the perceptual learning of depth discrimination in dynamic random dot stereograms were examined by assessing stereothresholds as a function of decorrelation. The inflection point of the decorrelation function was defined as the level of decorrelation corresponding to 1.4 times the threshold when decorrelation is 0%. In general, stereothresholds increased with increasing decorrelation. Following training, stereothresholds and standard errors of measurement decreased systematically for all tested decorrelation values. Post training decorrelation functions were reduced by a multiplicative constant (approximately 5), exhibiting changes in stereothresholds without changes in the inflection points. Disparity energy model simulations indicate that a post-training reduction in neuronal noise can sufficiently account for the perceptual learning effects. In two subjects, learning effects were retained over a period of six months, which may have application for training stereo deficient subjects.