928 resultados para workflow variance
Resumo:
Bounds on the expectation and variance of errors at the output of a multilayer feedforward neural network with perturbed weights and inputs are derived. It is assumed that errors in weights and inputs to the network are statistically independent and small. The bounds obtained are applicable to both digital and analogue network implementations and are shown to be of practical value.
Resumo:
This paper presents a new metric, which we call the lighting variance ratio, for quantifying descriptors in terms of their variance to illumination changes. In many applications it is desirable to have descriptors that are robust to changes in illumination, especially in outdoor environments. The lighting variance ratio is useful for comparing descriptors and determining if a descriptor is lighting invariant enough for a given environment. The metric is analysed across a number of datasets, cameras and descriptors. The results show that the upright SIFT descriptor is typically the most lighting invariant descriptor.
Resumo:
Twin studies are a major research direction in imaging genetics, a new field, which combines algorithms from quantitative genetics and neuroimaging to assess genetic effects on the brain. In twin imaging studies, it is common to estimate the intraclass correlation (ICC), which measures the resemblance between twin pairs for a given phenotype. In this paper, we extend the commonly used Pearson correlation to a more appropriate definition, which uses restricted maximum likelihood methods (REML). We computed proportion of phenotypic variance due to additive (A) genetic factors, common (C) and unique (E) environmental factors using a new definition of the variance components in the diffusion tensor-valued signals. We applied our analysis to a dataset of Diffusion Tensor Images (DTI) from 25 identical and 25 fraternal twin pairs. Differences between the REML and Pearson estimators were plotted for different sample sizes, showing that the REML approach avoids severe biases when samples are smaller. Measures of genetic effects were computed for scalar and multivariate diffusion tensor derived measures including the geodesic anisotropy (tGA) and the full diffusion tensors (DT), revealing voxel-wise genetic contributions to brain fiber microstructure.
Resumo:
Aim Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment. Ongoing work seeks to determine the impact of simulation on clinical skills.
Resumo:
Gene expression is arguably the most important indicator of biological function. Thus identifying differentially expressed genes is one of the main aims of high throughout studies that use microarray and RNAseq platforms to study deregulated cellular pathways. There are many tools for analysing differentia gene expression from transciptomic datasets. The major challenge of this topic is to estimate gene expression variance due to the high amount of ‘background noise’ that is generated from biological equipment and the lack of biological replicates. Bayesian inference has been widely used in the bioinformatics field. In this work, we reveal that the prior knowledge employed in the Bayesian framework also helps to improve the accuracy of differential gene expression analysis when using a small number of replicates. We have developed a differential analysis tool that uses Bayesian estimation of the variance of gene expression for use with small numbers of biological replicates. Our method is more consistent when compared to the widely used cyber-t tool that successfully introduced the Bayesian framework to differential analysis. We also provide a user-friendly web based Graphic User Interface for biologists to use with microarray and RNAseq data. Bayesian inference can compensate for the instability of variance caused when using a small number of biological replicates by using pseudo replicates as prior knowledge. We also show that our new strategy to select pseudo replicates will improve the performance of the analysis. - See more at: http://www.eurekaselect.com/node/138761/article#sthash.VeK9xl5k.dpuf
Resumo:
Experiments in spintronics necessarily involve the detection of spin polarization. The sensitivity of this detection becomes an important factor to consider when extending the low temperature studies on semiconductor spintronic devices to room temperature, where the spin signal is weaker. In pump-probe experiments, which optically inject and detect spins, the sensitivity is often improved by using a photoelastic modulator (PEM) for lock-in detection. However, spurious signals can arise if diode lasers are used as optical sources in such experiments, along with a PEM. In this work, we eliminated the spurious electromagnetic coupling of the PEM onto the probe diode laser, by the double modulation technique. We also developed a test for spurious modulated interference in the pump-probe signal, due to the PEM. Besides, an order of magnitude enhancement in the sensitivity of detection of spin polarization by Kerr rotation, to 3x10(-8) rad was obtained by using the concept of Allan variance to optimally average the time series data over a period of 416 s. With these improvements, we are able to experimentally demonstrate at room temperature, photoinduced steady-state spin polarization in bulk GaAs. Thus, the advances reported here facilitate the use of diode lasers with a PEM for sensitive pump-probe experiments. They also constitute a step toward detection of spin-injection in Si at room temperature.
Resumo:
A modeling paradigm is proposed for covariate, variance and working correlation structure selection for longitudinal data analysis. Appropriate selection of covariates is pertinent to correct variance modeling and selecting the appropriate covariates and variance function is vital to correlation structure selection. This leads to a stepwise model selection procedure that deploys a combination of different model selection criteria. Although these criteria find a common theoretical root based on approximating the Kullback-Leibler distance, they are designed to address different aspects of model selection and have different merits and limitations. For example, the extended quasi-likelihood information criterion (EQIC) with a covariance penalty performs well for covariate selection even when the working variance function is misspecified, but EQIC contains little information on correlation structures. The proposed model selection strategies are outlined and a Monte Carlo assessment of their finite sample properties is reported. Two longitudinal studies are used for illustration.
Resumo:
The approach of generalized estimating equations (GEE) is based on the framework of generalized linear models but allows for specification of a working matrix for modeling within-subject correlations. The variance is often assumed to be a known function of the mean. This article investigates the impacts of misspecifying the variance function on estimators of the mean parameters for quantitative responses. Our numerical studies indicate that (1) correct specification of the variance function can improve the estimation efficiency even if the correlation structure is misspecified; (2) misspecification of the variance function impacts much more on estimators for within-cluster covariates than for cluster-level covariates; and (3) if the variance function is misspecified, correct choice of the correlation structure may not necessarily improve estimation efficiency. We illustrate impacts of different variance functions using a real data set from cow growth.
Resumo:
Organisations are always focussed on ensuring that their business operations are performed in the most cost-effective manner, and that processes are responsive to ever-changing cost pressures. In many organisations, however, strategic cost-based decisions at the managerial level are not directly or quickly translatable to process-level operational support. A primary reason for this disconnect is the limited system-based support for cost-informed decisions at the process-operational level in real time. In this paper, we describe the different ways in which a workflow management system can support process-related decisions, guided by cost-informed considerations at the operational level, during execution. As a result, cost information is elevated from its non-functional attribute role to a first-class, fully functional process perspective. The paper defines success criteria that a WfMS should meet to provide such support, and discusses a reference implementation within the YAWL workflow environment that demonstrates how the various types of cost-informed decision rules are supported, using an illustrative example.
Resumo:
A recent theoretical model developed by Imparato et al. Phys of the experimentally measured heat and work effects produced by the thermal fluctuations of single micron-sized polystyrene beads in stationary and moving optical traps has proved to be quite successful in rationalizing the observed experimental data. The model, based on the overdamped Brownian dynamics of a particle in a harmonic potential that moves at a constant speed under a time-dependent force, is used to obtain an approximate expression for the distribution of the heat dissipated by the particle at long times. In this paper, we generalize the above model to consider particle dynamics in the presence of colored noise, without passing to the overdamped limit, as a way of modeling experimental situations in which the fluctuations of the medium exhibit long-lived temporal correlations, of the kind characteristic of polymeric solutions, for instance, or of similar viscoelastic fluids. Although we have not been able to find an expression for the heat distribution itself, we do obtain exact expressions for its mean and variance, both for the static and for the moving trap cases. These moments are valid for arbitrary times and they also hold in the inertial regime, but they reduce exactly to the results of Imparato et al. in appropriate limits. DOI: 10.1103/PhysRevE.80.011118 PACS.
Resumo:
Self-tuning is applied to the control of nonlinear systems represented by the Hammerstein model wherein the nonlinearity is any odd-order polynomial. But control costing is not feasible in general. Initial relay control is employed to contain the deviations.