11 resultados para Ratio of normal random variables

em Universidad Politécnica de Madrid


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monte Carlo techniques, which require the generation of samples from some target density, are often the only alternative for performing Bayesian inference. Two classic sampling techniques to draw independent samples are the ratio of uniforms (RoU) and rejection sampling (RS). An efficient sampling algorithm is proposed combining the RoU and polar RS (i.e. RS inside a sector of a circle using polar coordinates). Its efficiency is shown in drawing samples from truncated Cauchy and Gaussian random variables, which have many important applications in signal processing and communications. RESUMEN. Método eficiente para generar algunas variables aleatorias de uso común en procesado de señal y comunicaciones (por ejemplo, Gaussianas o Cauchy truncadas) mediante la combinación de dos técnicas: "ratio of uniforms" y "rejection sampling".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mechanical properties of aortic wall, both healthy and pathological, are needed in order to develop and improve diagnostic and interventional criteria, and for the development of mechanical models to assess arterial integrity. This study focuses on the mechanical behaviour and rupture conditions of the human ascending aorta and its relationship with age and pathologies. Fresh ascending aortic specimens harvested from 23 healthy donors, 12 patients with bicuspid aortic valve (BAV) and 14 with aneurysm were tensile-tested in vitro under physiological conditions. Tensile strength, stretch at failure and elbow stress were measured. The obtained results showed that age causes a major reduction in the mechanical parameters of healthy ascending aortic tissue, and that no significant differences are found between the mechanical strength of aneurysmal or BAV aortic specimens and the corresponding age-matched control group. The physiological level of the stress in the circumferential direction was also computed to assess the physiological operation range of healthy and diseased ascending aortas. The mean physiological wall stress acting on pathologic aortas was found to be far from rupture, with factors of safety (defined as the ratio of tensile strength to the mean wall stress) larger than six. In contrast, the physiological operation of pathologic vessels lays in the stiff part of the response curve, losing part of its function of damping the pressure waves from the heart.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to determine the effect of animal management and farm facilities on total feed intake (TFI), feed conversion ratio (FCR) and mortality rate (MORT) of grower-finishing pigs. In total, 310 batches from 244 grower-finishing farms, consisting of 454 855 Pietrain sired pigs in six Spanish pig companies were used. Data collection consisted of a survey on management practices (season of placement, split-sex by pens, number of pig origins, water source in the farm, initial or final BW) and facilities (floor, feeder, ventilation or number of animals placed) during 2008 and 2009. Results indicated that batches of pigs placed between January and March had higher TFI (P=0.006), FCR (P=0.005) and MORT (P=0.03) than those placed between July and September. Moreover, batches of pigs placed between April and June had lower MORT (P=0.003) than those placed between January and March. Batches which had split-sex pens had lower TFI (P=0.001) and better FCR (P<0.001) than those with mixed-sex in pens; pigs fed with a single-space feeder with incorporated drinker also had the lowest TFI (P<0.001) and best FCR (P<0.001) in comparison to single and multi-space feeders without a drinker. Pigs placed in pens with <50% slatted floors presented an improvement in FCR (P<0.05) than pens with 50% or more slatted floors. Batches filled with pigs from multiple origins had higher MORT (P<0.001) than those from a single origin. Pigs housed in barns that performed manual ventilation control presented higher MORT (P<0.001) in comparison to automatic ventilation. The regression analysis also indicated that pigs which entered to grower-finisher facilities with higher initial BW had lower MORT (P<0.05) and finally pigs which were sent to slaughterhouse with a higher final BW presented higher TFI (P<0.001). The variables selected for each dependent variable explained 61.9%, 24.8% and 20.4% of the total variability for TFI, FCR and MORT, respectively. This study indicates that farms can increase growth performance and reduce mortality by improving farm facilities and/or modifying management practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies feature subset selection in classification using a multiobjective estimation of distribution algorithm. We consider six functions, namely area under ROC curve, sensitivity, specificity, precision, F1 measure and Brier score, for evaluation of feature subsets and as the objectives of the problem. One of the characteristics of these objective functions is the existence of noise in their values that should be appropriately handled during optimization. Our proposed algorithm consists of two major techniques which are specially designed for the feature subset selection problem. The first one is a solution ranking method based on interval values to handle the noise in the objectives of this problem. The second one is a model estimation method for learning a joint probabilistic model of objectives and variables which is used to generate new solutions and advance through the search space. To simplify model estimation, l1 regularized regression is used to select a subset of problem variables before model learning. The proposed algorithm is compared with a well-known ranking method for interval-valued objectives and a standard multiobjective genetic algorithm. Particularly, the effects of the two new techniques are experimentally investigated. The experimental results show that the proposed algorithm is able to obtain comparable or better performance on the tested datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a new multi-objective estimation of distribution algorithm (EDA) based on joint modeling of objectives and variables. This EDA uses the multi-dimensional Bayesian network as its probabilistic model. In this way it can capture the dependencies between objectives, variables and objectives, as well as the dependencies learnt between variables in other Bayesian network-based EDAs. This model leads to a problem decomposition that helps the proposed algorithm to find better trade-off solutions to the multi-objective problem. In addition to Pareto set approximation, the algorithm is also able to estimate the structure of the multi-objective problem. To apply the algorithm to many-objective problems, the algorithm includes four different ranking methods proposed in the literature for this purpose. The algorithm is applied to the set of walking fish group (WFG) problems, and its optimization performance is compared with an evolutionary algorithm and another multi-objective EDA. The experimental results show that the proposed algorithm performs significantly better on many of the problems and for different objective space dimensions, and achieves comparable results on some compared with the other algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

cartografía de incertidumbres

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method based on experimental data is proposed to optimize the energy harvesting of a silicone-on-glass Fresnel-lens based CPV system. It takes into account the spectral variations along the year in a particular location as well as the thermal and spectral sensitivities of the optics and solar cell. In addition, different alternatives to tune the top/middle subcells current ratio in a CPV module are analyzed and their capacity to maximize the annually produced energy is quantified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Magnetoencephalography (MEG) provides a direct measure of brain activity with high combined spatiotemporal resolution. Preprocessing is necessary to reduce contributions from environmental interference and biological noise. New method The effect on the signal-to-noise ratio of different preprocessing techniques is evaluated. The signal-to-noise ratio (SNR) was defined as the ratio between the mean signal amplitude (evoked field) and the standard error of the mean over trials. Results Recordings from 26 subjects obtained during and event-related visual paradigm with an Elekta MEG scanner were employed. Two methods were considered as first-step noise reduction: Signal Space Separation and temporal Signal Space Separation, which decompose the signal into components with origin inside and outside the head. Both algorithm increased the SNR by approximately 100%. Epoch-based methods, aimed at identifying and rejecting epochs containing eye blinks, muscular artifacts and sensor jumps provided an SNR improvement of 5–10%. Decomposition methods evaluated were independent component analysis (ICA) and second-order blind identification (SOBI). The increase in SNR was of about 36% with ICA and 33% with SOBI. Comparison with existing methods No previous systematic evaluation of the effect of the typical preprocessing steps in the SNR of the MEG signal has been performed. Conclusions The application of either SSS or tSSS is mandatory in Elekta systems. No significant differences were found between the two. While epoch-based methods have been routinely applied the less often considered decomposition methods were clearly superior and therefore their use seems advisable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We derive by program transformation Pierre Crégut s full-reducing Krivine machine KN from the structural operational semantics of the normal order reduction strategy in a closure-converted pure lambda calculus. We thus establish the correspondence between the strategy and the machine, and showcase our technique for deriving full-reducing abstract machines. Actually, the machine we obtain is a slightly optimised version that can work with open terms and may be used in implementations of proof assistants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Motivated by these difficulties, Castillo et al. (2012) made some suggestions on how to build consistent stochastic models avoiding the selection of easy to use mathematical functions, which were replaced by those resulting from a set of properties to be satisfied by the model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In tunnel construction, as in every engineering work, it is usual the decision making, with incomplete data. Nevertheless, consciously or not, the builder weighs the risks (even if this is done subjectively) so that he can offer a cost. The objective of this paper is to recall the existence of a methodology to treat the uncertainties in the data so that it is possible to see their effect on the output of the computational model used and then to estimate the failure probability or the safety margin of a structure. In this scheme it is possible to include the subjective knowledge on the statistical properties of the random variables and, using a numerical model consistent with the degree of complexity appropiate to the problem at hand, to make rationally based decisions. As will be shown with the method it is possible to quantify the relative importance of the random variables and, in addition, it can be used, under certain conditions, to solve the inverse problem. It is then a method very well suited both to the project and to the control phases of tunnel construction.