996 resultados para normal probability


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a method of voice activity detection (VAD) suitable for high noise scenarios, based on the fusion of two complementary systems. The first system uses a proposed non-Gaussianity score (NGS) feature based on normal probability testing. The second system employs a histogram distance score (HDS) feature that detects changes in the signal through conducting a template-based similarity measure between adjacent frames. The decision outputs by the two systems are then merged using an open-by-reconstruction fusion stage. Accuracy of the proposed method was compared to several baseline VAD methods on a database created using real recordings of a variety of high-noise environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

EXTRACT (SEE PDF FOR FULL ABSTRACT): As part of a study of climatic influences on landslide initiation, a statistical analysis of long-term (>40 years) records of daily rainfall from 24 Pacific coastal stations, from San Diego to Cape Flattery, disclosed an unexpected result - the square root of the daily rainfall closely approximates a normal distribution function. ... This paper illustrates the use of the square-root-normal distribution to analyze variations in precipitation along the mainland United States Pacific Coast with examples of orographic enhancement, rain shadows, and increase in precipitation frequency with geographic latitude.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Currently great emphasis is given for seed metering that assist rigorous demands in relation to longitudinal distribution of seeds, as well as to the index of fails in spacing laws, breaks and double seeds. The evaluation of these variable demands much time and work of attainment of data and processing. The objective of this work went propose to use of graphs of normal probability, facilitating the treatment of the data and decreasing the time of processing. The evaluation methodology consists in the counting of broken seeds, fail spacing and double seeds through the measure of the spacing among seeds, preliminary experiments through combinations of treatments had been carried through whose factors of variation were the level of the reservoir of seeds, the leveling of the seed metering, the speed of displacement and dosage of seeds. The evaluation was carried through in two parts, first through preliminary experiments for elaboration of the graphs of normal probability and later in experiments with bigger sampling for evaluation of the influence of the factors most important. It was done the evaluation of seed metering of rotating internal ring, and the amount of necessary data for the evaluation was very decreased through of the graphs of normal probability that facilitated to prioritize only the significant factors. The dosage of seeds was factor that more important because factor (D) have greater significance.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Many existing engineering works model the statistical characteristics of the entities under study as normal distributions. These models are eventually used for decision making, requiring in practice the definition of the classification region corresponding to the desired confidence level. Surprisingly enough, however, a great amount of computer vision works using multidimensional normal models leave unspecified or fail to establish correct confidence regions due to misconceptions on the features of Gaussian functions or to wrong analogies with the unidimensional case. The resulting regions incur in deviations that can be unacceptable in high-dimensional models. Here we provide a comprehensive derivation of the optimal confidence regions for multivariate normal distributions of arbitrary dimensionality. To this end, firstly we derive the condition for region optimality of general continuous multidimensional distributions, and then we apply it to the widespread case of the normal probability density function. The obtained results are used to analyze the confidence error incurred by previous works related to vision research, showing that deviations caused by wrong regions may turn into unacceptable as dimensionality increases. To support the theoretical analysis, a quantitative example in the context of moving object detection by means of background modeling is given.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We report a Monte Carlo representation of the long-term inter-annual variability of monthly snowfall on a detailed (1 km) grid of points throughout the southwest. An extension of the local climate model of the southwestern United States (Stamm and Craig 1992) provides spatially based estimates of mean and variance of monthly temperature and precipitation. The mean is the expected value from a canonical regression using independent variables that represent controls on climate in this area, including orography. Variance is computed as the standard error of the prediction and provides site-specific measures of (1) natural sources of variation and (2) errors due to limitations of the data and poor distribution of climate stations. Simulation of monthly temperature and precipitation over a sequence of years is achieved by drawing from a bivariate normal distribution. The conditional expectation of precipitation. given temperature in each month, is the basis of a numerical integration of the normal probability distribution of log precipitation below a threshold temperature (3°C) to determine snowfall as a percent of total precipitation. Snowfall predictions are tested at stations for which long-term records are available. At Donner Memorial State Park (elevation 1811 meters) a 34-year simulation - matching the length of instrumental record - is within 15 percent of observed for mean annual snowfall. We also compute resulting snowpack using a variation of the model of Martinec et al. (1983). This allows additional tests by examining spatial patterns of predicted snowfall and snowpack and their hydrologic implications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper compares the applicability of three ground survey methods for modelling terrain: one man electronic tachymetry (TPS), real time kinematic GPS (GPS), and terrestrial laser scanning (TLS). Vertical accuracy of digital terrain models (DTMs) derived from GPS, TLS and airborne laser scanning (ALS) data is assessed. Point elevations acquired by the four methods represent two sections of a mountainous area in Cumbria, England. They were chosen so that the presence of non-terrain features is constrained to the smallest amount. The vertical accuracy of the DTMs was addressed by subtracting each DTM from TPS point elevations. The error was assessed using exploratory measures including statistics, histograms, and normal probability plots. The results showed that the internal measurement accuracy of TPS, GPS, and TLS was below a centimetre. TPS and GPS can be considered equally applicable alternatives for sampling the terrain in areas accessible on foot. The highest DTM vertical accuracy was achieved with GPS data, both on sloped terrain (RMSE 0.16. m) and flat terrain (RMSE 0.02. m). TLS surveying was the most efficient overall but veracity of terrain representation was subject to dense vegetation cover. Therefore, the DTM accuracy was the lowest for the sloped area with dense bracken (RMSE 0.52. m) although it was the second highest on the flat unobscured terrain (RMSE 0.07. m). ALS data represented the sloped terrain more realistically (RMSE 0.23. m) than the TLS. However, due to a systematic bias identified on the flat terrain the DTM accuracy was the lowest (RMSE 0.29. m) which was above the level stated by the data provider. Error distribution models were more closely approximated by normal distribution defined using median and normalized median absolute deviation which supports the use of the robust measures in DEM error modelling and its propagation. © 2012 Elsevier Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In multi-attribute utility theory, it is often not easy to elicit precise values for the scaling weights representing the relative importance of criteria. A very widespread approach is to gather incomplete information. A recent approach for dealing with such situations is to use information about each alternative?s intensity of dominance, known as dominance measuring methods. Different dominancemeasuring methods have been proposed, and simulation studies have been carried out to compare these methods with each other and with other approaches but only when ordinal information about weights is available. In this paper, we useMonte Carlo simulation techniques to analyse the performance of and adapt such methods to deal with weight intervals, weights fitting independent normal probability distributions orweights represented by fuzzy numbers.Moreover, dominance measuringmethod performance is also compared with a widely used methodology dealing with incomplete information on weights, the stochastic multicriteria acceptability analysis (SMAA). SMAA is based on exploring the weight space to describe the evaluations that would make each alternative the preferred one.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis investigates the soil-pipeline interactions associated with the operation of large-diameter chilled gas pipelines in Britain, these are frost/pipe heave and ground cracking. The investigation was biased towards the definition of the mechanism of ground cracking and, the parameters which influence its generation and subsequent development, especially its interaction with frost heave. The study involved a literature review, questionnaire, large-scale test and small-scale laboratory model experiments. The literature review concentrated on soil-pipeline interactions and frost action, with frost/pipe heave often reported but ground cracking was seldom reported. A questionnaire was circulated within British Gas to gain further information on these interactions. The replies indicated that if frost/pipe heave was reported, ground cracking was also likely to be observed. These soil-pipeline interactions were recorded along 19% of pipelines in the survey and were more likely along the larger diameter, higher flow pipelines. A large-scale trial along a 900 mm pipeline was undertaken to assess the soil thermal, hydraulic and stress regimes, together with pipe and ground movements. Results indicated that cracking occurred intermittently along the pipeline during periods of rapid frost/pipe heave and ground movement and, that frozen annulus growth produced a ground surface profile was approximated by a normal probability distribution curve. This curve indicates maximum tensile strain directly over the pipe centre. Finally a small-scale laboratory model was operated to further define the ground cracking mechanism. Ground cracking was observed at small upward ground surface movement, and with continued movement the ground crack increased in width and depth. At the end of the experiments internal soil failure planes slanting upwards and away from the frozen annulus were noted. The suggested mechanism for ground cracking involved frozen annulus growth producing tensile strain in the overlying unfrozen soil, which when sufficient produced a crack.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cette thèse s’inscrit dans le contexte d’une optimisation industrielle et économique des éléments de structure en BFUP permettant d’en garantir la ductilité au niveau structural, tout en ajustant la quantité de fibres et en optimisant le mode de fabrication. Le modèle développé décrit explicitement la participation du renfort fibré en traction au niveau local, en enchaînant une phase de comportement écrouissante suivie d’une phase adoucissante. La loi de comportement est fonction de la densité, de l’orientation des fibres vis-à-vis des directions principales de traction, de leur élancement et d’autres paramètres matériaux usuels liés aux fibres, à la matrice cimentaire et à leur interaction. L’orientation des fibres est prise en compte à partir d’une loi de probabilité normale à une ou deux variables permettant de reproduire n’importe quelle orientation obtenue à partir d’un calcul représentatif de la mise en oeuvre du BFUP frais ou renseignée par analyse expérimentale sur prototype. Enfin, le modèle reproduit la fissuration des BFUP sur le principe des modèles de fissures diffuses et tournantes. La loi de comportement est intégrée au sein d’un logiciel de calcul de structure par éléments finis, permettant de l’utiliser comme un outil prédictif de la fiabilité et de la ductilité globale d’éléments en BFUP. Deux campagnes expérimentales ont été effectuées, une à l’Université Laval de Québec et l’autre à l’Ifsttar, Marne-la-Vallée. La première permet de valider la capacité du modèle reproduire le comportement global sous des sollicitations typiques de traction et de flexion dans des éléments structurels simples pour lesquels l’orientation préférentielle des fibres a été renseignée par tomographie. La seconde campagne expérimentale démontre les capacités du modèle dans une démarche d’optimisation, pour la fabrication de plaques nervurées relativement complexes et présentant un intérêt industriel potentiel pour lesquels différentes modalités de fabrication et des BFUP plus ou moins fibrés ont été envisagés. Le contrôle de la répartition et de l’orientation des fibres a été réalisé à partir d’essais mécaniques sur prélèvements. Les prévisions du modèle ont été confrontées au comportement structurel global et à la ductilité mis en évidence expérimentalement. Le modèle a ainsi pu être qualifié vis-à-vis des méthodes analytiques usuelles de l’ingénierie, en prenant en compte la variabilité statistique. Des pistes d’amélioration et de complément de développement ont été identifiées.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Teamwork is generally assessed either solely by academic staff or by both academic staff andstudents themselves confidentially as well as collaboratively. Peer- and self-assessments have beenused primarily to assess teamwork process and teacher assessment to assess teamwork product.Peer- and self-assessments are useful to elicit team members’ contribution towards teamwork and toconvert team mark into individual marks, provided the scores are reliable (the extent to which thescores are consistent). However, not all peer- and self-assessment scores are reliable. Anecdotal andliterature evidence suggest that there are several cases of inconsistencies in these scores. Individualcontribution scores given by teammates to an assessee (including himself/herself) can sometimesvary significantly due to both intentional and unintentional reasons. Simply using total individual ratingscores without considering an assessor’s reliability to estimate individual contribution factors cansometime results unfair grades and becomes hindrance to learning through teamwork.PURPOSEThis study proposes an extended approach to adjust inconsistent and/or distorted minority peer andself-assessment scores of teamwork using standard normal probability concept.APPROACHIn order to adjust inconsistent and/or distorted minority peer-and self-assessment scores of teamwork,an extended approach has been proposed. The approach uses the reliability of assessor’s scores ofan assessee using standard normal probability curve. The evaluation of the extended approach isconducted by comparing with the existing approaches using two case examples of peer- and selfassessmentof teamwork where minority team members’ scores are inconsistent.RESULTSThe evaluation of the extended approach shows that the proposed method is superior to the availableapproaches in order to adjust inconsistent peer- and self-assessment scores for special cases wherescores of minority team members are inconsistent. The extended approach helps both to automaticallydetect such scoring anomalies and to adjust the scores so that the fairer contributions to the teamworkwould be obtained and utilised.CONCLUSIONSThe extended approach is useful in that it helps both to automatically detect scoring anomalies and todevise the methods to adjust them. However, the approach does not address the issue of scoringinconsistencies by majority of team members as it uses average score as a basis for identifyinginconsistencies. Moreover, the approach needs to be implemented in the real teamwork environmentin order to identify the impacts of these scoring adjustments in teamwork process and teamworkproduct.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose This study evaluated the impact of patient set-up errors on the probability of pulmonary and cardiac complications in the irradiation of left-sided breast cancer. Methods and Materials Using the CMS XiO Version 4.6 (CMS Inc., St Louis, MO) radiotherapy planning system's NTCP algorithm and the Lyman -Kutcher-Burman (LKB) model, we calculated the DVH indices for the ipsilateral lung and heart and the resultant normal tissue complication probabilities (NTCP) for radiation-induced pneumonitis and excess cardiac mortality in 12 left-sided breast cancer patients. Results Isocenter shifts in the posterior direction had the greatest effect on the lung V20, heart V25, mean and maximum doses to the lung and the heart. Dose volume histograms (DVH) results show that the ipsilateral lung V20 tolerance was exceeded in 58% of the patients after 1cm posterior shifts. Similarly, the heart V25 tolerance was exceeded after 1cm antero-posterior and left-right isocentric shifts in 70% of the patients. The baseline NTCPs for radiation-induced pneumonitis ranged from 0.73% - 3.4% with a mean value of 1.7%. The maximum reported NTCP for radiation-induced pneumonitis was 5.8% (mean 2.6%) after 1cm posterior isocentric shift. The NTCP for excess cardiac mortality were 0 % in 100% of the patients (n=12) before and after setup error simulations. Conclusions Set-up errors in left sided breast cancer patients have a statistically significant impact on the Lung NTCPs and DVH indices. However, with a central lung distance of 3cm or less (CLD <3cm), and a maximum heart distance of 1.5cm or less (MHD<1.5cm), the treatment plans could tolerate set-up errors of up to 1cm without any change in the NTCP to the heart.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive a very general expression of the survival probability and the first passage time distribution for a particle executing Brownian motion in full phase space with an absorbing boundary condition at a point in the position space, which is valid irrespective of the statistical nature of the dynamics. The expression, together with the Jensen's inequality, naturally leads to a lower bound to the actual survival probability and an approximate first passage time distribution. These are expressed in terms of the position-position, velocity-velocity, and position-velocity variances. Knowledge of these variances enables one to compute a lower bound to the survival probability and consequently the first passage distribution function. As examples, we compute these for a Gaussian Markovian process and, in the case of non-Markovian process, with an exponentially decaying friction kernel and also with a power law friction kernel. Our analysis shows that the survival probability decays exponentially at the long time irrespective of the nature of the dynamics with an exponent equal to the transition state rate constant.