11 resultados para Distributions for Correlated Variables

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of the common techniques for estimating conditional probability densities are inappropriate for applications involving periodic variables. In this paper we introduce two novel techniques for tackling such problems, and investigate their performance using synthetic data. We then apply these techniques to the problem of extracting the distribution of wind vector directions from radar scatterometer data gathered by a remote-sensing satellite.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of the common techniques for estimating conditional probability densities are inappropriate for applications involving periodic variables. In this paper we apply two novel techniques to the problem of extracting the distribution of wind vector directions from radar catterometer data gathered by a remote-sensing satellite.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most conventional techniques for estimating conditional probability densities are inappropriate for applications involving periodic variables. In this paper we introduce three related techniques for tackling such problems, and investigate their performance using synthetic data. We then apply these techniques to the problem of extracting the distribution of wind vector directions from radar scatterometer data gathered by a remote-sensing satellite.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Visualising data for exploratory analysis is a major challenge in many applications. Visualisation allows scientists to gain insight into the structure and distribution of the data, for example finding common patterns and relationships between samples as well as variables. Typically, visualisation methods like principal component analysis and multi-dimensional scaling are employed. These methods are favoured because of their simplicity, but they cannot cope with missing data and it is difficult to incorporate prior knowledge about properties of the variable space into the analysis; this is particularly important in the high-dimensional, sparse datasets typical in geochemistry. In this paper we show how to utilise a block-structured correlation matrix using a modification of a well known non-linear probabilistic visualisation model, the Generative Topographic Mapping (GTM), which can cope with missing data. The block structure supports direct modelling of strongly correlated variables. We show that including prior structural information it is possible to improve both the data visualisation and the model fit. These benefits are demonstrated on artificial data as well as a real geochemical dataset used for oil exploration, where the proposed modifications improved the missing data imputation results by 3 to 13%.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This project represents the collaboration of Charta Mede Ltd and the Interdisciplinary Higher Degrees Scheme at the University of Aston. The aim of the project was to monitor the effects of the Civil Service's Executive Officer Qualifying Test Battery on minority group applicants. Prior to monitoring the EO Test Battery, however, an ethnic classification had to be developed which was reliable, acceptable to respondents and appropriate for monitoring. Three pilot studies were conducted to examine these issues, during which different classifications and different ways of asking the question were trialled. The results indicated that by providing more precise instructions as to the meanings of categories, it was possible to obtain classifications which were acceptable and reliable. However, there were also certain terms and expressions which should be avoided such as those referring to colour and anthropological racial groups. Two classifications were used in the Executive Officer Study - one derived from an Office of Population Censuses and Surveys classification and one developed for this project - the MultiCultural British Classification. The results indicated that some minority groups (Asians, West Indians and Africans in particular) pass the tests in significantly lower proportions than the majority group and also score significantly less well on the tests. Factors which were significantly related to pass/fail and test scores included educational qualifications and age on entering the UK (the latter being negatively correlated). Using variables in this study, however, it was only possible to account for 5% of the variance in pass/fail rates and 11% of the variance in test scores. Analyses of covariance carried out indicated that the differences in test scores still remained even though the effects of significantly correlated variables were removed. Although indirect discrimination could not be inferred from the data, further research into differential validity and fairer methods of select ion is needed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate the shape of the distribution of stock prices using data from options on the underlying asset, and test whether this distribution is distorted in a systematic manner each time a particular news event occurs. In particular we look at the response of the FTSE100 index to market wide announcements of key macroeconomic indicators and policy variables. We show that the whole distribution of stock prices can be distorted on an event day. The shift in distributional shape happens whether the event is characterized as an announcement occurrence or as a measured surprise. We find that larger surprises have proportionately greater impact, and that higher moments are more sensitive to events however characterised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pearson's correlation coefficient (‘r’) is one of the most widely used of all statistics. Nevertheless, care needs to be used in interpreting the results because with large numbers of observations, quite small values of ‘r’ become significant and the X variable may only account for a small proportion of the variance in Y. Hence, ‘r squared’ should always be calculated and included in a discussion of the significance of ‘r’. The use of ‘r’ also assumes that the data follow a bivariate normal distribution (see Statnote 17) and this assumption should be examined prior to the study. If the data do not conform to such a distribution, the use of a non-parametric correlation coefficient should be considered. A significant correlation should not be interpreted as indicating ‘causation’ especially in observational studies, in which the two variables may be correlated because of their mutual correlations with other confounding variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The literature relating to haze formation, methods of separation, coalescence mechanisms, and models by which droplets <100 μm are collected, coalesced and transferred, have been reviewed with particular reference to particulate bed coalescers. The separation of secondary oil-water dispersions was studied experimentally using packed beds of monosized glass ballotini particles. The variables investigated were superficial velocity, bed depth, particle size, and the phase ratio and drop size distribution of inlet secondary dispersion. A modified pump loop was used to generate secondary dispersions of toluene or Clairsol 350 in water with phase ratios between 0.5-6.0 v/v%.Inlet drop size distributions were determined using a Malvern Particle Size Analyser;effluent, coalesced droplets were sized by photography. Single phase flow pressure drop data were correlated by means of a Carman-Kozeny type equation. Correlations were obtained relating single and two phase pressure drops, as (ΔP2/μc)/ΔP1/μd) = kp Ua Lb dcc dpd Cine A flow equation was derived to correlate the two phase pressure drop data as, ΔP2/(ρcU2) = 8.64*107 [dc/D]-0.27 [L/D]0.71 [dp/D]-0.17 [NRe]1.5 [e1]-0.14 [Cin]0.26  In a comparison between functions to characterise the inlet drop size distributions a modification of the Weibull function provided the best fit of experimental data. The general mean drop diameter was correlated by: q_p q_p p_q /β      Γ ((q-3/β) +1) d qp = d fr  .α        Γ ((P-3/β +1 The measured and predicted mean inlet drop diameters agreed within ±15%. Secondary dispersion separation depends largely upon drop capture within a bed. A theoretical analysis of drop capture mechanisms in this work indicated that indirect interception and London-van der Waal's mechanisms predominate. Mathematical models of dispersed phase concentration m the bed were developed by considering drop motion to be analogous to molecular diffusion.The number of possible channels in a bed was predicted from a model in which the pores comprised randomly-interconnected passage-ways between adjacent packing elements and axial flow occured in cylinders on an equilateral triangular pitch. An expression was derived for length of service channels in a queuing system leading to the prediction of filter coefficients. The insight provided into the mechanisms of drop collection and travel, and the correlations of operating parameters, should assist design of industrial particulate bed coalescers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The.use of high-chromium cast irons for abrasive wear resistance is restricted due to their poor fracture toughness properties. An.attempt was made to improve the fracture characteristics by altering the distribution, size and.shape of the eutectic carbide phase without sacrificing their excellent wear resistance. This was achieved by additions of molybdenum or tungsten followed by high temperature heat treatments. The absence of these alloying elements or replacement of them with vanadium or manganese did not show any significant effect and the continuous eutectic carbide morphology remained the same after application of high temperature heat treatments. The fracture characteristics of the alloys with these metallurgical variables were evaluated for both sharp-cracks and blunt notches. The results were used in conjunction with metallographic and fractographic observations to establish possible failure mechanisms. The fracture mechanism of the austenitic alloys was found to be controlled not only by the volume percent but was also greatly influenced by the size and distribution of the eutectic carbides. On the other hand, the fracture mechanism of martensitic alloys was independent of the eutectic carbide morphology. The uniformity of the secondary carbide precipitation during hardening heat treatments was shown to be a reason for consistant fracture toughness results being obtained with this series of alloys although their eutectic carbide morphologies were different. The collected data were applied to a model which incorporated the microstructural parameters and correlated them with the experimentally obtained valid stress intensity factors. The stress intensity coefficients of different short-bar fracture toughness test specimens were evaluated from analytical and experimental compliance studies. The.validity and applicability of this non-standard testing technique for determination of the fracture toughness of high-chromium cast irons were investigated. The results obtained correlated well with the valid results obtained from standard fracture toughness tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motor timing tasks have been employed in studies of neurodevelopmental disorders such as developmental dyslexia and ADHD, where they provide an index of temporal processing ability. Investigations of these disorders have used different stimulus parameters within the motor timing tasks which are likely to affect performance measures. Here we assessed the effect of auditory and visual pacing stimuli on synchronised motor timing performance and its relationship with cognitive and behavioural predictors that are commonly used in the diagnosis of these highly prevalent developmental disorders. Twenty- one children (mean age 9.6 years) completed a finger tapping task in two stimulus conditions, together with additional psychometric measures. As anticipated, synchronisation to the beat (ISI 329 ms) was less accurate in the visually paced condition. Decomposition of timing variance indicated that this effect resulted from differences in the way that visual and auditory paced tasks are processed by central timekeeping and associated peripheral implementation systems. The ability to utilise an efficient processing strategy on the visual task correlated with both reading and sustained attention skills. Dissociations between these patterns of relationship across task modality suggest that not all timing tasks are equivalent.