915 resultados para Trivariate Normal Distribution


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In many of the Statnotes described in this series, the statistical tests assume the data are a random sample from a normal distribution These Statnotes include most of the familiar statistical tests such as the ‘t’ test, analysis of variance (ANOVA), and Pearson’s correlation coefficient (‘r’). Nevertheless, many variables exhibit a more or less ‘skewed’ distribution. A skewed distribution is asymmetrical and the mean is displaced either to the left (positive skew) or to the right (negative skew). If the mean of the distribution is low, the degree of variation large, and when values can only be positive, a positively skewed distribution is usually the result. Many distributions have potentially a low mean and high variance including that of the abundance of bacterial species on plants, the latent period of an infectious disease, and the sensitivity of certain fungi to fungicides. These positively skewed distributions are often fitted successfully by a variant of the normal distribution called the log-normal distribution. This statnote describes fitting the log-normal distribution with reference to two scenarios: (1) the frequency distribution of bacterial numbers isolated from cloths in a domestic environment and (2), the sizes of lichenised ‘areolae’ growing on the hypothalus of Rhizocarpon geographicum (L.) DC.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In previous Statnotes, many of the statistical tests described rely on the assumption that the data are a random sample from a normal or Gaussian distribution. These include most of the tests in common usage such as the ‘t’ test ), the various types of analysis of variance (ANOVA), and Pearson’s correlation coefficient (‘r’) . In microbiology research, however, not all variables can be assumed to follow a normal distribution. Yeast populations, for example, are a notable feature of freshwater habitats, representatives of over 100 genera having been recorded . Most common are the ‘red yeasts’ such as Rhodotorula, Rhodosporidium, and Sporobolomyces and ‘black yeasts’ such as Aurobasidium pelculans, together with species of Candida. Despite the abundance of genera and species, the overall density of an individual species in freshwater is likely to be low and hence, samples taken from such a population will contain very low numbers of cells. A rare organism living in an aquatic environment may be distributed more or less at random in a volume of water and therefore, samples taken from such an environment may result in counts which are more likely to be distributed according to the Poisson than the normal distribution. The Poisson distribution was named after the French mathematician Siméon Poisson (1781-1840) and has many applications in biology, especially in describing rare or randomly distributed events, e.g., the number of mutations in a given sequence of DNA after exposure to a fixed amount of radiation or the number of cells infected by a virus given a fixed level of exposure. This Statnote describes how to fit the Poisson distribution to counts of yeast cells in samples taken from a freshwater lake.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Using a modified deprivation (or poverty) function, in this paper, we theoretically study the changes in poverty with respect to the 'global' mean and variance of the income distribution using Indian survey data. We show that when the income obeys a log-normal distribution, a rising mean income generally indicates a reduction in poverty while an increase in the variance of the income distribution increases poverty. This altruistic view for a developing economy, however, is not tenable anymore once the poverty index is found to follow a pareto distribution. Here although a rising mean income indicates a reduction in poverty, due to the presence of an inflexion point in the poverty function, there is a critical value of the variance below which poverty decreases with increasing variance while beyond this value, poverty undergoes a steep increase followed by a decrease with respect to higher variance. Identifying this inflexion point as the poverty line, we show that the pareto poverty function satisfies all three standard axioms of a poverty index [N.C. Kakwani, Econometrica 43 (1980) 437; A.K. Sen, Econometrica 44 (1976) 219] whereas the log-normal distribution falls short of this requisite. Following these results, we make quantitative predictions to correlate a developing with a developed economy. © 2006 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Евелина Илиева Велева - Разпределението на Уишарт се среща в практиката като разпределението на извадъчната ковариационна матрица за наблюдения над многомерно нормално разпределение. Изведени са някои маргинални плътности, получени чрез интегриране на плътността на Уишарт разпределението. Доказани са необходими и достатъчни условия за положителна определеност на една матрица, които дават нужните граници за интегрирането.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Lognormal distribution has abundant applications in various fields. In literature, most inferences on the two parameters of the lognormal distribution are based on Type-I censored sample data. However, exact measurements are not always attainable especially when the observation is below or above the detection limits, and only the numbers of measurements falling into predetermined intervals can be recorded instead. This is the so-called grouped data. In this paper, we will show the existence and uniqueness of the maximum likelihood estimators of the two parameters of the underlying lognormal distribution with Type-I censored data and grouped data. The proof was first established under the case of normal distribution and extended to the lognormal distribution through invariance property. The results are applied to estimate the median and mean of the lognormal population.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The quality of stormwater runoff from ports is significant as it can be an important source of pollution to the marine environment. This is also a significant issue for the Port of Brisbane as it is located in an area of high environmental values. Therefore, it is imperative to develop an in-depth understanding of stormwater runoff quality to ensure that appropriate strategies are in place for quality improvement, where necessary. To this end, the Port of Brisbane Corporation aimed to develop a port specific stormwater model for the Fisherman Islands facility. The need has to be considered in the context of the proposed future developments of the Port area. ----------------- The Project: The research project is an outcome of the collaborative Partnership between the Port of Brisbane Corporation (POBC) and Queensland University of Technology (QUT). A key feature of this Partnership is that it seeks to undertake research to assist the Port in strengthening the environmental custodianship of the Port area through ‘cutting edge’ research and its translation into practical application. ------------------ The project was separated into two stages. The first stage developed a quantitative understanding of the generation potential of pollutant loads in the existing land uses. This knowledge was then used as input for the stormwater quality model developed in the subsequent stage. The aim is to expand this model across the yet to be developed port expansion area. This is in order to predict pollutant loads associated with stormwater flows from this area with the longer term objective of contributing to the development of ecological risk mitigation strategies for future expansion scenarios. ----------------- Study approach: Stage 1 of the overall study confirmed that Port land uses are unique in terms of the anthropogenic activities occurring on them. This uniqueness in land use results in distinctive stormwater quality characteristics different to other conventional urban land uses. Therefore, it was not scientifically valid to consider the Port as belonging to a single land use category or to consider as being similar to any typical urban land use. The approach adopted in this study was very different to conventional modelling studies where modelling parameters are developed using calibration. The field investigations undertaken in Stage 1 of the overall study helped to create fundamental knowledge on pollutant build-up and wash-off in different Port land uses. This knowledge was then used in computer modelling so that the specific characteristics of pollutant build-up and wash-off can be replicated. This meant that no calibration processes were involved due to the use of measured parameters for build-up and wash-off. ---------------- Conclusions: Stage 2 of the study was primarily undertaken using the SWMM stormwater quality model. It is a physically based model which replicates natural processes as closely as possible. The time step used and catchment variability considered was adequate to accommodate the temporal and spatial variability of input parameters and the parameters used in the modelling reflect the true nature of rainfall-runoff and pollutant processes to the best of currently available knowledge. In this study, the initial loss values adopted for the impervious surfaces are relatively high compared to values noted in research literature. However, given the scientifically valid approach used for the field investigations, it is appropriate to adopt the initial losses derived from this study for future modelling of Port land uses. The relatively high initial losses will reduce the runoff volume generated as well as the frequency of runoff events significantly. Apart from initial losses, most of the other parameters used in SWMM modelling are generic to most modelling studies. Development of parameters for MUSIC model source nodes was one of the primary objectives of this study. MUSIC, uses the mean and standard deviation of pollutant parameters based on a normal distribution. However, based on the values generated in this study, the variation of Event Mean Concentrations (EMCs) for Port land uses within the given investigation period does not fit a normal distribution. This is possibly due to the fact that only one specific location was considered, namely the Port of Brisbane unlike in the case of the MUSIC model where a range of areas with different geographic and climatic conditions were investigated. Consequently, the assumptions used in MUSIC are not totally applicable for the analysis of water quality in Port land uses. Therefore, in using the parameters included in this report for MUSIC modelling, it is important to note that it may result in under or over estimations of annual pollutant loads. It is recommended that the annual pollutant load values given in the report should be used as a guide to assess the accuracy of the modelling outcomes. A step by step guide for using the knowledge generated from this study for MUSIC modelling is given in Table 4.6. ------------------ Recommendations: The following recommendations are provided to further strengthen the cutting edge nature of the work undertaken: * It is important to further validate the approach recommended for stormwater quality modelling at the Port. Validation will require data collection in relation to rainfall, runoff and water quality from the selected Port land uses. Additionally, the recommended modelling approach could be applied to a soon-to-be-developed area to assess ‘before’ and ‘after’ scenarios. * In the modelling study, TSS was adopted as the surrogate parameter for other pollutants. This approach was based on other urban water quality research undertaken at QUT. The validity of this approach should be further assessed for Port land uses. * The adoption of TSS as a surrogate parameter for other pollutants and the confirmation that the <150 m particle size range was predominant in suspended solids for pollutant wash-off gives rise to a number of important considerations. The ability of the existing structural stormwater mitigation measures to remove the <150 m particle size range need to be assessed. The feasibility of introducing source control measures as opposed to end-of-pipe measures for stormwater quality improvement may also need to be considered.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction. In vitro spine biomechanical testing has been central to many advances in understanding the physiology and pathology of the human spine. Owing to the difficulty in obtaining sufficient numbers of human samples to conduct these studies, animal spines have been accepted as a substitute model. However, it is difficult to compare results from different studies, as they use different preparation, testing and data collection methods. The aim of this study was to identify the effect of repeated cyclic loading on bovine spine segment stiffness. It also aimed to quantify the effect of multiple freeze-thaw sequences, as many tests would be difficult to complete in a single session [1-3]. Materials and Methods. Thoracic spines from 6-8 week old calves were used. Each spine was dissected and divided into motion segments including levels T4-T11 (n=28). These were divided into two equal groups. Each segment was potted in polymethylemethacrylate. An Instron Biaxial materials testing machine with a custom made jig was used for testing. The segments were tested in flexion/extension, lateral bending and axial rotation at 37 degrees C and 100% humidity, using moment control to a maximum plus/minus 1.75 Nm with a loading rate of 0.3 Nm per second. Group (A) were tested with continuous repeated cyclic loading for 500 cycles with data recorded at cycles 3, 5, 10, 25, 100, 200, 300, 400 and 500. Group (B) were tested with 10 load cycles after each of 5 freeze thaw sequences. Data was collected from the tenth load cycle after each sequence. Statistical analysis of the data was performed using paired samples t-tests, ANOVA and generalized estimating equations. Results. The data were confirmed as having a normal distribution. 1. There were significant reductions in mean stiffness in flexion/extension (-20%; P=0.001) and lateral bending (-17%; P=0.009) over the 500 load cycles. However, there was no statistically significant change in axial rotation (P=0.152) 2. There was no statistically significant difference between mean stiffness over the five freeze-thaw sequences in flexion/extension (p=0.879) and axial rotation (p=0.07). However, there was a significant reduction in stiffness in lateral bending (-26%; p=0.007) Conclusion. Biomechanical testing of immature bovine spine motion segments requires careful interpretation. The effect of the number of load cycles as well as the number of freeze-thaw cycles on the stiffness of the motion segments depends on the axis of main movement.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Although transit travel time variability is essential for understanding the deterioration of reliability, optimising transit schedule and route choice; it has not attracted enough attention from the literature. This paper proposes public transport-oriented definitions of travel time variability and explores the distributions of public transport travel time using the Transit Signal Priority data. First, definitions of public transport travel time variability are established by extending the common definitions of variability in the literature and by using route and services data of public transport vehicles. Second, the paper explores the distribution of public transport travel time. A new approach for analysing the distributions involving all transit vehicles as well as vehicles from a specific route is proposed. The Lognormal distribution is revealed as the descriptors for public transport travel time from the same route and service. The methods described in this study could be of interest for both traffic managers and transit operators for planning and managing the transit systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Railway is one of the most important, reliable and widely used means of transportation, carrying freight, passengers, minerals, grains, etc. Thus, research on railway tracks is extremely important for the development of railway engineering and technologies. The safe operation of a railway track is based on the railway track structure that includes rails, fasteners, pads, sleepers, ballast, subballast and formation. Sleepers are very important components of the entire structure and may be made of timber, concrete, steel or synthetic materials. Concrete sleepers were first installed around the middle of last century and currently are installed in great numbers around the world. Consequently, the design of concrete sleepers has a direct impact on the safe operation of railways. The "permissible stress" method is currently most commonly used to design sleepers. However, the permissible stress principle does not consider the ultimate strength of materials, probabilities of actual loads, and the risks associated with failure, all of which could lead to the conclusion of cost-ineffectiveness and over design of current prestressed concrete sleepers. Recently the limit states design method, which appeared in the last century and has been already applied in the design of buildings, bridges, etc, is proposed as a better method for the design of prestressed concrete sleepers. The limit states design has significant advantages compared to the permissible stress design, such as the utilisation of the full strength of the member, and a rational analysis of the probabilities related to sleeper strength and applied loads. This research aims to apply the ultimate limit states design to the prestressed concrete sleeper, namely to obtain the load factors of both static and dynamic loads for the ultimate limit states design equations. However, the sleepers in rail tracks require different safety levels for different types of tracks, which mean the different types of tracks have different load factors of limit states design equations. Therefore, the core tasks of this research are to find the load factors of the static component and dynamic component of loads on track and the strength reduction factor of the sleeper bending strength for the ultimate limit states design equations for four main types of tracks, i.e., heavy haul, freight, medium speed passenger and high speed passenger tracks. To find those factors, the multiple samples of static loads, dynamic loads and their distributions are needed. In the four types of tracks, the heavy haul track has the measured data from Braeside Line (A heavy haul line in Central Queensland), and the distributions of both static and dynamic loads can be found from these data. The other three types of tracks have no measured data from sites and the experimental data are hardly available. In order to generate the data samples and obtain their distributions, the computer based simulations were employed and assumed the wheel-track impacts as induced by different sizes of wheel flats. A valid simulation package named DTrack was firstly employed to generate the dynamic loads for the freight and medium speed passenger tracks. However, DTrack is only valid for the tracks which carry low or medium speed vehicles. Therefore, a 3-D finite element (FE) model was then established for the wheel-track impact analysis of the high speed track. This FE model has been validated by comparing its simulation results with the DTrack simulation results, and with the results from traditional theoretical calculations based on the case of heavy haul track. Furthermore, the dynamic load data of the high speed track were obtained from the FE model and the distributions of both static and dynamic loads were extracted accordingly. All derived distributions of loads were fitted by appropriate functions. Through extrapolating those distributions, the important parameters of distributions for the static load induced sleeper bending moment and the extreme wheel-rail impact force induced sleeper dynamic bending moments and finally, the load factors, were obtained. Eventually, the load factors were obtained by the limit states design calibration based on reliability analyses with the derived distributions. After that, a sensitivity analysis was performed and the reliability of the achieved limit states design equations was confirmed. It has been found that the limit states design can be effectively applied to railway concrete sleepers. This research significantly contributes to railway engineering and the track safety area. It helps to decrease the failure and risks of track structure and accidents; better determines the load range for existing sleepers in track; better rates the strength of concrete sleepers to support bigger impact and loads on railway track; increases the reliability of the concrete sleepers and hugely saves investments on railway industries. Based on this research, many other bodies of research can be promoted in the future. Firstly, it has been found that the 3-D FE model is suitable for the study of track loadings and track structure vibrations. Secondly, the equations for serviceability and damageability limit states can be developed based on the concepts of limit states design equations of concrete sleepers obtained in this research, which are for the ultimate limit states.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many websites offer the opportunity for customers to rate items and then use customers' ratings to generate items reputation, which can be used later by other users for decision making purposes. The aggregated value of the ratings per item represents the reputation of this item. The accuracy of the reputation scores is important as it is used to rank items. Most of the aggregation methods didn't consider the frequency of distinct ratings and they didn't test how accurate their reputation scores over different datasets with different sparsity. In this work we propose a new aggregation method which can be described as a weighted average, where weights are generated using the normal distribution. The evaluation result shows that the proposed method outperforms state-of-the-art methods over different sparsity datasets.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research has successfully developed a novel synthetic structural health monitoring system model that is cost-effective and flexible in sensing and data acquisition; and robust in the structural safety evaluation aspect for the purpose of long-term and frequent monitoring of large-scale civil infrastructure during their service lives. Not only did it establish a real-world structural monitoring test-bed right at the heart of QUT Gardens Point Campus but it can also facilitate reliable and prompt protection for any built infrastructure system as well as the user community involved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose The previous literature on Bland-Altman analysis only describes approximate methods for calculating confidence intervals for 95% Limits of Agreement (LoAs). This paper describes exact methods for calculating such confidence intervals, based on the assumption that differences in measurement pairs are normally distributed. Methods Two basic situations are considered for calculating LoA confidence intervals: the first where LoAs are considered individually (i.e. using one-sided tolerance factors for a normal distribution); and the second, where LoAs are considered as a pair (i.e. using two-sided tolerance factors for a normal distribution). Equations underlying the calculation of exact confidence limits are briefly outlined. Results To assist in determining confidence intervals for LoAs (considered individually and as a pair) tables of coefficients have been included for degrees of freedom between 1 and 1000. Numerical examples, showing the use of the tables for calculating confidence limits for Bland-Altman LoAs, have been provided. Conclusions Exact confidence intervals for LoAs can differ considerably from Bland and Altman’s approximate method, especially for sample sizes that are not large. There are better, more precise methods for calculating confidence intervals for LoAs than Bland and Altman’s approximate method, although even an approximate calculation of confidence intervals for LoAs is likely to be better than none at all. Reporting confidence limits for LoAs considered as a pair is appropriate for most situations, however there may be circumstances where it is appropriate to report confidence limits for LoAs considered individually.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

It is extremely important to ensure that people with disabilities can access information and cultural works on an equal basis with others. Access is fundamentally important to enable people with disabilities to fully participate in economic, social, and political life. This is both a pressing moral imperative and a legal requirement in international law. Australia should take clear steps to affirmatively redress the fundamental inequalities of access that people with disabilities face. This requires a fundamental shift in the way that we think about copyright and disability rights: the mechanisms for enabling access should not be a limited exception to normal distribution, but should instead be strong positive rights that are able to be routinely and practically exercised.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aim Determining how ecological processes vary across space is a major focus in ecology. Current methods that investigate such effects remain constrained by important limiting assumptions. Here we provide an extension to geographically weighted regression in which local regression and spatial weighting are used in combination. This method can be used to investigate non-stationarity and spatial-scale effects using any regression technique that can accommodate uneven weighting of observations, including machine learning. Innovation We extend the use of spatial weights to generalized linear models and boosted regression trees by using simulated data for which the results are known, and compare these local approaches with existing alternatives such as geographically weighted regression (GWR). The spatial weighting procedure (1) explained up to 80% deviance in simulated species richness, (2) optimized the normal distribution of model residuals when applied to generalized linear models versus GWR, and (3) detected nonlinear relationships and interactions between response variables and their predictors when applied to boosted regression trees. Predictor ranking changed with spatial scale, highlighting the scales at which different species–environment relationships need to be considered. Main conclusions GWR is useful for investigating spatially varying species–environment relationships. However, the use of local weights implemented in alternative modelling techniques can help detect nonlinear relationships and high-order interactions that were previously unassessed. Therefore, this method not only informs us how location and scale influence our perception of patterns and processes, it also offers a way to deal with different ecological interpretations that can emerge as different areas of spatial influence are considered during model fitting.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reconstructing 3D motion data is highly under-constrained due to several common sources of data loss during measurement, such as projection, occlusion, or miscorrespondence. We present a statistical model of 3D motion data, based on the Kronecker structure of the spatiotemporal covariance of natural motion, as a prior on 3D motion. This prior is expressed as a matrix normal distribution, composed of separable and compact row and column covariances. We relate the marginals of the distribution to the shape, trajectory, and shape-trajectory models of prior art. When the marginal shape distribution is not available from training data, we show how placing a hierarchical prior over shapes results in a convex MAP solution in terms of the trace-norm. The matrix normal distribution, fit to a single sequence, outperforms state-of-the-art methods at reconstructing 3D motion data in the presence of significant data loss, while providing covariance estimates of the imputed points.