39 resultados para Log-normal distribution


Relevância:

80.00% 80.00%

Publicador:

Resumo:

TThe size frequency distributions of ß-amyloid (Aß) and prion protein (PrPsc) deposits were studied in Alzheimer’s disease (AD) and the variant form of Creutzfeldt-Jakob disease (vCJD) respectively. All size distributions were unimodal and positively skewed. Aß deposits reached a greater maximum size and their distributions were significantly less skewed than the PrPsc deposits. All distributions were approximately log-normal in shape but only the diffuse PrPsc deposits did not deviate significantly from a log-normal model. There were fewer larger classic Aß deposits than predicted and the florid PrPsc deposits occupied a more restricted size range than predicted by a log-normal model. Hence, Aß deposits exhibit greater growth than the corresponding PrPsc deposits. Surface diffusion may be particularly important in determining the growth of the diffuse PrPsc deposits. In addition, there are factors limiting the maximum size of the Aß and florid PrPsc deposits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Different types of numerical data can be collected in a scientific investigation and the choice of statistical analysis will often depend on the distribution of the data. A basic distinction between variables is whether they are ‘parametric’ or ‘non-parametric’. When a variable is parametric, the data come from a symmetrically shaped distribution known as the ‘Gaussian’ or ‘normal distribution’ whereas non-parametric variables may have a distribution which deviates markedly in shape from normal. This article describes several aspects of the problem of non-normality including: (1) how to test for two common types of deviation from a normal distribution, viz., ‘skew’ and ‘kurtosis’, (2) how to fit the normal distribution to a sample of data, (3) the transformation of non-normally distributed data and scores, and (4) commonly used ‘non-parametric’ statistics which can be used in a variety of circumstances.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pearson's correlation coefficient (‘r’) is one of the most widely used of all statistics. Nevertheless, care needs to be used in interpreting the results because with large numbers of observations, quite small values of ‘r’ become significant and the X variable may only account for a small proportion of the variance in Y. Hence, ‘r squared’ should always be calculated and included in a discussion of the significance of ‘r’. The use of ‘r’ also assumes that the data follow a bivariate normal distribution (see Statnote 17) and this assumption should be examined prior to the study. If the data do not conform to such a distribution, the use of a non-parametric correlation coefficient should be considered. A significant correlation should not be interpreted as indicating ‘causation’ especially in observational studies, in which the two variables may be correlated because of their mutual correlations with other confounding variables.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article explains first, the reasons why a knowledge of statistics is necessary and describes the role that statistics plays in an experimental investigation. Second, the normal distribution is introduced which describes the natural variability shown by many measurements in optometry and vision sciences. Third, the application of the normal distribution to some common statistical problems including how to determine whether an individual observation is a typical member of a population and how to determine the confidence interval for a sample mean is described.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this second article, statistical ideas are extended to the problem of testing whether there is a true difference between two samples of measurements. First, it will be shown that the difference between the means of two samples comes from a population of such differences which is normally distributed. Second, the 't' distribution, one of the most important in statistics, will be applied to a test of the difference between two means using a simple data set drawn from a clinical experiment in optometry. Third, in making a t-test, a statistical judgement is made as to whether there is a significant difference between the means of two samples. Before the widespread use of statistical software, this judgement was made with reference to a statistical table. Even if such tables are not used, it is useful to understand their logical structure and how to use them. Finally, the analysis of data, which are known to depart significantly from the normal distribution, will be described.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. Pearson's correlation coefficient only tests whether the data fit a linear model. With large numbers of observations, quite small values of r become significant and the X variable may only account for a minute proportion of the variance in Y. Hence, the value of r squared should always be calculated and included in a discussion of the significance of r. 2. The use of r assumes that a bivariate normal distribution is present and this assumption should be examined prior to the study. If Pearson's r is not appropriate, then a non-parametric correlation coefficient such as Spearman's rs may be used. 3. A significant correlation should not be interpreted as indicating causation especially in observational studies in which there is a high probability that the two variables are correlated because of their mutual correlations with other variables. 4. In studies of measurement error, there are problems in using r as a test of reliability and the ‘intra-class correlation coefficient’ should be used as an alternative. A correlation test provides only limited information as to the relationship between two variables. Fitting a regression line to the data using the method known as ‘least square’ provides much more information and the methods of regression and their application in optometry will be discussed in the next article.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Some of the problems arising from the inherent instability of emulsions are discussed. Aspects of emulsion stability are described and particular attention is given to the influence of the chemical nature of the dispersed phase on adsorbed film structure and stability, Emulsion stability has been measured by a photomicrographic technique. Electrophoresis, interfacial tension and droplet rest-time data were also obtained. Emulsions were prepared using a range of oils, including aliphatic and aromatic hydrocarbons, dispersed In a solution of sodium dodecyl sulphate. In some cases a small amount of alkane or alkanol was incorporated into the oil phase. In general the findings agree with the classical view that the stability of oil-in-water emulsions is favoured by a closely packed interfacial film and appreciable electric charge on the droplets. The inclusion of non-ionic alcohol leads to enhanced stability, presumably owing to the formation of a "mixed" interfacial film which is more closely packed and probably more coherent than that of the anionic surfactant alone. In some instances differences in stability cannot he accounted for simply by differences in interfacial adsorption or droplet charge. Alternative explanations are discussed and it is postulated that the coarsening of emulsions may occur not only hy coalescence but also through the migration of oil from small droplets to larger ones by molecular diffusion. The viability of using the coalescence rates of droplets at a plane interface as a guide to emulsion stability has been researched. The construction of a suitable apparatus and the development of a standard testing procedure are described. Coalescence-time distributions may be correlated by equations similar to those presented by other workers, or by an analysis based upon the log-normal function. Stability parameters for a range of oils are discussed in terms of differences in film drainage and the natl1re of the interfacial film. Despite some broad correlations there is generally poor agreement between droplet and emulsion stabilities. It is concluded that hydrodynamic factors largely determine droplet stability in the systems studied. Consequently droplet rest-time measurements do not provide a sensible indication of emulsion stability,

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Common approaches to IP-traffic modelling have featured the use of stochastic models, based on the Markov property, which can be classified into black box and white box models based on the approach used for modelling traffic. White box models, are simple to understand, transparent and have a physical meaning attributed to each of the associated parameters. To exploit this key advantage, this thesis explores the use of simple classic continuous-time Markov models based on a white box approach, to model, not only the network traffic statistics but also the source behaviour with respect to the network and application. The thesis is divided into two parts: The first part focuses on the use of simple Markov and Semi-Markov traffic models, starting from the simplest two-state model moving upwards to n-state models with Poisson and non-Poisson statistics. The thesis then introduces the convenient to use, mathematically derived, Gaussian Markov models which are used to model the measured network IP traffic statistics. As one of the most significant contributions, the thesis establishes the significance of the second-order density statistics as it reveals that, in contrast to first-order density, they carry much more unique information on traffic sources and behaviour. The thesis then exploits the use of Gaussian Markov models to model these unique features and finally shows how the use of simple classic Markov models coupled with use of second-order density statistics provides an excellent tool for capturing maximum traffic detail, which in itself is the essence of good traffic modelling. The second part of the thesis, studies the ON-OFF characteristics of VoIP traffic with reference to accurate measurements of the ON and OFF periods, made from a large multi-lingual database of over 100 hours worth of VoIP call recordings. The impact of the language, prosodic structure and speech rate of the speaker on the statistics of the ON-OFF periods is analysed and relevant conclusions are presented. Finally, an ON-OFF VoIP source model with log-normal transitions is contributed as an ideal candidate to model VoIP traffic and the results of this model are compared with those of previously published work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Deposition of ß-amyloid (Aß ), a 'signature' pathological lesion of Alzheimer's disease (AD), is also characteristic of Down's syndrome (DS), and has been observed in dementia with Lewy bodies (DLB) and corticobasal degeneration (CBD). To determine whether the growth of Aß deposits was similar in these disorders, the size frequency distributions of the diffuse ('pre-amyloid'), primitive ('neuritic'), and classic ('dense-cored') A ß deposits were compared in AD, DS, DLB, and CBD. All size distributions had essentially the same shape, i.e., they were unimodal and positively skewed. Mean size of Aß deposits, however, varied between disorders. Mean diameters of the diffuse, primitive, and classic deposits were greatest in DS, DS and CBD, and DS, respectively, while the smallest deposits, on average, were recorded in DLB. Although the shape of the frequency distributions was approximately log-normal, the model underestimated the frequency of smaller deposits and overestimated the frequency of larger deposits in all disorders. A 'power-law' model fitted the size distributions of the primitive deposits in AD, DS, and DLB, and the diffuse deposits in AD. The data suggest: (1) similarities in size distributions of Aß deposits among disorders, (2) growth of deposits varies with subtype and disorder, (3) different factors are involved in the growth of the diffuse/primitive and classic deposits, and (4) log-normal and power-law models do not completely account for the size frequency distributions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Areolae of the crustose lichen Rhizocarpon geographicum (L.) DC., are present on the peripheral prothallus (marginal areolae) and also aggregate to form confluent masses in the centre of the thallus (central areolae). To determine the relationships between these areolae and whether growth of the peripheral prothallus is dependent on the marginal areolae, the density, morphology, and size frequency distributions of marginal areolae were measured in 23 thalli of R. geographicum in north Wales, UK using image analysis (Image J). Size and morphology of central areolae were also studied across the thallus. Marginal areolae were small, punctate, and occurred in clusters scattered over the peripheral prothallus while central areolae were larger and had a lobed structure. The size-class frequency distributions of the marginal and central areolae were fitted by power-law and log-normal models respectively. In 16 out of 23 thalli, central areolae close to the outer edge were larger and had a more complex lobed morphology than those towards the thallus centre. Neither mean width nor radial growth rate (RaGR) of the peripheral prothallus were correlated with density, diameter, or area fraction of marginal areolae. The data suggest central areolae may develop from marginal areolae as follows: (1) marginal areolae develop in clusters at the periphery and fuse to form central areolae, (2) central areolae grow exponentially, and (3) crowding of central areolae results in constriction and fragmentation. In addition, growth of the peripheral prothallus may be unrelated to the marginal areolae. © 2013 Springer Science+Business Media Dordrecht.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Variation in lichen growth rates poses a significant challenge for the application of direct lichenometry, i.e. the construction of lichen dating curves from direct measurement of growth rates. To examine the magnitude and possible causes of within-site growth variation, radial growth rates (RaGRs) of thalli of the fast-growing foliose lichen Melanelia fuliginosa ssp. fuliginosa (Fr. ex Duby) Essl. and the slow-growing crustose lichen Rhizocarpon geographicum (L.) DC. were studied on two S-facing slate rock surfaces in north Wales, UK using digital photography and an image analysis system (Image-J). RaGRs of M. fuliginosa ssp. fuliginosa varied from 0.44 to 2.63 mmyr-1 and R. geographicum from 0.10 to 1.50 mmyr-1.5. Analysis of variance suggested no significant variation in RaGRs with vertical or horizontal location on the rock, thallus diameter, aspect, slope, light intensity, rock porosity, rock surface texture, distance to nearest lichen neighbour or distance to vegetation on the rock surface. The frequency distribution of RaGR did not deviate from a normal distribution. It was concluded that despite considerable growth rate variation in both species studied, growth curves could be constructed with sufficient precision to be useful for direct lichenometry. © 2014 Swedish Society for Anthropology and Geography.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fluorescence spectroscopy has recently become more common in clinical medicine. However, there are still many unresolved issues related to the methodology and implementation of instruments with this technology. In this study, we aimed to assess individual variability of fluorescence parameters of endogenous markers (NADH, FAD, etc.) measured by fluorescent spectroscopy (FS) in situ and to analyse the factors that lead to a significant scatter of results. Most studied fluorophores have an acceptable scatter of values (mostly up to 30%) for diagnostic purposes. Here we provide evidence that the level of blood volume in tissue impacts FS data with a significant inverse correlation. The distribution function of the fluorescence intensity and the fluorescent contrast coefficient values are a function of the normal distribution for most of the studied fluorophores and the redox ratio. The effects of various physiological (different content of skin melanin) and technical (characteristics of optical filters) factors on the measurement results were additionally studied.The data on the variability of the measurement results in FS should be considered when interpreting the diagnostic parameters, as well as when developing new algorithms for data processing and FS devices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data fluctuation in multiple measurements of Laser Induced Breakdown Spectroscopy (LIBS) greatly affects the accuracy of quantitative analysis. A new LIBS quantitative analysis method based on the Robust Least Squares Support Vector Machine (RLS-SVM) regression model is proposed. The usual way to enhance the analysis accuracy is to improve the quality and consistency of the emission signal, such as by averaging the spectral signals or spectrum standardization over a number of laser shots. The proposed method focuses more on how to enhance the robustness of the quantitative analysis regression model. The proposed RLS-SVM regression model originates from the Weighted Least Squares Support Vector Machine (WLS-SVM) but has an improved segmented weighting function and residual error calculation according to the statistical distribution of measured spectral data. Through the improved segmented weighting function, the information on the spectral data in the normal distribution will be retained in the regression model while the information on the outliers will be restrained or removed. Copper elemental concentration analysis experiments of 16 certified standard brass samples were carried out. The average value of relative standard deviation obtained from the RLS-SVM model was 3.06% and the root mean square error was 1.537%. The experimental results showed that the proposed method achieved better prediction accuracy and better modeling robustness compared with the quantitative analysis methods based on Partial Least Squares (PLS) regression, standard Support Vector Machine (SVM) and WLS-SVM. It was also demonstrated that the improved weighting function had better comprehensive performance in model robustness and convergence speed, compared with the four known weighting functions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The normal distribution is a useful tool for the statistician, but not everyone knows how to wield it. In an extract from his new book, Chancing It, Robert Matthews explains what can happen when things are far from normal.