63 resultados para non-parametric technique


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fare, Grosskopf, Norris and Zhang developed a non-parametric productivity index, Malmquist index, using data envelopment analysis (DEA). The Malmquist index is a measure of productivity progress (regress) and it can be decomposed to different components such as 'efficiency catch-up' and 'technology change'. However, Malmquist index and its components are based on two period of time which can capture only a part of the impact of investment in long-lived assets. The effects of lags in the investment process on the capital stock have been ignored in the current model of Malmquist index. This paper extends the recent dynamic DEA model introduced by Emrouznejad and Thanassoulis and Emrouznejad for dynamic Malmquist index. This paper shows that the dynamic productivity results for Organisation for Economic Cooperation and Development countries should reflect reality better than those based on conventional model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The efficiency literature, both using parametric and non-parametric methods, has been focusing mainly on cost efficiency analysis rather than on profit efficiency. In for-profit organisations, however, the measurement of profit efficiency and its decomposition into technical and allocative efficiency is particularly relevant. In this paper a newly developed method is used to measure profit efficiency and to identify the sources of any shortfall in profitability (technical and/or allocative inefficiency). The method is applied to a set of Portuguese bank branches first assuming long run and then a short run profit maximisation objective. In the long run most of the scope for profit improvement of bank branches is by becoming more allocatively efficient. In the short run most of profit gain can be realised through higher technical efficiency. © 2003 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traditionally, geostatistical algorithms are contained within specialist GIS and spatial statistics software. Such packages are often expensive, with relatively complex user interfaces and steep learning curves, and cannot be easily integrated into more complex process chains. In contrast, Service Oriented Architectures (SOAs) promote interoperability and loose coupling within distributed systems, typically using XML (eXtensible Markup Language) and Web services. Web services provide a mechanism for a user to discover and consume a particular process, often as part of a larger process chain, with minimal knowledge of how it works. Wrapping current geostatistical algorithms with a Web service layer would thus increase their accessibility, but raises several complex issues. This paper discusses a solution to providing interoperable, automatic geostatistical processing through the use of Web services, developed in the INTAMAP project (INTeroperability and Automated MAPping). The project builds upon Open Geospatial Consortium standards for describing observations, typically used within sensor webs, and employs Geography Markup Language (GML) to describe the spatial aspect of the problem domain. Thus the interpolation service is extremely flexible, being able to support a range of observation types, and can cope with issues such as change of support and differing error characteristics of sensors (by utilising descriptions of the observation process provided by SensorML). XML is accepted as the de facto standard for describing Web services, due to its expressive capabilities which allow automatic discovery and consumption by ‘naive’ users. Any XML schema employed must therefore be capable of describing every aspect of a service and its processes. However, no schema currently exists that can define the complex uncertainties and modelling choices that are often present within geostatistical analysis. We show a solution to this problem, developing a family of XML schemata to enable the description of a full range of uncertainty types. These types will range from simple statistics, such as the kriging mean and variances, through to a range of probability distributions and non-parametric models, such as realisations from a conditional simulation. By employing these schemata within a Web Processing Service (WPS) we show a prototype moving towards a truly interoperable geostatistical software architecture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper develops a productivity index applicable when producers are cost minimisers and input prices are known. The index is inspired by the Malmquist index as extended to productivity measurement. The index developed here is defined in terms of input cost rather than input quantity distance functions. Hence, productivity change is decomposed into overall efficiency and cost technical change. Furthermore, overall efficiency change is decomposed into technical and allocative efficiency change and cost technical change into a part capturing shifts of input quantities and shifts of relative input prices. These decompositions provide a clearer picture of the root sources of productivity change. They are illustrated here in a sample of hospitals; results are computed using non-parametric mathematical programming. © 2003 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When testing the difference between two groups, if previous data indicate non-normality, then either transform the data if they comprise percentages, integers or scores or use a non-parametric test. If there is uncertainty whether the data are normally distributed, then deviations from normality are likely to be small if the data are measurements to three significant figures. Unless there is clear evidence that the distribution is non-normal, it is more efficient to use the conventional t-tests. It is poor statistical practice to carry out both the parametric and non-parametric tests on a set of data and then choose the result that is most convenient to the investigator!

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There may be circumstances where it is necessary for microbiologists to compare variances rather than means, e,g., in analysing data from experiments to determine whether a particular treatment alters the degree of variability or testing the assumption of homogeneity of variance prior to other statistical tests. All of the tests described in this Statnote have their limitations. Bartlett’s test may be too sensitive but Levene’s and the Brown-Forsythe tests also have problems. We would recommend the use of the variance-ratio test to compare two variances and the careful application of Bartlett’s test if there are more than two groups. Considering that these tests are not particularly robust, it should be remembered that the homogeneity of variance assumption is usually the least important of those considered when carrying out an ANOVA. If there is concern about this assumption and especially if the other assumptions of the analysis are also not likely to be met, e.g., lack of normality or non additivity of treatment effects then it may be better either to transform the data or to carry out a non-parametric test on the data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pearson's correlation coefficient (‘r’) is one of the most widely used of all statistics. Nevertheless, care needs to be used in interpreting the results because with large numbers of observations, quite small values of ‘r’ become significant and the X variable may only account for a small proportion of the variance in Y. Hence, ‘r squared’ should always be calculated and included in a discussion of the significance of ‘r’. The use of ‘r’ also assumes that the data follow a bivariate normal distribution (see Statnote 17) and this assumption should be examined prior to the study. If the data do not conform to such a distribution, the use of a non-parametric correlation coefficient should be considered. A significant correlation should not be interpreted as indicating ‘causation’ especially in observational studies, in which the two variables may be correlated because of their mutual correlations with other confounding variables.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. Pearson's correlation coefficient only tests whether the data fit a linear model. With large numbers of observations, quite small values of r become significant and the X variable may only account for a minute proportion of the variance in Y. Hence, the value of r squared should always be calculated and included in a discussion of the significance of r. 2. The use of r assumes that a bivariate normal distribution is present and this assumption should be examined prior to the study. If Pearson's r is not appropriate, then a non-parametric correlation coefficient such as Spearman's rs may be used. 3. A significant correlation should not be interpreted as indicating causation especially in observational studies in which there is a high probability that the two variables are correlated because of their mutual correlations with other variables. 4. In studies of measurement error, there are problems in using r as a test of reliability and the ‘intra-class correlation coefficient’ should be used as an alternative. A correlation test provides only limited information as to the relationship between two variables. Fitting a regression line to the data using the method known as ‘least square’ provides much more information and the methods of regression and their application in optometry will be discussed in the next article.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tear component deposition onto contact lenses is termed `spoilation' and occurs due to the interaction of synthetic polymers with their biological fluid environment. Spoilation phenomena alter the physico-chemical properties of hydrophilic contact lenses, diminishing the optical properties of the lens; causing discomfort and complications for the wearer. Eventually these alterations render the lens unwearable. The primary aim of this interdisciplinary study was to develop analytical techniques capable of analysing the minute quantities of biological deposition involved, in particular the lipid fraction. Prior to this work such techniques were unavailable for single contact lenses. It is envisaged that these investigations will further the understanding of this biological interfacial conversion. Two main analytical techniques were developed: a high performance liquid chromatography (HPLC) technique and fluorescence spectrofluorimetry. The HPLC method allows analysis of a single contact lens and provided previously unavailable valuable information about variations in the lipid profiles of deposited contact lenses and patient tear films. Fluorescence spectrophotofluorimetry is a sensitive non-destructive technique for observing changes in the fluorescence intensity of biological components on contact lenses. The progression and deposition of tear materials can be monitored and assessed for both in vivo and in vitro spoiled lenses using this technique. An improved in vitro model which is comparable to tears and chemically mimics ocular spoilation was also developed. This model allows the controlled study of extrinsic factors and hydrogel compositions. These studies show that unsaturated tear lipids, probably unsaturated fatty acids, are involved in the interfacial conversion of hydrogel lenses, rendering them incompatible with the ocular microenvironment. Lipid interaction with the lens surface then facilitates secondary deposition of other tear components. Interaction, exchange and immobilisation (by polymerisation) of the lipid layer appears to occur before the final and rapid growth of more complex, insoluble discrete deposits, sometimes called `white spots'.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This book is aimed primarily at microbiologists who are undertaking research and who require a basic knowledge of statistics to analyse their experimental data. Computer software employing a wide range of data analysis methods is widely available to experimental scientists. The availability of this software, however, makes it essential that investigators understand the basic principles of statistics. Statistical analysis of data can be complex with many different methods of approach, each of which applies in a particular experimental circumstance. Hence, it is possible to apply an incorrect statistical method to data and to draw the wrong conclusions from an experiment. The purpose of this book, which has its origin in a series of articles published in the Society for Applied Microbiology journal ‘The Microbiologist’, is an attempt to present the basic logic of statistics as clearly as possible and therefore, to dispel some of the myths that often surround the subject. The 28 ‘Statnotes’ deal with various topics that are likely to be encountered, including the nature of variables, the comparison of means of two or more groups, non-parametric statistics, analysis of variance, correlating variables, and more complex methods such as multiple linear regression and principal components analysis. In each case, the relevant statistical method is illustrated with examples drawn from experiments in microbiological research. The text incorporates a glossary of the most commonly used statistical terms and there are two appendices designed to aid the investigator in the selection of the most appropriate test.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis reports on the development of a technique to evaluate hydraulic conductivities in a soil (Snowcal) subject to freezing conditions. The technique draws on three distinctly different disciplines, Nuclear Physics, Soil Physics and Remote Sensing to provide a non-destructive and reliable evaluation of hydraulic conductivity throughout a freezing test. Thermal neutron radiography is used to provide information on local water/ice contents at anytime throughout the test. The experimental test rig is designed so that the soil matrix can be radiated by a neutron beam, from a nuclear reactor, to obtain radiographs. The radiographs can then be interpreted, following a process of remote sensing image enhancement, to yield information on relative water/ice contents. Interpretation of the radiographs is accommodated using image analysis equipment capable of distinguishing between 256 shades of grey. Remote sensing image enhancing techniques are then employed to develop false colour images which show the movement of water and development of ice lenses in the soil. Instrumentation is incorporated in the soil in the form of psychrometer/thermocouples, to record water potential, electrical resistance probes to enable ice and water to be differentiated on the radiographs and thermocouples to record the temperature gradient. Water content determinations are made from the enhanced images and plotted against potential measurements to provide the moisture characteristic for the soil. With relevant mathematical theory pore water distributions are obtained and combined with water content data to give hydraulic conductivities. The values for hydraulic conductivity in the saturated soil and at the frozen fringe are compared with established values for silts and silty-sands. The values are in general agreement and, with refinement, this non-destructive technique could afford useful information on a whole range of soils. The technique is of value over other methods because ice lenses are actually seen forming in the soil, supporting the accepted theories of frost action. There are economic and experimental restraints to the work which are associated with the use of a nuclear facility, however, the technique is versatile and has been applied to the study of moisture transfer in porous building materials and could be further developed into other research areas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Citation information: Armstrong RA, Davies LN, Dunne MCM & Gilmartin B. Statistical guidelines for clinical studies of human vision. Ophthalmic Physiol Opt 2011, 31, 123-136. doi: 10.1111/j.1475-1313.2010.00815.x ABSTRACT: Statistical analysis of data can be complex and different statisticians may disagree as to the correct approach leading to conflict between authors, editors, and reviewers. The objective of this article is to provide some statistical advice for contributors to optometric and ophthalmic journals, to provide advice specifically relevant to clinical studies of human vision, and to recommend statistical analyses that could be used in a variety of circumstances. In submitting an article, in which quantitative data are reported, authors should describe clearly the statistical procedures that they have used and to justify each stage of the analysis. This is especially important if more complex or 'non-standard' analyses have been carried out. The article begins with some general comments relating to data analysis concerning sample size and 'power', hypothesis testing, parametric and non-parametric variables, 'bootstrap methods', one and two-tail testing, and the Bonferroni correction. More specific advice is then given with reference to particular statistical procedures that can be used on a variety of types of data. Where relevant, examples of correct statistical practice are given with reference to recently published articles in the optometric and ophthalmic literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research is concerned with the measurement of residents' evaluations of the environmental quality of residential areas. The research reflects the increased attention being given to residents' values in planning decisions affecting the residential environment. The work was undertaken in co-operation with a local authority which was in the process of revising its housing strategy, and in particular the priorities for improvement action. The study critically examines the existing evidence on environmental values and their relationship to the environment and points to a number of methodological and conceptual deficiencies. The research strategy developed on the basis of the research review was constrained by the need to keep any survey methods simple so that they could easily be repeated, when necessary, by the sponsoring authority. A basic perception model was assumed, and a social survey carried out to measure residents' responses to different environmental conditions. The data was only assumed to have ordinal properties, necessitating the extensive use of non-parametric statistics. Residents' expressions of satisfaction with the component elements of the environment (ranging from convenience to upkeep and privacy) were successfully related to 'objective' measures of the environment. However the survey evidence did not justify the use of the 'objective' variables as environmental standards. A method of using the social survey data directly as an aid to decision-making is discussed. Alternative models of the derivation of overall satisfaction with the environment are tested, and the values implied by the additive model compared with residents' preferences as measured directly in the survey. Residents' overall satisfactions with the residential environment were most closely related to their satisfactions with the "Appearance" and the "Reputation" of their areas. By contrast the most important directly measured preference was "Friendliness of area". The differences point to the need to define concepts used in social research clearly in operational terms, and to take care in the use of values 'measured' by different methods.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problem of structured noise suppression is addressed by i)modelling the subspaces hosting the components of the signal conveying the information and ii)applying a nonlin- ear non-extensive technique for effecting the right separation. Although the approach is applicable to all situations satisfying the hypothesis of the proposed framework, this work is motivated by a particular scenario, namely, the cancellation of low frequency noise in broadband seismic signals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: The Nidek F-10 is a scanning laser ophthalmoscope that is capable of a novel fundus imaging technique, so-called ‘retro-mode’ imaging. The standard method of imaging drusen in age-related macular degeneration (AMD) is by fundus photography. The aim of the study was to assess drusen quantification using retro-mode imaging. Methods: Stereoscopic fundus photographs and retro-mode images were captured in 31 eyes of 20 patients with varying stages of AMD. Two experienced masked retinal graders independently assessed images for the number and size of drusen, using purpose-designed software. Drusen were further assessed in a subset of eight patients using optical coherence tomography (OCT) imaging. Results: Drusen observed by fundus photography (mean 33.5) were significantly fewer in number than subretinal deposits seen in retro-mode (mean 81.6; p < 0.001). The predominant deposit diameter was on average 5 µm smaller in retro-mode imaging than in fundus photography (p = 0.004). Agreement between graders for both types of imaging was substantial for number of deposits (weighted ? = 0.69) and moderate for size of deposits (weighted ? = 0.42). Retro-mode deposits corresponded to drusen on OCT imaging in all eight patients. Conclusion: The subretinal deposits detected by retro-mode imaging were consistent with the appearance of drusen on OCT imaging; however, a larger longitudinal study would be required to confirm this finding. Retro-mode imaging detected significantly more deposits than conventional colour fundus photography. Retro-mode imaging provides a rapid non-invasive technique, useful in monitoring subtle changes and progression of AMD, which may be useful in monitoring the response of drusen to future therapeutic interventions.