891 resultados para Non-parametric methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper explores the use of the optimisation procedures in SAS/OR software with application to the measurement of efficiency and productivity of decision-making units (DMUs) using data envelopment analysis (DEA) techniques. DEA was originally introduced by Charnes et al. [J. Oper. Res. 2 (1978) 429] is a linear programming method for assessing the efficiency and productivity of DMUs. Over the last two decades, DEA has gained considerable attention as a managerial tool for measuring performance of organisations and it has widely been used for assessing the efficiency of public and private sectors such as banks, airlines, hospitals, universities and manufactures. As a result, new applications with more variables and more complicated models are being introduced. Further to successive development of DEA a non-parametric productivity measure, Malmquist index, has been introduced by Fare et al. [J. Prod. Anal. 3 (1992) 85]. Employing Malmquist index, productivity growth can be decomposed into technical change and efficiency change. On the other hand, the SAS is a powerful software and it is capable of running various optimisation problems such as linear programming with all types of constraints. To facilitate the use of DEA and Malmquist index by SAS users, a SAS/MALM code was implemented in the SAS programming language. The SAS macro developed in this paper selects the chosen variables from a SAS data file and constructs sets of linear-programming models based on the selected DEA. An example is given to illustrate how one could use the code to measure the efficiency and productivity of organisations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this paper is twofold: first, we compute quality-adjusted measures of productivity change for the three most important diagnostic technologies (i.e., the Computerised Tomography Scan, Electrocardiogram and Echocardiogram) in the major Portuguese hospitals. We use the Malmquist–Luenberger index, which allows to measure productivity growth while controlling for the quality of the production. Second, using non-parametric tests, we analyse whether the implementation of the Prospective Payment System may have had a positive impact on the movements of productivity over time. The results show that the PPS has helped hospitals to use these tools more efficiently and to improve their effectiveness.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of this paper is to analyse the relationship between the corporate governance system and technical efficiency in Italian manufacturing. We use a non-parametric frontier technique (DEA) to derive technical efficiency measures for a sample of Italian firms taken from nine manufacturing industries. These measures are then related to the characteristics of the corporate governance system. Two of these characteristics turn out to have a positive impact on technical efficiency: the percentage of the company shares owned by the largest shareholder and the fact that a firm belongs to a pyramidal group. Interestingly, a trade-off emerges between these influences, in the sense that one is stronger in industries where the other is weaker. Copyright © 2007 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper analyses the relationship between industrial total factor productivity and public capital across the 20 Italian administrative regions. It adds upon the existing literature in a number of ways: it analyses a longer period (1970-98); it allows for the role of human capital accumulation; it tests for the existence of a long-run relationship between total factor productivity and public capital (through previously suggested panel techniques) and for weak exogeneity of public capital; and it assesses the significance of public capital within a non-parametric set-up based on the Free Disposal Hull. The results confirm that public capital has a significant impact on the evolution of total factor productivity, particularly in the Southern regions. This impact is mainly ascribed to the core infrastructures (road and airports, harbours, railroads, water and electricity, telecommunications). Also, core infrastructures are weakly exogenous. © 2005 Regional Studies Association.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is a popular non-parametric technique for determining the efficiency of a homogeneous set of decision-making units (DMUs). In many practical cases, there is some doubt if the all the DMUs form a single group with a common efficiency distribution. The Mann-Whitney rank statistic has been used in DEA both to test if two groups of DMUs come from a common efficiency distribution and also to test if the two groups have a common frontier, each of which are likely to have important but different policy implications for the management of the groups. In this paper it is demonstrated that where the Mann-Whitney rank statistic is used for the second of these it is likely to overestimate programmatic inefficiency, particularly of the smaller group. A new non-parametric statistic is proposed for the case of comparing the efficient frontiers of two groups, which overcomes the problems we identify in the use of the Mann-Whitney rank statistic for this purpose. © 2005 Operational Research Society Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim in this paper is to replicate and extend the analysis of visual technical patterns by Lo et al. (2000) using data on the UK market. A non-parametric smoother is used to model a nonlinear trend in stock price series. Technical patterns, such as the 'head-and-shoulders' pattern, that are characterised by a sequence of turning points are identified in the smoothed data. Statistical tests are used to determine whether returns conditioned on the technical patterns are different from random returns and, in an extension to the analysis of Lo et al. (2000), whether they can outperform a market benchmark return. For the stocks in the FTSE 100 and FTSE 250 indices over the period 1986 to 2001, we find that technical patterns occur with different frequencies to each other and in different relativities to the frequencies found in the US market. Our extended statistical testing indicates that UK stock returns are less influenced by technical patterns than was the case for US stock returns.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fare, Grosskopf, Norris and Zhang developed a non-parametric productivity index, Malmquist index, using data envelopment analysis (DEA). The Malmquist index is a measure of productivity progress (regress) and it can be decomposed to different components such as 'efficiency catch-up' and 'technology change'. However, Malmquist index and its components are based on two period of time which can capture only a part of the impact of investment in long-lived assets. The effects of lags in the investment process on the capital stock have been ignored in the current model of Malmquist index. This paper extends the recent dynamic DEA model introduced by Emrouznejad and Thanassoulis and Emrouznejad for dynamic Malmquist index. This paper shows that the dynamic productivity results for Organisation for Economic Cooperation and Development countries should reflect reality better than those based on conventional model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent animal studies highlighting the relationship between functional imaging signals and the underlying neuronal activity have revealed the potential capabilities of non-invasive methods. However, the valuable exchange of information between animal and human studies remains restricted by the limited evidence of direct physiological links between species. In this study we used magnetoencephalography (MEG) to investigate the occurrence of 30-70 Hz (gamma) oscillations in human visual cortex, induced by the presentation of visual stimuli of varying contrast. These oscillations, well described in the animal literature, were observed in retinotopically concordant locations of visual cortex and show striking similarity to those found in primate visual cortex using surgically implanted electrodes. The amplitude of the gamma oscillations increases linearly with stimulus contrast in strong correlation with the gamma oscillations found in the local field potential (LFP) of the macaque. We demonstrate that non-invasive magnetic field measurements of gamma oscillations in human visual cortex concur with invasive measures of activation in primate visual cortex, suggesting both a direct representation of underlying neuronal activity and a concurrence between human and primate cortical activity. © 2005 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traditionally, geostatistical algorithms are contained within specialist GIS and spatial statistics software. Such packages are often expensive, with relatively complex user interfaces and steep learning curves, and cannot be easily integrated into more complex process chains. In contrast, Service Oriented Architectures (SOAs) promote interoperability and loose coupling within distributed systems, typically using XML (eXtensible Markup Language) and Web services. Web services provide a mechanism for a user to discover and consume a particular process, often as part of a larger process chain, with minimal knowledge of how it works. Wrapping current geostatistical algorithms with a Web service layer would thus increase their accessibility, but raises several complex issues. This paper discusses a solution to providing interoperable, automatic geostatistical processing through the use of Web services, developed in the INTAMAP project (INTeroperability and Automated MAPping). The project builds upon Open Geospatial Consortium standards for describing observations, typically used within sensor webs, and employs Geography Markup Language (GML) to describe the spatial aspect of the problem domain. Thus the interpolation service is extremely flexible, being able to support a range of observation types, and can cope with issues such as change of support and differing error characteristics of sensors (by utilising descriptions of the observation process provided by SensorML). XML is accepted as the de facto standard for describing Web services, due to its expressive capabilities which allow automatic discovery and consumption by ‘naive’ users. Any XML schema employed must therefore be capable of describing every aspect of a service and its processes. However, no schema currently exists that can define the complex uncertainties and modelling choices that are often present within geostatistical analysis. We show a solution to this problem, developing a family of XML schemata to enable the description of a full range of uncertainty types. These types will range from simple statistics, such as the kriging mean and variances, through to a range of probability distributions and non-parametric models, such as realisations from a conditional simulation. By employing these schemata within a Web Processing Service (WPS) we show a prototype moving towards a truly interoperable geostatistical software architecture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper develops a productivity index applicable when producers are cost minimisers and input prices are known. The index is inspired by the Malmquist index as extended to productivity measurement. The index developed here is defined in terms of input cost rather than input quantity distance functions. Hence, productivity change is decomposed into overall efficiency and cost technical change. Furthermore, overall efficiency change is decomposed into technical and allocative efficiency change and cost technical change into a part capturing shifts of input quantities and shifts of relative input prices. These decompositions provide a clearer picture of the root sources of productivity change. They are illustrated here in a sample of hospitals; results are computed using non-parametric mathematical programming. © 2003 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

When testing the difference between two groups, if previous data indicate non-normality, then either transform the data if they comprise percentages, integers or scores or use a non-parametric test. If there is uncertainty whether the data are normally distributed, then deviations from normality are likely to be small if the data are measurements to three significant figures. Unless there is clear evidence that the distribution is non-normal, it is more efficient to use the conventional t-tests. It is poor statistical practice to carry out both the parametric and non-parametric tests on a set of data and then choose the result that is most convenient to the investigator!

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There may be circumstances where it is necessary for microbiologists to compare variances rather than means, e,g., in analysing data from experiments to determine whether a particular treatment alters the degree of variability or testing the assumption of homogeneity of variance prior to other statistical tests. All of the tests described in this Statnote have their limitations. Bartlett’s test may be too sensitive but Levene’s and the Brown-Forsythe tests also have problems. We would recommend the use of the variance-ratio test to compare two variances and the careful application of Bartlett’s test if there are more than two groups. Considering that these tests are not particularly robust, it should be remembered that the homogeneity of variance assumption is usually the least important of those considered when carrying out an ANOVA. If there is concern about this assumption and especially if the other assumptions of the analysis are also not likely to be met, e.g., lack of normality or non additivity of treatment effects then it may be better either to transform the data or to carry out a non-parametric test on the data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pearson's correlation coefficient (‘r’) is one of the most widely used of all statistics. Nevertheless, care needs to be used in interpreting the results because with large numbers of observations, quite small values of ‘r’ become significant and the X variable may only account for a small proportion of the variance in Y. Hence, ‘r squared’ should always be calculated and included in a discussion of the significance of ‘r’. The use of ‘r’ also assumes that the data follow a bivariate normal distribution (see Statnote 17) and this assumption should be examined prior to the study. If the data do not conform to such a distribution, the use of a non-parametric correlation coefficient should be considered. A significant correlation should not be interpreted as indicating ‘causation’ especially in observational studies, in which the two variables may be correlated because of their mutual correlations with other confounding variables.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Distributed Brillouin sensing of strain and temperature works by making spatially resolved measurements of the position of the measurand-dependent extremum of the resonance curve associated with the scattering process in the weakly nonlinear regime. Typically, measurements of backscattered Stokes intensity (the dependent variable) are made at a number of predetermined fixed frequencies covering the design measurand range of the apparatus and combined to yield an estimate of the position of the extremum. The measurand can then be found because its relationship to the position of the extremum is assumed known. We present analytical expressions relating the relative error in the extremum position to experimental errors in the dependent variable. This is done for two cases: (i) a simple non-parametric estimate of the mean based on moments and (ii) the case in which a least squares technique is used to fit a Lorentzian to the data. The question of statistical bias in the estimates is discussed and in the second case we go further and present for the first time a general method by which the probability density function (PDF) of errors in the fitted parameters can be obtained in closed form in terms of the PDFs of the errors in the noisy data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this project was to carry out a fundamental study to assess the potential of colour image analysis for use in investigations of fire damaged concrete. This involved:(a) Quantification (rather than purely visual assessment) of colour change as an indicator of the thermal history of concrete.(b) Quantification of the nature and intensity of crack development as an indication of the thermal history of concrete, supporting and in addition to, colour change observations.(c) Further understanding of changes in the physical and chemical properties of aggregate and mortar matrix after heating.(d) An indication of the relationship between cracking and non-destructive methods of testing e.g. UPV or Schmidt hammer. Results showed that colour image analysis could be used to quantify the colour changes found when concrete is heated. Development of red colour coincided with significant reduction in compressive strength. Such measurements may be used to determine the thermal history of concrete by providing information regarding the temperature distribution that existed at the height of a fire. The actual colours observed depended on the types of cement and aggregate that were used to make the concrete. With some aggregates it may be more appropriate to only analyse the mortar matrix. Petrographic techniques may also be used to determine the nature and density of cracks developing at elevated temperatures and values of crack density correlate well with measurements of residual compressive strength. Small differences in crack density were observed with different cements and aggregates, although good correlations were always found with the residual compressive strength. Taken together these two techniques can provide further useful information for the evaluation of fire damaged concrete. This is especially so since petrographic analysis can also provide information on the quality of the original concrete such as cement content and water / cement ratio. Concretes made with blended cements tended to produce small differences in physical and chemical properties compared to those made with unblended cements. There is some evidence to suggest that a coarsening of pore structure in blended cements may lead to onset of cracking at lower temperatures. The use of DTA/TGA was of little use in assessing the thermal history of concrete made with blended cements. Corner spalling and sloughing off, as observed in columns, was effectively reproduced in tests on small scale specimens and the crack distributions measured. Relationships between compressive strength/cracking and non-destructive methods of testing are discussed and an outline procedure for site investigations of fire damaged concrete is described.