948 resultados para Non-parametric methods
Resumo:
Different types of numerical data can be collected in a scientific investigation and the choice of statistical analysis will often depend on the distribution of the data. A basic distinction between variables is whether they are ‘parametric’ or ‘non-parametric’. When a variable is parametric, the data come from a symmetrically shaped distribution known as the ‘Gaussian’ or ‘normal distribution’ whereas non-parametric variables may have a distribution which deviates markedly in shape from normal. This article describes several aspects of the problem of non-normality including: (1) how to test for two common types of deviation from a normal distribution, viz., ‘skew’ and ‘kurtosis’, (2) how to fit the normal distribution to a sample of data, (3) the transformation of non-normally distributed data and scores, and (4) commonly used ‘non-parametric’ statistics which can be used in a variety of circumstances.
Resumo:
If in a correlation test, one or both variables are small whole numbers, scores based on a limited scale, or percentages, a non-parametric correlation coefficient should be considered as an alternative to Pearson’s ‘r’. Kendall’s t and Spearman’s rs are similar tests but the former should be considered if the analysis is to be extended to include partial correlations. If the data contain many tied values, then gamma should be considered as a suitable test.
Resumo:
Practitioners assess performance of entities in increasingly large and complicated datasets. If non-parametric models, such as Data Envelopment Analysis, were ever considered as simple push-button technologies, this is impossible when many variables are available or when data have to be compiled from several sources. This paper introduces by the 'COOPER-framework' a comprehensive model for carrying out non-parametric projects. The framework consists of six interrelated phases: Concepts and objectives, On structuring data, Operational models, Performance comparison model, Evaluation, and Result and deployment. Each of the phases describes some necessary steps a researcher should examine for a well defined and repeatable analysis. The COOPER-framework provides for the novice analyst guidance, structure and advice for a sound non-parametric analysis. The more experienced analyst benefits from a check list such that important issues are not forgotten. In addition, by the use of a standardized framework non-parametric assessments will be more reliable, more repeatable, more manageable, faster and less costly. © 2010 Elsevier B.V. All rights reserved.
Resumo:
The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.
Resumo:
Grape is one of the world's largest fruit crops with approximately 67.5 million tonnes produced each year and energy is an important element in modern grape productions as it heavily depends on fossil and other energy resources. Efficient use of these energies is a necessary step toward reducing environmental hazards, preventing destruction of natural resources and ensuring agricultural sustainability. Hence, identifying excessive use of energy as well as reducing energy resources is the main focus of this paper to optimize energy consumption in grape production.In this study we use a two-stage methodology to find the association of energy efficiency and performance explained by farmers' specific characteristics. In the first stage a non-parametric Data Envelopment Analysis is used to model efficiencies as an explicit function of human labor, machinery, chemicals, FYM (farmyard manure), diesel fuel, electricity and water for irrigation energies. In the second step, farm specific variables such as farmers' age, gender, level of education and agricultural experience are used in a Tobit regression framework to explain how these factors influence efficiency of grape farming.The result of the first stage shows substantial inefficiency between the grape producers in the studied area while the second stage shows that the main difference between efficient and inefficient farmers was in the use of chemicals, diesel fuel and water for irrigation. The use of chemicals such as insecticides, herbicides and fungicides were considerably less than inefficient ones. The results revealed that the more educated farmers are more energy efficient in comparison with their less educated counterparts. © 2013.
Resumo:
This thesis considers non-perturbative methods in quantum field theory with applications to gravity and cosmology. In particular, there are chapters on black hole holography, inflationary model building, and the conformal bootstrap.
Resumo:
In marginal lands Opuntia ficus-indica (OFI) could be used as an alternative fruit and forage crop. The plant vigour and the biomass production were evaluated in Portuguese germplasm (15 individuals from 16 ecotypes) by non-destructive methods, 2 years following planting in a marginal soil and dryland conditions. Two Italian cultivars (Gialla and Bianca) were included in the study for comparison purposes. The biomass production and the plant vigour were estimated by measuring the cladodes number and area, and the fresh (FW) and dry weight (DW) per plant. We selected linear models by using the biometric data from 60 cladodes to predict the cladode area, the FW and the DW per plant. Among ecotypes, significant differences were found in the studied biomass-related parameters and several homogeneous groups were established. Four Portuguese ecotypes had higher biomass production than the others, 3.20 Mg ha−1 on average, a value not significantly different to the improved ‘Gialla’ cultivar, which averaged 3.87 Mg ha−1. Those ecotypes could be used to start a breeding program and to deploy material for animal feeding and fruit production.
Resumo:
This paper proposes a template for modelling complex datasets that integrates traditional statistical modelling approaches with more recent advances in statistics and modelling through an exploratory framework. Our approach builds on the well-known and long standing traditional idea of 'good practice in statistics' by establishing a comprehensive framework for modelling that focuses on exploration, prediction, interpretation and reliability assessment, a relatively new idea that allows individual assessment of predictions. The integrated framework we present comprises two stages. The first involves the use of exploratory methods to help visually understand the data and identify a parsimonious set of explanatory variables. The second encompasses a two step modelling process, where the use of non-parametric methods such as decision trees and generalized additive models are promoted to identify important variables and their modelling relationship with the response before a final predictive model is considered. We focus on fitting the predictive model using parametric, non-parametric and Bayesian approaches. This paper is motivated by a medical problem where interest focuses on developing a risk stratification system for morbidity of 1,710 cardiac patients given a suite of demographic, clinical and preoperative variables. Although the methods we use are applied specifically to this case study, these methods can be applied across any field, irrespective of the type of response.
Resumo:
Dissertação de Mestrado em Finanças Empresariais
Resumo:
El objetivo de este trabajo es analizar como ha evolucionado y los efectos que el tipo de propiedad tiene sobre el desempeño de los bancos en aquellos países de la Europa Central y del Este, que en los últimos años han experimentado con gran intensidad el proceso de integración europea. Con este fin, hemos analizado 242 bancos correspondientes a 12 países (10 nuevos miembros de la UE y 2 en fase de negociación). Para verificar la existencia de un efecto derivado del tipo de propiedad, analizamos las dimensiones de la eficiencia bancaria, rentabilidad, costes, e intermediación, mediante la aplicación de distintas técnicas, tanto paramétricas como no paramétricas. Los resultados muestran la existencia de ciertos efectos derivados del tipo de propiedad. Así, entre los principales resultados, destaca que los bancos privatizados tienden a presentar unos niveles de rentabilidad superiores a los presentados por otros tipos de propiedad, mientras que a su vez, los bancos de origen extranjero son los que de media presentan unos menores niveles de costes, si bien esta diferencia no es estadísticamente significativa. Analizamos también la importancia que supone la presencia de un inversor estratégico en la propiedad de los bancos, obteniendo una mejoría que si bien no es significativa en los ratios de rentabilidad, si lo es en relación a los gastos generales de gestión.
Comparação de duas metodologias de amostragem atmosférica com ferramenta estatística não paramétrica
Resumo:
In atmospheric aerosol sampling, it is inevitable that the air that carries particles is in motion, as a result of both externally driven wind and the sucking action of the sampler itself. High or low air flow sampling speeds may lead to significant particle size bias. The objective of this work is the validation of measurements enabling the comparison of species concentration from both air flow sampling techniques. The presence of several outliers and increase of residuals with concentration becomes obvious, requiring non-parametric methods, recommended for the handling of data which may not be normally distributed. This way, conversion factors are obtained for each of the various species under study using Kendall regression.
Resumo:
Thirty-seven patients were submitted to kidney transplantation after transfusion at 2-week intervals with 4-week stored blood from their potential donors. All patients and donors were typed for HLA-A-B and DR antigens. The patients were also tested for cytotoxic antibodies against donor antigens before each transfusion. The percentage of panel reactive antibodies (PRA) was determined against a selected panel of 30 cell donors before and after the transfusions. The patients were immunosuppressed with azathioprine and prednisone. Rejection crises were treated with methylprednisolone. The control group consisted of 23 patients who received grafts from an unrelated donor but who did not receive donor-specific pretransplant blood transfusion. The incidence and reversibility of rejection episodes, allograft loss caused by rejection, and patient and graft survival rates were determined for both groups. Non-parametric methods (chi-square and Fisher tests) were used for statistical analysis, with the level of significance set at P<0.05. The incidence and reversibility of rejection crises during the first 60 post-transplant days did not differ significantly between groups. The actuarial graft and patient survival rates at five years were 56% and 77%, respectively, for the treated group and 39.8% and 57.5% for the control group. Graft loss due to rejection was significantly higher in the untreated group (P = 0.0026) which also required more intense immunosuppression (P = 0.0001). We conclude that tranfusions using stored blood have the immunosuppressive effect of fresh blood transfusions without the risk of provoking a widespread formation of antibodies. In addition, this method permits a reduction of the immunosuppressive drugs during the process without impairing the adequate functioning of the renal graft
Resumo:
Background: Established in 1999, the Swedish Maternal Health Care Register (MHCR) collects data on pregnancy, birth, and the postpartum period for most pregnant women in Sweden. Antenatal care (ANC) midwives manually enter data into the Web-application that is designed for MHCR. The aim of this study was to investigate midwives? experiences, opinions and use of the MHCR. Method: A national, cross-sectional, questionnaire survey, addressing all Swedish midwives working in ANC, was conducted January to March 2012. The questionnaire included demographic data, preformed statements with six response options ranging from zero to five (0 = totally disagree and 5 = totally agree), and opportunities to add information or further clarification in the form of free text comments. Parametric and non-parametric methods and logistic regression analyses were applied, and content analysis was used for free text comments. Results: The estimated response rate was 53.1%. Most participants were positive towards the Web-application and the included variables in the MHCR. Midwives exclusively engaged in patient-related work tasks perceived the register as burdensome (70.3%) and 44.2% questioned the benefit of the register. The corresponding figures for midwives also engaged in administrative supervision were 37.8% and 18.5%, respectively. Direct electronic transfer of data from the medical records to the MHCR was emphasised as significant future improvement. In addition, the midwives suggested that new variables of interest should be included in the MHCR ? e.g., infertility, outcomes of previous pregnancy and birth, and complications of the index pregnancy. Conclusions: In general, the MHCR was valued positively, although perceived as burdensome. Direct electronic transfer of data from the medical records to the MHCR is a prioritized issue to facilitate the working situation for midwives. Finally, the data suggest that the MHCR is an underused source for operational planning and quality assessment in local ANC centres.
Resumo:
In this thesis, we investigate some aspects of the interplay between economic regulation and the risk of the regulated firm. In the first chapter, the main goal is to understand the implications a mainstream regulatory model (Laffont and Tirole, 1993) have on the systematic risk of the firm. We generalize the model in order to incorporate aggregate risk, and find that the optimal regulatory contract must be severely constrained in order to reproduce real-world systematic risk levels. We also consider the optimal profit-sharing mechanism, with an endogenous sharing rate, to explore the relationship between contract power and beta. We find results compatible with the available evidence that high-powered regimes impose more risk to the firm. In the second chapter, a joint work with Daniel Lima from the University of California, San Diego (UCSD), we start from the observation that regulated firms are subject to some regulatory practices that potentially affect the symmetry of the distribution of their future profits. If these practices are anticipated by investors in the stock market, the pattern of asymmetry in the empirical distribution of stock returns may differ among regulated and non-regulated companies. We review some recently proposed asymmetry measures that are robust to the empirical regularities of return data and use them to investigate whether there are meaningful differences in the distribution of asymmetry between these two groups of companies. In the third and last chapter, three different approaches to the capital asset pricing model of Kraus and Litzenberger (1976) are tested with recent Brazilian data and estimated using the generalized method of moments (GMM) as a unifying procedure. We find that ex-post stock returns generally exhibit statistically significant coskewness with the market portfolio, and hence are sensitive to squared market returns. However, while the theoretical ground for the preference for skewness is well established and fairly intuitive, we did not find supporting evidence that investors require a premium for supporting this risk factor in Brazil.