999 resultados para statistical efficiency
Resumo:
This paper does two things. First, it presents alternative approaches to the standard methods of estimating productive efficiency using a production function. It favours a parametric approach (viz. the stochastic production frontier approach) over a nonparametric approach (e.g. data envelopment analysis); and, further, one that provides a statistical explanation of efficiency, as well as an estimate of its magnitude. Second, it illustrates the favoured approach (i.e. the ‘single stage procedure’) with estimates of two models of explained inefficiency, using data from the Thai manufacturing sector, after the crisis of 1997. Technical efficiency is modelled as being dependent on capital investment in three major areas (viz. land, machinery and office appliances) where land is intended to proxy the effects of unproductive, speculative capital investment; and both machinery and office appliances are intended to proxy the effects of productive, non-speculative capital investment. The estimates from these models cast new light on the five-year long, post-1997 crisis period in Thailand, suggesting a structural shift from relatively labour intensive to relatively capital intensive production in manufactures from 1998 to 2002.
Resumo:
The aim of this paper is to discover the origins of utility regulation in Spain, and to analyse, from a microeconomic perspective, its characteristics and the impact of regulation on consumers and utilities. Madrid and the Madrilenian utilities are taken as a case study. The electric industry in the period studied was a natural monopoly2. Each of the three phases of production, generation, transmission and distribution, had natural monopoly characteristics. Therefore, the most efficient form to generate, transmit and distribute electricity was the monopoly because one firm can produce a quantity at a lower cost than the sum of costs incurred by two or more firms. A problem arises because when a firm is the single provider it can charge prices above the marginal cost, at monopoly prices. When a monopolist reduces the quantity produced, price increases, causing the consumer to demand less than the economic efficiency level, incurring a loss of consumer surplus. The loss of the consumer surplus is not completely gained by the monopolist, causing a loss of social surplus, a deadweight loss. The main objective of regulation is going to be to reduce to a minimum the deadweight loss. Regulation is also needed because when the monopolist fixes prices at marginal cost equal marginal revenue there would be an incentive for firms to enter the market creating inefficiency. The Madrilenian industry has been chosen because of the availability of statistical information on costs and production. The complex industry structure and the atomised demand add interest to the analysis. This study will also provide some light on the tariff regulation of the period which has been poorly studied and will complement the literature on the US electric utilities regulation where a different type of regulation was implemented.
Resumo:
Gumbel analyses were carried out on rainfall time-series at 151 locations in Switzerland for 4 different periods of 30 years in order to estimate daily extreme precipitation for a return period of 100 years. Those estimations were compared with maximal daily values measured during the last 100 years (1911-2010) to test the efficiency of these analyses. This comparison shows that these analyses provide good results for 50 to 60% locations in this country from rainfall time-series 1961-1990 and 1980-2010. On the other hand, daily precipitation with a return period of 100 years is underestimated at most locations from time-series 1931-1960 and especially 1911-1940. Such underestimation results from the increase of maximal daily precipitation recorded from 1911 to 2010 at 90% locations in Switzerland.
Resumo:
The objective of this study is to analyse the technical or productive efficiency ofthe refuse collection services in 75 municipalities located in the Spanish regionof Catalonia. The analysis has been carried out using various techniques. Firstly we have calculated a deterministic parametric frontier, then a stochastic parametric frontier, and finally, various non-parametric approaches (DEA and FDH). Concerning the results, these naturally differ according to the technique used to approach the frontier. Nevertheless, they have an appearance of solidity, at least with regard to the ordinal concordance among the indices of efficiency obtained by the different approaches, as is demonstrated by the statistical tests used. Finally, we have attempted to search for any relation existing between efficiency and the method (public or private) of managing the services. No significant relation was found between the type of management and efficiencyindices
Resumo:
The objective of this study is to analyse the technical or productive efficiency ofthe refuse collection services in 75 municipalities located in the Spanish regionof Catalonia. The analysis has been carried out using various techniques. Firstly we have calculated a deterministic parametric frontier, then a stochastic parametric frontier, and finally, various non-parametric approaches (DEA and FDH). Concerning the results, these naturally differ according to the technique used to approach the frontier. Nevertheless, they have an appearance of solidity, at least with regard to the ordinal concordance among the indices of efficiency obtained by the different approaches, as is demonstrated by the statistical tests used. Finally, we have attempted to search for any relation existing between efficiency and the method (public or private) of managing the services. No significant relation was found between the type of management and efficiencyindices
Resumo:
Statistical models allow the representation of data sets and the estimation and/or prediction of the behavior of a given variable through its interaction with the other variables involved in a phenomenon. Among other different statistical models, are the autoregressive state-space models (ARSS) and the linear regression models (LR), which allow the quantification of the relationships among soil-plant-atmosphere system variables. To compare the quality of the ARSS and LR models for the modeling of the relationships between soybean yield and soil physical properties, Akaike's Information Criterion, which provides a coefficient for the selection of the best model, was used in this study. The data sets were sampled in a Rhodic Acrudox soil, along a spatial transect with 84 points spaced 3 m apart. At each sampling point, soybean samples were collected for yield quantification. At the same site, soil penetration resistance was also measured and soil samples were collected to measure soil bulk density in the 0-0.10 m and 0.10-0.20 m layers. Results showed autocorrelation and a cross correlation structure of soybean yield and soil penetration resistance data. Soil bulk density data, however, were only autocorrelated in the 0-0.10 m layer and not cross correlated with soybean yield. The results showed the higher efficiency of the autoregressive space-state models in relation to the equivalent simple and multiple linear regression models using Akaike's Information Criterion. The resulting values were comparatively lower than the values obtained by the regression models, for all combinations of explanatory variables.
Resumo:
Soil penetration resistance (PR) is a measure of soil compaction closely related to soil structure and plant growth. However, the variability in PR hampers the statistical analyses. This study aimed to evaluate the variability of soil PR on the efficiency of parametric and nonparametric analyses in indentifying significant effects of soil compaction and to classify the coefficient of variation of PR into low, medium, high and very high. On six dates, the PR of a typical dystrophic Red Ultisol under continuous no-tillage for 16 years was measured. Three tillage and/or traffic conditions were established with the application of: (i) no chiseling or additional traffic, (ii) additional compaction, and (iii) chiseling. On each date, the nineteen PR data (measured at every 1.5 cm to a depth of 28.5 cm) were grouped in layers with different thickness. In each layer, the treatment effects were evaluated by variance (ANOVA) and Kruskal-Wallis analyses in a completely randomized design, and the coefficients of variation of all analyses were classified (low, intermediate, high and very high). The ANOVA performed better in discriminating the compaction effects, but the rejection rate of null hypothesis decreased from 100 to 80 % when the coefficient of variation increased from 15 to 26 %. The values of 15 and 26 % were the thresholds separating the low/intermediate and the high/very high coefficient variation classes of PR in this Ultisol.
Roadway Lighting and Safety: Phase II – Monitoring Quality, Durability and Efficiency, November 2011
Resumo:
This Phase II project follows a previous project titled Strategies to Address Nighttime Crashes at Rural, Unsignalized Intersections. Based on the results of the previous study, the Iowa Highway Research Board (IHRB) indicated interest in pursuing further research to address the quality of lighting, rather than just the presence of light, with respect to safety. The research team supplemented the literature review from the previous study, specifically addressing lighting level in terms of measurement, the relationship between light levels and safety, and lamp durability and efficiency. The Center for Transportation Research and Education (CTRE) teamed with a national research leader in roadway lighting, Virginia Tech Transportation Institute (VTTI) to collect the data. An integral instrument to the data collection efforts was the creation of the Roadway Monitoring System (RMS). The RMS allowed the research team to collect lighting data and approach information for each rural intersection identified in the previous phase. After data cleanup, the final data set contained illuminance data for 101 lighted intersections (of 137 lighted intersections in the first study). Data analysis included a robust statistical analysis based on Bayesian techniques. Average illuminance, average glare, and average uniformity ratio values were used to classify quality of lighting at the intersections.
Roadway Lighting and Safety: Phase II – Monitoring Quality, Durability and Efficiency, November 2011
Resumo:
This Phase II project follows a previous project titled Strategies to Address Nighttime Crashes at Rural, Unsignalized Intersections. Based on the results of the previous study, the Iowa Highway Research Board (IHRB) indicated interest in pursuing further research to address the quality of lighting, rather than just the presence of light, with respect to safety. The research team supplemented the literature review from the previous study, specifically addressing lighting level in terms of measurement, the relationship between light levels and safety, and lamp durability and efficiency. The Center for Transportation Research and Education (CTRE) teamed with a national research leader in roadway lighting, Virginia Tech Transportation Institute (VTTI) to collect the data. An integral instrument to the data collection efforts was the creation of the Roadway Monitoring System (RMS). The RMS allowed the research team to collect lighting data and approach information for each rural intersection identified in the previous phase. After data cleanup, the final data set contained illuminance data for 101 lighted intersections (of 137 lighted intersections in the first study). Data analysis included a robust statistical analysis based on Bayesian techniques. Average illuminance, average glare, and average uniformity ratio values were used to classify quality of lighting at the intersections.
Resumo:
In this study we propose an evaluation of the angular effects altering the spectral response of the land-cover over multi-angle remote sensing image acquisitions. The shift in the statistical distribution of the pixels observed in an in-track sequence of WorldView-2 images is analyzed by means of a kernel-based measure of distance between probability distributions. Afterwards, the portability of supervised classifiers across the sequence is investigated by looking at the evolution of the classification accuracy with respect to the changing observation angle. In this context, the efficiency of various physically and statistically based preprocessing methods in obtaining angle-invariant data spaces is compared and possible synergies are discussed.
Resumo:
The objective of this study was to evaluate the efficiency of spatial statistical analysis in the selection of genotypes in a plant breeding program and, particularly, to demonstrate the benefits of the approach when experimental observations are not spatially independent. The basic material of this study was a yield trial of soybean lines, with five check varieties (of fixed effect) and 110 test lines (of random effects), in an augmented block design. The spatial analysis used a random field linear model (RFML), with a covariance function estimated from the residuals of the analysis considering independent errors. Results showed a residual autocorrelation of significant magnitude and extension (range), which allowed a better discrimination among genotypes (increase of the power of statistical tests, reduction in the standard errors of estimates and predictors, and a greater amplitude of predictor values) when the spatial analysis was applied. Furthermore, the spatial analysis led to a different ranking of the genetic materials, in comparison with the non-spatial analysis, and a selection less influenced by local variation effects was obtained.
Resumo:
BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.
Resumo:
Methods used to analyze one type of nonstationary stochastic processes?the periodically correlated process?are considered. Two methods of one-step-forward prediction of periodically correlated time series are examined. One-step-forward predictions made in accordance with an autoregression model and a model of an artificial neural network with one latent neuron layer and with an adaptation mechanism of network parameters in a moving time window were compared in terms of efficiency. The comparison showed that, in the case of prediction for one time step for time series of mean monthly water discharge, the simpler autoregression model is more efficient.
Resumo:
Tämän tutkielman tavoitteena on tarkastella Kiinan osakemarkkinoiden tehokkuutta ja random walk -hypoteesin voimassaoloa. Tavoitteena on myös selvittää esiintyykö viikonpäiväanomalia Kiinan osakemarkkinoilla. Tutkimusaineistona käytetään Shanghain osakepörssin A-sarjan,B-sarjan ja yhdistelmä-sarjan ja Shenzhenin yhdistelmä-sarjan indeksien päivittäisiä logaritmisoituja tuottoja ajalta 21.2.1992-30.12.2005 sekä Shenzhenin osakepörssin A-sarjan ja B-sarjan indeksien päivittäisiä logaritmisoituja tuottoja ajalta 5.10.1992-30.12.2005. Tutkimusmenetelminä käytetään neljä tilastollista menetelmää, mukaan lukien autokorrelaatiotestiä, epäparametrista runs-testiä, varianssisuhdetestiä sekä Augmented Dickey-Fullerin yksikköjuuritestiä. Viikonpäiväanomalian esiintymistä tutkitaan käyttämällä pienimmän neliösumman menetelmää (OLS). Testejä tehdään sekä koko aineistolla että kolmella erillisellä ajanjaksolla. Tämän tutkielman empiiriset tulokset tukevat aikaisempia tutkimuksia Kiinan osakemarkkinoiden tehottomuudesta. Lukuun ottamatta yksikköjuuritestien saatuja tuloksia, autokorrelaatio-, runs- ja varianssisuhdetestien perusteella random walk-hypoteesi hylättiin molempien Kiinan osakemarkkinoiden kohdalla. Tutkimustulokset osoittavat, että molemmilla osakepörssillä B-sarjan indeksien käyttäytyminenon ollut huomattavasti enemmän random walk -hypoteesin vastainen kuin A-sarjan indeksit. Paitsi B-sarjan markkinat, molempien Kiinan osakemarkkinoiden tehokkuus näytti myös paranevan vuoden 2001 markkinabuumin jälkeen. Tutkimustulokset osoittavat myös viikonpäiväanomalian esiintyvän Shanghain osakepörssillä, muttei kuitenkaan Shenzhenin osakepörssillä koko tarkasteluajanjaksolla.
Resumo:
Due to increasing waterborne transportation in the Gulf of Finland, the risk of a hazardous accident increases and therefore manifold preventive actions are needed. As a main legislative authority in the maritime community, The International Maritime Organization (IMO) has set down plenary laws and recommendations which are e.g., utilised in the safe operations in ships and pollution prevention. One of these compulsory requirements, the ISM Code, requires proactive attitude both from the top management and operational workers in the shipping companies. In this study, a crosssectional approach was taken to analyse whether the ISM Code has actively enhanced maritime safety in the Gulf of Finland. The analysis included; 1) performance of the ISM Code in Finnish shipping companies, 2) statistical measurements of maritime safety, 3) influence of corporate top management to the safety culture and 4) comparing safety management practices in shipping companies and port operations of Finnish maritime and port authorities. The main results found were that maritime safety culture has developed in the right direction after the launch of the ISM Code in the 1990´s. However, this study does not exclusively prove that the improvements are the consequence of the ISM Code. Accident prone ships can be recognized due to their behaviour and there is a lesson to learn from the safety culture of some high standard safety disciplines such as, air traffic. In addition, the reporting of accidents and nearmisses should be more widely used in shipping industry. In conclusion, there is still much to be improved in the maritime safety culture of the Finnish Shipping industry, e.g., a “no blame culture” needs to be adopted.