887 resultados para Analysis and statistical methods
Resumo:
Applying programming techniques to detailed data for 406 rice farms in 21 villages, for 1997, produces inefficiency measures, which differ substantially from the results of simple yield and unit cost measures. For the Boro (dry) season, mean technical efficiency was efficiency was 56.2 per cent and 69.4 per cent, allocative efficiency was 81.3 per cent, cost efficiency was 56.2 per cent and scale efficiency 94.9 per cent. The Aman (wet) season results are similar, but a few points lower. Allocative inefficiency is due to overuse of labour, suggesting population pressure, and of fertiliser, where recommended rates may warrant revision. Second-stage regressions show that large families are more inefficient, whereas farmers with better access to input markets, and those who do less off-farm work, tend to be more efficient. The information on the sources of inter-farm performance differentials could be used by the extension agents to help inefficient farmers. There is little excuse for such sub-optimal use of survey data, which are often collected at substantial costs.
Resumo:
The reasons for the spectacular collapse of so many centrally-planned economies are a source of ongoing debate. In this paper, we use detailed farm-level data to measure total factor productivity (TFP) changes in Mongolian grain and potato farming during the 14-year period immediately preceding the 1990 economic reforms. We measure TFP growth using stochastic frontier analysis (SFA) and data envelopment analysis (DEA) methods. Our results indicate quite poor overall performance, with an average annual TFP change of - 1.7% in grain and 0.8% in potatoes, over the 14-year period. However, the pattern of TFP growth changed substantially during this period, with TFP growth exceeding 7% per year in the latter half of this period. This suggests that the new policies of improved education, greater management autonomy, and improved incentives, which were introduced in final two planning periods in the 1980s, were beginning to have a significant influence upon the performance of Mongolian crop farming. Crown Copyright (C) 2002 Published by Elsevier Science B.V. All rights reserved.
Resumo:
In this paper we use sensor-annotated abstraction hierarchies (Reising & Sanderson, 1996, 2002a,b) to show that unless appropriately instrumented, configural displays designed according to the principles of ecological interface design (EID) might be vulnerable to misinterpretation when sensors become unreliable or are unavailable. Building on foundations established in Reising and Sanderson (2002a) we use a pasteurization process control example to show how sensor-annotated AHs help the analyst determine the impact of different instrumentation engineering policies on a configural display that is part of an ecological interface. Our analyses suggest that configural displays showing higher-order properties of a system are especially vulnerable under some conservative instrumentation configurations. However, sensor-annotated AHs can be used to indicate where corrective instrumentation might be placed. We argue that if EID is to be effectively employed in the design of displays for complex systems, then the information needs of the human operator need to be considered while instrumentation requirements are being formulated. Rasmussen's abstraction hierarchy-and particularly its extension to the analysis of information captured by sensors and derived from sensors-may therefore be a useful adjunct to up-stream instrumentation design. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
In this paper we establish a foundation for understanding the instrumentation needs of complex dynamic systems if ecological interface design (EID)-based interfaces are to be robust in the face of instrumentation failures. EID-based interfaces often include configural displays which reveal the higher-order properties of complex systems. However, concerns have been expressed that such displays might be misleading when instrumentation is unreliable or unavailable. Rasmussen's abstraction hierarchy (AH) formalism can be extended to include representations of sensors near the functions or properties about which they provide information, resulting in what we call a sensor-annotated abstraction hierarchy. Sensor-annotated AHs help the analyst determine the impact of different instrumentation engineering policies on higher-order system information by showing how the data provided from individual sensors propagates within and across levels of abstraction in the AH. The use of sensor-annotated AHs with a configural display is illustrated with a simple water reservoir example. We argue that if EID is to be effectively employed in the design of interfaces for complex systems, then the information needs of the human operator need to be considered at the earliest stages of system development while instrumentation requirements are being formulated. In this way, Rasmussen's AH promotes a formative approach to instrumentation engineering. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
In Part 1 of this paper a methodology for back-to-back testing of simulation software was described. Residuals with error-dependent geometric properties were generated. A set of potential coding errors was enumerated, along with a corresponding set of feature matrices, which describe the geometric properties imposed on the residuals by each of the errors. In this part of the paper, an algorithm is developed to isolate the coding errors present by analysing the residuals. A set of errors is isolated when the subspace spanned by their combined feature matrices corresponds to that of the residuals. Individual feature matrices are compared to the residuals and classified as 'definite', 'possible' or 'impossible'. The status of 'possible' errors is resolved using a dynamic subset testing algorithm. To demonstrate and validate the testing methodology presented in Part 1 and the isolation algorithm presented in Part 2, a case study is presented using a model for biological wastewater treatment. Both single and simultaneous errors that are deliberately introduced into the simulation code are correctly detected and isolated. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
This paper analyzes the DNA code of several species in the perspective of information content. For that purpose several concepts and mathematical tools are selected towards establishing a quantitative method without a priori distorting the alphabet represented by the sequence of DNA bases. The synergies of associating Gray code, histogram characterization and multidimensional scaling visualization lead to a collection of plots with a categorical representation of species and chromosomes.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tool must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case based on California Independent System Operator (CAISO) data concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.
Resumo:
It is presented in this paper a study on the photo-electronic properties of multi layer a-Si: H/a-SiC: H p-i-n-i-p structures. This study is aimed to give an insight into the internal electrical characteristics of such a structure in thermal equilibrium, under applied Was and under different illumination condition. Taking advantage of this insight it is possible to establish a relation among-the electrical behavior of the structure the structure geometry (i.e. thickness of the light absorbing intrinsic layers and of the internal n-layer) and the composition of the layers (i.e. optical bandgap controlled through percentage of carbon dilution in the a-Si1-xCx: H layers). Showing an optical gain for low incident light power controllable by means of externally applied bias or structure composition, these structures are quite attractive for photo-sensing device applications, like color sensors and large area color image detector. An analysis based on numerical ASCA simulations is presented for describing the behavior of different configurations of the device and compared with experimental measurements (spectral response and current-voltage characteristic). (c) 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Resumo:
The handling of waste and compost that occurs frequently in composting plants (compost turning, shredding, and screening) has been shown to be responsible for the release of dust and air borne microorganisms and their compounds in the air. Thermophilic fungi, such as A. fumigatus, have been reported and this kind of contamination in composting facilities has been associated with increased respiratory symptoms among compost workers. This study intended to characterize fungal contamination in a totally indoor composting plant located in Portugal. Besides conventional methods, molecular biology was also applied to overcome eventual limitations.