57 resultados para R15 - Econometric and Input Output Models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formative measurement has seen increasing acceptance in organizational research since the turn of the 21st Century. However, in more recent times, a number of criticisms of the formative approach have appeared. Such work argues that formatively-measured constructs are empirically ambiguous and thus flawed in a theory-testing context. The aim of the present paper is to examine the underpinnings of formative measurement theory in light of theories of causality and ontology in measurement in general. In doing so, a thesis is advanced which draws a distinction between reflective, formative, and causal theories of latent variables. This distinction is shown to be advantageous in that it clarifies the ontological status of each type of latent variable, and thus provides advice on appropriate conceptualization and application. The distinction also reconciles in part both recent supportive and critical perspectives on formative measurement. In light of this, advice is given on how most appropriately to model formative composites in theory-testing applications, placing the onus on the researcher to make clear their conceptualization and operationalisation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The predictive accuracy of competing crude-oil price forecast densities is investigated for the 1994–2006 period. Moving beyond standard ARCH type models that rely exclusively on past returns, we examine the benefits of utilizing the forward-looking information that is embedded in the prices of derivative contracts. Risk-neutral densities, obtained from panels of crude-oil option prices, are adjusted to reflect real-world risks using either a parametric or a non-parametric calibration approach. The relative performance of the models is evaluated for the entire support of the density, as well as for regions and intervals that are of special interest for the economic agent. We find that non-parametric adjustments of risk-neutral density forecasts perform significantly better than their parametric counterparts. Goodness-of-fit tests and out-of-sample likelihood comparisons favor forecast densities obtained by option prices and non-parametric calibration methods over those constructed using historical returns and simulated ARCH processes. © 2010 Wiley Periodicals, Inc. Jrl Fut Mark 31:727–754, 2011

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a general methodology for estimating and incorporating uncertainty in the controller and forward models for noisy nonlinear control problems. Conditional distribution modeling in a neural network context is used to estimate uncertainty around the prediction of neural network outputs. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localize the possible control solutions to consider. A nonlinear multivariable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non Gaussian distributions of control signal as well as processes with hysteresis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a part of the Managing Uncertainty in Complex Models (MUCM) project, research at Aston University will develop methods for dimensionality reduction of the input and/or output spaces of models, as seen within the emulator framework. Towards this end this report describes a framework for generating toy datasets, whose underlying structure is understood, to facilitate early investigations of dimensionality reduction methods and to gain a deeper understanding of the algorithms employed, both in terms of how effective they are for given types of models / situations, and also their speed in applications and how this scales with various factors. The framework, which allows the evaluation of both screening and projection approaches to dimensionality reduction, is described. We also describe the screening and projection methods currently under consideration and present some preliminary results. The aim of this draft of the report is to solicit feedback from the project team on the dataset generation framework, the methods we propose to use, and suggestions for extensions that should be considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As we enter the 21st Century, technologies originally developed for defense purposes such as computers and satellite communications appear to have become a driving force behind economic growth in the United States. Paradoxically, almost all previous econometric models suggest that the largely defense-oriented federal industrial R&D funding that helped create these technologies had no discernible effect on U.S. industrial productivity growth. This paper addresses this paradox by stressing that defense procurement as well as federal R&D expenditures were targeted to a few narrowly defined manufacturing sub-sectors that produced high tech weaponry. Analysis employing data from the NBER Manufacturing Productivity Database and the BEA' s Input Output tables then demonstrates that defense procurement policies did have significant effects on the productivity performance of disaggregated manufacturing industries because of a process of procurement-driven technological change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is defined based on observed units and by finding the distance of each unit to the border of estimated production possibility set (PPS). The convexity is one of the underlying assumptions of the PPS. This paper shows some difficulties of using standard DEA models in the presence of input-ratios and/or output-ratios. The paper defines a new convexity assumption when data includes a ratio variable. Then it proposes a series of modified DEA models which are capable to rectify this problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper re-assesses three independently developed approaches that are aimed at solving the problem of zero-weights or non-zero slacks in Data Envelopment Analysis (DEA). The methods are weights restricted, non-radial and extended facet DEA models. Weights restricted DEA models are dual to envelopment DEA models with restrictions on the dual variables (DEA weights) aimed at avoiding zero values for those weights; non-radial DEA models are envelopment models which avoid non-zero slacks in the input-output constraints. Finally, extended facet DEA models recognize that only projections on facets of full dimension correspond to well defined rates of substitution/transformation between all inputs/outputs which in turn correspond to non-zero weights in the multiplier version of the DEA model. We demonstrate how these methods are equivalent, not only in their aim but also in the solutions they yield. In addition, we show that the aforementioned methods modify the production frontier by extending existing facets or creating unobserved facets. Further we propose a new approach that uses weight restrictions to extend existing facets. This approach has some advantages in computational terms, because extended facet models normally make use of mixed integer programming models, which are computationally demanding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The appraisal and relative performance evaluation of nurses are very important and beneficial for both nurses and employers in an era of clinical governance, increased accountability and high standards of health care services. They enhance and consolidate the knowledge and practical skills of nurses by identification of training and career development plans as well as improvement in health care quality services, increase in job satisfaction and use of cost-effective resources. In this paper, a data envelopment analysis (DEA) model is proposed for the appraisal and relative performance evaluation of nurses. The model is validated on thirty-two nurses working at an Intensive Care Unit (ICU) at one of the most recognized hospitals in Lebanon. The DEA was able to classify nurses into efficient and inefficient ones. The set of efficient nurses was used to establish an internal best practice benchmark to project career development plans for improving the performance of other inefficient nurses. The DEA result confirmed the ranking of some nurses and highlighted injustice in other cases that were produced by the currently practiced appraisal system. Further, the DEA model is shown to be an effective talent management and motivational tool as it can provide clear managerial plans related to promoting, training and development activities from the perspective of nurses, hence increasing their satisfaction, motivation and acceptance of appraisal results. Due to such features, the model is currently being considered for implementation at ICU. Finally, the ratio of the number DEA units to the number of input/output measures is revisited with new suggested values on its upper and lower limits depending on the type of DEA models and the desired number of efficient units from a managerial perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In nonlinear and stochastic control problems, learning an efficient feed-forward controller is not amenable to conventional neurocontrol methods. For these approaches, estimating and then incorporating uncertainty in the controller and feed-forward models can produce more robust control results. Here, we introduce a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. A nonlinear multi-variable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non-Gaussian distributions of control signal as well as processes with hysteresis. © 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Typical performance of low-density parity-check (LDPC) codes over a general binary-input output-symmetric memoryless channel is investigated using methods of statistical mechanics. The binary-input additive-white-Gaussian-noise channel and the binary-input Laplace channel are considered as specific channel noise models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear models reach their limitations in applications with nonlinearities in the data. In this paper new empirical evidence is provided on the relative Euro inflation forecasting performance of linear and non-linear models. The well established and widely used univariate ARIMA and multivariate VAR models are used as linear forecasting models whereas neural networks (NN) are used as non-linear forecasting models. It is endeavoured to keep the level of subjectivity in the NN building process to a minimum in an attempt to exploit the full potentials of the NN. It is also investigated whether the historically poor performance of the theoretically superior measure of the monetary services flow, Divisia, relative to the traditional Simple Sum measure could be attributed to a certain extent to the evaluation of these indices within a linear framework. Results obtained suggest that non-linear models provide better within-sample and out-of-sample forecasts and linear models are simply a subset of them. The Divisia index also outperforms the Simple Sum index when evaluated in a non-linear framework. © 2005 Taylor & Francis Group Ltd.