880 resultados para Evaluation models
Resumo:
The primary goal of this research is to design and develop an education technology to support learning in global operations management. The research implements a series of studies to determine the right balance among user requirements, learning methods and applied technologies, on a view of student-centred learning. This research is multidisciplinary by nature, involving topics from various disciplines such as global operations management, curriculum and contemporary learning theory, and computer aided learning. Innovative learning models that emphasise on technological implementation are employed and discussed throughout this research.
Resumo:
As a part of the Managing Uncertainty in Complex Models (MUCM) project, research at Aston University will develop methods for dimensionality reduction of the input and/or output spaces of models, as seen within the emulator framework. Towards this end this report describes a framework for generating toy datasets, whose underlying structure is understood, to facilitate early investigations of dimensionality reduction methods and to gain a deeper understanding of the algorithms employed, both in terms of how effective they are for given types of models / situations, and also their speed in applications and how this scales with various factors. The framework, which allows the evaluation of both screening and projection approaches to dimensionality reduction, is described. We also describe the screening and projection methods currently under consideration and present some preliminary results. The aim of this draft of the report is to solicit feedback from the project team on the dataset generation framework, the methods we propose to use, and suggestions for extensions that should be considered.
Resumo:
Linear models reach their limitations in applications with nonlinearities in the data. In this paper new empirical evidence is provided on the relative Euro inflation forecasting performance of linear and non-linear models. The well established and widely used univariate ARIMA and multivariate VAR models are used as linear forecasting models whereas neural networks (NN) are used as non-linear forecasting models. It is endeavoured to keep the level of subjectivity in the NN building process to a minimum in an attempt to exploit the full potentials of the NN. It is also investigated whether the historically poor performance of the theoretically superior measure of the monetary services flow, Divisia, relative to the traditional Simple Sum measure could be attributed to a certain extent to the evaluation of these indices within a linear framework. Results obtained suggest that non-linear models provide better within-sample and out-of-sample forecasts and linear models are simply a subset of them. The Divisia index also outperforms the Simple Sum index when evaluated in a non-linear framework. © 2005 Taylor & Francis Group Ltd.
Resumo:
Using the core aspects of five main models of human resource management (HRM), this article investigates the dominant HRM practices in the Indian manufacturing sector. The evaluation is conducted in the context of the recently liberalized economic environment. In response to ever-increasing levels of globalization of business, the article initially highlights the need for more cross-national comparative HRM research. Then it briefly analyzes the five models of HRM (namely, the `Matching model'; the `Harvard model'; the `Contextual model'; the `5-P model'; and the `European model') and identifies the main research questions emerging from these that could be used to reveal and highlight the HRM practices in different national/regional settings. The findings of the research are based on a questionnaire survey of 137 large Indian firms and 24 in-depth interviews in as many firms. The examination not only helped to present the scenario of HRM practices in the Indian context but also the logic dictating the presence of such practices. The article contributes to the fields of cross-national HRM and industrial relations research. It also has key messages for policy makers and opens avenues for further research.
Resumo:
This article proposes a framework of alternative international marketing strategies, based on the evaluation of intra- and inter-cultural behavioural homogeneity for market segmentation. The framework developed in this study provides a generic structure to behavioural homogeneity, proposing consumer involvement as a construct with unique predictive ability for international marketing strategy decisions. A model-based segmentation process, using structural equation models, is implemented to illustrate the application of the framework.
Resumo:
Rhizome of cassava plants (Manihot esculenta Crantz) was catalytically pyrolysed at 500 °C using analytical pyrolysis–gas chromatography/mass spectrometry (Py–GC/MS) method in order to investigate the relative effect of various catalysts on pyrolysis products. Selected catalysts expected to affect bio-oil properties were used in this study. These include zeolites and related materials (ZSM-5, Al-MCM-41 and Al-MSU-F type), metal oxides (zinc oxide, zirconium (IV) oxide, cerium (IV) oxide and copper chromite) catalysts, proprietary commercial catalysts (Criterion-534 and alumina-stabilised ceria-MI-575) and natural catalysts (slate, char and ashes derived from char and biomass). The pyrolysis product distributions were monitored using models in principal components analysis (PCA) technique. The results showed that the zeolites, proprietary commercial catalysts, copper chromite and biomass-derived ash were selective to the reduction of most oxygenated lignin derivatives. The use of ZSM-5, Criterion-534 and Al-MSU-F catalysts enhanced the formation of aromatic hydrocarbons and phenols. No single catalyst was found to selectively reduce all carbonyl products. Instead, most of the carbonyl compounds containing hydroxyl group were reduced by zeolite and related materials, proprietary catalysts and copper chromite. The PCA model for carboxylic acids showed that zeolite ZSM-5 and Al-MSU-F tend to produce significant amounts of acetic and formic acids.
Resumo:
The objective of this study has been to enable a greater understanding of the biomass gasification process through the development and use of process and economic models. A new theoretical equilibrium model of gasification is described using the operating condition called the adiabatic carbon boundary. This represents an ideal gasifier working at the point where the carbon in the feedstock is completely gasified. The model can be used as a `target' against which the results of real gasifiers can be compared, but it does not simulate the results of real gasifiers. A second model has been developed which uses a stagewise approach in order to model fluid bed gasification, and its results have indicated that pyrolysis and the reactions of pyrolysis products play an important part in fluid bed gasifiers. Both models have been used in sensitivity analyses: the biomass moisture content and gasifying agent composition were found to have the largest effects on performance, whilst pressure and heat loss had lesser effects. Correlations have been produced to estimate the total installed capital cost of gasification systems and have been used in an economic model of gasification. This has been used in a sensitivity analysis to determine the factors which most affect the profitability of gasification. The most important influences on gasifier profitability have been found to be feedstock cost, product selling price and throughput. Given the economic conditions of late 1985, refuse gasification for the production of producer gas was found to be viable at throughputs of about 2.5 tonnes/h dry basis and above, in the metropolitan counties of the United Kingdom. At this throughput and above, the largest element of product gas cost is the feedstock cost, the cost element which is most variable.
Resumo:
The PC12 and SH-SY5Y cell models have been proposed as potentially realistic models to investigate neuronal cell toxicity. The effects of oxidative stress (OS) caused by both H2O2 and Aβ on both cell models were assessed by several methods. Cell toxicity was quantitated by measuring cell viability using the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium (MTT) viability assay, an indicator of the integrity of the electron transfer chain (ETC), and cell morphology by fluorescence and video microscopy, both of which showed OS to cause decreased viability and changes in morphology. Levels of intracellular peroxide production, and changes in glutathione and carbonyl levels were also assessed, which showed OS to cause increases in intracellular peroxide production, glutathione and carbonyl levels. Differentiated SH-SY5y cells were also employed and observed to exhibit the greatest sensitivity to toxicity. The neurotrophic factor, nerve growth factor (NGF) was shown to cause protection against OS. Cells pre-treated with NGF showed higher viability after OS, generally less apoptotic morphology, recorded less apoptotic nucleiods, generally lower levels of intracellular peroxides and changes in gene expression. The neutrophic factor, brain derived growth factor (BDNF) and ascorbic acid (AA) were also investigated. BDNF showed no specific neuroprotection, however the preliminary data does warrant further investigation. AA showed a 'janus face' showing either anti-oxidant action and neuroprotection or pro-oxidant action depending on the situation. Results showed that the toxic effects of compounds such as Aβ and H2O2 are cell type dependent, and that OS alters glutathione metabolism in neuronal cells. Following toxic insult, glutathione levels are depleted to low levels. It is herein suggested that this lowering triggers an adaptive response causing alterations in glutathione metabolism as assessed by evaluation of glutathione mRNA biosynthetic enzyme expression and the subsequent increase in glutathione peroxidase (GPX) levels.
Resumo:
This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.
Resumo:
This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.
Resumo:
How are innovative new business models established if organizations constantly compare themselves against existing criteria and expectations? The objective is to address this question from the perspective of innovators and their ability to redefine established expectations and evaluation criteria. The research questions ask whether there are discernible patterns of discursive action through which innovators theorize institutional change and what role such theorizations play for mobilizing support and realizing change projects. These questions are investigated through a case study on a critical area of enterprise computing software, Java application servers. In the present case, business practices and models were already well established among incumbents with critical market areas allocated to few dominant firms. Fringe players started experimenting with a new business approach of selling services around freely available opensource application servers. While most new players struggled, one new entrant succeeded in leading incumbents to adopt and compete on the new model. The case demonstrates that innovative and substantially new models and practices are established in organizational fields when innovators are able to refine expectations and evaluation criteria within an organisational field. The study addresses the theoretical paradox of embedded agency. Actors who are embedded in prevailing institutional logics and structures find it hard to perceive potentially disruptive opportunities that fall outside existing ways of doing things. Changing prevailing institutional logics and structures requires strategic and institutional work aimed at overcoming barriers to innovation. The study addresses this problem through the lens of (new) institutional theory. This discourse methodology traces the process through which innovators were able to establish a new social and business model in the field.
Resumo:
This thesis describes research that has developed the principles of a modelling tool for the analytical evaluation of a manufacturing strategy. The appropriate process of manufacturing strategy formulation is based on mental synthesis with formal planning processes supporting this role. Inherent to such processes is a stage where the effects of alternative strategies on the performance of a manufacturing system must be evaluated so that a choice of preferred strategy can be made. Invariably this evaluation is carried out by practitioners applying mechanisms of judgement, bargaining and analysis. Ibis thesis makes a significant and original contribution to the provision of analytical support for practitioners in this role. The research programme commences by defining the requirements of analytical strategy evaluation from the perspective of practitioners. A broad taxonomy of models has been used to identify a set of potentially suitable techniques for the strategy evaluation task. Then, where possible, unsuitable modelling techniques have been identified on the basis of evidence in the literature and discarded from this set. The remaining modelling techniques have been critically appraised by testing representative contemporary modelling tools in an industrially based experimentation programme. The results show that individual modelling techniques exhibit various limitations in the strategy evaluation role, though some combinations do appear to provide the necessary functionality. On the basis of this comprehensive and in-depth knowledge a modelling tool ' has been specifically designed for this task. Further experimental testing has then been conducted to verify the principles of this modelling tool. Ibis research has bridged the fields of manufacturing strategy formulation and manufacturing systems modelling and makes two contributions to knowledge. Firstly, a comprehensive and in-depth platform of knowledge has been established about modelling techniques in manufacturing strategy evaluation. Secondly, the principles of a tool that supports this role have been formed and verified.
Resumo:
We discuss aggregation of data from neuropsychological patients and the process of evaluating models using data from a series of patients. We argue that aggregation can be misleading but not aggregating can also result in information loss. The basis for combining data needs to be theoretically defined, and the particular method of aggregation depends on the theoretical question and characteristics of the data. We present examples, often drawn from our own research, to illustrate these points. We also argue that statistical models and formal methods of model selection are a useful way to test theoretical accounts using data from several patients in multiple-case studies or case series. Statistical models can often measure fit in a way that explicitly captures what a theory allows; the parameter values that result from model fitting often measure theoretically important dimensions and can lead to more constrained theories or new predictions; and model selection allows the strength of evidence for models to be quantified without forcing this into the artificial binary choice that characterizes hypothesis testing methods. Methods that aggregate and then formally model patient data, however, are not automatically preferred to other methods. Which method is preferred depends on the question to be addressed, characteristics of the data, and practical issues like availability of suitable patients, but case series, multiple-case studies, single-case studies, statistical models, and process models should be complementary methods when guided by theory development.
Resumo:
The appraisal and relative performance evaluation of nurses are very important and beneficial for both nurses and employers in an era of clinical governance, increased accountability and high standards of health care services. They enhance and consolidate the knowledge and practical skills of nurses by identification of training and career development plans as well as improvement in health care quality services, increase in job satisfaction and use of cost-effective resources. In this paper, a data envelopment analysis (DEA) model is proposed for the appraisal and relative performance evaluation of nurses. The model is validated on thirty-two nurses working at an Intensive Care Unit (ICU) at one of the most recognized hospitals in Lebanon. The DEA was able to classify nurses into efficient and inefficient ones. The set of efficient nurses was used to establish an internal best practice benchmark to project career development plans for improving the performance of other inefficient nurses. The DEA result confirmed the ranking of some nurses and highlighted injustice in other cases that were produced by the currently practiced appraisal system. Further, the DEA model is shown to be an effective talent management and motivational tool as it can provide clear managerial plans related to promoting, training and development activities from the perspective of nurses, hence increasing their satisfaction, motivation and acceptance of appraisal results. Due to such features, the model is currently being considered for implementation at ICU. Finally, the ratio of the number DEA units to the number of input/output measures is revisited with new suggested values on its upper and lower limits depending on the type of DEA models and the desired number of efficient units from a managerial perspective.
Resumo:
This paper describes the work undertaken in the Scholarly Ontologies Project. The aim of the project has been to develop a computational approach to support scholarly sensemaking, through interpretation and argumentation, enabling researchers to make claims: to describe and debate their view of a document's key contributions and relationships to the literature. The project has investigated the technicalities and practicalities of capturing conceptual relations, within and between conventional documents in terms of abstract ontological structures. In this way, we have developed a new kind of index to distributed digital library systems. This paper reports a case study undertaken to test the sensemaking tools developed by the Scholarly Ontologies project. The tools used were ClaiMapper, which allows the user to sketch argument maps of individual papers and their connections, ClaiMaker, a server on which such models can be stored and saved, which provides interpretative services to assist the querying of argument maps across multiple papers and ClaimFinder, a novice interface to the search services in ClaiMaker.