940 resultados para practical epistemology analysis
Resumo:
This study proposes an integrated analytical framework for effective management of project risks using combined multiple criteria decision-making technique and decision tree analysis. First, a conceptual risk management model was developed through thorough literature review. The model was then applied through action research on a petroleum oil refinery construction project in the Central part of India in order to demonstrate its effectiveness. Oil refinery construction projects are risky because of technical complexity, resource unavailability, involvement of many stakeholders and strict environmental requirements. Although project risk management has been researched extensively, practical and easily adoptable framework is missing. In the proposed framework, risks are identified using cause and effect diagram, analysed using the analytic hierarchy process and responses are developed using the risk map. Additionally, decision tree analysis allows modelling various options for risk response development and optimises selection of risk mitigating strategy. The proposed risk management framework could be easily adopted and applied in any project and integrated with other project management knowledge areas.
Resumo:
The appraisal and relative performance evaluation of nurses are very important and beneficial for both nurses and employers in an era of clinical governance, increased accountability and high standards of health care services. They enhance and consolidate the knowledge and practical skills of nurses by identification of training and career development plans as well as improvement in health care quality services, increase in job satisfaction and use of cost-effective resources. In this paper, a data envelopment analysis (DEA) model is proposed for the appraisal and relative performance evaluation of nurses. The model is validated on thirty-two nurses working at an Intensive Care Unit (ICU) at one of the most recognized hospitals in Lebanon. The DEA was able to classify nurses into efficient and inefficient ones. The set of efficient nurses was used to establish an internal best practice benchmark to project career development plans for improving the performance of other inefficient nurses. The DEA result confirmed the ranking of some nurses and highlighted injustice in other cases that were produced by the currently practiced appraisal system. Further, the DEA model is shown to be an effective talent management and motivational tool as it can provide clear managerial plans related to promoting, training and development activities from the perspective of nurses, hence increasing their satisfaction, motivation and acceptance of appraisal results. Due to such features, the model is currently being considered for implementation at ICU. Finally, the ratio of the number DEA units to the number of input/output measures is revisited with new suggested values on its upper and lower limits depending on the type of DEA models and the desired number of efficient units from a managerial perspective.
Resumo:
Purpose – This paper attempts to seek answers to four questions. Two of these questions have been borrowed (but adapted) from the work of Defee et al.: RQ1. To what extent is theory used in purchasing and supply chain management (P&SCM) research? RQ2. What are the prevalent theories to be found in P&SCM research? Following on from these questions an additional question is posed: RQ3. Are theory-based papers more highly cited than papers with no theoretical foundation? Finally, drawing on the work of Harland et al., the authors have added a fourth question: RQ4. To what extent does P&SCM meet the tests of coherence, breadth and depth, and quality necessary to make it a scientific discipline? Design/methodology/approach – A systematic literature review was conducted in accordance with the model outlined by Tranfield et al. for three journals within the field of “purchasing and supply chain management”. In total 1,113 articles were reviewed. In addition a citation analysis was completed covering 806 articles in total. Findings – The headline features from the results suggest that nearly a decade-and-a-half on from its development, the field still lacks coherence. There is the absence of theory in much of the work and although theory-based articles achieved on average a higher number of citations than non-theoretical papers, there is no obvious contender as an emergent paradigm for the discipline. Furthermore, it is evident that P&SCM does not meet Fabian's test necessary to make it a scientific discipline and is still some way from being a normal science. Research limitations/implications – This study would have benefited from the analysis of further journals, however the analysis of 1,113 articles from three leading journals in the field of P&SCM was deemed sufficient in scope. In addition, a further significant line of enquiry to follow is the rigour vs relevance debate. Practical implications – This article is of interest to both an academic and practitioner audience as it highlights the use theories in P&SCM. Furthermore, this article raises a number of important questions. Should research in this area draw more heavily on theory and if so which theories are appropriate? Social implications – The broader social implications relate to the discussion of how a scientific discipline develops and builds on the work of Fabian and Amundson. Originality/value – The data set for this study is significant and builds on a number of previous literature reviews. This review is both greater in scope than previous reviews and is broader in its subject focus. In addition, the citation analysis (not previously conducted in any of the reviews) and statistical test highlights that theory-based articles are more highly cited than non-theoretically based papers. This could indicate that researchers are attempting to build on one another's work.
Resumo:
The main purpose of this research is to develop and deploy an analytical framework for measuring the environmental performance of manufacturing supply chains. This work's theoretical bases combine and reconcile three major areas: supply chain management, environmental management and performance measurement. Researchers have suggested many empirical criteria for green supply chain (GSC) performance measurement and proposed both qualitative and quantitative frameworks. However, these are mainly operational in nature and specific to the focal company. This research develops an innovative GSC performance measurement framework by integrating supply chain processes (supplier relationship management, internal supply chain management and customer relationship management) with organisational decision levels (both strategic and operational). Environmental planning, environmental auditing, management commitment, environmental performance, economic performance and operational performance are the key level constructs. The proposed framework is then applied to three selected manufacturing organisations in the UK. Their GSC performance is measured and benchmarked by using the analytic hierarchy process (AHP), a multiple-attribute decision-making technique. The AHP-based framework offers an effective way to measure and benchmark organisations’ GSC performance. This study has both theoretical and practical implications. Theoretically it contributes holistic constructs for designing a GSC and managing it for sustainability; and practically it helps industry practitioners to measure and improve the environmental performance of their supply chain. © 2013 Copyright Taylor and Francis Group, LLC. CORRIGENDUM DOI 10.1080/09537287.2012.751186 In the article ‘Green supply chain performance measurement using the analytic hierarchy process: a comparative analysis of manufacturing organisations’ by Prasanta Kumar Dey and Walid Cheffi, Production Planning & Control, 10.1080/09537287.2012.666859, a third author is added which was not included in the paper as it originally appeared. The third author is Breno Nunes.
Resumo:
Purpose: The purpose of this paper is to focus on investigating and benchmarking green operations initiatives in the automotive industry documented in the environmental reports of selected companies. The investigation roadmaps the main environmental initiatives taken by the world's three major car manufacturers and benchmarks them against each other. The categorisation of green operations initiatives that is provided in the paper can also help companies in other sectors to evaluate their green practices. Design/methodology/approach: The first part of the paper is based on existing literature on the topic of green and sustainable operations and the "unsustainable" context of automotive production. The second part relates to the roadmap and benchmarking of green operations initiatives based on an analysis of secondary data from the automotive industry. Findings: The findings show that the world's three major car manufacturers are pursuing various environmental initiatives involving the following green operations practices: green buildings, eco-design, green supply chains, green manufacturing, reverse logistics and innovation. Research limitations/implications: The limitations of this paper start from its selection of the companies, which was made using production volume and country of origin as the principal criteria. There is ample evidence that other, smaller, companies are pursuing more sophisticated and original environmental initiatives. Also, there might be a gap between what companies say they do in their environmental reports and what they actually do. Practical implications: This paper helps practitioners in the automotive industry to benchmark themselves against the major volume manufacturers in three different continents. Practitioners from other industries will also find it valuable to discover how the automotive industry is pursuing environmental initiatives beyond manufacturing, apart from the green operations practices covering broadly all the activities of operations function. Originality/value: The originality of the paper is in its up-to-date analysis of environmental reports of automotive companies. The paper offers value for researchers and practitioners due to its contribution to the green operations literature. For instance, the inclusion of green buildings as part of green operations practices has so far been neglected by most researchers and authors in the field of green and sustainable operations. © Emerald Group Publishing Limited.
Resumo:
Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.
Resumo:
We study soliton solutions of the path-averaged propagation equation governing the transmission of dispersion-managed (DM) optical pulses in the (practical) limit when residual dispersion and nonlinearity only slightly affect the pulse dynamics over one compensation period. In the case of small dispersion map strengths, the averaged pulse dynamics is governed by a perturbed form of the nonlinear Schrödinger equation; applying a perturbation theory – elsewhere developed – based on inverse scattering theory, we derive an analytic expression for the envelope of the DM soliton. This expression correctly predicts the power enhancement arising from the dispersion management. Theoretical results are verified by direct numerical simulations.
Resumo:
Since its introduction in 1978, data envelopment analysis (DEA) has become one of the preeminent nonparametric methods for measuring efficiency and productivity of decision making units (DMUs). Charnes et al. (1978) provided the original DEA constant returns to scale (CRS) model, later extended to variable returns to scale (VRS) by Banker et al. (1984). These ‘standard’ models are known by the acronyms CCR and BCC, respectively, and are now employed routinely in areas that range from assessment of public sectors, such as hospitals and health care systems, schools, and universities, to private sectors, such as banks and financial institutions (Emrouznejad et al. 2008; Emrouznejad and De Witte 2010). The main objective of this volume is to publish original studies that are beyond the two standard CCR and BCC models with both theoretical and practical applications using advanced models in DEA.
Resumo:
We present a logical design of an all-optical processor that performs modular arithmetic. The overall design is based a set of interconnected modules that use all-optical gates to perform simple logical functions. The all-optical logic gates are based on the semiconductor optical amplifier nonlinear loop. Simulation results are presented and some practical design issues are discussed.
Resumo:
In the nonparametric framework of Data Envelopment Analysis the statistical properties of its estimators have been investigated and only asymptotic results are available. For DEA estimators results of practical use have been proved only for the case of one input and one output. However, in the real world problems the production process is usually well described by many variables. In this paper a machine learning approach to variable aggregation based on Canonical Correlation Analysis is presented. This approach is applied for efficiency estimation of all the farms in Terceira Island of the Azorean archipelago.
A simulation analysis of spoke-terminals operating in LTL Hub-and-Spoke freight distribution systems
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT The research presented in this thesis is concerned with Discrete-Event Simulation (DES) modelling as a method to facilitate logistical policy development within the UK Less-than-Truckload (LTL) freight distribution sector which has been typified by “Pallet Networks” operating on a hub-and-spoke philosophy. Current literature relating to LTL hub-and-spoke and cross-dock freight distribution systems traditionally examines a variety of network and hub design configurations. Each is consistent with classical notions of creating process efficiency, improving productivity, reducing costs and generally creating economies of scale through notions of bulk optimisation. Whilst there is a growing abundance of papers discussing both the network design and hub operational components mentioned above, there is a shortcoming in the overall analysis when it comes to discussing the “spoke-terminal” of hub-and-spoke freight distribution systems and their capabilities for handling the diverse and discrete customer profiles of freight that multi-user LTL hub-and-spoke networks typically handle over the “last-mile” of the delivery, in particular, a mix of retail and non-retail customers. A simulation study is undertaken to investigate the impact on operational performance when the current combined spoke-terminal delivery tours are separated by ‘profile-type’ (i.e. retail or nonretail). The results indicate that a potential improvement in delivery performance can be made by separating retail and non-retail delivery runs at the spoke-terminal and that dedicated retail and non-retail delivery tours could be adopted in order to improve customer delivery requirements and adapt hub-deployed policies. The study also leverages key operator experiences to highlight the main practical implementation challenges when integrating the observed simulation results into the real-world. The study concludes that DES be harnessed as an enabling device to develop a ‘guide policy’. This policy needs to be flexible and should be applied in stages, taking into account the growing retail-exposure.
Resumo:
Красимир Манев, Нели Манева, Хараламби Хараламбиев - Подходът с използване на бизнес правила (БП) беше въведен в края на миналия век, за да се улесни специфицирането на фирмен софтуер и да може той да задоволи по-добре нуждите на съответния бизнес. Днес повечето от целите на подхода са постигнати. Но усилията, в научно-изследователски и практически аспект, за постигане на „’формална основа за обратно извличане на БП от съществуващи системи “продължават. В статията е представен подход за извличане на БП от програмен код, базиран на методи за статичен анализ на кода. Посочени са някои предимства и недостатъци на такъв подход.
Resumo:
Purpose – This paper aims to clarify what ‘narrative analysis’ may entail when it is assumed that interview accounts can be treated as (collections of) narratives. What is considered a narrative and how these may be analyzed is open to debate. After suggesting an approach of how to deal with narrative analysis, the authors critically discuss how far it might offer insights into a particular accounting case. Design/methodology/approach – After having explained what the authors’ view on narrative analysis is, and how this is linked with the extant literature, the authors examine the socialisation processes of two early career accountants that have been articulated in an interview context. Findings – The approach to narrative analysis set out in this paper could help to clarify how and why certain interpretations from an interview are generated by a researcher. The authors emphasise the importance of discussing a researcher’s process of discovery when an interpretive approach to research is adopted. Research limitations/implications – The application of any method, and what a researcher thinks can be distilled from this, depends on the research outlook he/she has. As the authors adopt an interpretive approach to research in this paper, they acknowledge that the interpretations of narratives, and what they deem to be narratives, will be infused by their own perceptions. Practical implications – The authors believe that the writing-up of qualitative research from an interpretive stance would benefit from an explicit acceptance of the equivocal nature of interpretation. The way in which they present and discuss the narrative analyses in this paper intends to bring this to the fore. Originality/value – Whenever someone says he/she engages in narrative analysis, both the “narrative” and “analysis” part of “narrative analysis” need to be explicated. The authors believe that this only happens every so often. This paper puts forward an approach of how more clarity on this might be achieved by combining two frameworks in the extant literature, so that the transparency of the research is enhanced.
Resumo:
The uncertainty of measurements must be quantified and considered in order to prove conformance with specifications and make other meaningful comparisons based on measurements. While there is a consistent methodology for the evaluation and expression of uncertainty within the metrology community industry frequently uses the alternative Measurement Systems Analysis methodology. This paper sets out to clarify the differences between uncertainty evaluation and MSA and presents a novel hybrid methodology for industrial measurement which enables a correct evaluation of measurement uncertainty while utilising the practical tools of MSA. In particular the use of Gage R&R ANOVA and Attribute Gage studies within a wider uncertainty evaluation framework is described. This enables in-line measurement data to be used to establish repeatability and reproducibility, without time consuming repeatability studies being carried out, while maintaining a complete consideration of all sources of uncertainty and therefore enabling conformance to be proven with a stated level of confidence. Such a rigorous approach to product verification will become increasingly important in the era of the Light Controlled Factory with metrology acting as the driving force to achieve the right first time and highly automated manufacture of high value large scale products such as aircraft, spacecraft and renewable power generation structures.
Resumo:
This study analyses the current role of police-suspect interview discourse in the England & Wales criminal justice system, with a focus on its use as evidence. A central premise is that the interview should be viewed not as an isolated and self-contained discursive event, but as one link in a chain of events which together constitute the criminal justice process. It examines: (1) the format changes undergone by interview data after the interview has taken place, and (2) how the other links in the chain – both before and after the interview – affect the interview-room interaction itself. It thus examines the police interview as a multi-format, multi-purpose and multi-audience mode of discourse. An interdisciplinary and multi-method discourse-analytic approach is taken, combining elements of conversation analysis, pragmatics, sociolinguistics and critical discourse analysis. Data from a new corpus of recent police-suspect interviews, collected for this study, are used to illustrate previously unaddressed problems with the current process, mainly in the form of two detailed case studies. Additional data are taken from the case of Dr. Harold Shipman. The analysis reveals several causes for concern, both in aspects of the interaction in the interview room, and in the subsequent treatment of interview material as evidence, especially in the light of s.34 of the Criminal Justice and Public Order Act 1994. The implications of the findings for criminal justice are considered, along with some practical recommendations for improvements. Overall, this study demonstrates the need for increased awareness within the criminal justice system of the many linguistic factors affecting interview evidence.