897 resultados para Analytical hierarchical process


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main purpose of this research is to develop and deploy an analytical framework for measuring the environmental performance of manufacturing supply chains. This work's theoretical bases combine and reconcile three major areas: supply chain management, environmental management and performance measurement. Researchers have suggested many empirical criteria for green supply chain (GSC) performance measurement and proposed both qualitative and quantitative frameworks. However, these are mainly operational in nature and specific to the focal company. This research develops an innovative GSC performance measurement framework by integrating supply chain processes (supplier relationship management, internal supply chain management and customer relationship management) with organisational decision levels (both strategic and operational). Environmental planning, environmental auditing, management commitment, environmental performance, economic performance and operational performance are the key level constructs. The proposed framework is then applied to three selected manufacturing organisations in the UK. Their GSC performance is measured and benchmarked by using the analytic hierarchy process (AHP), a multiple-attribute decision-making technique. The AHP-based framework offers an effective way to measure and benchmark organisations’ GSC performance. This study has both theoretical and practical implications. Theoretically it contributes holistic constructs for designing a GSC and managing it for sustainability; and practically it helps industry practitioners to measure and improve the environmental performance of their supply chain. © 2013 Copyright Taylor and Francis Group, LLC. CORRIGENDUM DOI 10.1080/09537287.2012.751186 In the article ‘Green supply chain performance measurement using the analytic hierarchy process: a comparative analysis of manufacturing organisations’ by Prasanta Kumar Dey and Walid Cheffi, Production Planning & Control, 10.1080/09537287.2012.666859, a third author is added which was not included in the paper as it originally appeared. The third author is Breno Nunes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projection of a high-dimensional dataset onto a two-dimensional space is a useful tool to visualise structures and relationships in the dataset. However, a single two-dimensional visualisation may not display all the intrinsic structure. Therefore, hierarchical/multi-level visualisation methods have been used to extract more detailed understanding of the data. Here we propose a multi-level Gaussian process latent variable model (MLGPLVM). MLGPLVM works by segmenting data (with e.g. K-means, Gaussian mixture model or interactive clustering) in the visualisation space and then fitting a visualisation model to each subset. To measure the quality of multi-level visualisation (with respect to parent and child models), metrics such as trustworthiness, continuity, mean relative rank errors, visualisation distance distortion and the negative log-likelihood per point are used. We evaluate the MLGPLVM approach on the ‘Oil Flow’ dataset and a dataset of protein electrostatic potentials for the ‘Major Histocompatibility Complex (MHC) class I’ of humans. In both cases, visual observation and the quantitative quality measures have shown better visualisation at lower levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, we have developed the hierarchical Generative Topographic Mapping (HGTM), an interactive method for visualization of large high-dimensional real-valued data sets. In this paper, we propose a more general visualization system by extending HGTM in three ways, which allows the user to visualize a wider range of data sets and better support the model development process. 1) We integrate HGTM with noise models from the exponential family of distributions. The basic building block is the Latent Trait Model (LTM). This enables us to visualize data of inherently discrete nature, e.g., collections of documents, in a hierarchical manner. 2) We give the user a choice of initializing the child plots of the current plot in either interactive, or automatic mode. In the interactive mode, the user selects "regions of interest," whereas in the automatic mode, an unsupervised minimum message length (MML)-inspired construction of a mixture of LTMs is employed. The unsupervised construction is particularly useful when high-level plots are covered with dense clusters of highly overlapping data projections, making it difficult to use the interactive mode. Such a situation often arises when visualizing large data sets. 3) We derive general formulas for magnification factors in latent trait models. Magnification factors are a useful tool to improve our understanding of the visualization plots, since they can highlight the boundaries between data clusters. We illustrate our approach on a toy example and evaluate it on three more complex real data sets. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Novel macroporous solid bases have been developed as alternative clean technologies to existing commercial homogeneous catalysts for the production of biodiesel from triglycerides; the latter suffer process disadvantages including complex separation and associated saponification and engine corrosion, and are unsuitable for continuous operation. To this end, tuneable macroporous MgAl hydrotalcites have been prepared by an alkali-free route and characterised by TGA, XRD, SEM and XPS. The macropore architecture improves diffusion of bulky triglyceride molecules to the active base sites, increasing activity. Lamellar and macroporous hydrotalcites will be compared for the transesterification of both model and plant oil feedstocks, and structure-reactivity relations identified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Triggered biodegradable composites made entirely from renewable resources are urgently sought after to improve material recyclability or be able to divert materials from waste streams. Many biobased polymers and natural fibers usually display poor interfacial adhesion when combined in a composite material. Here we propose a way to modify the surfaces of natural fibers by utilizing bacteria (Acetobacter xylinum) to deposit nanosized bacterial cellulose around natural fibers, which enhances their adhesion to renewable polymers. This paper describes the process of modifying large quantities of natural fibers with bacterial cellulose through their use as substrates for bacteria during fermentation. The modified fibers were characterized by scanning electron microscopy, single fiber tensile tests, X-ray photoelectron spectroscopy, and inverse gas chromatography to determine their surface and mechanical properties. The practical adhesion between the modified fibers and the renewable polymers cellulose acetate butyrate and poly(L-lactic acid) was quantified using the single fiber pullout test. © 2008 American Chemical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimization of design, creation, functioning and accompaniment processes of expert system is the important problem of artificial intelligence theory and decisions making methods techniques. In this paper the approach to its solving with the use of technology, being based on methodology of systems analysis, ontology of subject domain, principles and methods of self-organisation, is offered. The aspects of such approach realization, being based on construction of accordance between the ontology hierarchical structure and sequence of questions in automated systems for examination, are expounded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work was supported by the Bulgarian National Science Fund under grant BY-TH-105/2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Usually, generalization is considered as a function of learning from a set of examples. In present work on the basis of recent neural network assembly memory model (NNAMM), a biologically plausible 'grandmother' model for vision, where each separate memory unit itself can generalize, has been proposed. For such a generalization by computation through memory, analytical formulae and numerical procedure are found to calculate exactly the perfectly learned memory unit's generalization ability. The model's memory has complex hierarchical structure, can be learned from one example by a one-step process, and may be considered as a semi-representational one. A simple binary neural network for bell-shaped tuning is described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hierarchically structured Cu2O nanocubes have been synthesized by a facile and cost-effective one-pot, solution phase process. Self-assembly of 5 nm Cu2O nanocrystallites induced through reduction by glucose affords a mesoporous 375 nm cubic architecture with superior visible light photocatalytic performance in both methylene blue dye degradation and hydrogen production from water than conventional non-porous analogues. Hierarchical nanocubes offer improved accessible surface active sites and optical/electronic properties, which act in concert to confer 200–300% rate-enhancements for the photocatalytic decomposition of organic pollutants and solar fuels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation provides an analytical framework to study the political economy of policy reform in the Dominican Republic during the nineties. Based on a country study, I develop two theoretical models that replicate the mechanisms of policy approval in developing countries with weak democracies. The first model considers a pro-reform President who submits a tariff bill to an anti-reform Congress dominated by the opposition party. In between, two opposing lobbies try to get their favored policy approved. Lobbies act as Stackelberg leaders vis a vis a weak President. The behavior of the Congress is determined exogenously while the lobbies act strategically pursuing the approval of the reform bill and indirectly affecting the President's decision. I show that in such a setting external agents like the Press play an important role in the decision-making process of the political actors. ^ The second model presents a similar framework. However, the President, who is a Stackelberg leader, is allowed only two choices, total reform or status-quo. I show how a lobby reacts to an increase in its rival's or its own size. These reactions depend on the President's level of commitment to the reform. Finally, I discuss the effect of variations in the size of the lobbies on the President's choice. The model suitably explains real events that took place in the Dominican Republic in the mid-nineties. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The total time a customer spends in the business process system, called the customer cycle-time, is a major contributor to overall customer satisfaction. Business process analysts and designers are frequently asked to design process solutions with optimal performance. Simulation models have been very popular to quantitatively evaluate the business processes; however, simulation is time-consuming and it also requires extensive modeling experiences to develop simulation models. Moreover, simulation models neither provide recommendations nor yield optimal solutions for business process design. A queueing network model is a good analytical approach toward business process analysis and design, and can provide a useful abstraction of a business process. However, the existing queueing network models were developed based on telephone systems or applied to manufacturing processes in which machine servers dominate the system. In a business process, the servers are usually people. The characteristics of human servers should be taken into account by the queueing model, i.e. specialization and coordination. ^ The research described in this dissertation develops an open queueing network model to do a quick analysis of business processes. Additionally, optimization models are developed to provide optimal business process designs. The queueing network model extends and improves upon existing multi-class open-queueing network models (MOQN) so that the customer flow in the human-server oriented processes can be modeled. The optimization models help business process designers to find the optimal design of a business process with consideration of specialization and coordination. ^ The main findings of the research are, first, parallelization can reduce the cycle-time for those customer classes that require more than one parallel activity; however, the coordination time due to the parallelization overwhelms the savings from parallelization under the high utilization servers since the waiting time significantly increases, thus the cycle-time increases. Third, the level of industrial technology employed by a company and coordination time to mange the tasks have strongest impact on the business process design; as the level of industrial technology employed by the company is high; more division is required to improve the cycle-time; as the coordination time required is high; consolidation is required to improve the cycle-time. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study—employing psychometric meta-analysis of 92 independent studies with sample sizes ranging from 26 to 322 leaders—examined the relationship between EI and leadership effectiveness. Overall, the results supported a linkage between leader EI and effectiveness that was moderate in nature (ρ = .25). In addition, the positive manifold of the effect sizes presented in this study, ranging from .10 to .44, indicate that emotional intelligence has meaningful relations with myriad leadership outcomes including effectiveness, transformational leadership, LMX, follower job satisfaction, and others. Furthermore, this paper examined potential process mechanisms that may account for the EI-leadership effectiveness relationship and showed that both transformational leadership and LMX partially mediate this relationship. However, while the predictive validities of EI were moderate in nature, path analysis and hierarchical regression suggests that EI contributes less than or equal to 1% of explained variance in leadership effectiveness once personality and intelligence are accounted for. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stable isotope analysis has emerged as one of the primary means for examining the structure and dynamics of food webs, and numerous analytical approaches are now commonly used in the field. Techniques range from simple, qualitative inferences based on the isotopic niche, to Bayesian mixing models that can be used to characterize food-web structure at multiple hierarchical levels. We provide a comprehensive review of these techniques, and thus a single reference source to help identify the most useful approaches to apply to a given data set. We structure the review around four general questions: (1) what is the trophic position of an organism in a food web?; (2) which resource pools support consumers?; (3) what additional information does relative position of consumers in isotopic space reveal about food-web structure?; and (4) what is the degree of trophic variability at the intrapopulation level? For each general question, we detail different approaches that have been applied, discussing the strengths and weaknesses of each. We conclude with a set of suggestions that transcend individual analytical approaches, and provide guidance for future applications in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigates a new structural system utilising modular construction. Five-sided boxes are cast on-site and stacked together to form a building. An analytical model was created of a typical building in each of two different analysis programs utilising the finite element method (Robot Millennium and ETABS). The pros and cons of both Robot Millennium and ETABS are listed at several key stages in the development of an analytical model utilising this structural system. Robot Millennium was initially utilised but created an analytical model too large to be successfully run. The computation requirements were too large for conventional computers. Therefore Robot Millennium was abandoned in favour of ETABS, whose more simplistic algorithms and assumptions permitted running this large computation model. Tips are provided as well as pitfalls signalled throughout the process of modelling such complex buildings of this type. ^ The building under high seismic loading required a new horizontal shear mechanism. This dissertation has proposed to create a secondary floor that ties to the modular box through the use of gunwales, and roughened surfaces with epoxy coatings. In addition, vertical connections necessitated a new type of shear wall. These shear walls consisted of waffled external walls tied through both reinforcement and a secondary concrete pour. ^ This structural system has generated a new building which was found to be very rigid compared to a conventional structure. The proposed modular building exhibited a period of 1.27 seconds, which is about one-fifth of a conventional building. The maximum lateral drift occurs under seismic loading with a magnitude of 6.14 inches which is one-quarter of a conventional building's drift. The deflected shape and pattern of the interstorey drifts are consistent with those of a coupled shear wall building. In conclusion, the computer analysis indicate that this new structure exceeds current code requirements for both hurricane winds and high seismic loads, and concomitantly provides a shortened construction time with reduced funding. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The manner in which remains decompose has been and is currently being researched around the world, yet little is still known about the generated scent of death. In fact, it was not until the Casey Anthony trial that research on the odor released from decomposing remains, and the compounds that it is comprised of, was brought to light. The Anthony trial marked the first admission of human decomposition odor as forensic evidence into the court of law; however, it was not "ready for prime time" as the scientific research on the scent of death is still in its infancy. This research employed the use of solid-phase microextraction (SPME) with gas chromatography-mass spectrometry (GC-MS) to identify the volatile organic compounds (VOCs) released from decomposing remains and to assess the impact that different environmental conditions had on the scent of death. Using human cadaver analogues, it was discovered that the environment in which the remains were exposed to dramatically affected the odors released by either modifying the compounds that it was comprised of or by enhancing/hindering the amount that was liberated. In addition, the VOCs released during the different stages of the decomposition process for both human remains and analogues were evaluated. Statistical analysis showed correlations between the stage of decay and the VOCs generated, such that each phase of decomposition was distinguishable based upon the type and abundance of compounds that comprised the odor. This study has provided new insight into the scent of death and the factors that can dramatically affect it, specifically, frozen, aquatic, and soil environments. Moreover, the results revealed that different stages of decomposition were distinguishable based upon the type and total mass of each compound present. Thus, based upon these findings, it is suggested that the training aids that are employed for human remains detection (HRD) canines should 1) be characteristic of remains that have undergone decomposition in different environmental settings, and 2) represent each stage of decay, to ensure that the HRD canines have been trained to the various odors that they are likely to encounter in an operational situation.