906 resultados para machine-tool industry


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud data centres are critical business infrastructures and the fastest growing service providers. Detecting anomalies in Cloud data centre operation is vital. Given the vast complexity of the data centre system software stack, applications and workloads, anomaly detection is a challenging endeavour. Current tools for detecting anomalies often use machine learning techniques, application instance behaviours or system metrics distribu- tion, which are complex to implement in Cloud computing environments as they require training, access to application-level data and complex processing. This paper presents LADT, a lightweight anomaly detection tool for Cloud data centres that uses rigorous correlation of system metrics, implemented by an efficient corre- lation algorithm without need for training or complex infrastructure set up. LADT is based on the hypothesis that, in an anomaly-free system, metrics from data centre host nodes and virtual machines (VMs) are strongly correlated. An anomaly is detected whenever correlation drops below a threshold value. We demonstrate and evaluate LADT using a Cloud environment, where it shows that the hosting node I/O operations per second (IOPS) are strongly correlated with the aggregated virtual machine IOPS, but this correlation vanishes when an application stresses the disk, indicating a node-level anomaly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing benchmarking methods are time consuming processes as they typically benchmark the entire Virtual Machine (VM) in order to generate accurate performance data, making them less suitable for real-time analytics. The research in this paper is aimed to surmount the above challenge by presenting DocLite - Docker Container-based Lightweight benchmarking tool. DocLite explores lightweight cloud benchmarking methods for rapidly executing benchmarks in near real-time. DocLite is built on the Docker container technology, which allows a user-defined memory size and number of CPU cores of the VM to be benchmarked. The tool incorporates two benchmarking methods - the first referred to as the native method employs containers to benchmark a small portion of the VM and generate performance ranks, and the second uses historic benchmark data along with the native method as a hybrid to generate VM ranks. The proposed methods are evaluated on three use-cases and are observed to be up to 91 times faster than benchmarking the entire VM. In both methods, small containers provide the same quality of rankings as a large container. The native method generates ranks with over 90% and 86% accuracy for sequential and parallel execution of an application compared against benchmarking the whole VM. The hybrid method did not improve the quality of the rankings significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to develop a scientific and practical tool to be used to assess horse welfare after commercial transport over long journeys. A set of physical, behavioural and environmental measures was selected, covering welfare aspects of both transport and unloading procedures. The protocol was field-tested on 51 intra-EU commercial transports arriving at different sites in Italy. Univariate analysis was implemented to look for associations between the input variables (environmental hazards potentially affecting the animal well-being during long transports) and the outcome variables (direct evaluation of the animal condition). No severe welfare impairments were recorded (ie dead on arrival, severe injuries, non-ambulatory animals), while milder ones were more frequent at unloading (eg slipping; 36.7%, reluctance to move; 9.6%). Correlations emerged between ramp slope and falling; type of ramp floor and slipping; fast gait and the presence of gaps between the ramp and the floor. The horses' behaviour was also related to the type of handling procedure used. The measures were repeatable and practical to apply and score during real-time unloading. This work provides a sound basis for a new and practical welfare assessment tool for horses travelling over long journeys. Careful and constant application of this protocol would provide stakeholders with the opportunity to track and monitor changes in the industry over time, as well as to identify high risk areas in transport routines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses the role of enterprise architecture representation, in the context of ERP (Enterprise Resource Planning) information systems, as an instrument for an organization to reflect on itself and develop its business strategies and respective alignment with Information Systems. The paper proposes a representation model of enterprise architecture, as a tool for recommending good practices, and it emerges from a case study undertaken in the context of and investigation on advantages and limitations of ERP systems in the hospitality industry. The proposed approach is also inspired on other academic or market propositions suitable for the objectives of the investigation. It consists on a set of items representing the steps that must be taken by top managers and IS managers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every can of tuna purchased by the consumer has taken a long journey before reaching the supermarket shelves. For each can bought there is a lengthy process from sea to shelf. A large proportion of the tuna cans purchased in the European Union come all the way from West Africa; a developing region with a high dependency on fisheries. Amidst an ever-increasing demand for tuna products the global tuna fisheries are set to continue expanding, apparently one of the last natural resource based industries fit to do so in West Africa. Tuna is the biggest fisheries export and dominates the fisheries sector in Ghana, a country situated in West Africa. This thesis aims to understand how this globally important industrial fisheries functions in terms of procedures, practices, Governance and finance. Socioeconomic influences, in the setting of a developing country, were also examined. For these purposes a Value Chain Analysis was employed. A Value Chain Analysis is a tool commonly used to understand how different companies and organizations participate in a domestic policy environment, which directs conclusion in the global economy. This analysis has the potential to allow researchers to fully understand a commodity chain and hence identify realistic opportunities for consequential improvements. Interviews and questionnaires were employed in-field Ghana along with secondary data collection techniques. It was found that the fisheries functions at the production level under influences from large multinational companies and tends to operate with a certain degree of lawlessness. Governance over the value chain is well defined, however implementation is poor or non-existent. The processors, whom are also dominated by multinationals, exert some control over the producers and their sales, however the high value links which are highlighted occur at the retail stage. Socioeconomic dynamics acting in the chain included the lack of communication between the public and private sector, power imbalances amongst players at production, the role of local businesswomen as actors in the chain and the general characteristics of the workers in the industry. Value addition and upgrading are needed the most in Governance over the chain, especially within Monitoring, Control and Surveillance. The results of the study provide a wealth of material about the components of a cost-heavy fishing industry in a developing country; an industry on which many eyes have recently turned due to illegal fishing activities. It highlights clearly where funding and future focus are needed. This value chain can be used as a guide for those that need to comprehend the financial complexities and real life dynamics of the Ghanaian tuna fishing industry today.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today more than 99% of plastics are petroleum-based because of availability and cost of the raw material. The durability of these disposed plastics contributes to the environmental problems as waste and their persistence in the environment causes deleterious effects on the ecosystem. Environmental pollution awareness and the demand for green technology have drawn considerable attention of both academia and industry into biodegradable polymers. In this regard green chemistry technology has the potential to provide solution to this problematic issue. Laccase bio-grafting has recently been the focus of green chemistry technologies due to the growing environmental concerns, legal restrictions and increasing availability of scientific knowledge. In the last several years, research covering various applications of laccases has been increased rapidly particularly in the field of grafting. In principle, laccase-assisted graft co-polymerization may impart a variety of new functionalities to a polymer. The modified polymers through grafting have a bright future and their development is practically boundless. In present work, novel biodegradable graft copolymers combining the advantages of bacterial cellulose backbone and PHB side chains will be prepared by introducing enzymatic grafting technique. The present research will be a first step in the biopolymer modification. To date no report has been found in literature explaining the enzymatic grafting of PHAs. The technique would also provide an efficient modulation approach to improve the biodegradability and biocompatibility of the graft copolymer. The newly grafted copolymers will exhibit unique functionalities with wider range of potential applications mainly in tissue engineering, biosensors, pharmaceutical industry (drug delivery systems) and bio-plastics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of electricity markets operation has been gaining an increasing importance in last years, as result of the new challenges that the electricity markets restructuring produced. This restructuring increased the competitiveness of the market, but with it its complexity. The growing complexity and unpredictability of the market’s evolution consequently increases the decision making difficulty. Therefore, the intervenient entities are forced to rethink their behaviour and market strategies. Currently, lots of information concerning electricity markets is available. These data, concerning innumerous regards of electricity markets operation, is accessible free of charge, and it is essential for understanding and suitably modelling electricity markets. This paper proposes a tool which is able to handle, store and dynamically update data. The development of the proposed tool is expected to be of great importance to improve the comprehension of electricity markets and the interactions among the involved entities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Janssen-Cilag proposal for a risk-sharing agreement regarding bortezomib received a welcome signal from NICE. The Office of Fair Trading report included risk-sharing agreements as an available tool for the National Health Service. Nonetheless, recent discussions have somewhat neglected the economic fundamentals underlying risk-sharing agreements. We argue here that risk-sharing agreements, although attractive due to the principle of paying by results, also entail risks. Too many patients may be put under treatment even with a low success probability. Prices are likely to be adjusted upward, in anticipation of future risk-sharing agreements between the pharmaceutical company and the third-party payer. An available instrument is a verification cost per patient treated, which allows obtaining the first-best allocation of patients to the new treatment, under the risk sharing agreement. Overall, the welfare effects of risk-sharing agreements are ambiguous, and care must be taken with their use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, an attempt was made in order to measure and evaluate the eco-efficiency performance of a pultruded composite processing company. For this purpose the recommendations of World Business Council for Sustainable Development (WCSD) and the directives of ISO 14301 standard were followed and applied. The main general indicators of eco-efficiency, as well as the specific indicators, were defined and determined. With basis on indicators’ figures, the value profile, the environmental profile, and the pertinent eco-efficiency ratios were established and analyzed. In order to evaluate potential improvements on company eco-performance, new indicators values and eco-efficiency ratios were estimated taking into account the implementation of new proceedings and procedures, at both upstream and downstream of the production process, namely: i) Adoption of a new heating system for pultrusion die-tool in the manufacturing process, more effective and with minor heat losses; ii) Recycling approach, with partial waste reuse of scrap material derived from manufacturing, cutting and assembly processes of GFRP profiles. These features lead to significant improvements on the sequent assessed eco-efficiency ratios of the present case study, yielding to a more sustainable product and manufacturing process of pultruded GFRP profiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work project we discuss the advantages and disadvantages of social media as a marketing tool. Four international cases were analyzed to provide anecdotal evidence of how social and viral marketing have been used by four firms in very different industries. We reviewed empirical evidence on the topic to discuss the main components of viral marketing. We concluded that positive (electronic) word of mouth, short response time and seeding through high network value customers are the main drivers of the success of a viral marketing campaign. We also conducted a study of the Portuguese telecommunications industry, in particular, the mobile segment. We found that the three main players operating in this market have been using social media successfully as a marketing tool in a strategic approach to the 14-25 years old segment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La fumée du tabac est un aérosol extrêmement complexe constitué de milliers de composés répartis entre la phase particulaire et la phase vapeur. Il a été démontré que les effets toxicologiques de cette fumée sont associés aux composés appartenant aux deux phases. Plusieurs composés biologiquement actifs ont été identifiés dans la fumée du tabac; cependant, il n’y a pas d’études démontrant la relation entre les réponses biologiques obtenues via les tests in vitro ou in vivo et les composés présents dans la fumée entière du tabac. Le but de la présente recherche est de développer des méthodes fiables et robustes de fractionnement de la fumée à l’aide de techniques de séparation analytique et de techniques de détection combinés à des essais in vitro toxicologiques. Une étude antérieure réalisée par nos collaborateurs a démontré que, suite à l’étude des produits de combustion de douze principaux composés du tabac, l’acide chlorogénique s’est avéré être le composé le plus cytotoxique selon les test in vitro du micronoyau. Ainsi, dans cette étude, une méthode par chromatographie préparative en phase liquide a été développée dans le but de fractionner les produits de combustion de l’acide chlorogénique. Les fractions des produits de combustion de l’acide chlorogénique ont ensuite été testées et les composés responsables de la toxicité de l’acide chlorogénique ont été identifiés. Le composé de la sous-fraction responsable en majeure partie de la cytoxicité a été identifié comme étant le catéchol, lequel fut confirmé par chromatographie en phase liquide/ spectrométrie de masse à temps de vol. Des études récentes ont démontré les effets toxicologiques de la fumée entière du tabac et l’implication spécifique de la phase vapeur. C’est pourquoi notre travail a ensuite été focalisé principalement à l’analyse de la fumée entière. La machine à fumer Borgwaldt RM20S® utilisée avec les chambres d’exposition cellulaire de British American Tobacco permettent l’étude in vitro de l’exposition de cellules à différentes concentrations de fumée entière du tabac. Les essais biologiques in vitro ont un degré élevé de variabilité, ainsi, il faut prendre en compte toutes les autres sources de variabilité pour évaluer avec précision la finalité toxicologique de ces essais; toutefois, la fiabilité de la génération de la fumée de la machine n’a jamais été évaluée jusqu’à maintenant. Nous avons donc déterminé la fiabilité de la génération et de la dilution (RSD entre 0,7 et 12 %) de la fumée en quantifiant la présence de deux gaz de référence (le CH4 par détection à ionisation de flamme et le CO par absorption infrarouge) et d’un composé de la phase particulaire, le solanesol (par chromatographie en phase liquide à haute performance). Ensuite, la relation entre la dose et la dilution des composés de la phase vapeur retrouvée dans la chambre d’exposition cellulaire a été caractérisée en utilisant une nouvelle technique d’extraction dite par HSSE (Headspace Stir Bar Sorptive Extraction) couplée à la chromatographie en phase liquide/ spectrométrie de masse. La répétabilité de la méthode a donné une valeur de RSD se situant entre 10 et 13 % pour cinq des composés de référence identifiés dans la phase vapeur de la fumée de cigarette. La réponse offrant la surface maximale d’aire sous la courbe a été obtenue en utilisant les conditions expérimentales suivantes : intervalle de temps d’exposition/ désorption de 10 0.5 min, température de désorption de 200°C pour 2 min et température de concentration cryogénique (cryofocussing) de -75°C. La précision de la dilution de la fumée est linéaire et est fonction de l’abondance des analytes ainsi que de la concentration (RSD de 6,2 à 17,2 %) avec des quantités de 6 à 450 ng pour les composés de référence. Ces résultats démontrent que la machine à fumer Borgwaldt RM20S® est un outil fiable pour générer et acheminer de façon répétitive et linéaire la fumée de cigarette aux cultures cellulaires in vitro. Notre approche consiste en l’élaboration d’une méthodologie permettant de travailler avec un composé unique du tabac, pouvant être appliqué à des échantillons plus complexes par la suite ; ex : la phase vapeur de la fumée de cigarette. La méthodologie ainsi développée peut potentiellement servir de méthode de standardisation pour l’évaluation d’instruments ou de l’identification de produits dans l’industrie de tabac.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La croissance dramatique du commerce électronique des titres cache un grand potentiel pour les investisseurs, de même que pour l’industrie des valeurs mobilières en général. Prenant en considération ses risques particuliers, les autorités réglementaires vivent un défi important face à l’Internet en tant que nouveau moyen d’investir. Néanmoins, malgré l’évolution technologique, les objectifs fondamentaux et l’approche des autorités réglementaires restent similaires à ce qui se produit présentement. Cet article analyse l’impact de l’Internet sur le commerce des valeurs mobilières en se concentrant sur les problèmes soulevés par l’utilisation de ce nouveau moyen de communication dans le contexte du marché secondaire. Par conséquent, son objectif est de dresser le portrait des plaintes typiques des investisseurs, de même que celui des activités frauduleuses en valeurs mobilières propres au cyberespace. L’auteur fait une synthèse des développements récents en analysant l’approche des autorités réglementaires, les études doctrinales, la jurisprudence et les cas administratifs. L'auteure désire remercier la professeure Raymonde Crête pour ses précieux commentaires et conseils.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

De plus en plus de recherches sur les Interactions Humain-Machine (IHM) tentent d’effectuer des analyses fines de l’interaction afin de faire ressortir ce qui influence les comportements des utilisateurs. Tant au niveau de l’évaluation de la performance que de l’expérience des utilisateurs, on note qu’une attention particulière est maintenant portée aux réactions émotionnelles et cognitives lors de l’interaction. Les approches qualitatives standards sont limitées, car elles se fondent sur l’observation et des entrevues après l’interaction, limitant ainsi la précision du diagnostic. L’expérience utilisateur et les réactions émotionnelles étant de nature hautement dynamique et contextualisée, les approches d’évaluation doivent l’être de même afin de permettre un diagnostic précis de l’interaction. Cette thèse présente une approche d’évaluation quantitative et dynamique qui permet de contextualiser les réactions des utilisateurs afin d’en identifier les antécédents dans l’interaction avec un système. Pour ce faire, ce travail s’articule autour de trois axes. 1) La reconnaissance automatique des buts et de la structure de tâches de l’utilisateur, à l’aide de mesures oculométriques et d’activité dans l’environnement par apprentissage machine. 2) L’inférence de construits psychologiques (activation, valence émotionnelle et charge cognitive) via l’analyse des signaux physiologiques. 3) Le diagnostic de l‘interaction reposant sur le couplage dynamique des deux précédentes opérations. Les idées et le développement de notre approche sont illustrés par leur application dans deux contextes expérimentaux : le commerce électronique et l’apprentissage par simulation. Nous présentons aussi l’outil informatique complet qui a été implémenté afin de permettre à des professionnels en évaluation (ex. : ergonomes, concepteurs de jeux, formateurs) d’utiliser l’approche proposée pour l’évaluation d’IHM. Celui-ci est conçu de manière à faciliter la triangulation des appareils de mesure impliqués dans ce travail et à s’intégrer aux méthodes classiques d’évaluation de l’interaction (ex. : questionnaires et codage des observations).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.