28 resultados para Measurement and processing vibrations


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measuring and compensating the pivot points of five-axis machine tools is always challenging and very time consuming. This paper presents a newly developed approach for automatic measurement and compensation of pivot point positional errors on five-axis machine tools. Machine rotary axis errors are measured using a circular test. This method has been tested on five-axis machine tools with swivel table configuration. Results show that up to 99% of the positional errors of the rotary axis can be compensated by using this approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Incorporating Material Balance Principle (MBP) in industrial and agricultural performance measurement systems with pollutant factors has been on the rise in recent years. Many conventional methods of performance measurement have proven incompatible with the material flow conditions. This study will address the issue of eco-efficiency measurement adjusted for pollution, taking into account materials flow conditions and the MBP requirements, in order to provide ‘real’ measures of performance that can serve as guides when making policies. We develop a new approach by integrating slacks-based measure to enhance the Malmquist Luenberger Index by a material balance condition that reflects the conservation of matter. This model is compared with a similar model, which incorporates MBP using the trade-off approach to measure productivity and eco-efficiency trends of power plants. Results reveal similar findings for both models substantiating robustness and applicability of the proposed model in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The solubility of telmisartan (form A) in nine organic solvents (chloroform, dichloromethane, ethanol, toluene, benzene, 2-propanol, ethyl acetate, methanol and acetone) was determined by a laser monitoring technique at temperatures from 277.85 to 338.35 K. The solubility of telmisartan (form A) in all of the nine solvents increased with temperature as did the rates at which the solubility increased except in chloroform and dichloromethane. The mole fraction solubility in chloroform is higher than that in dichloromethane, which are both one order of magnitude higher than those in the other seven solvents at the experimental temperatures. The solubility data were correlated with the modified Apelblat equation and λh equations. The results show that the λh equation is in better agreement with the experimental data than the Apelblat equation. The relative root mean square deviations (σ) of the λh equation are in the range from 0.004 to 0.45 %. The dissolution enthalpies, entropies and Gibbs energies of telmisartan in these solvents were estimated by the Van’t Hoff equation and the Gibbs equation. The melting point and the fusion enthalpy of telmisartan were determined by differential scanning calorimetry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although atypical social behaviour remains a key characterisation of ASD, the presence ofsensory and perceptual abnormalities has been given a more central role in recentclassification changes. An understanding of the origins of such aberrations could thus prove afruitful focus for ASD research. Early neurocognitive models of ASD suggested that thestudy of high frequency activity in the brain as a measure of cortical connectivity mightprovide the key to understanding the neural correlates of sensory and perceptual deviations inASD. As our review shows, the findings from subsequent research have been inconsistent,with a lack of agreement about the nature of any high frequency disturbances in ASD brains.Based on the application of new techniques using more sophisticated measures of brainsynchronisation, direction of information flow, and invoking the coupling between high andlow frequency bands, we propose a framework which could reconcile apparently conflictingfindings in this area and would be consistent both with emerging neurocognitive models ofautism and with the heterogeneity of the condition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a series of experiments, we tested category-specific activation in normal parti¬cipants using magnetoencephalography (MEG). Our experiments explored the temporal processing of objects, as MEG characterises neural activity on the order of milliseconds. Our experiments explored object-processing, including assessing the time-course of ob¬ject naming, early differences in processing living compared with nonliving objects and processing objects at the basic compared with the domain level, and late differences in processing living compared with nonliving objects and processing objects at the basic compared with the domain level. In addition to studies using normal participants, we also utilised MEG to explore category-specific processing in a patient with a deficit for living objects. Our findings support the cascade model of object naming (Humphreys et al., 1988). In addition, our findings using normal participants demonstrate early, category-specific perceptual differences. These findings are corroborated by our patient study. In our assessment of the time-course of category-specific effects as well as a separate analysis designed to measure semantic differences between living and nonliving objects, we found support for the sensory/motor model of object naming (Martin, 1998), in addition to support for the cascade model of object naming. Thus, object processing in normal participants appears to be served by a distributed network in the brain, and there are both perceptual and semantic differences between living and nonliving objects. A separate study assessing the influence of the level at which you are asked to identify an object on processing in the brain found evidence supporting the convergence zone hypothesis (Damasio, 1989). Taken together, these findings indicate the utility of MEG in exploring the time-course of object processing, isolating early perceptual and later semantic effects within the brain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two reactive comonomers, divinyl benzene (DVB) and trimethylolpropane triacrylate (TRIS), were evaluated for their role in effecting the melt free radical grafting reaction of the monomer glycidyl methacrylate (GMA) onto polypropylene (PP). The characteristics of the GMA-grafting systems in the presence and absence of DVB or TRIS were examined and compared in terms of the yield of the grafting reaction and the extent of the main side reactions, namely homopolymerisation of GMA (poly-GMA) and polymer degradation, using different chemical compositions of the reactive systems and processing conditions. In the absence of the comonomers, i.e. in a conventional system, high initiator concentrations of peroxides were typically required to achieve the highest possible GMA grafting levels which were found to be generally low. Concomitantly, both poly-GMA and degradation of the polymer by chain scission takes place with increasing initiator amounts. On the other hand, the presence of a small amount of the comonomers, DVB or Tris, in the GMA-grafting system, was shown to bring about a significant increase in the grafting level paralleled by a large reduction in poly-GMA and PP degradation. In the presence of these highly reactive comonomers, the optimum grafting system requires a much lower concentration of the peroxide initiator and, consequently, would lead to the much lower degree of polymer degradation observed in these systems. The differences in the effects of the presence of DVB and that of TRIS in the grafting systems on the rate of the GMA-grafting and homopolymerisation reactions, and the extent of PP degradation (through melt flow changes), were compared and contrasted with a conventional GMA-grafting system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores demand and production management challenges in the food processing industry. The goal is to identify the main production planning constraints and secondly to explore how each of these constraints affects company’s performance in terms of costs and customer service level. A single case study methodology was preferred since it enabled the collection of in-depth data. Findings suggest that product shelf life, carcass utilization and production lead time are the main constraints affecting supply chain efficiency and hence, a single planning approach is not appropriate when different products have different technological and processing characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measuring variations in efficiency and its extension, eco-efficiency, during a restructuring period in different industries has always been a point of interest for regulators and policy makers. This paper assesses the impacts of restructuring of procurement in the Iranian power industry on the performance of power plants. We introduce a new slacks-based model for Malmquist-Luenberger (ML) Index measurement and apply it to the power plants to calculate the efficiency, eco-efficiency, and technological changes over the 8-year period (2003-2010) of restructuring in the power industry. The results reveal that although the restructuring had different effects on the individual power plants, the overall growth in the eco-efficiency of the sector was mainly due to advances in pure technology. We also assess the correlation between efficiency and eco-efficiency of the power plants, which indicates a close relationship between these two steps, thus lending support to the incorporation of environmental factors in efficiency analysis. © 2014 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermal effects in uncontrolled factory environments are often the largest source of uncertainty in large volume dimensional metrology. As the standard temperature for metrology of 20°C cannot be achieved practically or economically in many manufacturing facilities, the characterisation and modelling of temperature offers a solution for improving the uncertainty of dimensional measurement and quantifying thermal variability in large assemblies. Technologies that currently exist for temperature measurement in the range of 0-50°C have been presented alongside discussion of these temperature measurement technologies' usefulness for monitoring temperatures in a manufacturing context. Particular aspects of production where the technology could play a role are highlighted as well as practical considerations for deployment. Contact sensors such as platinum resistance thermometers can produce accuracy closest to the desired accuracy given the most challenging measurement conditions calculated to be ∼0.02°C. Non-contact solutions would be most practical in the light controlled factory (LCF) and semi-invasive appear least useful but all technologies can play some role during the initial development of thermal variability models.