873 resultados para pacs: engineering mathematics and mathematical techniques
Resumo:
While the retrieval of existing designs to prevent unnecessary duplication of parts is a recognised strategy in the control of design costs the available techniques to achieve this, even in product data management systems, are limited in performance or require large resources. A novel system has been developed based on a new version of an existing coding system (CAMAC) that allows automatic coding of engineering drawings and their subsequent retrieval using a drawing of the desired component as the input. The ability to find designs using a detail drawing rather than textual descriptions is a significant achievement in itself. Previous testing of the system has demonstrated this capability but if a means could be found to find parts from a simple sketch then its practical application would be much more effective. This paper describes the development and testing of such a search capability using a database of over 3000 engineering components.
Resumo:
In this paper we propose a data envelopment analysis (DEA) based method for assessing the comparative efficiencies of units operating production processes where input-output levels are inter-temporally dependent. One cause of inter-temporal dependence between input and output levels is capital stock which influences output levels over many production periods. Such units cannot be assessed by traditional or 'static' DEA which assumes input-output correspondences are contemporaneous in the sense that the output levels observed in a time period are the product solely of the input levels observed during that same period. The method developed in the paper overcomes the problem of inter-temporal input-output dependence by using input-output 'paths' mapped out by operating units over time as the basis of assessing them. As an application we compare the results of the dynamic and static model for a set of UK universities. The paper is suggested that dynamic model capture the efficiency better than static model. © 2003 Elsevier Inc. All rights reserved.
Resumo:
Topical and transdermal formulations are promising platforms for the delivery of drugs. A unit dose topical or transdermal drug delivery system that optimises the solubility of drugs within the vehicle provides a novel dosage form for efficacious delivery that also offers a simple manufacture technique is desirable. This study used Witepsol® H15 wax as a abase for the delivery system. One aspect of this project involved determination of the solubility of ibuprofen, flurbiprofen and naproxen in the was using microscopy, Higuchi release kinetics, HyperDSC and mathematical modelling techniques. Correlations between the results obtained via these techniques were noted with additional merits such as provision of valuable information on drug release kinetics and possible interactions between the drug and excipients. A second aspect of this project involved the incorporation of additional excipients: Tween 20 (T), Carbopol®971 (C) and menthol (M) to the wax formulation. On in vitro permeation through porcine skin, the preferred formulations were: ibuprofen (5% w/w) within Witepsol®H15 + 1% w/w T; flurbiprofen (10% w/w) within Witepsol®H15 + 1% w/w T; naproxen (5% w/w) within Witepsol®H15 + 1% w/w T + 1% C and sodium diclofenac (10% w/w) within Witepsol®H15 + 1% w/w T + 1% w/w T + 1% w/w C + 5% w/w M. Unit dose transdermal tablets containing ibuprofen and diclofenac were produced with improved flux compared to marketed products; Voltarol Emugel® demonstrated flux of 1.68x10-3 cm/h compared to 123 x 10-3 cm/h for the optimised product as detailed above; Ibugel Forte® demonstrated a permeation coefficient value of 7.65 x 10-3 cm/h compared to 8.69 x 10-3 cm/h for the optimised product as described above.
Resumo:
The present study describes a pragmatic approach to the implementation of production planning and scheduling techniques in foundries of all types and looks at the use of `state-of-the-art' management control and information systems. Following a review of systems for the classification of manufacturing companies, a definitive statement is made which highlights the important differences between foundries (i.e. `component makers') and other manufacturing companies (i.e. `component buyers'). An investigation of the manual procedures which are used to plan and control the manufacture of components reveals the inherent problems facing foundry production management staff, which suggests the unsuitability of many manufacturing techniques which have been applied to general engineering companies. From the literature it was discovered that computer-assisted systems are required which are primarily `information-based' rather than `decision based', whilst the availability of low-cost computers and `packaged-software' has enabled foundries to `get their feet wet' without the financial penalties which characterized many of the early attempts at computer-assistance (i.e. pre-1980). Moreover, no evidence of a single methodology for foundry scheduling emerged from the review. A philosophy for the development of a CAPM system is presented, which details the essential information requirements and puts forward proposals for the subsequent interactions between types of information and the sub-system of CAPM which they support. The work developed was oriented specifically at the functions of production planning and scheduling and introduces the concept of `manual interaction' for effective scheduling. The techniques developed were designed to use the information which is readily available in foundries and were found to be practically successful following the implementation of the techniques into a wide variety of foundries. The limitations of the techniques developed are subsequently discussed within the wider issues which form a CAPM system, prior to a presentation of the conclusions which can be drawn from the study.
Resumo:
Much of the geometrical data relating to engineering components and assemblies is stored in the form of orthographic views, either on paper or computer files. For various engineering applications, however, it is necessary to describe objects in formal geometric modelling terms. The work reported in this thesis is concerned with the development and implementation of concepts and algorithms for the automatic interpretation of orthographic views as solid models. The various rules and conventions associated with engineering drawings are reviewed and several geometric modelling representations are briefly examined. A review of existing techniques for the automatic, and semi-automatic, interpretation of engineering drawings as solid models is given. A new theoretical approach is then presented and discussed. The author shows how the implementation of such an approach for uniform thickness objects may be extended to more general objects by introducing the concept of `approximation models'. Means by which the quality of the transformations is monitored, are also described. Detailed descriptions of the interpretation algorithms and the software package that were developed for this project are given. The process is then illustrated by a number of practical examples. Finally, the thesis concludes that, using the techniques developed, a substantial percentage of drawings of engineering components could be converted into geometric models with a specific degree of accuracy. This degree is indicative of the suitability of the model for a particular application. Further work on important details is required before a commercially acceptable package is produced.
Resumo:
Energy consumption in wireless networks, and in particular in cellular mobile networks, is now of major concern in respect of their potential adverse impact upon the environment and their escalating operating energy costs. The recent phenomenal growth of data services in cellular mobile networks has exacerbated the energy consumption issue and is forcing researchers to address how to design future wireless networks that take into account energy consumption constraints. One fundamental approach to reduce energy consumption of wireless networks is to adopt new radio access architectures and radio techniques. The Mobile VCE (MVCE) Green Radio project, established in 2009, is considering such new architectural and technical approaches. This paper reports highlights the key research issues pursued in the MVCE Green Radio project.
Resumo:
Nonlinear systems with periodic variations of nonlinearity and/or dispersion occur in a variety of physical problems and engineering applications. The mathematical concept of dispersion managed solitons already has made an impact on the development of fibre communications, optical signal processing and laser science. We overview here the field of the dispersion managed solitons starting from mathematical theories of Hamiltonian and dissipative systems and then discuss recent advances in practical implementation of this concept in fibre-optics and lasers.
Resumo:
Purpose – This paper aims to provide a critical analysis of UK Government policy in respect of recent moves to attract young people into engineering. Drawing together UK and EU policy literature, the paper considers why young people fail to look at engineering positively. Design/methodology/approach – Drawing together UK policy, practitioner and academic-related literature the paper critically considers the various factors influencing young people's decision-making processes in respect of entering the engineering profession. A conceptual framework providing a diagrammatic representation of the “push” and “pull” factors impacting young people at pre-university level is given. Findings – The discussion argues that higher education in general has a responsibility to assist young people overcome negative stereotypical views in respect of engineering education. Universities are in the business of building human capability ethically and sustainably. As such they hold a duty of care towards the next generation. From an engineering education perspective, the major challenge is to present a relevant and sustainable learning experience that will equip students with the necessary skills and competencies for a lifelong career in engineering. This may be achieved by promoting transferable skills and competencies or by the introduction of a capabilities-driven curriculum which brings together generic and engineering skills and abilities. Social implications – In identifying the push/pull factors impacting young people's decisions to study engineering, this paper considers why, at a time of global recession, young people should select to study the required subjects of mathematics, science and technology necessary to study for a degree in engineering. The paper identifies the long-term social benefits of increasing the number of young people studying engineering. Originality/value – In bringing together pedagogy and policy within an engineering framework, the paper adds to current debates in engineering education providing a distinctive look at what seems to be a recurring problem – the failure to attract young people into engineering.
Resumo:
Constructing and executing distributed systems that can adapt to their operating context in order to sustain provided services and the service qualities are complex tasks. Managing adaptation of multiple, interacting services is particularly difficult since these services tend to be distributed across the system, interdependent and sometimes tangled with other services. Furthermore, the exponential growth of the number of potential system configurations derived from the variabilities of each service need to be handled. Current practices of writing low-level reconfiguration scripts as part of the system code to handle run time adaptation are both error prone and time consuming and make adaptive systems difficult to validate and evolve. In this paper, we propose to combine model driven and aspect oriented techniques to better cope with the complexities of adaptive systems construction and execution, and to handle the problem of exponential growth of the number of possible configurations. Combining these techniques allows us to use high level domain abstractions, simplify the representation of variants and limit the problem pertaining to the combinatorial explosion of possible configurations. In our approach we also use models at runtime to generate the adaptation logic by comparing the current configuration of the system to a composed model representing the configuration we want to reach. © 2008 Springer-Verlag Berlin Heidelberg.
Resumo:
Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.
Resumo:
We introduce a flexible visual data mining framework which combines advanced projection algorithms from the machine learning domain and visual techniques developed in the information visualization domain. The advantage of such an interface is that the user is directly involved in the data mining process. We integrate principled projection algorithms, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates and billboarding, to provide a visual data mining framework. Results on a real-life chemoinformatics dataset using GTM are promising and have been analytically compared with the results from the traditional projection methods. It is also shown that the HGTM algorithm provides additional value for large datasets. The computational complexity of these algorithms is discussed to demonstrate their suitability for the visual data mining framework. Copyright 2006 ACM.
Resumo:
Nonlinear systems with periodic variations of nonlinearity and/or dispersion occur in a variety of physical problems and engineering applications. The mathematical concept of dispersion managed solitons already has made an impact on the development of fibre communications, optical signal processing and laser science. We overview here the field of the dispersion managed solitons starting from mathematical theories of Hamiltonian and dissipative systems and then discuss recent advances in practical implementation of this concept in fibre-optics and lasers.
Resumo:
This paper builds on previous work (Clark, 2009; Clark & Andrews 2011, 2014) to continue the debate around a seemingly universal question…“How can educational theory be applied to engineering education in such a way so as to make the subject more accessible and attractive to students? It argues that there are three key elements to student success; Relationships, Variety & Synergy (RVS). By further examining the purposefully developed bespoke learning and teaching approach constructed around these three elements (RVS) the discourse in this paper links educational theory to engineering education and in doing so further develops arguments for the introduction of a purposefully designed pedagogic approach for use in engineering education.
Resumo:
The reasonable choice is a critical success factor for decision- making in the field of software engineering (SE). A case-driven comparative analysis has been introduced and a procedure for its systematic application has been suggested. The paper describes how the proposed method can be built in a general framework for SE activities. Some examples of experimental versions of the framework are brie y presented.
Resumo:
Electronic publishing exploits numerous possibilities to present or exchange information and to communicate via most current media like the Internet. By utilizing modern Web technologies like Web Services, loosely coupled services, and peer-to-peer networks we describe the integration of an intelligent business news presentation and distribution network. Employing semantics technologies enables the coupling of multinational and multilingual business news data on a scalable international level and thus introduce a service quality that is not achieved by alternative technologies in the news distribution area so far. Architecturally, we identified the loose coupling of existing services as the most feasible way to address multinational and multilingual news presentation and distribution networks. Furthermore we semantically enrich multinational news contents by relating them using AI techniques like the Vector Space Model. Summarizing our experiences we describe the technical integration of semantics and communication technologies in order to create a modern international news network.