963 resultados para Engineering, Industrial|Engineering, System Science|Operations Research
Resumo:
This paper addresses the difficult problem of how to improve the process of evaluating organisational change. Given that the data emergent from an evaluative exercise will strongly influence the subsequent strategic and operational decisions taken by organisational managers, it is critical that the evaluation approach itself is capable of delivering high quality, accurate and timely data. The aim of this paper is to examine the role of the IT-based Optionfinder Technology used in conjunction with focus groups, in generating management decision-making data, and reflecting the changes in key performance indicators in a utility organisation. The case study research evaluates the innovative integrative approach adopted by the utility organisation and concludes that the proposed approach contributes to improvements in the decision-making capability of managers.
Resumo:
This article presents a novel classification of wavelet neural networks based on the orthogonality/non-orthogonality of neurons and the type of nonlinearity employed. On the basis of this classification different network types are studied and their characteristics illustrated by means of simple one-dimensional nonlinear examples. For multidimensional problems, which are affected by the curse of dimensionality, the idea of spherical wavelet functions is considered. The behaviour of these networks is also studied for modelling of a low-dimension map.
Resumo:
This paper introduces a novel modelling framework for identifying dynamic models of systems that are under feedback control. These models are identified under closed-loop conditions and produce a joint representation that includes both the plant and controller models in state space form. The joint plant/controller model is identified using subspace model identification (SMI), which is followed by the separation of the plant model from the identified one. Compared to previous research, this work (i) proposes a new modelling framework for identifying closed-loop systems, (ii) introduces a generic structure to represent the controller and (iii) explains how that the new framework gives rise to a simplified determination of the plant models. In contrast, the use of the conventional modelling approach renders the separation of the plant model a difficult task. The benefits of using the new model method are demonstrated using a number of application studies.
Resumo:
Universities planning the provision of space for their teaching requirements need to do so in a fashion that reduces capital and maintenance costs whilst still providing a high-quality level of service. Space plans should aim to provide sufficient capacity without incurring excessive costs due to over-capacity. A simple measure used to estimate over-provision is utilisation. Essentially, the utilisation is the fraction of seats that are used in practice, or the ratio of demand to supply. However, studies usually find that utilisation is low, often only 20–40%, and this is suggestive of significant over-capacity.
Our previous work has provided methods to improve such space planning. They identify a critical level of utilisation as the highest level that can be achieved whilst still reliably satisfying the demand for places to allocate teaching events. In this paper, we extend this body of work to incorporate the notions of event-types and space-types. Teaching events have multiple ‘event-types’, such as lecture, tutorial, workshop, etc., and there are generally corresponding space-types. Matching the type of an event to a room of a corresponding space-type is generally desirable. However, realistically, allocation happens in a mixed space-type environment where teaching events of a given type are allocated to rooms of another space-type; e.g., tutorials will borrow lecture theatres or workshop rooms.
We propose a model and methodology to quantify the effects of space-type mixing and establish methods to search for better space-type profiles; where the term “space-type profile” refers to the relative numbers of each type of space. We give evidence that these methods have the potential to improve utilisation levels. Hence, the contribution of this paper is twofold. Firstly, we present informative studies of the effects of space-type mixing on utilisation, and critical utilisations. Secondly, we present straightforward though novel methods to determine better space-type profiles, and give an example in which the resulting profiles are indeed significantly improved.
Resumo:
The present paper examines the role of organisational learning and transaction costs economics in strategic outsourcing decisions. Interorganisational learning is critical to competitive success, and organisations often learn more effectively by collaborating with other organisations. However, learning processes may also complicate the process of forming interorganisational partnerships which may increase transaction costs. Based on the literature, the authors develop refutable implications for outsourcing supply chain logistics and a sample of 121 firms in the supply chain logistics industry is used to test the hypotheses. The results show that trust and transaction costs are significant and substantial drivers of strategic outsourcing of supply chain logistics (a strategic flexibility action). Learning intent and knowledge acquisition have no significant influence on the decision to outsource supply chain logistics. The paper concludes with a discussion of the different and often conflicting implications for managing interorganisational learning processes.
Resumo:
Regional investment in R&D, technological development and innovation is perceived as being strongly associated with productivity, growth and sustained international competitiveness. One policy instrument by which policy makers have attempted to create regional advantage has been the establishment of publicly funded research centres (PRCs). In this paper we develop a logic model for this type of regional intervention and examine the outputs and longer-term outcomes from a group of (18) publicly funded R&D centres. Our results suggest some positive regional impacts but also identify significant differences in terms of innovation, additionality and sustainability between university-based and company-based PRCs. University-based PRCs have higher levels of short-term additionality, demonstrate higher levels of organisational innovation but prove less sustainable. Company-based PRCs demonstrate more partial additionality in the short-term but ultimately prove more sustainable.
Resumo:
This paper arose from the work carried out for the Cullen/Uff Joint Inquiry into Train Protection Systems. It is concerned with the problem of evaluating the benefits of safety enhancements in order to avoid rare, but catastrophic accidents, and the role of Operations Research in the process. The problems include both input values and representation of outcomes. A key input is the value of life. This paper briefly discusses why the value of life might vary from incident to incident and reviews alternative estimates before producing a 'best estimate' for rail. When the occurrence of an event is uncertain, the normal method is to apply a single 'expected' value. This paper argues that a more effective method of representing such situations is through Monte-Carlo simulation and demonstrates the use of the methodology on a case study of the decision as to whether or not advanced train protection (ATP) should have been installed on a route to the west of London. This paper suggests that the output is more informative than traditional cost-benefit appraisals or engineering event tree approaches. It also shows that, unlike the results from utilizing the traditional approach, the value of ATP on this route would be positive over 50% of the time.
Resumo:
Abstract. Modern business practices in engineering are increasingly turning to post manufacture service provision in an attempt to generate additional revenue streams and ensure commercial sustainability. Maintainability has always been a consideration during the design process but in the past it has been generally considered to be of tertiary importance behind manufacturability and primary product function in terms of design priorities. The need to draw whole life considerations into concurrent engineering (CE) practice has encouraged companies to address issues such as maintenance, earlier in the design process giving equal importance to all aspects of the product lifecycle. The consideration of design for maintainability (DFM) early in the design process has the potential to significantly reduce maintenance costs, and improve overall running efficiencies as well as safety levels. However a lack of simulation tools still hinders the adaptation of CE to include practical elements of design and therefore further research is required to develop methods by which ‘hands on’ activities such as maintenance can be fully assessed and optimised as concepts develop. Virtual Reality (VR) has the potential to address this issue but the application of these traditionally high cost systems can require complex infrastructure and their use has typically focused on aesthetic aspects of mature designs. This paper examines the application of cost effective VR technology to the rapid assessment of aircraft interior inspection during conceptual design. It focuses on the integration of VR hardware with a typical desktop engineering system and examines the challenges with data transfer, graphics quality and the development of practical user functions within the VR environment. Conclusions drawn to date indicate that the system has the potential to improve maintenance planning through the provision of a usable environment for inspection which is available as soon as preliminary structural models are generated as part of the conceptual design process. Challenges still exist in the efficient transfer of data between the CAD and VR environments as well as the quantification of any benefits that result from the proposed approach. The result of this research will help to improve product maintainability, reduce product development cycle times and lower maintenance costs.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Manutenção
Resumo:
This paper develops a model of short-range ballistic missile defense and uses it to study the performance of Israel’s Iron Dome system. The deterministic base model allows for inaccurate missiles, unsuccessful interceptions, and civil defense. Model enhancements consider the trade-offs in attacking the interception system, the difficulties faced by militants in assembling large salvos, and the effects of imperfect missile classification by the defender. A stochastic model is also developed. Analysis shows that system performance can be highly sensitive to the missile salvo size, and that systems with higher interception rates are more “fragile” when overloaded. The model is calibrated using publically available data about Iron Dome’s use during Operation Pillar of Defense in November 2012. If the systems performed as claimed, they saved Israel an estimated 1778 casualties and $80 million in property damage, and thereby made preemptive strikes on Gaza about 8 times less valuable to Israel. Gaza militants could have inflicted far more damage by grouping their rockets into large salvos, but this may have been difficult given Israel’s suppression efforts. Counter-battery fire by the militants is unlikely to be worthwhile unless they can obtain much more accurate missiles.
Resumo:
The basic concepts of digital signal processing are taught to the students in engineering and science. The focus of the course is on linear, time invariant systems. The question as to what happens when the system is governed by a quadratic or cubic equation remains unanswered in the vast majority of literature on signal processing. Light has been shed on this problem when John V Mathews and Giovanni L Sicuranza published the book Polynomial Signal Processing. This book opened up an unseen vista of polynomial systems for signal and image processing. The book presented the theory and implementations of both adaptive and non-adaptive FIR and IIR quadratic systems which offer improved performance than conventional linear systems. The theory of quadratic systems presents a pristine and virgin area of research that offers computationally intensive work. Once the area of research is selected, the next issue is the choice of the software tool to carry out the work. Conventional languages like C and C++ are easily eliminated as they are not interpreted and lack good quality plotting libraries. MATLAB is proved to be very slow and so do SCILAB and Octave. The search for a language for scientific computing that was as fast as C, but with a good quality plotting library, ended up in Python, a distant relative of LISP. It proved to be ideal for scientific computing. An account of the use of Python, its scientific computing package scipy and the plotting library pylab is given in the appendix Initially, work is focused on designing predictors that exploit the polynomial nonlinearities inherent in speech generation mechanisms. Soon, the work got diverted into medical image processing which offered more potential to exploit by the use of quadratic methods. The major focus in this area is on quadratic edge detection methods for retinal images and fingerprints as well as de-noising raw MRI signals
Resumo:
The rapid growth of the optical communication branches and the enormous demand for more bandwidth require novel networks such as dense wavelength division multiplexing (DWDM). These networks enable higher bitrate transmission using the existing optical fibers. Micromechanically tunable optical microcavity devices like VCSELs, Fabry-Pérot filters and photodetectors are core components of these novel DWDM systems. Several air-gap based tunable devices were successfully implemented in the last years. Even though these concepts are very promising, two main disadvantages are still remaining. On the one hand, the high fabrication and integration cost and on the other hand the undesired adverse buckling of the suspended membranes. This thesis addresses these two problems and consists of two main parts: • PECVD dielectric material investigation and stress control resulting in membranes shape engineering. • Implementation and characterization of novel tunable optical devices with tailored shapes of the suspended membranes. For this purposes, low-cost PECVD technology is investigated and developed in detail. The macro- and microstress of silicon nitride and silicon dioxide are controlled over a wide range. Furthermore, the effect of stress on the optical and mechanical properties of the suspended membranes and on the microcavities is evaluated. Various membrane shapes (concave, convex and planar) with several radii of curvature are fabricated. Using this resonator shape engineering, microcavity devices such as non tunable and tunable Fabry-Pérot filters, VCSELs and PIN photodetectors are succesfully implemented. The fabricated Fabry-Pérot filters cover a spectral range of over 200nm and show resonance linewidths down to 1.5nm. By varying the stress distribution across the vertical direction within a DBR, the shape and the radius of curvature of the top membrane are explicitely tailored. By adjusting the incoming light beam waist to the curvature, the fundamental resonant mode is supported and the higher order ones are suppressed. For instance, a tunable VCSEL with 26 nm tuning range, 400µW maximal output power, 47nm free spectral range and over 57dB side mode suppresion ratio (SMSR) is demonstrated. Other technologies, such as introducing light emitting organic materials in microcavities are also investigated.
Resumo:
Die Maßnahmen zur Förderung der Windenergie in Deutschland haben wichtige Anstöße zur technologischen Weiterentwicklung geliefert und die Grundlagen für den enormen Anlagenzubau geschaffen. Die installierte Windleistung hat heute eine beachtliche Größenordnung erreicht und ein weiteres Wachstum in ähnlichen Dimensionen ist auch für die nächsten Jahre zu erwarten. Die aus Wind erzeugte elektrische Leistung deckt bereits heute in einigen Netzbereichen die Netzlast zu Schwachlastzeiten. Dies zeigt, dass die Windenergie ein nicht mehr zu vernachlässigender Faktor in der elektrischen Energieversorgung geworden ist. Im Rahmen der Kraftwerkseinsatzplanung sind Betrag und Verlauf der Windleistung des folgenden Tages mittlerweile zu wichtigen und zugleich schwierig zu bestimmenden Variablen geworden. Starke Schwankungen und falsche Prognosen der Windstromeinspeisung verursachen zusätzlichen Bedarf an Regel- und Ausgleichsleistung durch die Systemführung. Das im Rahmen dieser Arbeit entwickelte Prognosemodell liefert die zu erwartenden Windleistungen an 16 repräsentativen Windparks bzw. Gruppen von Windparks für bis zu 48 Stunden im Voraus. Aufgrund von prognostizierten Wetterdaten des deutschen Wetterdienstes (DWD) werden die Leistungen der einzelnen Windparks mit Hilfe von künstlichen neuronalen Netzen (KNN) berechnet. Diese Methode hat gegenüber physikalischen Verfahren den Vorteil, dass der komplexe Zusammenhang zwischen Wettergeschehen und Windparkleistung nicht aufwendig analysiert und detailliert mathematisch beschrieben werden muss, sondern anhand von Daten aus der Vergangenheit von den KNN gelernt wird. Das Prognosemodell besteht aus zwei Modulen. Mit dem ersten wird, basierend auf den meteorologischen Vorhersagen des DWD, eine Prognose für den Folgetag erstellt. Das zweite Modul bezieht die online gemessenen Leistungsdaten der repräsentativen Windparks mit ein, um die ursprüngliche Folgetagsprognose zu verbessern und eine sehr genaue Kurzzeitprognose für die nächsten drei bis sechs Stunden zu berechnen. Mit den Ergebnissen der Prognosemodule für die repräsentativen Standorte wird dann über ein Transformationsmodell, dem so genannten Online-Modell, die Gesamteinspeisung in einem größeren Gebiet berechnet. Das Prognoseverfahren hat seine besonderen Vorzüge in der Genauigkeit, den geringen Rechenzeiten und den niedrigen Betriebskosten, da durch die Verwendung des bereits implementierten Online-Modells nur eine geringe Anzahl von Vorhersage- und Messstandorten benötigt wird. Das hier vorgestellte Prognosemodell wurde ursprünglich für die E.ON-Netz GmbH entwickelt und optimiert und ist dort seit Juli 2001 im Einsatz. Es lässt sich jedoch auch leicht an andere Gebiete anpassen. Benötigt werden dazu nur die Messdaten der Leistung ausgewählter repräsentativer Windparks sowie die dazu gehörenden Wettervorhersagen, um die KNN entsprechend zu trainieren.