937 resultados para Free Vibration Analysis
Investigation of the Effect of Array Geometry on the Performance of Free-Space Optical Interconnects
Resumo:
The effect of transmitter and receiver array configurations on the stray-light and diffraction-caused crosstalk in free-space optical interconnects was investigated. The optical system simulation software (Code V) is used to simulate both the stray-light and diffraction-caused crosstalk. Experimentally measured, spectrally-resolved, near-field images of VCSEL higher order modes were used as extended sources in our simulation model. In addition, we have included the electrical and optical noise in our analysis to give more accurate overall performance of the FSOI system. Our results show that by changing the square lattice geometry to a hexagonal configuration, we obtain an overall signal-to-noise ratio improvement of 3 dB. Furthermore, system density is increased by up to 4 channels/mm2.
Resumo:
Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.
Resumo:
The modelling of mechanical structures using finite element analysis has become an indispensable stage in the design of new components and products. Once the theoretical design has been optimised a prototype may be constructed and tested. What can the engineer do if the measured and theoretically predicted vibration characteristics of the structure are significantly different? This thesis considers the problems of changing the parameters of the finite element model to improve the correlation between a physical structure and its mathematical model. Two new methods are introduced to perform the systematic parameter updating. The first uses the measured modal model to derive the parameter values with the minimum variance. The user must provide estimates for the variance of the theoretical parameter values and the measured data. Previous authors using similar methods have assumed that the estimated parameters and measured modal properties are statistically independent. This will generally be the case during the first iteration but will not be the case subsequently. The second method updates the parameters directly from the frequency response functions. The order of the finite element model of the structure is reduced as a function of the unknown parameters. A method related to a weighted equation error algorithm is used to update the parameters. After each iteration the weighting changes so that on convergence the output error is minimised. The suggested methods are extensively tested using simulated data. An H frame is then used to demonstrate the algorithms on a physical structure.
Resumo:
The rapid developments in computer technology have resulted in a widespread use of discrete event dynamic systems (DEDSs). This type of system is complex because it exhibits properties such as concurrency, conflict and non-determinism. It is therefore important to model and analyse such systems before implementation to ensure safe, deadlock free and optimal operation. This thesis investigates current modelling techniques and describes Petri net theory in more detail. It reviews top down, bottom up and hybrid Petri net synthesis techniques that are used to model large systems and introduces on object oriented methodology to enable modelling of larger and more complex systems. Designs obtained by this methodology are modular, easy to understand and allow re-use of designs. Control is the next logical step in the design process. This thesis reviews recent developments in control DEDSs and investigates the use of Petri nets in the design of supervisory controllers. The scheduling of exclusive use of resources is investigated and an efficient Petri net based scheduling algorithm is designed and a re-configurable controller is proposed. To enable the analysis and control of large and complex DEDSs, an object oriented C++ software tool kit was developed and used to implement a Petri net analysis tool, Petri net scheduling and control algorithms. Finally, the methodology was applied to two industrial DEDSs: a prototype can sorting machine developed by Eurotherm Controls Ltd., and a semiconductor testing plant belonging to SGS Thomson Microelectronics Ltd.
Resumo:
Using current software engineering technology, the robustness required for safety critical software is not assurable. However, different approaches are possible which can help to assure software robustness to some extent. For achieving high reliability software, methods should be adopted which avoid introducing faults (fault avoidance); then testing should be carried out to identify any faults which persist (error removal). Finally, techniques should be used which allow any undetected faults to be tolerated (fault tolerance). The verification of correctness in system design specification and performance analysis of the model, are the basic issues in concurrent systems. In this context, modeling distributed concurrent software is one of the most important activities in the software life cycle, and communication analysis is a primary consideration to achieve reliability and safety. By and large fault avoidance requires human analysis which is error prone; by reducing human involvement in the tedious aspect of modelling and analysis of the software it is hoped that fewer faults will persist into its implementation in the real-time environment. The Occam language supports concurrent programming and is a language where interprocess interaction takes place by communications. This may lead to deadlock due to communication failure. Proper systematic methods must be adopted in the design of concurrent software for distributed computing systems if the communication structure is to be free of pathologies, such as deadlock. The objective of this thesis is to provide a design environment which ensures that processes are free from deadlock. A software tool was designed and used to facilitate the production of fault-tolerant software for distributed concurrent systems. Where Occam is used as a design language then state space methods, such as Petri-nets, can be used in analysis and simulation to determine the dynamic behaviour of the software, and to identify structures which may be prone to deadlock so that they may be eliminated from the design before the program is ever run. This design software tool consists of two parts. One takes an input program and translates it into a mathematical model (Petri-net), which is used for modeling and analysis of the concurrent software. The second part is the Petri-net simulator that takes the translated program as its input and starts simulation to generate the reachability tree. The tree identifies `deadlock potential' which the user can explore further. Finally, the software tool has been applied to a number of Occam programs. Two examples were taken to show how the tool works in the early design phase for fault prevention before the program is ever run.
Resumo:
This chapter provides the theoretical foundation and background on data envelopment analysis (DEA) method. We first introduce the basic DEA models. The balance of this chapter focuses on evidences showing DEA has been extensively applied for measuring efficiency and productivity of services including financial services (banking, insurance, securities, and fund management), professional services, health services, education services, environmental and public services, energy services, logistics, tourism, information technology, telecommunications, transport, distribution, audio-visual, media, entertainment, cultural and other business services. Finally, we provide information on the use of Performance Improvement Management Software (PIM-DEA). A free limited version of this software and downloading procedure is also included in this chapter.
Resumo:
Purpose: To compare distance and near visual performance with a zero-aberration aspheric intraocular lens (IOL) (Softec HD, Lenstec, Inc. FL, USA) with that of an otherwise identical, but spherical IOL (Softec 1). Setting: Department of Ophthalmology, Solihull Hospital, West Midlands, United Kingdom. Methods: This prospective study comprised 37 patients with a Softec 1 spherical IOL implanted in one eye, who underwent phacoemulsification and received the Softec HD aspheric IOL in the fellow eye. One month post-operatively, unaided distance and near vision, residual refraction, best spectacle corrected distance and near visual acuity, reading speed, pseudoaccommodation and photopic contrast sensitivity were recorded. Wavefront analysis enabled comparison of higher order aberrations between the IOLs. Results: Prior to surgery, the Softec 1 and Softec HD eyes were not significantly different. Post-operatively, unaided vision, best spectacle corrected visual acuity and residual refraction were not significantly different between the eyes, nor were there significant differences observed between the measured wavefront aberrations. Once implanted, the range of focus was significantly better in the Softec HD IOL eye than the Softec 1 IOL eye and, although reading speed was equivalent to the Softec 1 eye, the print size at which this could be achieved was significantly smaller. Conclusions: Depth of field was significantly improved with the aspheric IOL compared with the spherical IOL, without any compromise in distance visual performance between the two IOLs.
Resumo:
This thesis is concerned with demonstrating how the visual representation of the sequence distribution of individual monomer units, of a polymer, that would be observed upon polymerisation, may be utilised in designing and synthesizing polymers with relatively low cell adhesion characteristics, The initial part of this thesis is concerned with demonstrating the use of a computer simulation technique, in illustrating the sequence distribution that would be observed upon the polymerisation of a set of monomers. The power of the computer simulation technique has been demonstrated through the simulation of the sequence distributions of some generic contact lens materials. These generic contact lens materials were chosen simply because in the field of biomaterials their compositions are amongst the most systematically regulated and they present a wide range of compositions. The validity of the computer simulation technique has been assessed through the synthesis and analysis of linear free-radical polymers at different conversions. Two main parameters were examined, that of composition and the number-average sequence lengths of individual monomer units, at various conversions. The polymers were synthesized through the solution polymerisation process. The monomer composition was determined by elemental analysis and 13C nuclear magnetic analysis (NMR). Number-average sequence lengths were determined exclusively through 13C NMR. Although the computer simulation technique provides a visual representation of the monomer sequence distribution up to 100% conversion, these assessments were made on linear polymers at a reasonably high conversion (above 50%) but below 100% conversion of ease for analysis. The analyses proved that the computer simulation technique was reasonably accurate in predicting the sequence distribution of monomer units, upon polymerisation, in the polymer.An approach has been presented which allows one to manipulate the use of monomers, with their reactivity ratios, thereby enabling us to design polymers with controlled sequence distributions.Hydrogel membranes, with relatively controlled sequence distributions and polymerised to 100% conversion, were synthesized to represent prospective biomaterials. Cell adhesion studies were used as a biological probe to investigate the susceptibility of the surface of these membranes to cell adhesion. This was necessary in order to assess the surface biocompatibility or biotolerance of these prospective biomaterials.
Resumo:
This thesis reports the development of a reliable method for the prediction of response to electromagnetically induced vibration in large electric machines. The machines of primary interest are DC ship-propulsion motors but much of the work reported has broader significance. The investigation has involved work in five principal areas. (1) The development and use of dynamic substructuring methods. (2) The development of special elements to represent individual machine components. (3) Laboratory scale investigations to establish empirical values for properties which affect machine vibration levels. (4) Experiments on machines on the factory test-bed to provide data for correlation with prediction. (5) Reasoning with regard to the effect of various design features. The limiting factor in producing good models for machines in vibration is the time required for an analysis to take place. Dynamic substructuring methods were adopted early in the project to maximise the efficiency of the analysis. A review of existing substructure- representation and composite-structure assembly methods includes comments on which are most suitable for this application. In three appendices to the main volume methods are presented which were developed by the author to accelerate analyses. Despite significant advances in this area, the limiting factor in machine analyses is still time. The representation of individual machine components was addressed as another means by which the time required for an analysis could be reduced. This has resulted in the development of special elements which are more efficient than their finite-element counterparts. The laboratory scale experiments reported were undertaken to establish empirical values for the properties of three distinct features - lamination stacks, bolted-flange joints in rings and cylinders and the shimmed pole-yoke joint. These are central to the preparation of an accurate machine model. The theoretical methods are tested numerically and correlated with tests on two machines (running and static). A system has been devised with which the general electromagnetic forcing may be split into its most fundamental components. This is used to draw some conclusions about the probable effects of various design features.
Resumo:
The analysis and prediction of the dynamic behaviour of s7ructural components plays an important role in modern engineering design. :n this work, the so-called "mixed" finite element models based on Reissnen's variational principle are applied to the solution of free and forced vibration problems, for beam and :late structures. The mixed beam models are obtained by using elements of various shape functions ranging from simple linear to complex cubic and quadratic functions. The elements were in general capable of predicting the natural frequencies and dynamic responses with good accuracy. An isoparametric quadrilateral element with 8-nodes was developed for application to thin plate problems. The element has 32 degrees of freedom (one deflection, two bending and one twisting moment per node) which is suitable for discretization of plates with arbitrary geometry. A linear isoparametric element and two non-conforming displacement elements (4-node and 8-node quadrilateral) were extended to the solution of dynamic problems. An auto-mesh generation program was used to facilitate the preparation of input data required by the 8-node quadrilateral elements of mixed and displacement type. Numerical examples were solved using both the mixed beam and plate elements for predicting a structure's natural frequencies and dynamic response to a variety of forcing functions. The solutions were compared with the available analytical and displacement model solutions. The mixed elements developed have been found to have significant advantages over the conventional displacement elements in the solution of plate type problems. A dramatic saving in computational time is possible without any loss in solution accuracy. With beam type problems, there appears to be no significant advantages in using mixed models.
Resumo:
The unmitigated transmission of undesirable vibration can result in problems by way of causing human discomfort, machinery and equipment failure, and affecting the quality of a manufacturing process. When identifiable transmission paths are discernible, vibrations from the source can be isolated from the rest of the system and this prevents or minimises the problems. The approach proposed here for vibration isolation is active force cancellation at points close to the vibration source. It uses force feedback for multiple-input and multiple-output control at the mounting locations. This is particularly attractive for rigid mounting of machine on relative flexible base where machine alignment and motions are to be restricted. The force transfer function matrix is used as a disturbance rejection performance specification for the design of MIMO controllers. For machine soft-mounted via flexible isolators, a model for this matrix has been derived. Under certain conditions, a simple multiplicative uncertainty model is obtained that shows the amount of perturbation a flexible base has on the machine-isolator-rigid base transmissibility matrix. Such a model is very suitable for use with robust control design paradigm. A different model is derived for the machine on hard-mounts without the flexible isolators. With this model, the level of force transmitted from a machine to a final mounting structure using the measurements for the machine running on another mounting structure can be determined. The two mounting structures have dissimilar dynamic characteristics. Experiments have verified the usefulness of the expression. The model compares well with other methods in the literature. The disadvantage lies with the large amount of data that has to be collected. Active force cancellation is demonstrated on an experimental rig using an AC industrial motor hard-mounted onto a relative flexible structure. The force transfer function matrix, determined from measurements, is used to design H and Static Output Feedback controllers. Both types of controllers are stable and robust to modelling errors within the identified frequency range. They reduce the RMS of transmitted force by between 30?80% at all mounting locations for machine running at 1340 rpm. At the rated speed of 1440 rpm only the static gain controller is able to provide 30?55% reduction at all locations. The H controllers on the other hand could only give a small reduction at one mount location. This is due in part to the deficient of the model used in the design. Higher frequency dynamics has been ignored in the model. This can be resolved by the use of a higher order model that can result in a high order controller. A low order static gain controller, with some tuning, performs better. But it lacks the analytical framework for analysis and design.
Resumo:
Oxidized and chlorinated phospholipids are generated under inflammatory conditions and are increasingly understood to play important roles in diseases involving oxidative stress. MS is a sensitive and informative technique for monitoring phospholipid oxidation that can provide structural information and simultaneously detect a wide variety of oxidation products, including chain-shortened and -chlorinated phospholipids. MSn technologies involve fragmentation of the compounds to yield diagnostic fragment ions and thus assist in identification. Advanced methods such as neutral loss and precursor ion scanning can facilitate the analysis of specific oxidation products in complex biological samples. This is essential for determining the contributions of different phospholipid oxidation products in disease. While many pro-inflammatory signalling effects of oxPLs (oxidized phospholipids) have been reported, it has more recently become clear that they can also have anti-inflammatory effects in conditions such as infection and endotoxaemia. In contrast with free radical-generated oxPLs, the signalling effects of chlorinated lipids are much less well understood, but they appear to demonstrate mainly pro-inflammatory effects. Specific analysis of oxidized and chlorinated lipids and the determination of their molecular effects are crucial to understanding their role in disease pathology.
Resumo:
The field of free radical biology and medicine continues to move at a tremendous pace, with a constant flow of ground-breaking discoveries. The following collection of papers in this issue of Biochemical Society Transactions highlights several key areas of topical interest, including the crucial role of validated measurements of radicals and reactive oxygen species in underpinning nearly all research in the field, the important advances being made as a result of the overlap of free radical research with the reinvigorated field of lipidomics (driven in part by innovations in MS-based analysis), the acceleration of new insights into the role of oxidative protein modifications (particularly to cysteine residues) in modulating cell signalling, and the effects of free radicals on the functions of mitochondria, extracellular matrix and the immune system. In the present article, we provide a brief overview of these research areas, but, throughout this discussion, it must be remembered that it is the availability of reliable analytical methodologies that will be a key factor in facilitating continuing developments in this exciting research area.
Resumo:
In their policy proposals on how best to stimulate economic growth, economists have been increasingly emphasizing free markets. It is, however, possible that free-market-led economic growth can lead to increased income inequity which can further increase poverty. One of the more interesting but thus far insufficiently explored mechanisms for the latter is food–feed competition. Using Peruvian Living Standard Survey (PLSS) data for 1985–86 and 1990, the paper examines the demand patterns of households and concludes that the empirical evidence is in agreement with the hypotheses underlying the theory of food–feed competition.
Resumo:
Objectives and Methods: Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. Results: The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. Conclusions: No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable. © 2013 Contact Lens Association of Ophthalmologists.