52 resultados para optimising compiler
Resumo:
A common method for inducing the production of recombinant proteins in Pichia pastoris is through the use of methanol. However, the by-products of methanol metabolism are toxic to yeast cells and therefore its addition to recombinant cultures must be controlled and monitored throughout the process in order to maximise recombinant protein yields. Described here are online and off-line methods to monitor and control methanol addition to bench-top-scale bioreactors. © 2012 Springer Science+business Media, LLC.
Resumo:
Mobile WiFi devices are becoming increasingly popular in non-seamless and user-controlled mobile traffic offloading alongside the standard WiFi hotspots. Unlike the operator-controlled hotspots, a mobile WiFi device relies on the capacity of the macro-cell for the data rate allocated to it. This type of devices can help offloading data traffic from the macro-cell base station and serve the end users within a closer range, but will change the pattern of resource distributions operated by the base station. We propose a resource allocation scheme that aims to optimize user quality of experience (QoE) when accessing video services in the environment where traffic offloading is taking place through interworking between a mobile communication system and low range wireless LANs. In this scheme, a rate redistribution algorithm is derived to perform scheduling which is controlled by a no-reference quality assessment metric in order to achieve the desired trade-offs between efficiency and fairness. We show the performance of this algorithm in terms of the distribution of the allocated data rates throughout the macro-cell investigated and the service coverage offered by the WiFi access point.
Resumo:
The aim of this work is to empirically generate a shortened version of the Geriatric Depression Scale (GDS), with the intention of maximising the diagnostic performance in the detection of depression compared with previously GDS validated versions, while optimizing the size of the instrument. A total of 233 individuals (128 from a Day Hospital, 105 randomly selected from the community) aged 60 or over completed the GDS and other measures. The 30 GDS items were entered in the Day Hospital sample as independent variables in a stepwise logistic regression analysis predicting diagnosis of Major Depression. A final solution of 10 items was retained, which correctly classified 97.4% of cases. The diagnostic performance of these 10 GDS items was analysed in the random sample with a receiver operating characteristic (ROC) curve. Sensitivity (100%), specificity (97.2%), positive (81.8%) and negative (100%) predictive power, and the area under the curve (0.994) were comparable with values for GDS-30 and higher compared with GDS-15, GDS-10 and GDS-5. In addition, the new scale proposed had excellent fit when testing its unidimensionality with CFA for categorical outcomes (e.g., CFI=0.99). The 10-item version of the GDS proposed here, the GDS-R, seems to retain the diagnostic performance for detecting depression in older adults of the GDS-30 items, while increasing the sensitivity and predictive values relative to other shortened versions.
Resumo:
Training Mixture Density Network (MDN) configurations within the NETLAB framework takes time due to the nature of the computation of the error function and the gradient of the error function. By optimising the computation of these functions, so that gradient information is computed in parameter space, training time is decreased by at least a factor of sixty for the example given. Decreased training time increases the spectrum of problems to which MDNs can be practically applied making the MDN framework an attractive method to the applied problem solver.
Resumo:
This thesis is devoted to the tribology at the head~to~tape interface of linear tape recording systems, OnStream ADRTM system being used as an experimental platform, Combining experimental characterisation with computer modelling, a comprehensive picture of the mechanisms involved in a tape recording system is drawn. The work is designed to isolate the mechanisms responsible for the physical spacing between head and tape with the aim of minimising spacing losses and errors and optimising signal output. Standard heads-used in ADR current products-and prototype heads- DLC and SPL coated and dummy heads built from a AI203-TiC and alternative single-phase ceramics intended to constitute the head tape-bearing surface-are tested in controlled environment for up to 500 hours (exceptionally 1000 hours), Evidences of wear on the standard head are mainly observable as a preferential wear of the TiC phase of the AI203-TiC ceramic, The TiC grains are believed to delaminate due to a fatigue wear mechanism, a hypothesis further confirmed via modelling, locating the maximum von Mises equivalent stress at a depth equivalent to the TiC recession (20 to 30 nm). Debris of TiC delaminated residues is moreover found trapped within the pole-tip recession, assumed therefore to provide three~body abrasive particles, thus increasing the pole-tip recession. Iron rich stain is found over the cycled standard head surface (preferentially over the pole-tip and to a lesser extent over the TiC grains) at any environment condition except high temperature/humidity, where mainly organic stain was apparent, Temperature (locally or globally) affects staining rate and aspect; stain transfer is generally promoted at high temperature. Humidity affects transfer rate and quantity; low humidity produces, thinner stains at higher rate. Stain generally targets preferentially head materials with high electrical conductivity, i.e. Permalloy and TiC. Stains are found to decrease the friction at the head-to-tape interface, delay the TiC recession hollow-out and act as a protective soft coating reducing the pole-tip recession. This is obviously at the expense of an additional spacing at the head-to-tape interface of the order of 20 nm. Two kinds of wear resistant coating are tested: diamond like carbon (DLC) and superprotective layer (SPL), 10 nm and 20 to 40 nm thick, respectively. DLC coating disappears within 100 hours due possibly to abrasive and fatigue wear. SPL coatings are generally more resistant, particularly at high temperature and low humidity, possibly in relation with stain transfer. 20 nm coatings are found to rely on the substrate wear behaviour whereas 40 nm coatings are found to rely on the adhesive strength at the coating/substrate interface. These observations seem to locate the wear-driving forces 40 nm below the surface, hence indicate that for coatings in the 10 nm thickness range-· i,e. compatible with high-density recording-the substrate resistance must be taken into account. Single-phase ceramic as candidate for wear-resistant tape-bearing surface are tested in form of full-contour dummy-heads. The absence of a second phase eliminates the preferential wear observed at the AI203-TiC surface; very low wear rates and no evidence of brittle fracture are observed.
Resumo:
A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.
Resumo:
A system for the NDI' testing of the integrity of conposite materials and of adhesive bonds has been developed to meet industrial requirements. The vibration techniques used were found to be applicable to the development of fluid measuring transducers. The vibrational spectra of thin rectangular bars were used for the NDT work. A machined cut in a bar had a significant effect on the spectrum but a genuine crack gave an unambiguous response at high amplitudes. This was the generation of fretting crack noise at frequencies far above that of the drive. A specially designed vibrational decrement meter which, in effect, measures mechanical energy loss enabled a numerical classification of material adhesion to be obtained. This was used to study bars which had been flame or plasma sprayed with a variety of materials. It has become a useful tool in optimising coating methods. A direct industrial application was to classify piston rings of high performance I.C. engines. Each consists of a cast iron ring with a channel into which molybdenum, a good bearing surface, is sprayed. The NDT classification agreed quite well with the destructive test normally used. The techniques and equipment used for the NOT work were applied to the development of the tuning fork transducers investigated by Hassan into commercial density and viscosity devices. Using narrowly spaced, large area tines a thin lamina of fluid is trapped between them. It stores a large fraction of the vibrational energy which, acting as an inertia load reduces the frequency. Magnetostrictive and piezoelectric effects together or in combination enable the fork to be operated through a flange. This allows it to be used in pipeline or 'dipstick' applications. Using a different tine geometry the viscosity loading can be predoninant. This as well as the signal decrement of the density transducer makes a practical viscometer.
Resumo:
We propose a simplified approach to optical signal pre-distortion based on adaptive pulse shaping through unconventional use of a MZ modulator. The scheme allows natural tailoring of transmitted pulses by optimising the received pulse.
Resumo:
Packed beds have many industrial applications and are increasingly used in the process industries due to their low pressure drop. With the introduction of more efficient packings, novel packing materials (i.e. adsorbents) and new applications (i.e. flue gas desulphurisation); the aspect ratio (height to diameter) of such beds is decreasing. Obtaining uniform gas distribution in such beds is of crucial importance in minimising operating costs and optimising plant performance. Since to some extent a packed bed acts as its own distributor the importance of obtaining uniform gas distribution has increased as aspect ratios (bed height to diameter) decrease. There is no rigorous design method for distributors due to a limited understanding of the fluid flow phenomena and in particular of the effect of the bed base / free fluid interface. This study is based on a combined theoretical and modelling approach. The starting point is the Ergun Equation which is used to determine the pressure drop over a bed where the flow is uni-directional. This equation has been applied in a vectorial form so it can be applied to maldistributed and multi-directional flows and has been realised in the Computational Fluid Dynamics code PHOENICS. The use of this equation and its application has been verified by modelling experimental measurements of maldistributed gas flows, where there is no free fluid / bed base interface. A novel, two-dimensional experiment has been designed to investigate the fluid mechanics of maldistributed gas flows in shallow packed beds. The flow through the outlet of the duct below the bed can be controlled, permitting a rigorous investigation. The results from this apparatus provide useful insights into the fluid mechanics of flow in and around a shallow packed bed and show the critical effect of the bed base. The PHOENICS/vectorial Ergun Equation model has been adapted to model this situation. The model has been improved by the inclusion of spatial voidage variations in the bed and the prescription of a novel bed base boundary condition. This boundary condition is based on the logarithmic law for velocities near walls without restricting the velocity at the bed base to zero and is applied within a turbulence model. The flow in a curved bed section, which is three-dimensional in nature, is examined experimentally. The effect of the walls and the changes in gas direction on the gas flow are shown to be particularly significant. As before, the relative amounts of gas flowing through the bed and duct outlet can be controlled. The model and improved understanding of the underlying physical phenomena form the basis for the development of new distributors and rigorous design methods for them.
The effective use of implicit parallelism through the use of an object-oriented programming language
Resumo:
This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.
Resumo:
The scaling problems which afflict attempts to optimise neural networks (NNs) with genetic algorithms (GAs) are disclosed. A novel GA-NN hybrid is introduced, based on the bumptree, a little-used connectionist model. As well as being computationally efficient, the bumptree is shown to be more amenable to genetic coding lthan other NN models. A hierarchical genetic coding scheme is developed for the bumptree and shown to have low redundancy, as well as being complete and closed with respect to the search space. When applied to optimising bumptree architectures for classification problems the GA discovers bumptrees which significantly out-perform those constructed using a standard algorithm. The fields of artificial life, control and robotics are identified as likely application areas for the evolutionary optimisation of NNs. An artificial life case-study is presented and discussed. Experiments are reported which show that the GA-bumptree is able to learn simulated pole balancing and car parking tasks using only limited environmental feedback. A simple modification of the fitness function allows the GA-bumptree to learn mappings which are multi-modal, such as robot arm inverse kinematics. The dynamics of the 'geographic speciation' selection model used by the GA-bumptree are investigated empirically and the convergence profile is introduced as an analytical tool. The relationships between the rate of genetic convergence and the phenomena of speciation, genetic drift and punctuated equilibrium arc discussed. The importance of genetic linkage to GA design is discussed and two new recombination operators arc introduced. The first, linkage mapped crossover (LMX) is shown to be a generalisation of existing crossover operators. LMX provides a new framework for incorporating prior knowledge into GAs.Its adaptive form, ALMX, is shown to be able to infer linkage relationships automatically during genetic search.
The transformational implementation of JSD process specifications via finite automata representation
Resumo:
Conventional structured methods of software engineering are often based on the use of functional decomposition coupled with the Waterfall development process model. This approach is argued to be inadequate for coping with the evolutionary nature of large software systems. Alternative development paradigms, including the operational paradigm and the transformational paradigm, have been proposed to address the inadequacies of this conventional view of software developement, and these are reviewed. JSD is presented as an example of an operational approach to software engineering, and is contrasted with other well documented examples. The thesis shows how aspects of JSD can be characterised with reference to formal language theory and automata theory. In particular, it is noted that Jackson structure diagrams are equivalent to regular expressions and can be thought of as specifying corresponding finite automata. The thesis discusses the automatic transformation of structure diagrams into finite automata using an algorithm adapted from compiler theory, and then extends the technique to deal with areas of JSD which are not strictly formalisable in terms of regular languages. In particular, an elegant and novel method for dealing with so called recognition (or parsing) difficulties is described,. Various applications of the extended technique are described. They include a new method of automatically implementing the dismemberment transformation; an efficient way of implementing inversion in languages lacking a goto-statement; and a new in-the-large implementation strategy.
Resumo:
This investigation looks critically at conventional magnetic lenses in the light of present-day technology with the aim of advancing electron microscopy in its broadest sense. By optimising the cooling arrangements and heat transfer characteristics of lens windings it was possible to increase substantially the current density in the winding, and achieve a large reduction in the size of conventional magnetic electron lenses. Following investigations into the properties of solenoidal lenses, a new type of lens with only one pole-piece was developed. The focal properties of such lenses, which differ considerably from those.of conventional lenses, have been derived from a combination of mathematical models and experimentally measured axial flux density distributions. These properties can be profitably discussed with reference to "half-lenses". Miniature conventional twin pole-piece lenses and the proposed radial field single pole-piece lenses have been designed and constructed and both types of lenses have been evaluated by constructing miniature electron optical columns. A miniature experimental transmission electron microscope (TEM), a miniature scanning electron microscope (SEM) and a scanning transmission microscope (STEM) have been built. A single pole-piece miniature one million volt projector lens of only lOcm diameter and weighing 2.lkg was designed, built and tested at 1 million volts in a commercial electron microscope. iii. Preliminary experiments indicate that in single pole lenses it is possible to extract secondary electrons from the specimen in spite of the presence of the magnetic field of the probe-forming lens. This may well be relevant for the SEM in which it is desirable to examine a large specimen at a moderately good resolution.
Resumo:
This thesis is concerned with the optimising of hearing protector selection. A computer model was used to estimate the reduction in noise exposure and risk of occupational deafness provided by the wearing of hearing protectors in industrial noise spectra. The model was used to show that low attenuation hearing protectors con provide greater protection than high attenuation protectors if the high attenuation protectors ore not worn for the total duration of noise exposure; or not used by a small proportion of the population. The model was also used to show that high attenuation protectors will not necessarily provide significantly greater reduction in risk than low attenuation protectors if the population has been exposed to the noise for many years prior to the provision of hearing protectors. The effects of earplugs and earmuffs on the localisation of sounds were studied to determine whether high attenuation earmuffs are likely to have greater potential than the lower attenuation earplugs for affecting personal safety. Laboratory studies and experiments at a foundry with normal-hearing office employees and noise-exposed foundrymen who had some experience of wearing hearing protectors showed that although earplugs reduced the ability of the wearer to determine the direction of warning sounds, earmuffs produced more total angular error and more confusions between left and right. !t is concluded from the research findings that the key to the selection of hearing protectors is to be found in the provision of hearing protectors that can be worn for a very high percentage of the exposure time by a high percentage of the exposed population with the minimum effect on the personal safety of the wearers - the attenuation provided by the protection should be adequate but not a maximum value.
Resumo:
The development of an advanced outdoor valve requires coordinated research in the areas of light-triggered self-protecting thyristors, light triggering systems, insulation, cooling and mechanical design aspects. This thesis addresses the first two areas primarily, with a conceptual discussion of the remainder. Using the experience gained from evaluation of a prototype thyristor and computer IKdelling of turn-on behaviour, a light-triggered thyristor with immunity to damage from weak optical triggering and dv/dt triggering was designed, manufactured and evaluated. The optical turn-on process was investigated by measuring currents and voltages in the gate structure during turn-on, and this yielded insights not obtained through conventional measurement techniques. The mechanism by which the thyristor was immune to weak triggering damage is explained, and techniques for optimising the design of the gate structure are proposed. The most significant achievement, however, was the first demonstration of the feasibility of self-protection against forward recovery failure onditions. Furthermore, this was achieved without the need for complex structures or high levels of irradiation. The perfomance of the devices was limited by the inrush capability of the Zones, but it is believed that this can be improved by conventional means. A light triggering system was developed using sem~conductor lasers, and this incorporated several improvements over prior art In terms of optical performance and flexibility.