914 resultados para Development Applications


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper reviews the recent developments in the mechanics of superplasticity and its applications in industrial practice. After introducing the phenomena of superplasticity, the basic experiments for determining material deformation behavior and related parameters, and constructing superplastic constitutive equations, are reviewed. Finite element related formulations and techniques for simulating superplastic forming are discussed, together with some practical applications. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background. We describe the development, reliability and applications of the Diagnostic Interview for Psychoses (DIP), a comprehensive interview schedule for psychotic disorders. Method. The DIP is intended for use by interviewers with a clinical background and was designed to occupy the middle ground between fully structured, lay-administered schedules, and semi-structured., psychiatrist-administered interviews. It encompasses four main domains: (a) demographic data; (b) social functioning and disability; (c) a diagnostic module comprising symptoms, signs and past history ratings; and (d) patterns of service utilization Lind patient-perceived need for services. It generates diagnoses according to several sets of criteria using the OPCRIT computerized diagnostic algorithm and can be administered either on-screen or in a hard-copy format. Results. The DIP proved easy to use and was well accepted in the field. For the diagnostic module, inter-rater reliability was assessed on 20 cases rated by 24 clinicians: good reliability was demonstrated for both ICD-10 and DSM-III-R diagnoses. Seven cases were interviewed 2-11 weeks apart to determine test-retest reliability, with pairwise agreement of 0.8-1.0 for most items. Diagnostic validity was assessed in 10 cases, interviewed with the DIP and using the SCAN as 'gold standard': in nine cases clinical diagnoses were in agreement. Conclusions. The DIP is suitable for use in large-scale epidemiological studies of psychotic disorders. as well as in smaller Studies where time is at a premium. While the diagnostic module stands on its own, the full DIP schedule, covering demography, social functioning and service utilization makes it a versatile multi-purpose tool.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of the research carried out in this report was to observe the first ever in-situ sonochemical reaction in the NMR Spectrometer in the megahertz region of ultrasound. Several reactions were investigated as potential systems for a sonochemical reaction followed by NMR spectroscopy. The primary problem to resolve when applying ultrasound to a chemical reaction is that of heating. Ultrasound causes the liquid to move and produces 'hot spots' resulting in an increase in sample temperature. The problem was confronted by producing a device that would counteract this effect and so remove the need to account for heating. However, the design of the device limited the length of time during which it would function. Longer reaction times were required to enable observations to be carried out in the NMR spectrometer. The fIrst and most obvious reactions attempted were those of the well-known ultrasonic dosimeter. Such a reaction would, theoretically, enable the author to simultaneously observe a reaction and determine the exact power entering the system for direct comparison of results. Unfortunately, in order to monitor the reactions in the NMR spectrometer the reactant concentrations had to be signifIcantly increased, which resulted in a notable increase in reaction time, making the experiment too lengthy to follow in the time allocated. The Diels-Alder Reaction is probably one of the most highly investigated reaction systems in the field of chemistry and it was this to which the author turned her attention. Previous authors have carried out ultrasonic investigations, with considerable success, for the reaction of anthracene with maleic anhydride. It was this reaction in particular that was next attempted. The first ever sonochemically enhanced reaction using a frequency of ultrasound in the megahertz (MHz) region was successfully carried out as bench experiments. Due to the complexity of the component reactants the product would precipitate from the solution and because the reaction could only be monitored by its formation, it was not possible to observe the reaction in the NMR spectrometer. The solvolysis of 2-chloro-2-methylpropane was examined in various solvent systems; the most suitable of which was determined to be aqueous 2-methylpropan-2-ol. The experiment was successfully enhanced by the application of ultrasound and monitored in-situ in the NMR spectrometer. The increase in product formation of an ultrasonic reaction over that of a traditional thermal reaction occurred. A range of 1.4 to 2.9 fold improvement was noted, dependent upon the reaction conditions investigated. An investigation into the effect of sonication upon a large biological molecule, in this case aqueous lysozyme, was carried out. An easily observed effect upon the sample was noted but no explanation for the observed effects could be established.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Teallach project has adapted model-based user-interface development techniques to the systematic creation of user-interfaces for object-oriented database applications. Model-based approaches aim to provide designers with a more principled approach to user-interface development using a variety of underlying models, and tools which manipulate these models. Here we present the results of the Teallach project, describing the tools developed and the flexible design method supported. Distinctive features of the Teallach system include provision of database-specific constructs, comprehensive facilities for relating the different models, and support for a flexible design method in which models can be constructed and related by designers in different orders and in different ways, to suit their particular design rationales. The system then creates the desired user-interface as an independent, fully functional Java application, with automatically generated help facilities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bio-impedance analysis (BIA) provides a rapid, non-invasive technique for body composition estimation. BIA offers a convenient alternative to standard techniques such as MRI, CT scan or DEXA scan for selected types of body composition analysis. The accuracy of BIA is limited because it is an indirect method of composition analysis. It relies on linear relationships between measured impedance and morphological parameters such as height and weight to derive estimates. To overcome these underlying limitations of BIA, a multi-frequency segmental bio-impedance device was constructed through a series of iterative enhancements and improvements of existing BIA instrumentation. Key features of the design included an easy to construct current-source and compact PCB design. The final device was trialled with 22 human volunteers and measured impedance was compared against body composition estimates obtained by DEXA scan. This enabled the development of newer techniques to make BIA predictions. To add a ‘visual aspect’ to BIA, volunteers were scanned in 3D using an inexpensive scattered light gadget (Xbox Kinect controller) and 3D volumes of their limbs were compared with BIA measurements to further improve BIA predictions. A three-stage digital filtering scheme was also implemented to enable extraction of heart-rate data from recorded bio-electrical signals. Additionally modifications have been introduced to measure change in bio-impedance with motion, this could be adapted to further improve accuracy and veracity for limb composition analysis. The findings in this thesis aim to give new direction to the prediction of body composition using BIA. The design development and refinement applied to BIA in this research programme suggest new opportunities to enhance the accuracy and clinical utility of BIA for the prediction of body composition analysis. In particular, the use of bio-impedance to predict limb volumes which would provide an additional metric for body composition measurement and help distinguish between fat and muscle content.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The successful design of polymers for contact lens applications depends on the ability to provide a balance of properties appropriate to the ocular environment. Principal relevant aspects of the anterior eye are the tear film, eyelid and cornea, which govern the requirements for surface properties, modulus and oxygen permeability, respectively. Permeability requirements and the developing view of the needs of the cornea, in terms of oxygen consumption and the particular roles of fluorine and silicon in the design of silicone hydrogels, which have proved to be the most successful family of materials for this demanding application, are discussed. The contact lens field is complicated by the fact that contact lenses are used in a range of wear modalities, the extremes of which can conveniently be classified as lenses that are disposed of at the end of a single period of daily wear and those used for 30. days of successive day-and-night periods, frequently referred to as extended or continuous wear. As silicone hydrogels developed, in the decade following their launch there has been a progressive trend in properties taking both modulus and water content closer to those of conventional hydrogels. This is particularly evident in the family of daily disposable contact lenses that have appeared since 2008.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In 2010, the American Association of State Highway and Transportation Officials (AASHTO) released a safety analysis software system known as SafetyAnalyst. SafetyAnalyst implements the empirical Bayes (EB) method, which requires the use of Safety Performance Functions (SPFs). The system is equipped with a set of national default SPFs, and the software calibrates the default SPFs to represent the agency's safety performance. However, it is recommended that agencies generate agency-specific SPFs whenever possible. Many investigators support the view that the agency-specific SPFs represent the agency data better than the national default SPFs calibrated to agency data. Furthermore, it is believed that the crash trends in Florida are different from the states whose data were used to develop the national default SPFs. In this dissertation, Florida-specific SPFs were developed using the 2008 Roadway Characteristics Inventory (RCI) data and crash and traffic data from 2007-2010 for both total and fatal and injury (FI) crashes. The data were randomly divided into two sets, one for calibration (70% of the data) and another for validation (30% of the data). The negative binomial (NB) model was used to develop the Florida-specific SPFs for each of the subtypes of roadway segments, intersections and ramps, using the calibration data. Statistical goodness-of-fit tests were performed on the calibrated models, which were then validated using the validation data set. The results were compared in order to assess the transferability of the Florida-specific SPF models. The default SafetyAnalyst SPFs were calibrated to Florida data by adjusting the national default SPFs with local calibration factors. The performance of the Florida-specific SPFs and SafetyAnalyst default SPFs calibrated to Florida data were then compared using a number of methods, including visual plots and statistical goodness-of-fit tests. The plots of SPFs against the observed crash data were used to compare the prediction performance of the two models. Three goodness-of-fit tests, represented by the mean absolute deviance (MAD), the mean square prediction error (MSPE), and Freeman-Tukey R2 (R2FT), were also used for comparison in order to identify the better-fitting model. The results showed that Florida-specific SPFs yielded better prediction performance than the national default SPFs calibrated to Florida data. The performance of Florida-specific SPFs was further compared with that of the full SPFs, which include both traffic and geometric variables, in two major applications of SPFs, i.e., crash prediction and identification of high crash locations. The results showed that both SPF models yielded very similar performance in both applications. These empirical results support the use of the flow-only SPF models adopted in SafetyAnalyst, which require much less effort to develop compared to full SPFs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Capillary electrophoresis (CE) is a modern analytical technique, which is electrokinetic separation generated by high voltage and taken place inside the small capillaries. In this dissertation, several advanced capillary electrophoresis methods are presented using different approaches of CE and UV and mass spectrometry are utilized as the detection methods. ^ Capillary electrochromatography (CEC), as one of the CE modes, is a recent developed technique which is a hybrid of capillary electrophoresis and high performance liquid chromatography (HPLC). Capillary electrochromatography exhibits advantages of both techniques. In Chapter 2, monolithic capillary column are fabricated using in situ photoinitiation polymerization method. The column was then applied for the separation of six antidepressant compounds. ^ Meanwhile, a simple chiral separation method is developed and presented in Chapter 3. Beta cycodextrin was utilized to achieve the goal of chiral separation. Not only twelve cathinone analytes were separated, but also isomers of several analytes were enantiomerically separated. To better understand the molecular information on the analytes, the TOF-MS system was coupled with the CE. A sheath liquid and a partial filling technique (PFT) were employed to reduce the contamination of MS ionization source. Accurate molecular information was obtained. ^ It is necessary to propose, develop, and optimize new techniques that are suitable for trace-level analysis of samples in forensic, pharmaceutical, and environmental applications. Capillary electrophoresis (CE) was selected for this task, as it requires lower amounts of samples, it simplifies sample preparation, and it has the flexibility to perform separations of neutral and charged molecules as well as enantiomers. ^ Overall, the study demonstrates the versatility of capillary electrophoresis methods in forensic, pharmaceutical, and environmental applications.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.