896 resultados para Automated proof
Resumo:
Measures of icon designs rely heavily on surveys of the perceptions of population samples. Thus, measuring the extent to which changes in the structure of an icon will alter its perceived complexity can be costly and slow. An automated system capable of producing reliable estimates of perceived complexity could reduce development costs and time. Measures of icon complexity developed by Garcia, Badre, and Stasko (1994) and McDougall, Curry, and de Bruijn (1999) were correlated with six icon properties measured using Matlab (MathWorks, 2001) software, which uses image-processing techniques to measure icon properties. The six icon properties measured were icon foreground, the number of objects in an icon, the number of holes in those objects, and two calculations of icon edges and homogeneity in icon structure. The strongest correlates with human judgments of perceived icon complexity (McDougall et al., 1999) were structural variability (r(s) = .65) and edge information (r(s) =.64).
Resumo:
OBJECTIVE: To assess the impedance cardiogram recorded by an automated external defibrillator during cardiac arrest to facilitate emergency care by lay persons. Lay persons are poor at emergency pulse checks (sensitivity 84%, specificity 36%); guidelines recommend they should not be performed. The impedance cardiogram (dZ/dt) is used to indicate stroke volume. Can an impedance cardiogram algorithm in a defibrillator determine rapidly circulatory arrest and facilitate prompt initiation of external cardiac massage?
DESIGN: Clinical study.
SETTING: University hospital.
PATIENTS: Phase 1 patients attended for myocardial perfusion imaging. Phase 2 patients were recruited during cardiac arrest. This group included nonarrest controls.
INTERVENTIONS: The impedance cardiogram was recorded through defibrillator/electrocardiographic pads oriented in the standard cardiac arrest position.
MEASUREMENTS AND MAIN RESULTS: Phase 1: Stroke volumes from gated myocardial perfusion imaging scans were correlated with parameters from the impedance cardiogram system (dZ/dt(max) and the peak amplitude of the Fast Fourier Transform of dZ/dt between 1.5 Hz and 4.5 Hz). Multivariate analysis was performed to fit stroke volumes from gated myocardial perfusion imaging scans with linear and quadratic terms for dZ/dt(max) and the Fast Fourier Transform to identify significant parameters for incorporation into a cardiac arrest diagnostic algorithm. The square of the peak amplitude of the Fast Fourier Transform of dZ/dt was the best predictor of reduction in stroke volumes from gated myocardial perfusion imaging scans (range = 33-85 mL; p = .016). Having established that the two pad impedance cardiogram system could detect differences in stroke volumes from gated myocardial perfusion imaging scans, we assessed its performance in diagnosing cardiac arrest. Phase 2: The impedance cardiogram was recorded in 132 "cardiac arrest" patients (53 training, 79 validation) and 97 controls (47 training, 50 validation): the diagnostic algorithm indicated cardiac arrest with sensitivities and specificities (+/- exact 95% confidence intervals) of 89.1% (85.4-92.1) and 99.6% (99.4-99.7; training) and 81.1% (77.6-84.3) and 97% (96.7-97.4; validation).
CONCLUSIONS: The impedance cardiogram algorithm is a significant marker of circulatory collapse. Automated defibrillators with an integrated impedance cardiogram could improve emergency care by lay persons, enabling rapid and appropriate initiation of external cardiac massage.
Resumo:
Details are presented of the DAC (DSP ASIC Compiler) silicon compiler framework. DAC allows a non-specialist to automatically design DSP ASICs and DSP ASIC cores directly form a high level specification. Typical designs take only several minutes and the resulting layouts are comparable in area and performance to handcrafted designs.
Resumo:
The need to account for the effect of design decisions on manufacture and the impact of manufacturing cost on the life cycle cost of any product are well established. In this context, digital design and manufacturing solutions have to be further developed to facilitate and automate the integration of cost as one of the major driver in the product life cycle management. This article is to present an integration methodology for implementing cost estimation capability within a digital manufacturing environment. A digital manufacturing structure of knowledge databases are set out and the ontology of assembly and part costing that is consistent with the structure is provided. Although the methodology is currently used for recurring cost prediction, it can be well applied to other functional developments, such as process planning. A prototype tool is developed to integrate both assembly time cost and parts manufacturing costs within the same digital environment. An industrial example is used to validate this approach.
Resumo:
This paper presents a robust finite element procedure for modelling the behaviour of postbuckling structures undergoing mode-jumping. Current non-linear implicit finite element solution schemes, found in most finite element codes, are discussed and their shortcomings highlighted. A more effective strategy is presented which combines a quasi-static and a pseudo-transient routine for modelling this behaviour. The switching between these two schemes is fully automated and therefore eliminates the need for user intervention during the solution process. The quasi-static response is modelled using the are-length constraint while the pseudo-transient routine uses a modified explicit dynamic routine, which is more computationally efficient than standard implicit and explicit dynamic schemes. The strategies for switching between the quasi-static and pseudo-transient routines are presented
Resumo:
Introducing automation into a managed environment includes significant initial overhead and abstraction, creating a disconnect between the administrator and the system. In order to facilitate the transition to automated management, this paper proposes an approach whereby automation increases gradually, gathering data from the task deployment process. This stored data is analysed to determine the task outcome status and can then be used for comparison against future deployments of the same task and alerting the administrator to deviations from the expected outcome. Using a machinelearning
approach, the automation tool can learn from the administrator's reaction to task failures and eventually react to faults autonomously.
Resumo:
Abstract. Modern business practices in engineering are increasingly turning to post manufacture service provision in an attempt to generate additional revenue streams and ensure commercial sustainability. Maintainability has always been a consideration during the design process but in the past it has been generally considered to be of tertiary importance behind manufacturability and primary product function in terms of design priorities. The need to draw whole life considerations into concurrent engineering (CE) practice has encouraged companies to address issues such as maintenance, earlier in the design process giving equal importance to all aspects of the product lifecycle. The consideration of design for maintainability (DFM) early in the design process has the potential to significantly reduce maintenance costs, and improve overall running efficiencies as well as safety levels. However a lack of simulation tools still hinders the adaptation of CE to include practical elements of design and therefore further research is required to develop methods by which ‘hands on’ activities such as maintenance can be fully assessed and optimised as concepts develop. Virtual Reality (VR) has the potential to address this issue but the application of these traditionally high cost systems can require complex infrastructure and their use has typically focused on aesthetic aspects of mature designs. This paper examines the application of cost effective VR technology to the rapid assessment of aircraft interior inspection during conceptual design. It focuses on the integration of VR hardware with a typical desktop engineering system and examines the challenges with data transfer, graphics quality and the development of practical user functions within the VR environment. Conclusions drawn to date indicate that the system has the potential to improve maintenance planning through the provision of a usable environment for inspection which is available as soon as preliminary structural models are generated as part of the conceptual design process. Challenges still exist in the efficient transfer of data between the CAD and VR environments as well as the quantification of any benefits that result from the proposed approach. The result of this research will help to improve product maintainability, reduce product development cycle times and lower maintenance costs.
Resumo:
A novel multiplexed immunoassay for the analysis of phycotoxins in shellfish samples has been developed. Therefore, a regenerable chemiluminescence (CL) microarray was established which is able to analyze automatically three different phycotoxins (domoic acid (DA), okadaic acid (OA) and saxitoxin (STX)) in parallel on the analysis platform MCR3. As a test format an indirect competitive immunoassay format was applied. These phycotoxins were directly immobilized on an epoxy-activated PEG chip surface. The parallel analysis was enabled by the simultaneous addition of all analytes and specific antibodies on one microarray chip. After the competitive reaction, the CL signal was recorded by a CCD camera. Due to the ability to regenerate the toxin microarray, internal calibrations of phycotoxins in parallel were performed using the same microarray chip, which was suitable for 25 consecutive measurements. For the three target phycotoxins multi-analyte calibration curves were generated. In extracted shellfish matrix, the determined LODs for DA, OA and STX with values of 0.5±0.3 µg L(-1), 1.0±0.6 µg L(-1), and 0.4±0.2 µg L(-1) were slightly lower than in PBS buffer. For determination of toxin recoveries, the observed signal loss in the regeneration was corrected. After applying mathematical corrections spiked shellfish samples were quantified with recoveries for DA, OA, and STX of 86.2%, 102.5%, and 61.6%, respectively, in 20 min. This is the first demonstration of an antibody based phycotoxin microarray.
Resumo:
Pressure myography studies have played a crucial role in our understanding of vascular physiology and pathophysiology. Such studies depend upon the reliable measurement of changes in the diameter of isolated vessel segments over time. Although several software packages are available to carry out such measurements on small arteries and veins, no such software exists to study smaller vessels (<50 µm in diameter). We provide here a new, freely available open-source algorithm, MyoTracker, to measure and track changes in the diameter of small isolated retinal arterioles. The program has been developed as an ImageJ plug-in and uses a combination of cost analysis and edge enhancement to detect the vessel walls. In tests performed on a dataset of 102 images, automatic measurements were found to be comparable to those of manual ones. The program was also able to track both fast and slow constrictions and dilations during intraluminal pressure changes and following application of several drugs. Variability in automated measurements during analysis of videos and processing times were also investigated and are reported. MyoTracker is a new software to assist during pressure myography experiments on small isolated retinal arterioles. It provides fast and accurate measurements with low levels of noise and works with both individual images and videos. Although the program was developed to work with small arterioles, it is also capable of tracking the walls of other types of microvessels, including venules and capillaries. It also works well with larger arteries, and therefore may provide an alternative to other packages developed for larger vessels when its features are considered advantageous.
Resumo:
Pollen grains are microscopic so their identification and quantification has, for decades, depended upon human observers using light microscopes: a labour-intensive approach. Modern improvements in computing and imaging hardware and software now bring automation of pollen analyses within reach. In this paper, we provide the first review in over 15 yr of progress towards automation of the part of palynology concerned with counting and classifying pollen, bringing together literature published from a wide spectrum of sources. We
consider which attempts offer the most potential for an automated palynology system for universal application across all fields of research concerned with pollen classification and counting. We discuss what is required to make the datasets of these automated systems as acceptable as those produced by human palynologists, and present suggestions for how automation will generate novel approaches to counting and classifying pollen that have hitherto been unthinkable.
Resumo:
Optimizing and editing enterprise software systems, after the implementation process has started, is widely recognized to be an expensive process. This has led to increasing emphasis on locating mistakes within software systems at the design stage, to help minimize development costs. There is increasing interest in the field of architecture evaluation techniques that can identify problems at the design stage, either within complete, or partially complete architectures. Most current techniques rely on manual review-based evaluation methods that require advanced skills from architects and evaluators. We are currently considering what a formal Architecture Description Language (ADL) can contribute to the process of architecture evaluation and validation. Our investigation is considering the inter-relationships between the activities performed during the architecture evaluation process, the characteristics an ADL should possess to support these activities, and the tools needed to provide convenient access to, and presentation of architectural information.