916 resultados para WAVELENGTH AUTOMATED PERIMETRY
Resumo:
A new variation of holographic interferometry has been utilized to perform simultaneous two-wavelength measurements, allowing quantitative analysis of the heavy particle and electron densities in a superorbital facility. An air test gas accelerated to 12 km/s was passed over a cylindrical model, simulating reentry conditions encountered by a space vehicle on a superorbital mission. Laser beams with two different wavelengths have been overlapped, passed through the test section, and simultaneously recorded on a single holographic plate. Reconstruction of the hologram generated two separate interferograms at different. angles from which the quantitative measurements were made. With this technique, a peak electron concentration of (5.5 +/- 0.5) x 10(23) m(-3) was found behind a bow shock on a cylinder. (C) 1997 Optical Society of America.
Resumo:
Concerns have been raised about the reproducibility of brachial artery reactivity (BAR), because subjective decisions regarding the location of interfaces may influence the measurement of very small changes in lumen diameter. We studied 120 consecutive patients with BAR to address if an automated technique could be applied, and if experience influenced reproducibility between two observers, one experienced and one inexperienced. Digital cineloops were measured automatically, using software that measures the leading edge of the endothelium and tracks this in sequential frames and also manually, where a set of three point-to-point measurements were averaged. There was a high correlation between automated and manual techniques for both observers, although less variability was present with expert readers. The limits of agreement overall for interobserver concordance were 0.13 +/-0.65 mm for the manual and 0.03 +/-0.74 mm for the automated measurement. For intraobserver concordance, the limits of agreement were -0.07 +/-0.38 mm for observer 1 and -0.16 +/-0.55 mm for observer 2. We concluded that BAR measurements were highly concordant between observers, although more concordant using the automated method, and that experience does affect concordance. Care must be taken to ensure that the same segments are measured between observers and serially.
Resumo:
A system has been developed for studying the biodegradation of natural and synthetic polymeric material. The system is based on standard methods developed by the European Committee for Standardisation (CEN TC 261) (ISO/DIS 14855) and the American Society of Testing Materials, 'ASTM. Standard Test Method for Determining Aerobic. Biodegradation of Plastic Materials under Controlled Composting Conditions' (ASTM D 5338-92). A new low-cost compost facility has been used which satisfies the requirements of these standards. The system has been automated for data collection and has been run under the conditions specified by the standards. In the system, cellulose, newspaper and two starch-based polymers were treated with compost in a series of 3dm(3) vessels at 52 degreesC and under conditions of optimum moisture and pH. The degradation was followed over time by measuring the amount of carbon released as carbon dioxide. (C) 2001 Society of Chemical Industry.
Resumo:
Observations of accelerating seismic activity prior to large earthquakes in natural fault systems have raised hopes for intermediate-term eartquake forecasting. If this phenomena does exist, then what causes it to occur? Recent theoretical work suggests that the accelerating seismic release sequence is a symptom of increasing long-wavelength stress correlation in the fault region. A more traditional explanation, based on Reid's elastic rebound theory, argues that an accelerating sequence of seismic energy release could be a consequence of increasing stress in a fault system whose stress moment release is dominated by large events. Both of these theories are examined using two discrete models of seismicity: a Burridge-Knopoff block-slider model and an elastic continuum based model. Both models display an accelerating release of seismic energy prior to large simulated earthquakes. In both models there is a correlation between the rate of seismic energy release with the total root-mean-squared stress and the level of long-wavelength stress correlation. Furthermore, both models exhibit a systematic increase in the number of large events at high stress and high long-wavelength stress correlation levels. These results suggest that either explanation is plausible for the accelerating moment release in the models examined. A statistical model based on the Burridge-Knopoff block-slider is constructed which indicates that stress alone is sufficient to produce accelerating release of seismic energy with time prior to a large earthquake.
Resumo:
Quantification of stress echocardiography may overcome the training requirements and subjective nature of visual wall motion score (WMS) assessment, but quantitative approaches may be difficult to apply and require significant time for image processing. The integral of long-axis myocardial velocity is displacement, which may be represented as a color map over the left ventricular myocardium. This study was designed to explore the feasibility and accuracy of measuring long-axis myocardial displacement, derived from tissue Doppler, for the detection of coronary artery disease (CAD) during dobutamine stress echocardiography (DBE). One hundred thirty patients underwent standard DBE, including 30 patients at low risk of CAD, 30 patients with normal coronary angiography (both groups studied to define normal ranges of displacement), and 70 patients who underwent coronary angiography in whom the accuracy of normal ranges was tested. Regional myocardial displacement was obtained by analysis of color tissue Doppler apical images acquired at peak stress. Displacement was compared with WMS, and with the presence of CAD by angiography. The analysis time was 3.2 +/- 1.5 minutes per patient. Segmental displacement was correlated with wall motion (normal 7.4 +/- 3.2 mm, ischemia 5.8 +/- 4.2 mm, viability 4.6 +/- 3.0 mm, scar 4.5 +/- 3.5 mm, p <0.001). Reversal of normal base-apex displacement was an insensitive (19%) but specific (90%) marker of CAD. The sum of displacements within each vascular territory had a sensitivity and specificity of 89% and 79%, respectively, for prediction of significant CAD, compared with 86% and 78%, respectively, for WMS (p = NS). The displacements in the basal segments had a sensitivity and specificity of 83% and 78%, respectively (p = NS). Regional myocardial displacement during DBE is feasible and offers a fast and accurate method for the diagnosis of CAD. (C),2002 by Excerpta Medica, Inc.
Resumo:
Organic microcavity light-emitting diodes typically exhibit a blueshift of the emitting wavelength with increasing viewing angle. We have modeled the shift of the resonance wavelength for several metal mirrors. Eight metals (Al, Ag, Cr, Ti, Au, Ni, Pt, and Cu) have been considered as top or bottom mirrors, depending on their work functions. The model fully takes into account the dependence of the phase change that occurs on reflection on angle and wavelength for both s and p polarization, as well as on dispersion in the organic layers. Different contributions to the emission wavelength shift are discussed. The influence of the thickness of the bottom mirror and of the choice and thickness of the organic materials inside the cavity has been investigated. Based on the results obtained, guidelines for a choice of materials to reduce blueshift; are given. (C) 2002 Optical Society of America.
Resumo:
The refinement calculus is a well-established theory for deriving program code from specifications. Recent research has extended the theory to handle timing requirements, as well as functional ones, and we have developed an interactive programming tool based on these extensions. Through a number of case studies completed using the tool, this paper explains how the tool helps the programmer by supporting the many forms of variables needed in the theory. These include simple state variables as in the untimed calculus, trace variables that model the evolution of properties over time, auxiliary variables that exist only to support formal reasoning, subroutine parameters, and variables shared between parallel processes.
Resumo:
Many organisations need to extract useful information from huge amounts of movement data. One example is found in maritime transportation, where the automated identification of a diverse range of traffic routes is a key management issue for improving the maintenance of ports and ocean routes, and accelerating ship traffic. This paper addresses, in a first stage, the research challenge of developing an approach for the automated identification of traffic routes based on clustering motion vectors rather than reconstructed trajectories. The immediate benefit of the proposed approach is to avoid the reconstruction of trajectories in terms of their geometric shape of the path, their position in space, their life span, and changes of speed, direction and other attributes over time. For clustering the moving objects, an adapted version of the Shared Nearest Neighbour algorithm is used. The motion vectors, with a position and a direction, are analysed in order to identify clusters of vectors that are moving towards the same direction. These clusters represent traffic routes and the preliminary results have shown to be promising for the automated identification of traffic routes with different shapes and densities, as well as for handling noise data.
Resumo:
Background: Regulating mechanisms of branching morphogenesis of fetal lung rat explants have been an essential tool for molecular research. This work presents a new methodology to accurately quantify the epithelial, outer contour and peripheral airway buds of lung explants during cellular development from microscopic images. Methods: The outer contour was defined using an adaptive and multi-scale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelial was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds were counted as the skeleton branched ends from a skeletonized image of the lung inner epithelial. Results: The time for lung branching morphometric analysis was reduced in 98% in contrast to the manual method. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Non-significant differences were found between the automatic and manual results in all culture days. Conclusions: The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lightning characteristics and allowing a reliable comparison between different researchers.
Resumo:
Regulating mechanisms of branchingmorphogenesis of fetal lung rat explants have been an essential tool formolecular research.This work presents a new methodology to accurately quantify the epithelial, outer contour, and peripheral airway buds of lung explants during cellular development frommicroscopic images. Methods.Theouter contour was defined using an adaptive and multiscale threshold algorithm whose level was automatically calculated based on an entropy maximization criterion. The inner lung epithelium was defined by a clustering procedure that groups small image regions according to the minimum description length principle and local statistical properties. Finally, the number of peripheral buds was counted as the skeleton branched ends from a skeletonized image of the lung inner epithelia. Results. The time for lung branching morphometric analysis was reduced in 98% in contrast to themanualmethod. Best results were obtained in the first two days of cellular development, with lesser standard deviations. Nonsignificant differences were found between the automatic and manual results in all culture days. Conclusions. The proposed method introduces a series of advantages related to its intuitive use and accuracy, making the technique suitable to images with different lighting characteristics and allowing a reliable comparison between different researchers.
Resumo:
With the purpose of at lowering costs and reendering the demanded information available to users with no access to the internet, service companies have adopted automated interaction technologies in their call centers, which may or may not meet the expectations of users. Based on different areas of knowledge (man-machine interaction, consumer behavior and use of IT) 13 propositions are raised and a research is carried out in three parts: focus group, field study with users and interviews with experts. Eleven automated service characteristics which support the explanation for user satisfaction are listed, a preferences model is proposed and evidence in favor or against each of the 13 propositions is brought in. With balance scorecard concepts, a managerial assessment model is proposed for the use of automated call center technology. In future works, the propositions may become verifiable hypotheses through conclusive empirical research.
Resumo:
In this paper we present results on the optimization of multilayered a-SiC:H heterostructures that can be used as optical transducers for fluorescent proteins detection using the Fluorescence Resonance Energy Transfer approach. Double structures composed by pin based aSiC:H cells are analyzed. The color discrimination is achieved by ac photocurrent measurement under different externally applied bias. Experimental data on spectral response analysis, current-voltage characteristics and color and transmission rate discrimination are reported. An electrical model, supported by a numerical simulation gives insight into the device operation. Results show that the optimized a-SiC:H heterostructures act as voltage controlled optical filters in the visible spectrum. When the applied voltages are chosen appropriately those optical transducers can detect not only the selective excitation of specimen fluorophores, but also the subsequent weak acceptor fluorescent channel emission.
Resumo:
INTRODUCTION: The correct identification of the underlying cause of death and its precise assignment to a code from the International Classification of Diseases are important issues to achieve accurate and universally comparable mortality statistics These factors, among other ones, led to the development of computer software programs in order to automatically identify the underlying cause of death. OBJECTIVE: This work was conceived to compare the underlying causes of death processed respectively by the Automated Classification of Medical Entities (ACME) and the "Sistema de Seleção de Causa Básica de Morte" (SCB) programs. MATERIAL AND METHOD: The comparative evaluation of the underlying causes of death processed respectively by ACME and SCB systems was performed using the input data file for the ACME system that included deaths which occurred in the State of S. Paulo from June to December 1993, totalling 129,104 records of the corresponding death certificates. The differences between underlying causes selected by ACME and SCB systems verified in the month of June, when considered as SCB errors, were used to correct and improve SCB processing logic and its decision tables. RESULTS: The processing of the underlying causes of death by the ACME and SCB systems resulted in 3,278 differences, that were analysed and ascribed to lack of answer to dialogue boxes during processing, to deaths due to human immunodeficiency virus [HIV] disease for which there was no specific provision in any of the systems, to coding and/or keying errors and to actual problems. The detailed analysis of these latter disclosed that the majority of the underlying causes of death processed by the SCB system were correct and that different interpretations were given to the mortality coding rules by each system, that some particular problems could not be explained with the available documentation and that a smaller proportion of problems were identified as SCB errors. CONCLUSION: These results, disclosing a very low and insignificant number of actual problems, guarantees the use of the version of the SCB system for the Ninth Revision of the International Classification of Diseases and assures the continuity of the work which is being undertaken for the Tenth Revision version.