982 resultados para Computer techniques
Resumo:
The influence of initial perturbation geometry and material propel-ties on final fold geometry has been investigated using finite-difference (FLAC) and finite-element (MARC) numerical models. Previous studies using these two different codes reported very different folding behaviour although the material properties, boundary conditions and initial perturbation geometries were similar. The current results establish that the discrepancy was not due to the different computer codes but due to the different strain rates employed in the two previous studies (i.e. 10(-6) s(-1) in the FLAC models and 10(-14) s(-1) in the MARC models). As a result, different parts of the elasto-viscous rheological field were bring investigated. For the same material properties, strain rate and boundary conditions, the present results using the two different codes are consistent. A transition in Folding behaviour, from a situation where the geometry of initial perturbation determines final fold shape to a situation where material properties control the final geometry, is produced using both models. This transition takes place with increasing strain rate, decreasing elastic moduli or increasing viscosity (reflecting in each case the increasing influence of the elastic component in the Maxwell elastoviscous rheology). The transition described here is mechanically feasible but is associated with very high stresses in the competent layer (on the order of GPa), which is improbable under natural conditions. (C) 2000 Elsevier Science Ltd. All rights reserved.
Resumo:
The reported experimental work on the systems Fe-Zn-O and Fe-Zn-Si-O in equilibrium with metallic iron is part of a wider research program that combines experimental and thermodynamic computer modeling techniques to characterize zinc/lead industrial slags and sinters in the system PbO-ZnO-SiO2-CaO-FeO-Fe2O3. Extensive experimental,investigations using high-temperature equilibration and quenching techniques followed by electron probe X-ray microanalysis (EPMA) were carried out. Special experimental; procedures were developed to enable accurate measurements in these ZnO-containing systems to be performed in equilibrium with metallic iron; The systems Fe-Zn-O and FeZn-Si-O were experimentally investigated in equilibrium with metallic iron in the temperature ranges 900 degreesC to 1200 degreesC (1173 to 1473 K) and from 1000 degreesC to 1350 degreesC (1273 to 1623 K), respectively. The liquidus surface in the system Fe-Zn-Si-O in equilibrium with metallic iron was characterized in the composition ranges 0 to 33 wt pet ZnO and 0 to 40 wt pet SiO2. The wustite (Fe,Zn)O, zincite (Zn,Fe)O, willemite (Zn,Fe)(2)SiO4, arid fayalite: (Fe,Zn)(2)SiO4 solid solutions in equilibrium with metallic iron were measured.
Resumo:
In this paper we propose a new framework for evaluating designs based on work domain analysis, the first phase of cognitive work analysis. We develop a rationale for a new approach to evaluation by describing the unique characteristics of complex systems and by showing that systems engineering techniques only partially accommodate these characteristics. We then present work domain analysis as a complementary framework for evaluation. We explain this technique by example by showing how the Australian Defence Force used work domain analysis to evaluate design proposals for a new system called Airborne Early Warning and Control. This case study also demonstrates that work domain analysis is a useful and feasible approach that complements standard techniques for evaluation and that promotes a central role for human factors professionals early in the system design and development process. Actual or potential applications of this research include the evaluation of designs for complex systems.
Resumo:
The finite element method is used to simulate coupled problems, which describe the related physical and chemical processes of ore body formation and mineralization, in geological and geochemical systems. The main purpose of this paper is to illustrate some simulation results for different types of modelling problems in pore-fluid saturated rock masses. The aims of the simulation results presented in this paper are: (1) getting a better understanding of the processes and mechanisms of ore body formation and mineralization in the upper crust of the Earth; (2) demonstrating the usefulness and applicability of the finite element method in dealing with a wide range of coupled problems in geological and geochemical systems; (3) qualitatively establishing a set of showcase problems, against which any numerical method and computer package can be reasonably validated. (C) 2002 Published by Elsevier Science B.V.
Resumo:
While multimedia data, image data in particular, is an integral part of most websites and web documents, our quest for information so far is still restricted to text based search. To explore the World Wide Web more effectively, especially its rich repository of truly multimedia information, we are facing a number of challenging problems. Firstly, we face the ambiguous and highly subjective nature of defining image semantics and similarity. Secondly, multimedia data could come from highly diversified sources, as a result of automatic image capturing and generation processes. Finally, multimedia information exists in decentralised sources over the Web, making it difficult to use conventional content-based image retrieval (CBIR) techniques for effective and efficient search. In this special issue, we present a collection of five papers on visual and multimedia information management and retrieval topics, addressing some aspects of these challenges. These papers have been selected from the conference proceedings (Kluwer Academic Publishers, ISBN: 1-4020- 7060-8) of the Sixth IFIP 2.6 Working Conference on Visual Database Systems (VDB6), held in Brisbane, Australia, on 29–31 May 2002.
Resumo:
Direct and simultaneous observation of root growth and plant water uptake is difficult because soils are opaque. X-ray imaging techniques such as projection radiography or Computer Tomography (CT) offer a partial alternative to such limitations. Nevertheless, there is a trade-off between resolution, large field-of-view and 3-dimensionality: With the current state of the technology, it is possible to have any two. In this study, we used X-ray transmission through thin-slab systems to monitor transient saturation fields that develop around roots as plants grow. Although restricted to 2-dimensions, this approach offers a large field-of-view together with high spatial and dynamic resolutions. To illustrate the potential of this technology, we grew peas in 1 cm thick containers filled with soil and imaged them at regular intervals. The dynamics of both the root growth and the water content field that developed around the roots could be conveniently monitored. Compared to other techniques such as X-ray CT, our system is relatively inexpensive and easy to implement. It can potentially be applied to study many agronomic problems, such as issues related to the impact of soil constraints (physical, chemical or biological) on root development.
Resumo:
This paper is part of a large study to assess the adequacy of the use of multivariate statistical techniques in theses and dissertations of some higher education institutions in the area of marketing with theme of consumer behavior from 1997 to 2006. The regression and conjoint analysis are focused on in this paper, two techniques with great potential of use in marketing studies. The objective of this study was to analyze whether the employement of these techniques suits the needs of the research problem presented in as well as to evaluate the level of success in meeting their premisses. Overall, the results suggest the need for more involvement of researchers in the verification of all the theoretical precepts of application of the techniques classified in the category of investigation of dependence among variables.
Resumo:
Recent advances in the control of molecular engineering architectures have allowed unprecedented ability of molecular recognition in biosensing, with a promising impact for clinical diagnosis and environment control. The availability of large amounts of data from electrical, optical, or electrochemical measurements requires, however, sophisticated data treatment in order to optimize sensing performance. In this study, we show how an information visualization system based on projections, referred to as Projection Explorer (PEx), can be used to achieve high performance for biosensors made with nanostructured films containing immobilized antigens. As a proof of concept, various visualizations were obtained with impedance spectroscopy data from an array of sensors whose electrical response could be specific toward a given antibody (analyte) owing to molecular recognition processes. In addition to discussing the distinct methods for projection and normalization of the data, we demonstrate that an excellent distinction can be made between real samples tested positive for Chagas disease and Leishmaniasis, which could not be achieved with conventional statistical methods. Such high performance probably arose from the possibility of treating the data in the whole frequency range. Through a systematic analysis, it was inferred that Sammon`s mapping with standardization to normalize the data gives the best results, where distinction could be made of blood serum samples containing 10(-7) mg/mL of the antibody. The method inherent in PEx and the procedures for analyzing the impedance data are entirely generic and can be extended to optimize any type of sensor or biosensor.
Resumo:
BACKGROUND AND OBJECTIVE: To compare the analgesic effectiveness and aesthetic appearance associated with topical, subconjunctival, and peribulbar anesthesia for intravitreal bevacizumab injection. PATIENTS AND METHODS: Sixty consecutive patients undergoing their first intravitreal bevacizumab injection were randomized to receive one of three forms of anesthesia: proxymetacaine eye drops, subconjunctival injection of 2% xylocaine, and peribulbar injection of 2% xylocaine. Pain associated with the intravitreal injection and with the entire procedure (including anesthesia administration) was recorded using a Visual Analog Scale 15 minutes after intravitreal injection. Anterior segment evaluation was performed 24 hours after injection to measure the number of clock hours of subconjunctival hemorrhage. RESULTS: Median injection-related pain score was significantly lower in the peribulbar group compared with the topical and subconjunctival groups (P < .05). Median entire procedure pain score was significantly higher In the peribulbar group compared with the topical and subconjunctival groups (P < .05). The median extent of subconjunctival hemorrhage was significantly lower in the topical group compared with the other groups (P < .05). CONCLUSION: Among the three anesthetic techniques, peribulbar anesthesia was associated with greater effectiveness in controlling injection-related pain but was least effective in controlling entire procedure pain. There was no significant difference in pain scores between the topical and subconjunctival groups, and topical anesthesia was associated with less subconjunctival hemorrhage.
Resumo:
Using synchrotron radiation, we combined simultaneously wide angle X-ray scattering (WAXS) and small angle X-ray scattering (SAXS) techniques to obtain the scattering profiles of normal and neoplastic breast tissu-es samples at the momentum transfer range 6.28 nm(-1) <= Q(=4 pi.sin(theta/2)lambda) <= 50.26 nm(-1) and 0.15 nm(-1) <= Q <= 1.90 nm(-1), respectively. The results obtained show considerable differences between the scattering profiles of these tissues. We verified that the combination of some parameters (ratio between glandular and adipose peak intensity and third-order axial peak intensity) extracted from scattering profiles can be used for identifying breast cancer. (c) 2009 Elsevier Ltd. All rights reserved.
Resumo:
There is not a specific test to diagnose Alzheimer`s disease (AD). Its diagnosis should be based upon clinical history, neuropsychological and laboratory tests, neuroimaging and electroencephalography (EEG). Therefore, new approaches are necessary to enable earlier and more accurate diagnosis and to follow treatment results. In this study we used a Machine Learning (ML) technique, named Support Vector Machine (SVM), to search patterns in EEG epochs to differentiate AD patients from controls. As a result, we developed a quantitative EEG (qEEG) processing method for automatic differentiation of patients with AD from normal individuals, as a complement to the diagnosis of probable dementia. We studied EEGs from 19 normal subjects (14 females/5 males, mean age 71.6 years) and 16 probable mild to moderate symptoms AD patients (14 females/2 males, mean age 73.4 years. The results obtained from analysis of EEG epochs were accuracy 79.9% and sensitivity 83.2%. The analysis considering the diagnosis of each individual patient reached 87.0% accuracy and 91.7% sensitivity.
Resumo:
Objectives: Lung hyperinflation may be assessed by computed tomography (CT). As shown for patients with emphysema, however, CT image reconstruction affects quantification of hyperinflation. We studied the impact of reconstruction parameters on hyperinflation measurements in mechanically ventilated (MV) patients. Design: Observational analysis. Setting: A University hospital-affiliated research Unit. Patients: The patients were MV patients with injured (n = 5) or normal lungs (n = 6), and spontaneously breathing patients (n = 5). Interventions: None. Measurements and results: Eight image series involving 3, 5, 7, and 10 mm slices and standard and sharp filters were reconstructed from identical CT raw data. Hyperinflated (V-hyper), normally (V-normal), poorly (V-poor), and nonaerated (V-non) volumes were calculated by densitometry as percentage of total lung volume (V-total). V-hyper obtained with the sharp filter systematically exceeded that with the standard filter showing a median (interquartile range) increment of 138 (62-272) ml corresponding to approximately 4% of V-total. In contrast, sharp filtering minimally affected the other subvolumes (V-normal, V-poor, V-non, and V-total). Decreasing slice thickness also increased V-hyper significantly. When changing from 10 to 3 mm thickness, V-hyper increased by a median value of 107 (49-252) ml in parallel with a small and inconsistent increment in V-non of 12 (7-16) ml. Conclusions: Reconstruction parameters significantly affect quantitative CT assessment of V-hyper in MV patients. Our observations suggest that sharp filters are inappropriate for this purpose. Thin slices combined with standard filters and more appropriate thresholds (e.g., -950 HU in normal lungs) might improve the detection of V-hyper. Different studies on V-hyper can only be compared if identical reconstruction parameters were used.
Resumo:
An important consideration in the development of mathematical models for dynamic simulation, is the identification of the appropriate mathematical structure. By building models with an efficient structure which is devoid of redundancy, it is possible to create simple, accurate and functional models. This leads not only to efficient simulation, but to a deeper understanding of the important dynamic relationships within the process. In this paper, a method is proposed for systematic model development for startup and shutdown simulation which is based on the identification of the essential process structure. The key tool in this analysis is the method of nonlinear perturbations for structural identification and model reduction. Starting from a detailed mathematical process description both singular and regular structural perturbations are detected. These techniques are then used to give insight into the system structure and where appropriate to eliminate superfluous model equations or reduce them to other forms. This process retains the ability to interpret the reduced order model in terms of the physico-chemical phenomena. Using this model reduction technique it is possible to attribute observable dynamics to particular unit operations within the process. This relationship then highlights the unit operations which must be accurately modelled in order to develop a robust plant model. The technique generates detailed insight into the dynamic structure of the models providing a basis for system re-design and dynamic analysis. The technique is illustrated on the modelling for an evaporator startup. Copyright (C) 1996 Elsevier Science Ltd
Resumo:
Background/Aims: Cytokines have a significant role in the response to injury following liver transplantation, but the origin and course of such molecules are not completely known. The aim of this study was to evaluate the production and liver metabolism of the inflammatory cytokines interleukin (IL)-1 beta, IL-6, IL-8, interferon (IFN)-Y and tumor necrosis factor (TNF)-alpha in orthotopic liver transplantation (OLT), comparing the conventional and the piggyback methods. Methodology: We performed a study of 30 patients who underwent elective OLT and were randomized for the conventional or piggyback techniques at the beginning of the operation. The amount of cytokines and their hepatic metabolism were calculated based on plasma concentrations and vascular blood flow at 2, 5, 10, 15, 30, 60, 90, and 120 minutes after revascularization. Results: The amount of IL-1 beta in portal blood was higher in patients who underwent surgery using the conventional technique (estimate interest = 63,783.9 +/- 16,586.1 pg/min, versus 11,979.6 +/- 16,585.7 pg/min in the piggyback group, p=0.035). There were no significant differences between the two operative`s methods for IL-6, IL-8, IFN-Y and TNF-alpha production. The hepatic metabolism of cytokines was not different between groups. Although all the curves showed higher amounts of cytokines with the conventional technique, these were not statistically significant. Conclusion: The study shows the similarity between the two techniques concerning the stimuli for the production of inflammatory molecules.