856 resultados para Time-delay systems
Resumo:
Objectives: Pneumothorax is a frequent complication during mechanical ventilation. Electrical impedance tomography (EIT) is a noninvasive tool that allows real-time imaging of regional ventilation. The purpose of this study was to 1) identify characteristic changes in the EIT signals associated with pneumothoraces; 2) develop and fine-tune an algorithm for their automatic detection; and 3) prospectively evaluate this algorithm for its sensitivity and specificity in detecting pneumothoraces in real time. Design: Prospective controlled laboratory animal investigation. Setting: Experimental Pulmonology Laboratory of the University of Sao Paulo. Subjects: Thirty-nine anesthetized mechanically ventilated supine pigs (31.0 +/- 3.2 kg, mean +/- SD). Interventions. In a first group of 18 animals monitored by EIT, we either injected progressive amounts of air (from 20 to 500 mL) through chest tubes or applied large positive end-expiratory pressure (PEEP) increments to simulate extreme lung overdistension. This first data set was used to calibrate an EIT-based pneumothorax detection algorithm. Subsequently, we evaluated the real-time performance of the detection algorithm in 21 additional animals (with normal or preinjured lungs), submitted to multiple ventilatory interventions or traumatic punctures of the lung. Measurements and Main Results: Primary EIT relative images were acquired online (50 images/sec) and processed according to a few imaging-analysis routines running automatically and in parallel. Pneumothoraces as small as 20 mL could be detected with a sensitivity of 100% and specificity 95% and could be easily distinguished from parenchymal overdistension induced by PEEP or recruiting maneuvers, Their location was correctly identified in all cases, with a total delay of only three respiratory cycles. Conclusions. We created an EIT-based algorithm capable of detecting early signs of pneumothoraces in high-risk situations, which also identifies its location. It requires that the pneumothorax occurs or enlarges at least minimally during the monitoring period. Such detection was operator-free and in quasi real-time, opening opportunities for improving patient safety during mechanical ventilation.
Resumo:
A number of theoretical and experimental investigations have been made into the nature of purlin-sheeting systems over the past 30 years. These systems commonly consist of cold-formed zed or channel section purlins, connected to corrugated sheeting. They have proven difficult to model due to the complexity of both the purlin deformation and the restraint provided to the purlin by the sheeting. Part 1 of this paper presented a non-linear elasto plastic finite element model which, by incorporating both the purlin and the sheeting in the analysis, allowed the interaction between the two components of the system to be modelled. This paper presents a simplified version of the first model which has considerably decreased requirements in terms of computer memory, running time and data preparation. The Simplified Model includes only the purlin but allows for the sheeting's shear and rotational restraints by modelling these effects as springs located at the purlin-sheeting connections. Two accompanying programs determine the stiffness of these springs numerically. As in the Full Model, the Simplified Model is able to account for the cross-sectional distortion of the purlin, the shear and rotational restraining effects of the sheeting, and failure of the purlin by local buckling or yielding. The model requires no experimental or empirical input and its validity is shown by its goon con elation with experimental results. (C) 1997 Elsevier Science Ltd.
Resumo:
Background: Real time myocardial contrast echocardiography (RTMCE) is an emerging imaging modality for assessing myocardial perfusion that allows for noninvasive quantification of regional myocardial blood flow (MBF). Aim: We sought to assess the value of qualitative analysis of myocardial perfusion and quantitative assessment of myocardial blood flow (MBF) by RTMCE for predicting regional function recovery in patients with ischemic heart disease who underwent coronary artery bypass grafting (CABG). Methods: Twenty-four patients with coronary disease and left ventricular systolic dysfunction (ejection fraction < 45%) underwent RTMCE before and 3 months after CABG. RTMCE was performed using continuous intravenous infusion of commercially available contrast agent with low mechanical index power modulation imaging. Viability was defined by qualitative assessment of myocardial perfusion as homogenous opacification at rest in >= 2 segments of anterior or >= 1 segment of posterior territory. Viability by quantitative assessment of MBF was determined by receiver-operating characteristics curve analysis. Results: Regional function recovery was observed in 74% of territories considered viable by qualitative analysis of myocardial perfusion and 40% of nonviable (P = 0.03). Sensitivity, specificity, positive and negative predictive values of qualitative RTMCE for detecting regional function recovery were 74%, 60%, 77%, and 56%, respectively. Cutoff value of MBF for predicting regional function recovery was 1.76 (AUC = 0.77; 95% CI = 0.62-0.92). MBF obtained by RTMCE had sensitivity of 91%, specificity of 50%, positive predictive value of 75%, and negative predictive value of 78%. Conclusion: Qualitative and quantitative RTMCE provide good accuracy for predicting regional function recovery after CABG. Determination of MBF increases the sensitivity for detecting hibernating myocardium. (Echocardiography 2011;28:342-349).
Wavelet correlation between subjects: A time-scale data driven analysis for brain mapping using fMRI
Resumo:
Functional magnetic resonance imaging (fMRI) based on BOLD signal has been used to indirectly measure the local neural activity induced by cognitive tasks or stimulation. Most fMRI data analysis is carried out using the general linear model (GLM), a statistical approach which predicts the changes in the observed BOLD response based on an expected hemodynamic response function (HRF). In cases when the task is cognitively complex or in cases of diseases, variations in shape and/or delay may reduce the reliability of results. A novel exploratory method using fMRI data, which attempts to discriminate between neurophysiological signals induced by the stimulation protocol from artifacts or other confounding factors, is introduced in this paper. This new method is based on the fusion between correlation analysis and the discrete wavelet transform, to identify similarities in the time course of the BOLD signal in a group of volunteers. We illustrate the usefulness of this approach by analyzing fMRI data from normal subjects presented with standardized human face pictures expressing different degrees of sadness. The results show that the proposed wavelet correlation analysis has greater statistical power than conventional GLM or time domain intersubject correlation analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Background/Aims: To present a protocol of immediate surgical repair of myelomeningocele (MMC) after birth (`time zero`) and compare this surgical outcome with the surgery performed after the newborn`s admission to the nursery before the operation. Methods: Data from the medical files of 31 patients with MMC that underwent surgery after birth and after admission at the nursery ( group I) were compared with a group of 23 patients with MMC admitted and prospectively followed, who underwent surgery immediately after birth - `at time zero` ( group II). Results: The preoperative rupture of the MMC occurred more frequently in group I (67 vs. 39%, p < 0.05). The need for ventriculoperitoneal shunt was 84% in group I and 65% in group II and 4 of them were performed during the same anesthetic time as the immediate MMC repair, with no statistically significant difference. Group I had a higher incidence of small dehiscences when compared to group II ( 29 vs. 13%, p < 0.05); however, there was no statistically significant difference regarding infections. After 1 year of follow-up, 61% of group I showed neurodevelopmental delay, whereas only 35% of group II showed it. Conclusions: The surgical intervention carried out immediately after the birth showed benefits regarding a lower incidence of preoperative rupture of the MMC, postoperative dehiscences and lower incidence of neurodevelopmental delay 1 year after birth. Copyright (C) 2009 S. Karger AG, Basel
Resumo:
The aim of the present study was to evaluate the genetic correlations among real-time ultrasound carcass, BW, and scrotal circumference (SC) traits in Nelore cattle. Carcass traits, measured by real-time ultrasound of the live animal, were recorded from 2002 to 2004 on 10 farms across 6 Brazilian states on 2,590 males and females ranging in age from 450 to 599 d. Ultrasound records of LM area (LMA) and backfat thickness (BF) were obtained from cross-sectional images between the 12th and 13th ribs, and rump fat thickness (RF) was measured between the hook and pin bones over the junction between gluteus medius and biceps femoris muscles. Also, BW (n = 22,778) and SC ( n = 5,695) were recorded on animals born between 1998 and 2003. The BW traits were 120, 210, 365, 450, and 550-d standardized BW (W120, W210, W365, W450, and W550), plus BW (WS) and hip height (HH) on the ultrasound scanning date. The SC traits were 365-, 450-, and 550-d standardized SC (SC365, SC450, and SC550). For the BW and SC traits, the database used was from the Nelore Breeding Program-Nelore Brazil. The genetic parameters were estimated with multivariate animal models and REML. Estimated genetic correlations between LMA and other traits were 0.06 (BF), -0.04 ( RF), 0.05 (HH), 0.58 (WS), 0.53 (W120), 0.62 (W210), 0.67 (W365), 0.64 ( W450 and W550), 0.28 (SC365), 0.24 (SC450), and 0.00 ( SC550). Estimated genetic correlations between BF and with other traits were 0.74 ( RF), -0.32 (HH), 0.19 (WS), -0.03 (W120), -0.10 (W210), 0.04 (W365), 0.01 (W450), 0.06 ( W550), 0.17 (SC365 and SC450), and -0.19 (SC550). Estimated genetic correlations between RF and other traits were -0.41 (HH), -0.09 (WS), -0.13 ( W120), -0.09 ( W210), -0.01 ( W365), 0.02 (W450), 0.03 (W550), 0.05 ( SC365), 0.11 ( SC450), and -0.18 (SC550). These estimates indicate that selection for carcass traits measured by real-time ultrasound should not cause antagonism in the genetic improvement of SC and BW traits. Also, selection to increase HH might decrease subcutaneous fat as correlated response. Therefore, to obtain animals suited to specific tropical production systems, carcass, BW, and SC traits should be considered in selection programs.
Resumo:
Objective. To evaluate the influence of shaft design on the shaping ability of 3 rotary nickel-titanium (NiTi) systems. Study design. Sixty curved mesial canals of mandibular molars were used. Specimens were scanned by spiral tomography before and after canal preparation using ProTaper, ProFile, and ProSystem GT rotary instruments. One-millimeter-thick slices were scanned from the apical end point to the pulp chamber. The cross-sectional images from the slices taken earlier and after canal preparation at the apical, coronal, and midroot levels were compared. Results. The mean working time was 137.22 +/- 5.15 s. Mean transportation, mean centering ratio, and percentage of area increase were 0.022 +/- 0.131 mm, 0.21 +/- 0.11, and 76.90 +/- 42.27%, respectively, with no statistical differences (P > .05). Conclusions. All instruments were able to shape curved mesial canals in mandibular molars to size 30 without significant errors. The differences in shaft designs seemed not to affect their shaping capabilities.
Resumo:
The catalytic properties of enzymes are usually evaluated by measuring and analyzing reaction rates. However, analyzing the complete time course can be advantageous because it contains additional information about the properties of the enzyme. Moreover, for systems that are not at steady state, the analysis of time courses is the preferred method. One of the major barriers to the wide application of time courses is that it may be computationally more difficult to extract information from these experiments. Here the basic approach to analyzing time courses is described, together with some examples of the essential computer code to implement these analyses. A general method that can be applied to both steady state and non-steady-state systems is recommended. (C) 2001 academic Press.
Resumo:
Background. Although digital and videotaped images are known to be comparable for the evaluation of left ventricular function, their relative accuracy for assessment of more complex anatomy is unclear. We sought to compare reading time, storage costs, and concordance of video and digital interpretations across multiple observers and sites. Methods. One hundred one patients with valvular (90 mitral, 48 aortic, 80 tricuspid) disease were selected prospectively, and studies were stored according to video and standardized digital protocols. The same reviewer interpreted video and digital images independently and at different times with the use of a standard report form to evaluate 40 items (eg, severity of stenosis or regurgitation, leaflet thickening, and calcification) as normal or mildly, moderately, or severely abnormal Concordance between modalities was expressed at kappa Major discordance (difference of >1 level of severity) was ascribed to the modality that gave the lesser severity. CD-ROM was used to store digital data (20:1 lossy compression), and super-VHS video-tape was used to store video data The reading time and storage costs for each modality were compared Results. Measured parameters were highly concordant (ejection fraction was 52% +/- 13% by both). Major discordance was rare, and lesser values were reported with digital rather than video interpretation in the categories of aortic and mitral valve thicken ing (1% to 2%) and severity of mitral regurgitation (2%). Digital reading time was 6.8 +/- 2.4 minutes, 38% shorter than with video (11.0 +/- 3.0, range 8 to 22 minutes, P < .001). Compressed digital studies had an average size of 60 <plus/minus> 14 megabytes (range 26 to 96 megabytes). Storage cost for video was A$0.62 per patient (18 studies per tape, total cost A$11.20), compared with A$0.31 per patient for digital storage (8 studies per CD-ROM, total cost A$2.50). Conclusion. Digital and video interpretation were highly concordant; in the few cases of major discordance, the digital scores were lower, perhaps reflecting undersampling. Use of additional views and longer clips may be indicated to minimize discordance with video in patients with complex problems. Digital interpretation offers a significant reduction in reading times and the cost of archiving.
Resumo:
We compare the performance of two different low-storage filter diagonalisation (LSFD) strategies in the calculation of complex resonance energies of the HO2, radical. The first is carried out within a complex-symmetric Lanczos subspace representation [H. Zhang, S.C. Smith, Phys. Chem. Chem. Phys. 3 (2001) 2281]. The second involves harmonic inversion of a real autocorrelation function obtained via a damped Chebychev recursion [V.A. Mandelshtam, H.S. Taylor, J. Chem. Phys. 107 (1997) 6756]. We find that while the Chebychev approach has the advantage of utilizing real algebra in the time-consuming process of generating the vector recursion, the Lanczos, method (using complex vectors) requires fewer iterations, especially for low-energy part of the spectrum. The overall efficiency in calculating resonances for these two methods is comparable for this challenging system. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
A new method is presented to determine an accurate eigendecomposition of difficult low temperature unimolecular master equation problems. Based on a generalisation of the Nesbet method, the new method is capable of achieving complete spectral resolution of the master equation matrix with relative accuracy in the eigenvectors. The method is applied to a test case of the decomposition of ethane at 300 K from a microcanonical initial population with energy transfer modelled by both Ergodic Collision Theory and the exponential-down model. The fact that quadruple precision (16-byte) arithmetic is required irrespective of the eigensolution method used is demonstrated. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Teledermatology holds great potential for revolutionizing the delivery of dermatology services, providing equitable service to remote areas and allowing primary care physicians to refer patients to dermatology centres of excellence at a distance. However, before its routine application asa service tool, its reliability, accuracy and cost-effectiveness need to be verified by rigorous evaluation. Teledermatology can be applied in one of two ways: it may be conducted in real-time, utilizing videoconferencing equipment, or by store-and-forward methods, when transmitted digital images or photographs are submitted with a clinical history. While there is a considerable range of reported accuracy and reliability, evidence suggests that teledermatology will become increasingly utilized and incorporated into more conventional dermatology service delivery systems. Studies to date have generally found that real-time dermatology is likely to allow greater clinical information to be obtained from the patient. This may result in fewer patients requiring conventional consultations, but it is generally more time-consuming and costly to the health service provider It is often favoured by the patient because of the instantaneous nature of the diagnosis and management regimen for the condition, and it has educational value to the primary care physician. Store-and-forward systems of teledermatology often give high levels of diagnostic accuracy, and are cheaper and more convenient for the health care provider, but lack the immediacy of patient contact with the dermatologist, and involve a delay in obtaining the diagnosis and advice on management. It is increasingly likely that teledermatology will prove to be a significant tool in the provision of dermatology services in the future. These services will probably be provided by store-and-forward digital image systems, with real-time videoconferencing being used for case conferences and education. However, much more research is needed into the outcomes and Limitations of such a service and its effect on waiting lists, as well as possible cost benefits for patients, primary health care professionals and dermatology departments.
Resumo:
Research on the stability of flavours during high temperature extrusion cooking is reviewed. The important factors that affect flavour and aroma retention during the process of extrusion are illustrated. A substantial number of flavour volatiles which are incorporated prior to extrusion are normally lost during expansion, this is because of steam distillation. Therefore, a general practice has been to introduce a flavour mix after the extrusion process. This extra operation requires a binding agent (normally oil), and may also result in a non-uniform distribution of the flavour and low oxidative stability of the flavours exposed on the surface. Therefore, the importance of encapsulated flavours, particularly the beta -cyclodextrin-flavour complex, is highlighted in this paper.
Resumo:
A data warehouse is a data repository which collects and maintains a large amount of data from multiple distributed, autonomous and possibly heterogeneous data sources. Often the data is stored in the form of materialized views in order to provide fast access to the integrated data. One of the most important decisions in designing a data warehouse is the selection of views for materialization. The objective is to select an appropriate set of views that minimizes the total query response time with the constraint that the total maintenance time for these materialized views is within a given bound. This view selection problem is totally different from the view selection problem under the disk space constraint. In this paper the view selection problem under the maintenance time constraint is investigated. Two efficient, heuristic algorithms for the problem are proposed. The key to devising the proposed algorithms is to define good heuristic functions and to reduce the problem to some well-solved optimization problems. As a result, an approximate solution of the known optimization problem will give a feasible solution of the original problem. (C) 2001 Elsevier Science B.V. All rights reserved.