989 resultados para correction methods
Resumo:
Background. We assessed end-diastolic right ventricular (RV) dimensions and left ventricular (LV) ejection fraction by use of intraoperative transesophageal echocardiography before and after surgical correction of pectus excavatum in adults. Methods. A prospective study was conducted including 17 patients undergoing surgical correction of pectus excavatum according to the technique of Ravitch-Shamberger between 1999 and 2004. Intraoperative transesophageal echocardiography was performed under general anesthesia before and after surgery to assess end-diastolic RV dimensions and LV ejection fraction. The end-diastolic RV diameter and area were measured in four-chamber and RV inflow-outflow view, and the RV volume was calculated from these data. The LV was assessed by transgastric short-axis view, and its ejection fraction was calculated by use of the Teichholz formula. Results. The end-diastolic RV diameter, area, and volume all significantly increased after surgery (mean values +/- SD, respectively: 2.4 +/- 0.8 cm versus 3.0 +/- 0.9 cm, p < 0.001; 12.5 +/- 5.2 cm(2) versus 18.4 +/- 7.5 cm(2), p < 0.001; and 21.7 +/- 11.7 mL versus 40.8 +/- 23 mL, p < 0.001). The LV ejection fraction also significantly increased after surgery (58.4% +/- 15% versus 66.2% +/- 6%, p < 0.001). Conclusions. Surgical correction of pectus excavatum according to Ravitch-Shamberger technique results in a significant increase in end-diastolic RV dimensions and a significantly increased LV ejection fraction. (Ann Thorac Surg 2010; 89: 240-4) (C) 2010 by The Society of Thoracic Surgeons
Resumo:
Contexte La tétralogie de Fallot est la cardiopathie congénitale cyanogène la plus courante. Elle se présente avec une prévalence de 1/3000 naissances et sa proportion dans les malformations cardiaques congénitales est de 6.8% (Baltimore-Washington Infant Study). La correction, qui doit être chirurgicale, est généralement faite dans la première année de vie avec une intervention à coeur ouvert qui permet la réparation des différentes malformations. Dans certains cas on effectue d'abord une intervention de type palliatif, qui sera suivie d'une deuxième pour la réparation complète. Objectifs Les objectifs de ce travail sont de présenter les différentes stratégies de correction cardio-chirurgicale pratiquées actuellement et les raisons du choix; d'évaluer les complications rencontrées à long terme et leur prise en charge; ainsi que d'analyser les possibles différences dans l'évolution clinique à long terme de patients ayant subi une réparation unique vs. en deux stades. Méthode Revue de la littérature spécialisée, avec une attention particulière aux études qui traitent du decours clinique à long terme et des complications rencontrées après correction chirurgicale de la tétralogie de Fallot. Comparaison des informations récoltées concernant les deux stratégies pratiquées actuellement: réparation complète précédée ou non d'un traitement palliatif néonatal. Conclusions Ce travail résumera les connaissances actuelles sur le traitement chirurgical de cette anomalie cardiaque. Il permettra d'avoir une analyse récente des dernières études concernant l'évolution à long terme après sa correction et offrira une base de comparaison entre les différentes stratégies de réparation.
Resumo:
Due to their relatively small size and central location within the thorax, improvement in signal-to-noise (SNR) is of paramount importance for in vivo coronary vessel wall imaging. Thus, with higher field strengths, coronary vessel wall imaging is likely to benefit from the expected "near linear" proportional gain in SNR. In this study, we demonstrate the feasibility of in vivo human high field (3 T) coronary vessel wall imaging using a free-breathing black blood fast gradient echo technique with respiratory navigator gating and real-time motion correction. With the broader availability of more SNR efficient fast spin echo and spiral techniques, further improvements can be expected.
Resumo:
Flow cytometry (FCM) is emerging as an important tool in environmental microbiology. Although flow cytometry applications have to date largely been restricted to certain specialized fields of microbiology, such as the bacterial cell cycle and marine phytoplankton communities, technical advances in instrumentation and methodology are leading to its increased popularity and extending its range of applications. Here we will focus on a number of recent flow cytometry developments important for addressing questions in environmental microbiology. These include (i) the study of microbial physiology under environmentally relevant conditions, (ii) new methods to identify active microbial populations and to isolate previously uncultured microorganisms, and (iii) the development of high-throughput autofluorescence bioreporter assays
Resumo:
Elucidating the molecular and neural basis of complex social behaviors such as communal living, division of labor and warfare requires model organisms that exhibit these multi-faceted behavioral phenotypes. Social insects, such as ants, bees, wasps and termites, are attractive models to address this problem, with rich ecological and ethological foundations. However, their atypical systems of reproduction have hindered application of classical genetic approaches. In this review, we discuss how recent advances in social insect genomics, transcriptomics, and functional manipulations have enhanced our ability to observe and perturb gene expression, physiology and behavior in these species. Such developments begin to provide an integrated view of the molecular and cellular underpinnings of complex social behavior.
Resumo:
Recently, kernel-based Machine Learning methods have gained great popularity in many data analysis and data mining fields: pattern recognition, biocomputing, speech and vision, engineering, remote sensing etc. The paper describes the use of kernel methods to approach the processing of large datasets from environmental monitoring networks. Several typical problems of the environmental sciences and their solutions provided by kernel-based methods are considered: classification of categorical data (soil type classification), mapping of environmental and pollution continuous information (pollution of soil by radionuclides), mapping with auxiliary information (climatic data from Aral Sea region). The promising developments, such as automatic emergency hot spot detection and monitoring network optimization are discussed as well.
Resumo:
This paper analyses the correction of errors and mistakes made by students in the Foreign Language Teaching classroom. Its goal is to point out typical correction behaviors in Cape Verde in Language Teaching classrooms and raise teachers’ consciousness concerning better correction practice.
Resumo:
This paper presents 3-D brain tissue classificationschemes using three recent promising energy minimizationmethods for Markov random fields: graph cuts, loopybelief propagation and tree-reweighted message passing.The classification is performed using the well knownfinite Gaussian mixture Markov Random Field model.Results from the above methods are compared with widelyused iterative conditional modes algorithm. Theevaluation is performed on a dataset containing simulatedT1-weighted MR brain volumes with varying noise andintensity non-uniformities. The comparisons are performedin terms of energies as well as based on ground truthsegmentations, using various quantitative metrics.
Resumo:
This paper analyzes whether standard covariance matrix tests work whendimensionality is large, and in particular larger than sample size. Inthe latter case, the singularity of the sample covariance matrix makeslikelihood ratio tests degenerate, but other tests based on quadraticforms of sample covariance matrix eigenvalues remain well-defined. Westudy the consistency property and limiting distribution of these testsas dimensionality and sample size go to infinity together, with theirratio converging to a finite non-zero limit. We find that the existingtest for sphericity is robust against high dimensionality, but not thetest for equality of the covariance matrix to a given matrix. For thelatter test, we develop a new correction to the existing test statisticthat makes it robust against high dimensionality.
Resumo:
OBJECTIVES: We studied the influence of noninjecting and injecting drug use on mortality, dropout rate, and the course of antiretroviral therapy (ART), in the Swiss HIV Cohort Study (SHCS). METHODS: Cohort participants, registered prior to April 2007 and with at least one drug use questionnaire completed until May 2013, were categorized according to their self-reported drug use behaviour. The probabilities of death and dropout were separately analysed using multivariable competing risks proportional hazards regression models with mutual correction for the other endpoint. Furthermore, we describe the influence of drug use on the course of ART. RESULTS: A total of 6529 participants (including 31% women) were followed during 31 215 person-years; 5.1% participants died; 10.5% were lost to follow-up. Among persons with homosexual or heterosexual HIV transmission, noninjecting drug use was associated with higher all-cause mortality [subhazard rate (SHR) 1.73; 95% confidence interval (CI) 1.07-2.83], compared with no drug use. Also, mortality was increased among former injecting drug users (IDUs) who reported noninjecting drug use (SHR 2.34; 95% CI 1.49-3.69). Noninjecting drug use was associated with higher dropout rates. The mean proportion of time with suppressed viral replication was 82.2% in all participants, irrespective of ART status, and 91.2% in those on ART. Drug use lowered adherence, and increased rates of ART change and ART interruptions. Virological failure on ART was more frequent in participants who reported concomitant drug injections while on opiate substitution, and in current IDUs, but not among noninjecting drug users. CONCLUSIONS: Noninjecting drug use and injecting drug use are modifiable risks for death, and they lower retention in a cohort and complicate ART.
Resumo:
The paper contrasts empirically the results of alternative methods for estimating thevalue and the depreciation of mineral resources. The historical data of Mexico andVenezuela, covering the period 1920s-1980s, is used to contrast the results of severalmethods. These are the present value, the net price method, the user cost method andthe imputed income method. The paper establishes that the net price and the user costare not competing methods as such, but alternative adjustments to different scenariosof closed and open economies. The results prove that the biases of the methods, ascommonly described in the theoretical literature, only hold under the most restrictedscenario of constant rents over time. It is argued that the difference between what isexpected to happen and what actually did happen is for the most part due to a missingvariable, namely technological change. This is an important caveat to therecommendations made based on these models.
Resumo:
Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.