989 resultados para Log steaming


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contexte : La dialyse péritonéale (DP) est une méthode d'épuration extra-rénale qui utilise les propriétés physiologiques du péritoine comme membrane de dialyse. Cette technique requiert la présence d'un cathéter placé chirurgicalement dans le cul-de-sac de Douglas pour permettre l'instillation d'une solution de dialyse : le dialysat. Une des complications redoutée de cette technique est la survenue de péritonites infectieuses qui nécessitent l'administration rapide d'une antibiothérapie adéquate. Les péritonites peuvent parfois entrainer le retrait du cathéter de dialyse avec un échec définitif de la technique, ou plus rarement entrainer le décès du patient. Cette étude s'intéresse aux facteurs prédictifs de cette complication. Elle recense les germes impliqués et leur sensibilité aux différents antibiotiques. Cette étude analyse également les conséquences des péritonites, telles que la durée moyenne des hospitalisations, les échecs de la technique nécessitant un transfert définitif en hémodialyse et la survenue de décès. Méthode : Il s'agit d'une étude rétrospective monocentrique portant sur le dossier des patients inclus dans le programme de dialyse péritonéale du CHUV entre le 1er janvier 1995 et le 31 décembre 2010. Résultats : Cette étude inclus 108 patients, dont 65 hommes et 43 femmes. L'âge moyen est de 52.5 ans ± 17.84 (22-87). On répertorie 113 épisodes de péritonite pour une durée cumulative de 2932.24 mois x patients. L'incidence globale de péritonite s'élève à 1 épisode / 25.95 (mois x patient). La médiane de survie globale sans péritonite est de 23.56 mois. Une variabilité intergroupe statistiquement significative en matière de survie sans péritonite est démontrée entre les patients autonomes et non- autonomes [Log Rank (Mantel-Cox) :0.04], entre les patients diabétiques et non diabétiques [Log Rank (Mantel-Cox) : 0.002] et entre les patients cumulant un score de Charlson supérieur à 5 et ceux cumulant un score inférieur ou égal à 5 (Log Rank (Mantel-Cox) : 0.002). Une différence statistiquement significative en matière de survie de la technique a également pu être démontrée entre les patients autonomes et 2 non-autonome [Log Rank (Mantel-Cox) < 0.001], et entre les patients cumulant un score de Charlson supérieur ou inférieur ou égal à 5 [Log Rank (Mantel-Cox) : 0.047]. Le staphylococcus epidermidis est le pathogène le plus fréquemment isolé lors des péritonites (23.9%). Ce germe présente une sensibilité de 40.74% à l'oxacilline. Aucun cas de péritonite à MRSA n'a été enregistré dans ce collectif de patients. Une péritonite a causé la mort d'un patient (<1%). Conclusion : L'incidence de péritonite calculée satisfait les recommandations de la Société Internationale de Dialyse Péritonéale (ISPD). Une variabilité intergroupe statistiquement significative en terme de survie sans péritonite est mis en évidence pour : l'autonomie, le statut métabolique et le score de comorbidité de Charlson. Une variabilité intergroupe statistiquement significative en terme de survie de la technique est également démontrée pour : l'autonomie et le score de comorbidité de Charlson. Les statistiques de sensibilité mettent en évidence une excellente couverture antibiotique sur les germes isolés par le traitement empirique en vigueur (vancomycine + ceftazidime). La mortalité relative aux péritonites est extrêmement basse dans ce collectif de patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: (1) To evaluate the changes in surface roughness and gloss after simulated toothbrushing of 9 composite materials and 2 ceramic materials in relation to brushing time and load in vitro; (2) to assess the relationship between surface gloss and surface roughness. METHODS: Eight flat specimens of composite materials (microfilled: Adoro, Filtek Supreme, Heliomolar; microhybrid: Four Seasons, Tetric EvoCeram; hybrid: Compoglass F, Targis, Tetric Ceram; macrohybrid: Grandio), two ceramic materials (IPS d.SIGN and IPS Empress polished) were fabricated according to the manufacturer's instructions and optimally polished with up to 4000 grit SiC. The specimens were subjected to a toothbrushing (TB) simulation device (Willytec) with rotating movements, toothpaste slurry and at three different loads (100g/250g/350g). At hourly intervals from 1h to 10h TB, mean surface roughness Ra was measured with an optical sensor and the surface gloss (Gl) with a glossmeter. Statistical analysis was performed for log-transformed Ra data applying two-way ANOVA to evaluate the interaction between load and material and load and brushing time. RESULTS: There was a significant interaction between material and load as well as between load and brushing time (p<0.0001). The microhybrid and hybrid materials demonstrated more surface deterioration with higher loads, whereas with the microfilled resins Heliomolar and Adoro it was vice versa. For ceramic materials, no or little deterioration was observed over time and independent of the load. The ceramic materials and 3 of the composite materials (roughness) showed no further deterioration after 5h of toothbrushing. Mean surface gloss was the parameter which discriminated best between the materials, followed by mean surface roughness Ra. There was a strong correlation between surface gloss and surface roughness for all the materials except the ceramics. The evaluation of the deterioration curves of individual specimens revealed a more or less synchronous course suspecting hinting specific external conditions and not showing the true variability in relation to the tested material. SIGNIFICANCE: The surface roughness and gloss of dental materials changes with brushing time and load and thus results in different material rankings. Apart from Grandio, the hybrid composite resins were more prone to surface changes than microfilled composites. The deterioration potential of a composite material can be quickly assessed by measuring surface gloss. For this purpose, a brushing time of 10h (=72,000 strokes) is needed. In further comparative studies, specimens of different materials should be tested in one series to estimate the true variability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze crash data collected by the Iowa Department of Transportation using Bayesian methods. The data set includes monthly crash numbers, estimated monthly traffic volumes, site length and other information collected at 30 paired sites in Iowa over more than 20 years during which an intervention experiment was set up. The intervention consisted in transforming 15 undivided road segments from four-lane to three lanes, while an additional 15 segments, thought to be comparable in terms of traffic safety-related characteristics were not converted. The main objective of this work is to find out whether the intervention reduces the number of crashes and the crash rates at the treated sites. We fitted a hierarchical Poisson regression model with a change-point to the number of monthly crashes per mile at each of the sites. Explanatory variables in the model included estimated monthly traffic volume, time, an indicator for intervention reflecting whether the site was a “treatment” or a “control” site, and various interactions. We accounted for seasonal effects in the number of crashes at a site by including smooth trigonometric functions with three different periods to reflect the four seasons of the year. A change-point at the month and year in which the intervention was completed for treated sites was also included. The number of crashes at a site can be thought to follow a Poisson distribution. To estimate the association between crashes and the explanatory variables, we used a log link function and added a random effect to account for overdispersion and for autocorrelation among observations obtained at the same site. We used proper but non-informative priors for all parameters in the model, and carried out all calculations using Markov chain Monte Carlo methods implemented in WinBUGS. We evaluated the effect of the four to three-lane conversion by comparing the expected number of crashes per year per mile during the years preceding the conversion and following the conversion for treatment and control sites. We estimated this difference using the observed traffic volumes at each site and also on a per 100,000,000 vehicles. We also conducted a prospective analysis to forecast the expected number of crashes per mile at each site in the study one year, three years and five years following the four to three-lane conversion. Posterior predictive distributions of the number of crashes, the crash rate and the percent reduction in crashes per mile were obtained for each site for the months of January and June one, three and five years after completion of the intervention. The model appears to fit the data well. We found that in most sites, the intervention was effective and reduced the number of crashes. Overall, and for the observed traffic volumes, the reduction in the expected number of crashes per year and mile at converted sites was 32.3% (31.4% to 33.5% with 95% probability) while at the control sites, the reduction was estimated to be 7.1% (5.7% to 8.2% with 95% probability). When the reduction in the expected number of crashes per year, mile and 100,000,000 AADT was computed, the estimates were 44.3% (43.9% to 44.6%) and 25.5% (24.6% to 26.0%) for converted and control sites, respectively. In both cases, the difference in the percent reduction in the expected number of crashes during the years following the conversion was significantly larger at converted sites than at control sites, even though the number of crashes appears to decline over time at all sites. Results indicate that the reduction in the expected number of sites per mile has a steeper negative slope at converted than at control sites. Consistent with this, the forecasted reduction in the number of crashes per year and mile during the years after completion of the conversion at converted sites is more pronounced than at control sites. Seasonal effects on the number of crashes have been well-documented. In this dataset, we found that, as expected, the expected number of monthly crashes per mile tends to be higher during winter months than during the rest of the year. Perhaps more interestingly, we found that there is an interaction between the four to three-lane conversion and season; the reduction in the number of crashes appears to be more pronounced during months, when the weather is nice than during other times of the year, even though a reduction was estimated for the entire year. Thus, it appears that the four to three-lane conversion, while effective year-round, is particularly effective in reducing the expected number of crashes in nice weather.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We obtain minimax lower bounds on the regret for the classicaltwo--armed bandit problem. We provide a finite--sample minimax version of the well--known log $n$ asymptotic lower bound of Lai and Robbins. Also, in contrast to the log $n$ asymptotic results on the regret, we show that the minimax regret is achieved by mere random guessing under fairly mild conditions on the set of allowable configurations of the two arms. That is, we show that for {\sl every} allocation rule and for {\sl every} $n$, there is a configuration such that the regret at time $n$ is at least 1 -- $\epsilon$ times the regret of random guessing, where $\epsilon$ is any small positive constant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We show that the welfare of a representative consumer can be related to observable aggregatedata. To a first order, the change in welfare is summarized by (the present value of) the Solowproductivity residual and by the growth rate of the capital stock per capita. We also show thatproductivity and the capital stock suffice to calculate differences in welfare across countries, withboth variables computed as log level deviations from a reference country. These results hold forarbitrary production technology, regardless of the degree of product market competition, and applyto open economies as well if TFP is constructed using absorption rather than GDP as the measureof output. They require that TFP be constructed using prices and quantities as perceived byconsumers. Thus, factor shares need to be calculated using after-tax wages and rental rates, andwill typically sum to less than one. We apply these results to calculate welfare gaps and growthrates in a sample of developed countries for which high-quality TFP and capital data are available.We find that under realistic scenarios the United Kingdom and Spain had the highest growth ratesof welfare over our sample period of 1985-2005, but the United States had the highest level ofwelfare.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: To determine clinical and ultrasonographic predictors of joint replacement surgery across Europe in primary osteoarthritis (OA) of the knee. METHODS: This was a 3-year prospective study of a painful OA knee cohort (from a EULAR-sponsored, multicentre study). All subjects had clinical evaluation, radiographs and ultrasonography (US) at study entry. The rate of knee replacement surgery over the 3-year follow-up period was determined using Kaplan-Meier survival data analyses. Predictive factors for joint replacement were identified by univariate log-rank test then multivariate analysis using a Cox proportional-hazards regression model. Potential baseline predictors included demographic, clinical, radiographic and US features. RESULTS: Of the 600 original patients, 531 (88.5%), mean age 67+/-10 years, mean disease duration 6.1+/-6.9 years, had follow-up data and were analysed. During follow-up (median 3 years; range 0-4 years), knee replacement was done or required for 94 patients (estimated event rate of 17.7%). In the multivariate analysis, predictors of joint replacement were as follows: Kellgren and Lawrence radiographic grade (grade &gt; or =III vs &lt;III, hazards ratio (HR) = 4.08 (95% CI 2.34 to 7.12), p&lt;0.0001); ultrasonographic knee effusion (&gt; or =4 mm vs &lt;4 mm) (HR = 2.63 (95% CI 1.70 to 4.06), p&lt;0.0001); knee pain intensity on a 0-100 mm visual analogue scale (&gt; or =60 vs &lt;60) (HR = 1.81 (95% CI 1.15 to 2.83), p=0.01) and disease duration (&gt; or =5 years vs &lt;5 years) (HR=1.63 (95% CI 1.08 to 2.47), p=0.02). Clinically detected effusion and US synovitis were not associated with joint replacement in the univariate analysis. CONCLUSION: Longitudinal evaluation of this OA cohort demonstrated significant progression to joint replacement. In addition to severity of radiographic damage and pain, US-detected effusion was a predictor of subsequent joint replacement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Replicative phenotypic HIV resistance testing (rPRT) uses recombinant infectious virus to measure viral replication in the presence of antiretroviral drugs. Due to its high sensitivity of detection of viral minorities and its dissecting power for complex viral resistance patterns and mixed virus populations rPRT might help to improve HIV resistance diagnostics, particularly for patients with multiple drug failures. The aim was to investigate whether the addition of rPRT to genotypic resistance testing (GRT) compared to GRT alone is beneficial for obtaining a virological response in heavily pre-treated HIV-infected patients. METHODS: Patients with resistance tests between 2002 and 2006 were followed within the Swiss HIV Cohort Study (SHCS). We assessed patients' virological success after their antiretroviral therapy was switched following resistance testing. Multilevel logistic regression models with SHCS centre as a random effect were used to investigate the association between the type of resistance test and virological response (HIV-1 RNA <50 copies/mL or ≥1.5 log reduction). RESULTS: Of 1158 individuals with resistance tests 221 with GRT+rPRT and 937 with GRT were eligible for analysis. Overall virological response rates were 85.1% for GRT+rPRT and 81.4% for GRT. In the subgroup of patients with >2 previous failures, the odds ratio (OR) for virological response of GRT+rPRT compared to GRT was 1.45 (95% CI 1.00-2.09). Multivariate analyses indicate a significant improvement with GRT+rPRT compared to GRT alone (OR 1.68, 95% CI 1.31-2.15). CONCLUSIONS: In heavily pre-treated patients rPRT-based resistance information adds benefit, contributing to a higher rate of treatment success.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Power transformations of positive data tables, prior to applying the correspondence analysis algorithm, are shown to open up a family of methods with direct connections to the analysis of log-ratios. Two variations of this idea are illustrated. The first approach is simply to power the original data and perform a correspondence analysis this method is shown to converge to unweighted log-ratio analysis as the power parameter tends to zero. The second approach is to apply the power transformation to thecontingency ratios, that is the values in the table relative to expected values based on the marginals this method converges to weighted log-ratio analysis, or the spectral map. Two applications are described: first, a matrix of population genetic data which is inherently two-dimensional, and second, a larger cross-tabulation with higher dimensionality, from a linguistic analysis of several books.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Cretaceous Mont Saint-Hilaire complex (Quebec, Canada) comprises three major rock units that were emplaced in the following sequence: (I) gabbros; (II) diorites; (III) diverse partly agpaitic foid syenites. The major element compositions of the rock-forming minerals, age-corrected Nd and oxygen isotope data for mineral separates and trace element data of Fe-Mg silicates from the various lithologies imply a common source for all units. The distribution of the rare earth elements in clinopyroxene from the gabbros indicates an ocean island basalt type composition for the parental magma. Gabbros record temperatures of 1200 to 800 degrees C, variable silica activities between 0 center dot 7 and 0 center dot 3, and f(O2) values between -0 center dot 5 and +0 center dot 7 (log delta FMQ, where FMQ is fayalite-magnetite-quartz). The diorites crystallized under uniform a(SiO2) (a(SiO2) = 0 center dot 4-0 center dot 5) and more reduced f(O2) conditions (log delta FMQ similar to-1) between similar to 1100 and similar to 800 degrees C. Phase equilibria in various foid syenites indicate that silica activities decrease from 0 center dot 6-0 center dot 3 at similar to 1000 degrees C to < 0 center dot 3 at similar to 550 degrees C. Release of an aqueous fluid during the transition to the hydrothermal stage caused a(SiO2) to drop to very low values, which results from reduced SiO(2) solubilities in aqueous fluids compared with silicate melts. During the hydrothermal stage, high water activities stabilized zeolite-group minerals. Fluid inclusions record a complex post-magmatic history, which includes trapping of an aqueous fluid that unmixed from the restitic foid syenitic magma. Cogenetic aqueous and carbonic fluid inclusions reflect heterogeneous trapping of coexisting immiscible external fluids in the latest evolutionary stage. The O and C isotope characteristics of fluid-inclusion hosted CO(2) and late-stage carbonates imply that the surrounding limestones were the source of the external fluids. The mineral-rich syenitic rocks at Mont Saint-Hilaire evolved as follows: first, alkalis, high field strength and large ion lithophile elements were pre-enriched in the (late) magmatic and subsequent hydrothermal stages; second, percolation of external fluids in equilibrium with the carbonate host-rocks and mixing processes with internal fluids as well as fluid-rock interaction governed dissolution of pre-existing minerals, element transport and precipitation of mineral assemblages determined by locally variable parameters. It is this hydrothermal interplay between internal and external fluids that is responsible for the mineral wealth found at Mont Saint-Hilaire.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose/Objective(s): Primary bone lymphoma (PBL) represents less than 1% of all malignant lymphomas, and 4-5% of all extranodal lymphomas. In this study, we assessed the disease profile, outcome, and prognostic factors in patients with stage I and II PBL.Materials/Methods: Between 1987 and 2008, 116 consecutive patients with PBL treated in 13 RCNinstitutions were included in this study. Inclusion criteriawere: age.17 yrs, PBLin stage I and II, andminimum6months follow-up. The median agewas 51 yrs (range: 17-93).Diagnosticwork-up included plain boneXray (74%of patients), scintigraphy (62%), CT-scan (65%),MRI (58%), PET (18%), and bone-marrow biopsy (84%).All patients had biopsy-proven confirmation of non-Hodgkin's lymphoma (NHL). The histopathological type was predominantly diffuse large B-cell lymphoma (78%) and follicular lymphoma (6%), according to theWHOclassification. One hundred patients had a high-grade, 7 intermediate and 9 low-gradeNHL. Ninety-three patients had anAnn-Arbor stage I, and 23 had a stage II. Seventy-seven patients underwent chemoradiotherapy (CXRT), 12 radiotherapy (RT) alone, 10 chemotherapy alone (CXT), 9 surgery followed by CXRT, 5 surgery followed by CXT, and 2 surgery followed by RT. One patient died before treatment.Median RT dosewas 40Gy (range: 4-60).Themedian number ofCXTcycleswas 6 (range, : 2-8).Median follow-upwas 41months (range: 6-242).Results: Following treatment, the overall response rate was 91% (CR 74%, PR 17%). Local recurrence was observed in 12 (10%) patients, and systemic recurrence in 17 (15%) patients. Causes of death included disease progression in 16, unrelated disease in 6, CXT-related toxicity in 1, and secondary cancer in 2 patients. The 5-yr overall survival (OS), disease-free survival (DFS), lymphoma- specific survival (LSS), and local control (LC) were 76%, 69%, 78%, and 92%, respectively. In univariate analyses (log-rank test), favorable prognostic factors for survival were: age\50 years (p = 0.008), IPI score #1 (p = 0.009), complete response (p\0.001), CXT (p = 0.008), number of CXT cycles $6 (p = 0.007), and RT dose . 40 Gy (p = 0.005). In multivariate analysis age, RT dose, complete response, and absence of B symptoms were independent factors influencing the outcome. There were 3 patients developing grade 3 or more (CTCAE.V3.0) toxicities.Conclusions: This large multicenter study, confirms the relatively good prognosis of early stage PBL, treated with combined CXRT. Local control was excellent, and systemic failure occurred infrequently. A sufficient dose of RT (. 40 Gy) and completeCXT regime (. 6 cycles) were associated with a better outcome. Combined modality appears to be the treatment of choice.Author Disclosure: L. Cai, None; M.C. Stauder, None; Y.J. Zhang, None; P. Poortmans, None; Y.X. Li, None; N. Constantinou, None; J. Thariat, None; S. Kadish, None; M. Ozsahin, None; R.O. Mirimanoff, None.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to interpret the biplot it is necessary to know which points usually variables are the ones that are important contributors to the solution, and this information is available separately as part of the biplot s numerical results. We propose a new scaling of the display, called the contribution biplot, which incorporates this diagnostic directly into the graphical display, showing visually the important contributors and thus facilitating the biplot interpretation and often simplifying the graphical representation considerably. The contribution biplot can be applied to a wide variety of analyses such as correspondence analysis, principal component analysis, log-ratio analysis and the graphical results of a discriminant analysis/MANOVA, in fact to any method based on the singular-value decomposition. In the contribution biplot one set of points, usually the rows of the data matrix, optimally represent the spatial positions of the cases or sample units, according to some distance measure that usually incorporates some form of standardization unless all data are comparable in scale. The other set of points, usually the columns, is represented by vectors that are related to their contributions to the low-dimensional solution. A fringe benefit is that usually only one common scale for row and column points is needed on the principal axes, thus avoiding the problem of enlarging or contracting the scale of one set of points to make the biplot legible. Furthermore, this version of the biplot also solves the problem in correspondence analysis of low-frequency categories that are located on the periphery of the map, giving the false impression that they are important, when they are in fact contributing minimally to the solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We use aggregate GDP data and within-country income shares for theperiod 1970-1998 to assign a level of income to each person in theworld. We then estimate the gaussian kernel density function for theworldwide distribution of income. We compute world poverty rates byintegrating the density function below the poverty lines. The $1/daypoverty rate has fallen from 20% to 5% over the last twenty five years.The $2/day rate has fallen from 44% to 18%. There are between 300 and500 million less poor people in 1998 than there were in the 70s.We estimate global income inequality using seven different popularindexes: the Gini coefficient, the variance of log-income, two ofAtkinson s indexes, the Mean Logarithmic Deviation, the Theil indexand the coefficient of variation. All indexes show a reduction in globalincome inequality between 1980 and 1998. We also find that most globaldisparities can be accounted for by across-country, not within-country,inequalities. Within-country disparities have increased slightly duringthe sample period, but not nearly enough to offset the substantialreduction in across-country disparities. The across-country reductionsin inequality are driven mainly, but not fully, by the large growth rateof the incomes of the 1.2 billion Chinese citizens. Unless Africa startsgrowing in the near future, we project that income inequalities willstart rising again. If Africa does not start growing, then China, India,the OECD and the rest of middle-income and rich countries diverge awayfrom it, and global inequality will rise. Thus, the aggregate GDP growthof the African continent should be the priority of anyone concerned withincreasing global income inequality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Valganciclovir (VGC) has proved efficacious and safe for the prophylaxis against cytomegalovirus (CMV) in high-risk transplant recipients and for the treatment of CMV retinitis in AIDS patients. We used VGC for the treatment of CMV infection (viremia without symptoms) or disease (CMV syndrome or tissue-invasive disease) in kidney, heart, and lung transplant recipients. Fourteen transplant recipients were treated: five for asymptomatic CMV infection and nine for CMV disease. VGC was administered in doses adjusted to renal function for 4 to 12 weeks (induction and maintenance therapy). Clinically, all nine patients with CMV disease responded to treatment. Microbiologically, treatment with VGC turned blood culture negative for CMV within 2 weeks in all patients and was associated with a > or =2 log decrease in blood CMV DNA within 3 weeks in 8 of 8 tested patients. With a follow-up of 6 months (n = 12 patients), asymptomatic recurrent CMV viremia was noted in five cases, and CMV syndrome noted in one case (all cases in the first 2 months after the end of treatment). VGC was clinically well tolerated in all patients; however, laboratory abnormalities occurred in three cases (mild increase in transaminases, thrombocytopenia, and pancytopenia). This preliminary experience strongly suggests that therapy with VGC is effective against CMV in organ transplant recipients; however, the exact duration of therapy remains to be determined: a longer course may be necessary to prevent early recurrence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.