942 resultados para LOW ASPECT RATIO
Impact of low-level viremia on clinical and virological outcomes in treated HIV-1-infected patients.
Resumo:
BACKGROUND: The goal of antiretroviral therapy (ART) is to reduce HIV-related morbidity and mortality by suppressing HIV replication. The prognostic value of persistent low-level viremia (LLV), particularly for clinical outcomes, is unknown. OBJECTIVE: Assess the association of different levels of LLV with virological failure, AIDS event, and death among HIV-infected patients receiving combination ART. METHODS: We analyzed data from 18 cohorts in Europe and North America, contributing to the ART Cohort Collaboration. Eligible patients achieved viral load below 50 copies/ml within 3-9 months after ART initiation. LLV50-199 was defined as two consecutive viral loads between 50 and 199 copies/ml and LLV200-499 as two consecutive viral loads between 50 and 499 copies/ml, with at least one between 200 and 499 copies/ml. We used Cox models to estimate the association of LLV with virological failure (two consecutive viral loads at least 500 copies/ml or one viral load at least 500 copies/ml, followed by a modification of ART) and AIDS event/death. RESULTS: Among 17 902 patients, 624 (3.5%) experienced LLV50-199 and 482 (2.7%) LLV200-499. Median follow-up was 2.3 and 3.1 years for virological and clinical outcomes, respectively. There were 1903 virological failure, 532 AIDS events and 480 deaths. LLV200-499 was strongly associated with virological failure [adjusted hazard ratio (aHR) 3.97, 95% confidence interval (CI) 3.05-5.17]. LLV50-199 was weakly associated with virological failure (aHR 1.38, 95% CI 0.96-2.00). LLV50-199 and LLV200-499 were not associated with AIDS event/death (aHR 1.19, 95% CI 0.78-1.82; and aHR 1.11, 95% CI 0.72-1.71, respectively). CONCLUSION: LLV200-499 was strongly associated with virological failure, but not with AIDS event/death. Our results support the US guidelines, which define virological failure as a confirmed viral load above 200 copies/ml.
Resumo:
BACKGROUND: Both the human immunodeficiency virus (HIV) and hepatitis C virus (HCV), either alone or as coinfections, persist in their hosts by destroying and/or escaping immune defenses, with high morbidity as consequence. In some cases, however, a balance between infection and immunity is reached, leading to prolonged asymptomatic periods. We report a case of such an indolent co-infection, which could be explained by the development of a peculiar subset of Natural Killer (NK) cells. RESULTS: Persistently high peripheral levels of CD56+ NK cells were observed in a peculiar hemophiliac HIV/HCV co-infected patient with low CD4 counts, almost undetectable HIV viral load and no opportunistic infections. Thorough analysis of NK-subsets allowed to identify a marked increase in the CD56bright/dim cell ratio and low numbers of CD16+/CD56- cells. These cells have high levels of natural cytotoxicity receptors but low NCR2 and CD69, and lack both CD57 and CD25 expression. The degranulation potential of NK-cells which correlates with target cytolysis was atypically mainly performed by CD56bright NK-cells, whereas no production of interferon γ (IFN-γ) was observed following NK activation by K562 cells. CONCLUSIONS: These data suggest that the expansion and lytic capacity of the CD56bright NK subset may be involved in the protection of this « rare » HIV/HCV co-infected hemophiliac A patient from opportunistic infections and virus-related cancers despite very low CD4+ cell counts.
Resumo:
The spontaneous activity of the brain shows different features at different scales. On one hand, neuroimaging studies show that long-range correlations are highly structured in spatiotemporal patterns, known as resting-state networks, on the other hand, neurophysiological reports show that short-range correlations between neighboring neurons are low, despite a large amount of shared presynaptic inputs. Different dynamical mechanisms of local decorrelation have been proposed, among which is feedback inhibition. Here, we investigated the effect of locally regulating the feedback inhibition on the global dynamics of a large-scale brain model, in which the long-range connections are given by diffusion imaging data of human subjects. We used simulations and analytical methods to show that locally constraining the feedback inhibition to compensate for the excess of long-range excitatory connectivity, to preserve the asynchronous state, crucially changes the characteristics of the emergent resting and evoked activity. First, it significantly improves the model's prediction of the empirical human functional connectivity. Second, relaxing this constraint leads to an unrealistic network evoked activity, with systematic coactivation of cortical areas which are components of the default-mode network, whereas regulation of feedback inhibition prevents this. Finally, information theoretic analysis shows that regulation of the local feedback inhibition increases both the entropy and the Fisher information of the network evoked responses. Hence, it enhances the information capacity and the discrimination accuracy of the global network. In conclusion, the local excitation-inhibition ratio impacts the structure of the spontaneous activity and the information transmission at the large-scale brain level.
Resumo:
The quantity of interest for high-energy photon beam therapy recommended by most dosimetric protocols is the absorbed dose to water. Thus, ionization chambers are calibrated in absorbed dose to water, which is the same quantity as what is calculated by most treatment planning systems (TPS). However, when measurements are performed in a low-density medium, the presence of the ionization chamber generates a perturbation at the level of the secondary particle range. Therefore, the measured quantity is close to the absorbed dose to a volume of water equivalent to the chamber volume. This quantity is not equivalent to the dose calculated by a TPS, which is the absorbed dose to an infinitesimally small volume of water. This phenomenon can lead to an overestimation of the absorbed dose measured with an ionization chamber of up to 40% in extreme cases. In this paper, we propose a method to calculate correction factors based on the Monte Carlo simulations. These correction factors are obtained by the ratio of the absorbed dose to water in a low-density medium □D(w,Q,V1)(low) averaged over a scoring volume V₁ for a geometry where V₁ is filled with the low-density medium and the absorbed dose to water □D(w,QV2)(low) averaged over a volume V₂ for a geometry where V₂ is filled with water. In the Monte Carlo simulations, □D(w,QV2)(low) is obtained by replacing the volume of the ionization chamber by an equivalent volume of water, according to the definition of the absorbed dose to water. The method is validated in two different configurations which allowed us to study the behavior of this correction factor as a function of depth in phantom, photon beam energy, phantom density and field size.
Resumo:
OBJECTIVE The aim of the study was to determine whether the consumption of low protein dietetic foods improved the quality of life and nutritional status for vitamins B and homocysteine in patients with chronic renal failure. METHODOLOGY This nutritional-intervention involved 28 men and 21 women, divided into two groups. The control-group consumed a low-protein diet prescribed, and the experimental-group consumed a diet in which some commonly used foods were replaced by low-protein dietetic foods. The study lasted 6 months. Food consumption was assessed by 24-h recall. Vitamin B6 as alphaEAST was measured in blood. Creatinine, urea, vitamin B12, folate and homocysteine were measured in plasma. The impact on the patients' quality of life from consuming the dietetic foods was assessed via the SF-36 questionnaire. RESULTS After 6 months, the protein intake among the experimental-group had decreased by 40%, and the urea/creatinine ratio and alphaEAST activity were also lower. The results of the SF-36 questionnaire show that the patients in the experimental-group obtained higher scores in the categories of general health and physical status. CONCLUSIONS The dietetic foods were very well accepted by all patients and their use allowed a better control of the protein intake, improved B6 status and a better quality of life.
Resumo:
Résumé en français Contexte Les interventions intensives d'aide à l'arrêt de la cigarette en milieu hospitalier n'ont pas été adoptées à large échelle, peut-être en raison de barrières organisationnelles. Nous évaluons dans cette étude l'efficacité d'une approche moins contraignante. Méthodes Nous avons conçu et réalisé une étude de cohorte avec un groupe de contrôle historique dans le département de médecine d'un hôpital universitaire de 850 lits. Cent dix-sept fumeurs éligibles consécutifs ont bénéficié d'une intervention d'aide à l'arrêt de la cigarette et 113 fumeurs hospitalisés avant l'implémentation de cette intervention ont constitué notre groupe de contrôle. L'intervention d'aide à l'arrêt de la cigarette, d'une durée de 30 minutes, était réalisée par un médecin assistant formé en désaccoutumance au tabac, sans aucun contact ultérieur de suivi. Tous les patients ont ensuite reçu un questionnaire pour évaluer quelles étaient leurs habitudes en matière de tabagisme 6 mois après leur sortie d'hôpital. Nous avons considéré les patients perdus de vue comme fumeurs et l'abstinence tabagique ponctuelle (au moins 7 jours consécutifs) des ex-fumeurs a été validée par leur médecin traitant. Résultats Les taux d'arrêt de la cigarette validés étaient de 23.9% dans le groupe intervention et de 9.7% dans le groupe contrôle (odds ratio 2.9, intervalle de confiance à 95% [IC95] 1.4 à 6.2). Après ajustement pour les facteurs confondants potentiels, l'intervention était toujours efficace, avec un odds ratio ajusté de 2.7 (1095 = 1.0 à 5.0). Conclusion Une intervention d'aide à l'arrêt de la cigarette de faible intensité, sans contact de suivi, est associée avec un plus haut taux d'arrêt de la cigarette à 6 mois en comparaison avec un groupe de contrôle historique.
Resumo:
BACKGROUND: Iterative reconstruction (IR) techniques reduce image noise in multidetector computed tomography (MDCT) imaging. They can therefore be used to reduce radiation dose while maintaining diagnostic image quality nearly constant. However, CT manufacturers offer several strength levels of IR to choose from. PURPOSE: To determine the optimal strength level of IR in low-dose MDCT of the cervical spine. MATERIAL AND METHODS: Thirty consecutive patients investigated by low-dose cervical spine MDCT were prospectively studied. Raw data were reconstructed using filtered back-projection and sinogram-affirmed IR (SAFIRE, strength levels 1 to 5) techniques. Image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) were measured at C3-C4 and C6-C7 levels. Two radiologists independently and blindly evaluated various anatomical structures (both dense and soft tissues) using a 4-point scale. They also rated the overall diagnostic image quality using a 10-point scale. RESULTS: As IR strength levels increased, image noise decreased linearly, while SNR and CNR both increased linearly at C3-C4 and C6-C7 levels (P < 0.001). For the intervertebral discs, the content of neural foramina and dural sac, and for the ligaments, subjective image quality scores increased linearly with increasing IR strength level (P ≤ 0.03). Conversely, for the soft tissues and trabecular bone, the scores decreased linearly with increasing IR strength level (P < 0.001). Finally, the overall diagnostic image quality scores increased linearly with increasing IR strength level (P < 0.001). CONCLUSION: The optimal strength level of IR in low-dose cervical spine MDCT depends on the anatomical structure to be analyzed. For the intervertebral discs and the content of neural foramina, high strength levels of IR are recommended.
Resumo:
Purpose: To develop and evaluate a practical method for the quantification of signal-to-noise ratio (SNR) on coronary MR angiograms (MRA) acquired with parallel imaging.Materials and Methods: To quantify the spatially varying noise due to parallel imaging reconstruction, a new method has been implemented incorporating image data acquisition followed by a fast noise scan during which radio-frequency pulses, cardiac triggering and navigator gating are disabled. The performance of this method was evaluated in a phantom study where SNR measurements were compared with those of a reference standard (multiple repetitions). Subsequently, SNR of myocardium and posterior skeletal muscle was determined on in vivo human coronary MRA.Results: In a phantom, the SNR measured using the proposed method deviated less than 10.1% from the reference method for small geometry factors (<= 2). In vivo, the noise scan for a 10 min coronary MRA acquisition was acquired in 30 s. Higher signal and lower SNR, due to spatially varying noise, were found in myocardium compared with posterior skeletal muscle.Conclusion: SNR quantification based on a fast noise scan is a validated and easy-to-use method when applied to three-dimensional coronary MRA obtained with parallel imaging as long as the geometry factor remains low.
Resumo:
The aim of the present study was to compare, under the same nursing conditions, the energy-nitrogen balance and the protein turnover in small for gestational age (SGA) and appropriate for gestational age (AGA) low birthweight infants. We compared 8 SGA's (mean +/- s.d.: gestational age 35 +/- 2 weeks, birthweight 1520 +/- 330 g) to 11 AGA premature infants (32 +/- 2 weeks, birthweight 1560 +/- 240 g). When their rate of weight gain was above 15 g/kg/d (17.6 +/- 3.0 and 18.2 +/- 2.6 g/kg/d, mean postnatal age 18 +/- 10 and 20 +/- 9 d respectively) they were studied with respect to their metabolizable energy intake, their energy expenditure, their energy and protein gain and their protein turnover. Energy balance was assessed by the difference between metabolizable energy and energy expenditure as measured by indirect calorimetry. Protein gain was calculated from the amount of retained nitrogen. Protein turnover was estimated by a stable isotope enrichment technique using repeated nasogastric administration of 15N-glycine for 72 h. Although there was no difference in their metabolizable energy intakes (110 +/- 12 versus 108 +/- 11 kcal/kg/d), SGA's had a higher rate of resting energy expenditure (64 +/- 8 versus 57 +/- 8 kcal/kg/d, P less than 0.05). Protein gain and composition of weight gain was very similar in both groups (2.0 +/- 0.4 versus 2.1 +/- 0.4 g protein/kg/d; 3.5 +/- 1.1 versus 3.3 +/- 1.4 g fat/kg/d in SGA's and AGA's respectively). However, the rate of protein synthesis was significantly lower in SGA's (7.7 +/- 1.6 g/kg/d) as compared to AGA's (9.7 +/- 2.8 g/kg/d; P less than 0.05). It is concluded that SGA's have a more efficient protein gain/protein synthesis ratio since for the same weight and protein gains, SGA's show a 20 per cent slower protein turnover. They might therefore tolerate slightly higher protein intakes. Postconceptional age seems to be an important factor in the regulation of protein turnover.
Resumo:
Immigrants from high-burden countries and HIV-coinfected individuals are risk groups for tuberculosis (TB) in countries with low TB incidence. Therefore, we studied their role in transmission of Mycobacterium tuberculosis in Switzerland. We included all TB patients from the Swiss HIV Cohort and a sample of patients from the national TB registry. We identified molecular clusters by spoligotyping and mycobacterial interspersed repetitive-unit-variable-number tandem-repeat (MIRU-VNTR) analysis and used weighted logistic regression adjusted for age and sex to identify risk factors for clustering, taking sampling proportions into account. In total, we analyzed 520 TB cases diagnosed between 2000 and 2008; 401 were foreign born, and 113 were HIV coinfected. The Euro-American M. tuberculosis lineage dominated throughout the study period (378 strains; 72.7%), with no evidence for another lineage, such as the Beijing genotype, emerging. We identified 35 molecular clusters with 90 patients, indicating recent transmission; 31 clusters involved foreign-born patients, and 15 involved HIV-infected patients. Birth origin was not associated with clustering (adjusted odds ratio [aOR], 1.58; 95% confidence interval [CI], 0.73 to 3.43; P = 0.25, comparing Swiss-born with foreign-born patients), but clustering was reduced in HIV-infected patients (aOR, 0.49; 95% CI, 0.26 to 0.93; P = 0.030). Cavitary disease, male sex, and younger age were all associated with molecular clustering. In conclusion, most TB patients in Switzerland were foreign born, but transmission of M. tuberculosis was not more common among immigrants and was reduced in HIV-infected patients followed up in the national HIV cohort study. Continued access to health services and clinical follow-up will be essential to control TB in this population.
Resumo:
While adaptive adjustment of sex ratio in the function of colony kin structure and food availability commonly occurs in social Hymenoptera, long-term studies have revealed substantial unexplained between-year variation in sex ratio at the population level. In order to identify factors that contribute to increased between-year variation in population sex ratio, we conducted a comparative analysis across 47 Hymenoptera species differing in their breeding system. We found that between-year variation in population sex ratio steadily increased as one moved from solitary species, to primitively eusocial species, to single-queen eusocial species, to multiple-queen eusocial species. Specifically, between-year variation in population sex ratio was low (6.6% of total possible variation) in solitary species, which is consistent with the view that in solitary species, sex ratio can vary only in response to fluctuations in ecological factors such as food availability. In contrast, we found significantly higher (19.5%) between-year variation in population sex ratio in multiple-queen eusocial species, which supports the view that in these species, sex ratio can also fluctuate in response to temporal changes in social factors such as queen number and queen-worker control over sex ratio, as well as factors influencing caste determination. The simultaneous adjustment of sex ratio in response to temporal fluctuations in ecological and social factors seems to preclude the existence of a single sex ratio optimum. The absence of such an optimum may reflect an additional cost associated with the evolution of complex breeding systems in Hymenoptera societies.
Resumo:
OBJECTIVE: HIV-infected children have impaired antibody responses after exposure to certain antigens. Our aim was to determine whether HIV-infected children had lower varicella zoster virus (VZV) antibody levels compared with HIV-infected adults or healthy children and, if so, whether this was attributable to an impaired primary response, accelerated antibody loss, or failure to reactivate the memory VZV response. METHODS: In a prospective, cross-sectional and retrospective longitudinal study, we compared antibody responses, measured by enzyme-linked immunosorbent assay (ELISA), elicited by VZV infection in 97 HIV-infected children and 78 HIV-infected adults treated with antiretroviral therapy, followed over 10 years, and 97 age-matched healthy children. We also tested antibody avidity in HIV-infected and healthy children. RESULTS: Median anti-VZV immunoglobulin G (IgG) levels were lower in HIV-infected children than in adults (264 vs. 1535 IU/L; P<0.001) and levels became more frequently unprotective over time in the children [odds ratio (OR) 17.74; 95% confidence interval (CI) 4.36-72.25; P<0.001]. High HIV viral load was predictive of VZV antibody waning in HIV-infected children. Anti-VZV antibodies did not decline more rapidly in HIV-infected children than in adults. Antibody levels increased with age in healthy (P=0.004) but not in HIV-infected children. Thus, antibody levels were lower in HIV-infected than in healthy children (median 1151 IU/L; P<0.001). Antibody avidity was lower in HIV-infected than healthy children (P<0.001). A direct correlation between anti-VZV IgG level and avidity was present in HIV-infected children (P=0.001), but not in healthy children. CONCLUSION: Failure to maintain anti-VZV IgG levels in HIV-infected children results from failure to reactivate memory responses. Further studies are required to investigate long-term protection and the potential benefits of immunization.
Resumo:
BACKGROUND: The diagnosis of hypertension in children is difficult because of the multiple sex-, age-, and height-specific thresholds to define elevated blood pressure (BP). Blood pressure-to-height ratio (BPHR) has been proposed to facilitate the identification of elevated BP in children. OBJECTIVE: We assessed the performance of BPHR at a single screening visit to identify children with hypertension that is sustained elevated BP. METHOD: In a school-based study conducted in Switzerland, BP was measured at up to three visits in 5207 children. Children had hypertension if BP was elevated at the three visits. Sensitivity, specificity, negative predictive value (NPV), and positive predictive value (PPV) for the identification of hypertension were assessed for different thresholds of BPHR. The ability of BPHR at a single screening visit to discriminate children with and without hypertension was evaluated with receiver operating characteristic (ROC) curve analyses. RESULTS: The prevalence of systolic/diastolic hypertension was 2.2%. Systolic BPHR had a better performance to identify hypertension compared with diastolic BPHR (area under the ROC curve: 0.95 vs. 0.84). The highest performance was obtained with a systolic BPHR threshold set at 0.80 mmHg/cm (sensitivity: 98%; specificity: 85%; PPV: 12%; and NPV: 100%) and a diastolic BPHR threshold set at 0.45 mmHg/cm (sensitivity: 79%; specificity: 70%; PPV: 5%; and NPV: 99%). The PPV was higher among tall or overweight children. CONCLUSION: BPHR at a single screening visit had a high performance to identify hypertension in children, although the low prevalence of hypertension led to a low PPV.
Resumo:
BACKGROUND: Low-molecular-weight heparin (LMWH) appears to be safe and effective for treating pulmonary embolism (PE), but its cost-effectiveness has not been assessed. METHODS: We built a Markov state-transition model to evaluate the medical and economic outcomes of a 6-day course with fixed-dose LMWH or adjusted-dose unfractionated heparin (UFH) in a hypothetical cohort of 60-year-old patients with acute submassive PE. Probabilities for clinical outcomes were obtained from a meta-analysis of clinical trials. Cost estimates were derived from Medicare reimbursement data and other sources. The base-case analysis used an inpatient setting, whereas secondary analyses examined early discharge and outpatient treatment with LMWH. Using a societal perspective, strategies were compared based on lifetime costs, quality-adjusted life-years (QALYs), and the incremental cost-effectiveness ratio. RESULTS: Inpatient treatment costs were higher for LMWH treatment than for UFH (dollar 13,001 vs dollar 12,780), but LMWH yielded a greater number of QALYs than did UFH (7.677 QALYs vs 7.493 QALYs). The incremental costs of dollar 221 and the corresponding incremental effectiveness of 0.184 QALYs resulted in an incremental cost-effectiveness ratio of dollar 1,209/QALY. Our results were highly robust in sensitivity analyses. LMWH became cost-saving if the daily pharmacy costs for LMWH were < dollar 51, if > or = 8% of patients were eligible for early discharge, or if > or = 5% of patients could be treated entirely as outpatients. CONCLUSION: For inpatient treatment of PE, the use of LMWH is cost-effective compared to UFH. Early discharge or outpatient treatment in suitable patients with PE would lead to substantial cost savings.
Resumo:
BACKGROUND: The risk of end stage renal disease (ESRD) is increased among individuals with low income and in low income communities. However, few studies have examined the relation of both individual and community socioeconomic status (SES) with incident ESRD. METHODS: Among 23,314 U.S. adults in the population-based Reasons for Geographic and Racial Differences in Stroke study, we assessed participant differences across geospatially-linked categories of county poverty [outlier poverty, extremely high poverty, very high poverty, high poverty, neither (reference), high affluence and outlier affluence]. Multivariable Cox proportional hazards models were used to examine associations of annual household income and geospatially-linked county poverty measures with incident ESRD, while accounting for death as a competing event using the Fine and Gray method. RESULTS: There were 158 ESRD cases during follow-up. Incident ESRD rates were 178.8 per 100,000 person-years (105 py) in high poverty outlier counties and were 76.3 /105 py in affluent outlier counties, p trend = 0.06. In unadjusted competing risk models, persons residing in high poverty outlier counties had higher incidence of ESRD (which was not statistically significant) when compared to those persons residing in counties with neither high poverty nor affluence [hazard ratio (HR) 1.54, 95% Confidence Interval (CI) 0.75-3.20]. This association was markedly attenuated following adjustment for socio-demographic factors (age, sex, race, education, and income); HR 0.96, 95% CI 0.46-2.00. However, in the same adjusted model, income was independently associated with risk of ESRD [HR 3.75, 95% CI 1.62-8.64, comparing the < $20,000 income group to the > $75,000 group]. There were no statistically significant associations of county measures of poverty with incident ESRD, and no evidence of effect modification. CONCLUSIONS: In contrast to annual family income, geospatially-linked measures of county poverty have little relation with risk of ESRD. Efforts to mitigate socioeconomic disparities in kidney disease may be best appropriated at the individual level.