50 resultados para Low fungal load settings
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Immigrants from high tuberculosis (TB) incidence regions are a risk group for TB in low-incidence countries such as Switzerland. In a previous analysis of a nationwide collection of 520 Mycobacterium tuberculosis isolates from 2000-2008, we identified 35 clusters comprising 90 patients based on standard genotyping (24-loci MIRU-VNTR and spoligotyping). Here, we used whole genome sequencing (WGS) to revisit these transmission clusters. Genome-based transmission clusters were defined as isolate pairs separated by ≤12 single nucleotide polymorphisms (SNPs). WGS confirmed 17/35 (49%) MIRU-VNTR clusters; the other 18 clusters contained pairs separated by >12 SNPs. Most transmission clusters (3/4) of Swiss-born patients were confirmed by WGS, as opposed to 25% (4/16) of clusters involving only foreign-born patients. The overall clustering proportion using standard genotyping was 17% (90 patients, 95% confidence interval [CI]: 14-21%), but only 8% (43 patients, 95% CI: 6-11%) using WGS. The clustering proportion was 17% (67/401, 95% CI: 13-21%) using standard genotyping and 7% (26/401, 95% CI: 4-9%) using WGS among foreign-born patients, and 19% (23/119, 95% CI: 13-28%) and 14% (17/119, 95% CI: 9-22%), respectively, among Swiss-born patients. Using weighted logistic regression, we found weak evidence for an association between birth origin and transmission (aOR 2.2, 95% CI: 0.9-5.5, comparing Swiss-born patients to others). In conclusion, standard genotyping overestimated recent TB transmission in Switzerland when compared to WGS, particularly among immigrants from high TB incidence regions, where genetically closely related strains often predominate. We recommend the use of WGS to identify transmission clusters in low TB incidence settings.
Resumo:
PURPOSE: To determine how the ADC value of parotid glands is influenced by the choice of b-values. MATERIALS AND METHODS: In eight healthy volunteers, diffusion-weighted echo-planar imaging (DW-EPI) was performed on a 1.5 T system, with b-values (in seconds/mm2) of 0, 50, 100, 150, 200, 250, 300, 500, 750, and 1000. ADC values were calculated by two alternative methods (exponential vs. logarithmic fit) from five different sets of b-values: (A) all b-values; (B) b=0, 50, and 100; (C) b=0 and 750; (D) b=0, 500, and 1000; and (E) b=500, 750, and 1000. RESULTS: The mean ADC values for the different settings were (in 10(-3) mm2/second, exponential fit): (A) 0.732+/-0.019, (B) 2.074+/-0.084, (C) 0.947+/-0.020, (D) 0.890+/-0.023, and (E) 0.581+/-0.021. ADC values were significantly (P <0.001) different for all pairwise comparisons of settings (A-E) of b-values, except for A vs. D (P=0.172) and C vs. D (P=0.380). The ADC(B) was significantly higher than ADC(C) or ADC(D), which was significantly higher than ADC(E). ADC values from exponential vs. logarithmic fit (P=0.542), as well as left vs. right parotid gland (P=0.962), were indistinguishable. CONCLUSION: The ADC values calculated from low b-value settings were significantly higher than those calculated from high b-value settings. These results suggest that not only true diffusion but also perfusion and saliva flow may contribute to the ADC.
Resumo:
Background. Few studies consider the incidence of individual AIDS-defining illnesses (ADIs) at higher CD4 counts, relevant on a population level for monitoring and resource allocation. Methods. Individuals from the Collaboration of Observational HIV Epidemiological Research Europe (COHERE) aged ≥14 years with ≥1 CD4 count of ≥200 µL between 1998 and 2010 were included. Incidence rates (per 1000 person-years of follow-up [PYFU]) were calculated for each ADI within different CD4 strata; Poisson regression, using generalized estimating equations and robust standard errors, was used to model rates of ADIs with current CD4 ≥500/µL. Results. A total of 12 135 ADIs occurred at a CD4 count of ≥200 cells/µL among 207 539 persons with 1 154 803 PYFU. Incidence rates declined from 20.5 per 1000 PYFU (95% confidence interval [CI], 20.0–21.1 per 1000 PYFU) with current CD4 200–349 cells/µL to 4.1 per 1000 PYFU (95% CI, 3.6–4.6 per 1000 PYFU) with current CD4 ≥ 1000 cells/µL. Persons with a current CD4 of 500–749 cells/µL had a significantly higher rate of ADIs (adjusted incidence rate ratio [aIRR], 1.20; 95% CI, 1.10–1.32), whereas those with a current CD4 of ≥1000 cells/µL had a similar rate (aIRR, 0.92; 95% CI, .79–1.07), compared to a current CD4 of 750–999 cells/µL. Results were consistent in persons with high or low viral load. Findings were stronger for malignant ADIs (aIRR, 1.52; 95% CI, 1.25–1.86) than for nonmalignant ADIs (aIRR, 1.12; 95% CI, 1.01–1.25), comparing persons with a current CD4 of 500–749 cells/µL to 750–999 cells/µL. Discussion. The incidence of ADIs was higher in individuals with a current CD4 count of 500–749 cells/µL compared to those with a CD4 count of 750–999 cells/µL, but did not decrease further at higher CD4 counts. Results were similar in patients virologically suppressed on combination antiretroviral therapy, suggesting that immune reconstitution is not complete until the CD4 increases to >750 cells/µL.
Resumo:
The aim was to study the variation in metabolic responses in early-lactating dairy cows (n = 232) on-farm that were pre-selected for a high milk fat content (>45 g/l) and a high fat/protein ratio in milk (>1.5) in their previous lactation. Blood was assayed for concentrations of metabolites and hormones. Liver was measured for mRNA abundance of 25 candidate genes encoding enzymes and receptors involved in gluconeogenesis (6), fatty acid β-oxidation (6), fatty acid and triglyceride synthesis (5), cholesterol synthesis (4), ketogenesis (2) and the urea cycle (2). Two groups of cows were formed based on the plasma concentrations of glucose, non-esterified fatty acids (NEFA) and β-hydroxybutyric acid (BHBA) (GRP+, high metabolic load; glucose <3.0 mm, NEFA >300 μm and BHBA >1.0 mm, n = 30; GRP-, low metabolic load; glucose >3.0 mm, NEFA <300 μm and BHBA <1.0 mm, n = 30). No differences were found between GRP+ and GRP- for the milk yield at 3 weeks post-partum, but milk fat content was higher (p < 0.01) for GRP+ than for GRP-. In week 8 post-partum, milk yield was higher in GRP+ in relation to GRP- (37.5 vs. 32.5 kg/d; p < 0.01). GRP+ in relation to GRP- had higher (p < 0.001) NEFA and BHBA and lower glucose, insulin, IGF-I, T3 , T4 concentrations (p < 0.01). The mRNA abundance of genes related to gluconeogenesis, fatty acid β-oxidation, fatty acid and triglyceride synthesis, cholesterol synthesis and the urea cycle was different in GRP+ compared to GRP- (p < 0.05), although gene transcripts related to ketogenesis were similar between GRP+ and GRP-. In conclusion, high metabolic load post-partum in dairy cows on-farm corresponds to differences in the liver in relation to dairy cows with low metabolic load, even though all cows were pre-selected for a high milk fat content and fat/protein ratio in milk in their previous lactation.
Resumo:
BACKGROUND: In high-income countries, viral load is routinely measured to detect failure of antiretroviral therapy (ART) and guide switching to second-line ART. Viral load monitoring is not generally available in resource-limited settings. We examined switching from nonnucleoside reverse transcriptase inhibitor (NNRTI)-based first-line regimens to protease inhibitor-based regimens in Africa, South America and Asia. DESIGN AND METHODS: Multicohort study of 17 ART programmes. All sites monitored CD4 cell count and had access to second-line ART and 10 sites monitored viral load. We compared times to switching, CD4 cell counts at switching and obtained adjusted hazard ratios for switching (aHRs) with 95% confidence intervals (CIs) from random-effects Weibull models. RESULTS: A total of 20 113 patients, including 6369 (31.7%) patients from 10 programmes with access to viral load monitoring, were analysed; 576 patients (2.9%) switched. Low CD4 cell counts at ART initiation were associated with switching in all programmes. Median time to switching was 16.3 months [interquartile range (IQR) 10.1-26.6] in programmes with viral load monitoring and 21.8 months (IQR 14.0-21.8) in programmes without viral load monitoring (P < 0.001). Median CD4 cell counts at switching were 161 cells/microl (IQR 77-265) in programmes with viral load monitoring and 102 cells/microl (44-181) in programmes without viral load monitoring (P < 0.001). Switching was more common in programmes with viral load monitoring during months 7-18 after starting ART (aHR 1.38; 95% CI 0.97-1.98), similar during months 19-30 (aHR 0.97; 95% CI 0.58-1.60) and less common during months 31-42 (aHR 0.29; 95% CI 0.11-0.79). CONCLUSION: In resource-limited settings, switching to second-line regimens tends to occur earlier and at higher CD4 cell counts in ART programmes with viral load monitoring compared with programmes without viral load monitoring.
Resumo:
CT pulmonary angiography is the currently accepted standard in ruling out acute pulmonary embolism. Issues of radiation dose received by patients via CT have been extensively disputed by radiologists and reported by the media. In recent years there has been considerable research performed to find ways for reducing radiation exposure from CT. Herein, we will discuss specific measures that have been shown to be valuable for CT pulmonary angiography. The limitations and the potential benefits of reduced CT peak tube kilovoltage will be detailed as this method is capable of reducing both radiation exposure and iodine load to the patient simultaneously. We discuss some of the emerging tools, which will hopefully play a significant role in wider acceptance of low-dose CT pulmonary angiography protocols.
Resumo:
The aim of this study involving 170 patients suffering from non-specific low back pain was to test the validity of the spinal function sort (SFS) in a European rehabilitation setting. The SFS, a picture-based questionnaire, assesses perceived functional ability of work tasks involving the spine. All measurements were taken by a blinded research assistant; work status was assessed with questionnaires. Our study demonstrated a high internal consistency shown by a Cronbach's alpha of 0.98, reasonable evidence for unidimensionality, spearman correlations of >0.6 with work activities, and discriminating power for work status at 3 and 12 months by ROC curve analysis (area under curve = 0.760 (95% CI 0.689-0.822), respectively, 0.801 (95% CI 0.731-0.859). The standardised response mean within the two treatment groups was 0.18 and -0.31. As a result, we conclude that the perceived functional ability for work tasks can be validly assessed with the SFS in a European rehabilitation setting in patients with non-specific low back pain, and is predictive for future work status.
Resumo:
In low-income settings, treatment failure is often identified using CD4 cell count monitoring. Consequently, patients remain on a failing regimen, resulting in a higher risk of transmission. We investigated the benefit of routine viral load monitoring for reducing HIV transmission.
Resumo:
To assess the effect of radiation dose reduction on the appearance and visual quantification of specific CT patterns of fungal infection in immuno-compromised patients.
Resumo:
OBJECTIVES: To assess the frequency of and risk factors for discordant responses at 6 months on highly active antiretroviral therapy (HAART) in previously treatment-naive HIV patients from resource-limited countries. METHODS: The Antiretroviral Therapy in Low-Income Countries Collaboration is a network of clinics providing care and treatment to HIV-infected patients in Africa, Latin America, and Asia. Patients who initiated therapy between 1996 and 2004, were aged 16 years or older, and had a baseline CD4 cell count were included in this analysis. Responses were defined based on plasma viral load (PVL) and CD4 cell count at 6 months as complete virologic and immunologic (VR(+)IR(+)), virologic only (VR(+)IR(-)), immunologic only (VR(-)IR(+)), and nonresponse (VR(-)IR(-)). Multinomial logistic regression was used to assess the association between therapy responses and clinical and demographic variables. RESULTS: Of the 3111 patients eligible for analysis, 1914 had available information at 6 months of therapy: 1074 (56.1%) were VR(+)IR(+), 364 (19.0%) were VR(+)IR(-), 283 (14.8%) were (VR(-)IR(+)), and 193 (10.1%) were VR(-)IR(-). OF THE 3111 patients eligible for analysis, 1914 had available information at 6 months of therapy: 1074 (56.1%) were VRIR, 364 (19.0%) were VRIR, 283 (14.8%) were (VRIR), and 193 (10.1%) were VRIR. Compared with complete responders, virologic-only responders were older, had a higher baseline CD4 cell count, had a lower baseline PVL, and were more likely to have received a nonstandard HAART regimen; immunologic-only responders were younger, had a lower baseline CD4 cell count, had a higher baseline PVL, and were more likely to have received a protease inhibitor-based regimen. CONCLUSIONS: The frequency of and risk factors for discordant responses were comparable to those observed in developed countries. Longer follow-up is needed to assess the long-term impact of discordant responses on mortality in these resource-limited settings.
Resumo:
We used a PCR method to quantify the loads of Chlamydia trachomatis organisms in self-collected urine and vulvovaginal swab (VVS) samples from 93 women and 30 men participating in the Chlamydia Screening Studies Project, a community-based study of individuals not seeking health care. For women, self-collected VVS had a higher mean chlamydial load (10,405 organisms/ml; 95% confidence interval [95% CI], 5,167 to 21,163 organisms/ml) than did first-void urines (FVU) (503 organisms/ml; 95% CI, 250 to 1,022 organisms/ml; P < 0.001). Chlamydial loads in female and male self-collected FVU specimens were similar (P = 0.634). The mean chlamydial load in FVU specimens decreased with increasing age in females and males. There was no strong statistical evidence of differences in chlamydial load in repeat male and female FVU specimens taken when patients attended for treatment a median of 23.5 (range, 14 to 62) and 28 (range, 13 to 132) days later, respectively, or in VVS taken a median of 35 (range, 14 to 217) days later. In this study, chlamydial load values for infected persons in the community who were not seeking treatment were lower than those published in other studies involving symptomatic patients attending clinical settings. This might have implications for estimates of the infectiousness of chlamydia. The results of this study provide a scientific rationale for preferring VVS to FVU specimens from women.
Resumo:
BACKGROUND Monitoring of HIV viral load in patients on combination antiretroviral therapy (ART) is not generally available in resource-limited settings. We examined the cost-effectiveness of qualitative point-of-care viral load tests (POC-VL) in sub-Saharan Africa. DESIGN Mathematical model based on longitudinal data from the Gugulethu and Khayelitsha township ART programmes in Cape Town, South Africa. METHODS Cohorts of patients on ART monitored by POC-VL, CD4 cell count or clinically were simulated. Scenario A considered the more accurate detection of treatment failure with POC-VL only, and scenario B also considered the effect on HIV transmission. Scenario C further assumed that the risk of virologic failure is halved with POC-VL due to improved adherence. We estimated the change in costs per quality-adjusted life-year gained (incremental cost-effectiveness ratios, ICERs) of POC-VL compared with CD4 and clinical monitoring. RESULTS POC-VL tests with detection limits less than 1000 copies/ml increased costs due to unnecessary switches to second-line ART, without improving survival. Assuming POC-VL unit costs between US$5 and US$20 and detection limits between 1000 and 10,000 copies/ml, the ICER of POC-VL was US$4010-US$9230 compared with clinical and US$5960-US$25540 compared with CD4 cell count monitoring. In Scenario B, the corresponding ICERs were US$2450-US$5830 and US$2230-US$10380. In Scenario C, the ICER ranged between US$960 and US$2500 compared with clinical monitoring and between cost-saving and US$2460 compared with CD4 monitoring. CONCLUSION The cost-effectiveness of POC-VL for monitoring ART is improved by a higher detection limit, by taking the reduction in new HIV infections into account and assuming that failure of first-line ART is reduced due to targeted adherence counselling.
Resumo:
BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.
Resumo:
BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20% or 40% of patients in seven cohorts of patients starting ART in South Africa, and plotted cut-offs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia and the Asia-Pacific. FINDINGS 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African, from 64% to 93% in the Zambian and from 73% to 96% in the Asia-Pacific cohorts. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia and from 37% to 71% in Asia-Pacific. The area under the receiver-operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia and from 0.77 to 0.92 in Asia Pacific. INTERPRETATION CD4-based risk charts with optimal cut-offs for targeted VL testing may be useful to monitor ART in settings where VL capacity is limited.