70 resultados para Load-unload Response Ratio (lurr)
Resumo:
OBJECTIVES: Persons from sub-Saharan Africa (SSA) are increasingly enrolled in the Swiss HIV Cohort Study (SHCS). Cohorts from other European countries showed higher rates of viral failure among their SSA participants. We analyzed long-term outcomes of SSA versus North Western European participants. DESIGN: We analyzed data of the SHCS, a nation-wide prospective cohort study of HIV-infected adults at 7 sites in Switzerland. METHODS: SSA and North Western European participants were included if their first treatment combination consisted of at least 3 antiretroviral drugs (cART), if they had at least 1 follow-up visit, did not report active injecting drug use, and did not start cART with CD4 counts >200 cells per microliter during pregnancy. Early viral response, CD4 cell recovery, viral failure, adherence, discontinuation from SHCS, new AIDS-defining events, and survival were analyzed using linear regression and Cox proportional hazard models. RESULTS: The proportion of participants from SSA within the SHCS increased from 2.6% (<1995) to 20.8% (2005-2009). Of 4656 included participants, 808 (17.4%) were from SSA. Early viral response (6 months) and rate of viral failure in an intent-to-stay-on-cART approach were similar. However, SSA participants had a higher risk of viral failure on cART (adjusted hazard ratio: 2.03, 95% confidence interval: 1.50 to 2.75). Self-reported adherence was inferior for SSA. There was no increase of AIDS-defining events or mortality in SSA participants. CONCLUSIONS: Increased attention must be given to factors negatively influencing adherence to cART in participants from SSA to guarantee equal longer-term results on cART.
Resumo:
In shade-intolerant plants such as Arabidopsis, a reduction in the red/far-red (R/FR) ratio, indicative of competition from other plants, triggers a suite of responses known as the shade avoidance syndrome (SAS). The phytochrome photoreceptors measure the R/FR ratio and control the SAS. The phytochrome-interacting factors 4 and 5 (PIF4 and PIF5) are stabilized in the shade and are required for a full SAS, whereas the related bHLH factor HFR1 (long hypocotyl in FR light) is transcriptionally induced by shade and inhibits this response. Here we show that HFR1 interacts with PIF4 and PIF5 and limits their capacity to induce the expression of shade marker genes and to promote elongation growth. HFR1 directly inhibits these PIFs by forming non-DNA-binding heterodimers with PIF4 and PIF5. Our data indicate that PIF4 and PIF5 promote SAS by directly binding to G-boxes present in the promoter of shade marker genes, but their action is limited later in the shade when HFR1 accumulates and forms non-DNA-binding heterodimers. This negative feedback loop is important to limit the response of plants to shade.
Resumo:
Recent laboratory studies have suggested that heart rate variability (HRV) may be an appropriate criterion for training load (TL) quantification. The aim of this study was to validate a novel HRV index that may be used to assess TL in field conditions. Eleven well-trained long-distance male runners performed four exercises of different duration and intensity. TL was evaluated using Foster and Banister methods. In addition, HRV measurements were performed 5 minutes before exercise and 5 and 30 minutes after exercise. We calculated HRV index (TLHRV) based on the ratio between HRV decrease during exercise and HRV increase during recovery. HRV decrease during exercise was strongly correlated with exercise intensity (R = -0.70; p < 0.01) but not with exercise duration or training volume. TLHRV index was correlated with Foster (R = 0.61; p = 0.01) and Banister (R = 0.57; p = 0.01) methods. This study confirms that HRV changes during exercise and recovery phase are affected by both intensity and physiological impact of the exercise. Since the TLHRV formula takes into account the disturbance and the return to homeostatic balance induced by exercise, this new method provides an objective and rational TL index. However, some simplification of the protocol measurement could be envisaged for field use.
Resumo:
BACKGROUND: Hepatitis B virus (HBV) genotypes can influence treatment outcome in HBV-monoinfected and human immunodeficiency virus (HIV)/HBV-coinfected patients. Tenofovir disoproxil fumarate (TDF) plays a pivotal role in antiretroviral therapy (ART) of HIV/HBV-coinfected patients. The influence of HBV genotypes on the response to antiviral drugs, particularly TDF, is poorly understood. METHODS: HIV/HBV-co-infected participants with detectable HBV DNA prior to TDF therapy were selected from the Swiss HIV Cohort Study. HBV genotypes were identified and resistance testing was performed prior to antiviral therapy, and in patients with delayed treatment response (>6 months). The efficacy of TDF to suppress HBV (HBV DNA <20 IU/mL) and the influence of HBV genotypes were determined. RESULTS: 143 HIV/HBV-coinfected participants with detectable HBV DNA were identified. The predominant HBV genotypes were A (82 patients, 57 %); and D (35 patients, 24 %); 20 patients (14 %) were infected with multiple genotypes (3 % A + D and 11 % A + G); and genotypes B, C and E were each present in two patients (1 %). TDF completely suppressed HBV DNA in 131 patients (92 %) within 6 months; and in 12 patients (8 %), HBV DNA suppression was delayed. No HBV resistance mutations to TDF were found in patients with delayed response, but all were infected with HBV genotype A (among these, 5 patients with genotype A + G), and all had previously been exposed to lamivudine. CONCLUSION: In HIV/HBV-coinfected patients, infection with multiple HBV genotypes was more frequent than previously reported. The large majority of patients had an undetectable HBV viral load at six months of TDF-containing ART. In patients without viral suppression, no TDF-related resistance mutations were found. The role of specific genotypes and prior lamivudine treatment in the delayed response to TDF warrant further investigation.
Resumo:
BACKGROUND: Artemisinin-resistant Plasmodium falciparum has emerged in the Greater Mekong sub-region and poses a major global public health threat. Slow parasite clearance is a key clinical manifestation of reduced susceptibility to artemisinin. This study was designed to establish the baseline values for clearance in patients from Sub-Saharan African countries with uncomplicated malaria treated with artemisinin-based combination therapies (ACTs). METHODS: A literature review in PubMed was conducted in March 2013 to identify all prospective clinical trials (uncontrolled trials, controlled trials and randomized controlled trials), including ACTs conducted in Sub-Saharan Africa, between 1960 and 2012. Individual patient data from these studies were shared with the WorldWide Antimalarial Resistance Network (WWARN) and pooled using an a priori statistical analytical plan. Factors affecting early parasitological response were investigated using logistic regression with study sites fitted as a random effect. The risk of bias in included studies was evaluated based on study design, methodology and missing data. RESULTS: In total, 29,493 patients from 84 clinical trials were included in the analysis, treated with artemether-lumefantrine (n = 13,664), artesunate-amodiaquine (n = 11,337) and dihydroartemisinin-piperaquine (n = 4,492). The overall parasite clearance rate was rapid. The parasite positivity rate (PPR) decreased from 59.7 % (95 % CI: 54.5-64.9) on day 1 to 6.7 % (95 % CI: 4.8-8.7) on day 2 and 0.9 % (95 % CI: 0.5-1.2) on day 3. The 95th percentile of observed day 3 PPR was 5.3 %. Independent risk factors predictive of day 3 positivity were: high baseline parasitaemia (adjusted odds ratio (AOR) = 1.16 (95 % CI: 1.08-1.25); per 2-fold increase in parasite density, P <0.001); fever (>37.5 °C) (AOR = 1.50 (95 % CI: 1.06-2.13), P = 0.022); severe anaemia (AOR = 2.04 (95 % CI: 1.21-3.44), P = 0.008); areas of low/moderate transmission setting (AOR = 2.71 (95 % CI: 1.38-5.36), P = 0.004); and treatment with the loose formulation of artesunate-amodiaquine (AOR = 2.27 (95 % CI: 1.14-4.51), P = 0.020, compared to dihydroartemisinin-piperaquine). CONCLUSIONS: The three ACTs assessed in this analysis continue to achieve rapid early parasitological clearance across the sites assessed in Sub-Saharan Africa. A threshold of 5 % day 3 parasite positivity from a minimum sample size of 50 patients provides a more sensitive benchmark in Sub-Saharan Africa compared to the current recommended threshold of 10 % to trigger further investigation of artemisinin susceptibility.
Resumo:
BACKGROUND: Transmitted human immunodeficiency virus type 1 (HIV) drug resistance (TDR) mutations are transmitted from nonresponding patients (defined as patients with no initial response to treatment and those with an initial response for whom treatment later failed) or from patients who are naive to treatment. Although the prevalence of drug resistance in patients who are not responding to treatment has declined in developed countries, the prevalence of TDR mutations has not. Mechanisms causing this paradox are poorly explored. METHODS: We included recently infected, treatment-naive patients with genotypic resistance tests performed ≤1 year after infection and before 2013. Potential risk factors for TDR mutations were analyzed using logistic regression. The association between the prevalence of TDR mutations and population viral load (PVL) among treated patients during 1997-2011 was estimated with Poisson regression for all TDR mutations and individually for the most frequent resistance mutations against each drug class (ie, M184V/L90M/K103N). RESULTS: We included 2421 recently infected, treatment-naive patients and 5399 patients with no response to treatment. The prevalence of TDR mutations fluctuated considerably over time. Two opposing developments could explain these fluctuations: generally continuous increases in the prevalence of TDR mutations (odds ratio, 1.13; P = .010), punctuated by sharp decreases in the prevalence when new drug classes were introduced. Overall, the prevalence of TDR mutations increased with decreasing PVL (rate ratio [RR], 0.91 per 1000 decrease in PVL; P = .033). Additionally, we observed that the transmitted high-fitness-cost mutation M184V was positively associated with the PVL of nonresponding patients carrying M184V (RR, 1.50 per 100 increase in PVL; P < .001). Such association was absent for K103N (RR, 1.00 per 100 increase in PVL; P = .99) and negative for L90M (RR, 0.75 per 100 increase in PVL; P = .022). CONCLUSIONS: Transmission of antiretroviral drug resistance is temporarily reduced by the introduction of new drug classes and driven by nonresponding and treatment-naive patients. These findings suggest a continuous need for new drugs, early detection/treatment of HIV-1 infection.
Resumo:
BACKGROUND: Diagnosing pediatric pneumonia is challenging in low-resource settings. The World Health Organization (WHO) has defined primary end-point radiological pneumonia for use in epidemiological and vaccine studies. However, radiography requires expertise and is often inaccessible. We hypothesized that plasma biomarkers of inflammation and endothelial activation may be useful surrogates for end-point pneumonia, and may provide insight into its biological significance. METHODS: We studied children with WHO-defined clinical pneumonia (n = 155) within a prospective cohort of 1,005 consecutive febrile children presenting to Tanzanian outpatient clinics. Based on x-ray findings, participants were categorized as primary end-point pneumonia (n = 30), other infiltrates (n = 31), or normal chest x-ray (n = 94). Plasma levels of 7 host response biomarkers at presentation were measured by ELISA. Associations between biomarker levels and radiological findings were assessed by Kruskal-Wallis test and multivariable logistic regression. Biomarker ability to predict radiological findings was evaluated using receiver operating characteristic curve analysis and Classification and Regression Tree analysis. RESULTS: Compared to children with normal x-ray, children with end-point pneumonia had significantly higher C-reactive protein, procalcitonin and Chitinase 3-like-1, while those with other infiltrates had elevated procalcitonin and von Willebrand Factor and decreased soluble Tie-2 and endoglin. Clinical variables were not predictive of radiological findings. Classification and Regression Tree analysis generated multi-marker models with improved performance over single markers for discriminating between groups. A model based on C-reactive protein and Chitinase 3-like-1 discriminated between end-point pneumonia and non-end-point pneumonia with 93.3% sensitivity (95% confidence interval 76.5-98.8), 80.8% specificity (72.6-87.1), positive likelihood ratio 4.9 (3.4-7.1), negative likelihood ratio 0.083 (0.022-0.32), and misclassification rate 0.20 (standard error 0.038). CONCLUSIONS: In Tanzanian children with WHO-defined clinical pneumonia, combinations of host biomarkers distinguished between end-point pneumonia, other infiltrates, and normal chest x-ray, whereas clinical variables did not. These findings generate pathophysiological hypotheses and may have potential research and clinical utility.
Resumo:
Lactate may represent a supplemental fuel for the brain. We examined cerebral lactate metabolism during prolonged brain glucose depletion (GD) in acute brain injury (ABI) patients monitored with cerebral microdialysis (CMD). Sixty episodes of GD (defined as spontaneous decreases of CMD glucose from normal to low [<1.0 mmol/L] for at least 2 h) were identified among 26 patients. During GD, we found a significant increase of CMD lactate (from 4±2.3 to 5.4±2.9 mmol/L), pyruvate (126.9±65.1 to 172.3±74.1 μmol/L), and lactate/pyruvate ratio (LPR; 27±6 to 35±9; all, p<0.005), while brain oxygen and blood lactate remained normal. Dynamics of lactate and glucose supply during GD were further studied by analyzing the relationships between blood and CMD samples. There was a strong correlation between blood and brain lactate when LPR was normal (r=0.56; p<0.0001), while an inverse correlation (r=-0.11; p=0.04) was observed at elevated LPR >25. The correlation between blood and brain glucose also decreased from r=0.62 to r=0.45. These findings in ABI patients suggest increased cerebral lactate delivery in the absence of brain hypoxia when glucose availability is limited and support the concept that lactate acts as alternative fuel.
Resumo:
PURPOSE: Pretreatment measurements of systemic inflammatory response, including the Glasgow prognostic score (GPS), the neutrophil-to-lymphocyte ratio (NLR), the monocyte-to-lymphocyte ratio (MLR), the platelet-to-lymphocyte ratio (PLR) and the prognostic nutritional index (PNI) have been recognized as prognostic factors in clear cell renal cell carcinoma (CCRCC), but there is at present no study that compared these markers. METHODS: We evaluated the pretreatment GPS, NLR, MLR, PLR and PNI in 430 patients, who underwent surgery for clinically localized CCRCC (pT1-3N0M0). Associations with disease-free survival were assessed with Cox models. Discrimination was measured with the C-index, and a decision curve analysis was used to evaluate the clinical net benefit. RESULTS: On multivariable analyses, all measures of systemic inflammatory response were significant prognostic factors. The increase in discrimination compared with the stage, size, grade and necrosis (SSIGN) score alone was 5.8 % for the GPS, 1.1-1.4 % for the NLR, 2.9-3.4 % for the MLR, 2.0-3.3 % for the PLR and 1.4-3.0 % for the PNI. On the simultaneous multivariable analysis of all candidate measures, the final multivariable model contained the SSIGN score (HR 1.40, P < 0.001), the GPS (HR 2.32, P < 0.001) and the MLR (HR 5.78, P = 0.003) as significant variables. Adding both the GPS and the MLR increased the discrimination of the SSIGN score by 6.2 % and improved the clinical net benefit. CONCLUSIONS: In patients with clinically localized CCRCC, the GPS and the MLR appear to be the most relevant prognostic measures of systemic inflammatory response. They may be used as an adjunct for patient counseling, tailoring management and clinical trial design.
Resumo:
The adult sex ratio (ASR) is a key parameter of the demography of human and other animal populations, yet the causes of variation in ASR, how individuals respond to this variation, and how their response feeds back into population dynamics remain poorly understood. A prevalent hypothesis is that ASR is regulated by intrasexual competition, which would cause more mortality or emigration in the sex of increasing frequency. Our experimental manipulation of populations of the common lizard (Lacerta vivipara) shows the opposite effect. Male mortality and emigration are not higher under male-biased ASR. Rather, an excess of adult males begets aggression toward adult females, whose survival and fecundity drop, along with their emigration rate. The ensuing prediction that adult male skew should be amplified and total population size should decline is supported by long-term data. Numerical projections show that this amplifying effect causes a major risk of population extinction. In general, such an "evolutionary trap" toward extinction threatens populations in which there is a substantial mating cost for females, and environmental changes or management practices skew the ASR toward males.