62 resultados para microcystin-LR and -RR


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Elevated plasma fibrinogen levels have prospectively been associated with an increased risk of coronary artery disease in different populations. Plasma fibrinogen is a measure of systemic inflammation crucially involved in atherosclerosis. The vagus nerve curtails inflammation via a cholinergic antiinflammatory pathway. We hypothesized that lower vagal control of the heart relates to higher plasma fibrinogen levels. METHODS: Study participants were 559 employees (age 17-63 years; 89% men) of an airplane manufacturing plant in southern Germany. All subjects underwent medical examination, blood sampling, and 24-hour ambulatory heart rate recording while kept on their work routine. The root mean square of successive differences in RR intervals during the night period (nighttime RMSSD) was computed as the heart rate variability index of vagal function. RESULTS: After controlling for demographic, lifestyle, and medical factors, nighttime RMSSD explained 1.7% (P = 0.001), 0.8% (P = 0.033), and 7.8% (P = 0.007), respectively, of the variance in fibrinogen levels in all subjects, men, and women. Nighttime RMSSD and fibrinogen levels were stronger correlated in women than in men. In all workers, men, and women, respectively, there was a mean +/- SEM increase of 0.41 +/- 0.13 mg/dL, 0.28 +/- 0.13 mg/dL, and 1.16 +/- 0.41 mg/dL fibrinogen for each millisecond decrease in nighttime RMSSD. CONCLUSIONS: Reduced vagal outflow to the heart correlated with elevated plasma fibrinogen levels independent of the established cardiovascular risk factors. This relationship seemed comparably stronger in women than men. Such an autonomic mechanism might contribute to the atherosclerotic process and its thrombotic complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Treatment strategies for acute basilar artery occlusion (BAO) are based on case series and data that have been extrapolated from stroke intervention trials in other cerebrovascular territories, and information on the efficacy of different treatments in unselected patients with BAO is scarce. We therefore assessed outcomes and differences in treatment response after BAO. METHODS: The Basilar Artery International Cooperation Study (BASICS) is a prospective, observational registry of consecutive patients who presented with an acute symptomatic and radiologically confirmed BAO between November 1, 2002, and October 1, 2007. Stroke severity at time of treatment was dichotomised as severe (coma, locked-in state, or tetraplegia) or mild to moderate (any deficit that was less than severe). Outcome was assessed at 1 month. Poor outcome was defined as a modified Rankin scale score of 4 or 5, or death. Patients were divided into three groups according to the treatment they received: antithrombotic treatment only (AT), which comprised antiplatelet drugs or systemic anticoagulation; primary intravenous thrombolysis (IVT), including subsequent intra-arterial thrombolysis; or intra-arterial therapy (IAT), which comprised thrombolysis, mechanical thrombectomy, stenting, or a combination of these approaches. Risk ratios (RR) for treatment effects were adjusted for age, the severity of neurological deficits at the time of treatment, time to treatment, prodromal minor stroke, location of the occlusion, and diabetes. FINDINGS: 619 patients were entered in the registry. 27 patients were excluded from the analyses because they did not receive AT, IVT, or IAT, and all had a poor outcome. Of the 592 patients who were analysed, 183 were treated with only AT, 121 with IVT, and 288 with IAT. Overall, 402 (68%) of the analysed patients had a poor outcome. No statistically significant superiority was found for any treatment strategy. Compared with outcome after AT, patients with a mild-to-moderate deficit (n=245) had about the same risk of poor outcome after IVT (adjusted RR 0.94, 95% CI 0.60-1.45) or after IAT (adjusted RR 1.29, 0.97-1.72) but had a worse outcome after IAT compared with IVT (adjusted RR 1.49, 1.00-2.23). Compared with AT, patients with a severe deficit (n=347) had a lower risk of poor outcome after IVT (adjusted RR 0.88, 0.76-1.01) or IAT (adjusted RR 0.94, 0.86-1.02), whereas outcomes were similar after treatment with IAT or IVT (adjusted RR 1.06, 0.91-1.22). INTERPRETATION: Most patients in the BASICS registry received IAT. Our results do not support unequivocal superiority of IAT over IVT, and the efficacy of IAT versus IVT in patients with an acute BAO needs to be assessed in a randomised controlled trial. FUNDING: Department of Neurology, University Medical Center Utrecht.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Improved survival among HIV-infected individuals on antiretroviral therapy (ART) has focused attention on AIDS-related cancers including Kaposi sarcoma (KS). However, the effect of KS on response to ART is not well-described in Southern Africa. We assessed the effect of KS on survival and immunologic and virologic treatment responses at 6- and 12-months after initiation of ART. METHODS We analyzed prospectively collected data from a cohort of HIV-infected adults initiating ART in South Africa. Differences in mortality between those with and without KS at ART initiation were estimated with Cox proportional hazard models. Log-binomial models were used to assess differences in CD4 count response and HIV virologic suppression within a year of initiating treatment. RESULTS Between January 2001-January 2008, 13,847 HIV-infected adults initiated ART at the study clinics. Those with KS at ART initiation (n = 247, 2%) were similar to those without KS (n = 13600,98%) with respect to age (35 vs. 35yrs), presenting CD4 count (74 vs. 85cells/mm³) and proportion on TB treatment (37% vs. 30%). In models adjusted for sex, baseline CD4 count, age, treatment site, tuberculosis and year of ART initiation, KS patients were over three times more likely to have died at any time after ART initiation (hazard ratio[HR]: 3.62; 95% CI: 2.71-4.84) than those without KS. The increased risk was highest within the first year on ART (HR: 4.05; 95% CI: 2.95-5.55) and attenuated thereafter (HR: 2.30; 95% CI: 1.08-4.89). Those with KS also gained, on average, 29 fewer CD4 cells (95% CI: 7-52cells/mm³) and were less likely to increase their CD4 count by 50 cells from baseline (RR: 1.43; 95% CI: 0.99-2.06) within the first 6-months of treatment. CONCLUSIONS HIV-infected adults presenting with KS have increased risk of mortality even after initiation of ART with the greatest risk in the first year. Among those who survive the first year on therapy, subjects with KS demonstrated a poorer immunologic response to ART than those without KS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES This study sought to report the final 5 years follow-up of the landmark LEADERS (Limus Eluted From A Durable Versus ERodable Stent Coating) trial. BACKGROUND The LEADERS trial is the first randomized study to evaluate biodegradable polymer-based drug-eluting stents (DES) against durable polymer DES. METHODS The LEADERS trial was a 10-center, assessor-blind, noninferiority, "all-comers" trial (N = 1,707). All patients were centrally randomized to treatment with either biodegradable polymer biolimus-eluting stents (BES) (n = 857) or durable polymer sirolimus-eluting stents (SES) (n = 850). The primary endpoint was a composite of cardiac death, myocardial infarction (MI), or clinically indicated target vessel revascularization within 9 months. Secondary endpoints included extending the primary endpoint to 5 years and stent thrombosis (ST) (Academic Research Consortium definition). Analysis was by intention to treat. RESULTS At 5 years, the BES was noninferior to SES for the primary endpoint (186 [22.3%] vs. 216 [26.1%], rate ratio [RR]: 0.83 [95% confidence interval (CI): 0.68 to 1.02], p for noninferiority <0.0001, p for superiority = 0.069). The BES was associated with a significant reduction in the more comprehensive patient-orientated composite endpoint of all-cause death, any MI, and all-cause revascularization (297 [35.1%] vs. 339 [40.4%], RR: 0.84 [95% CI: 0.71 to 0.98], p for superiority = 0.023). A significant reduction in very late definite ST from 1 to 5 years was evident with the BES (n = 5 [0.7%] vs. n = 19 [2.5%], RR: 0.26 [95% CI: 0.10 to 0.68], p = 0.003), corresponding to a significant reduction in ST-associated clinical events (primary endpoint) over the same time period (n = 3 of 749 vs. n = 14 of 738, RR: 0.20 [95% CI: 0.06 to 0.71], p = 0.005). CONCLUSIONS The safety benefit of the biodegradable polymer BES, compared with the durable polymer SES, was related to a significant reduction in very late ST (>1 year) and associated composite clinical outcomes. (Limus Eluted From A Durable Versus ERodable Stent Coating [LEADERS] trial; NCT00389220).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Partner notification is essential to the comprehensive case management of sexually transmitted infections. Systematic reviews and mathematical modelling can be used to synthesise information about the effects of new interventions to enhance the outcomes of partner notification. OBJECTIVE To study the effectiveness and cost-effectiveness of traditional and new partner notification technologies for curable sexually transmitted infections (STIs). DESIGN Secondary data analysis of clinical audit data; systematic reviews of randomised controlled trials (MEDLINE, EMBASE and Cochrane Central Register of Controlled Trials) published from 1 January 1966 to 31 August 2012 and of studies of health-related quality of life (HRQL) [MEDLINE, EMBASE, ISI Web of Knowledge, NHS Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment (HTA)] published from 1 January 1980 to 31 December 2011; static models of clinical effectiveness and cost-effectiveness; and dynamic modelling studies to improve parameter estimation and examine effectiveness. SETTING General population and genitourinary medicine clinic attenders. PARTICIPANTS Heterosexual women and men. INTERVENTIONS Traditional partner notification by patient or provider referral, and new partner notification by expedited partner therapy (EPT) or its UK equivalent, accelerated partner therapy (APT). MAIN OUTCOME MEASURES Population prevalence; index case reinfection; and partners treated per index case. RESULTS Enhanced partner therapy reduced reinfection in index cases with curable STIs more than simple patient referral [risk ratio (RR) 0.71; 95% confidence interval (CI) 0.56 to 0.89]. There are no randomised trials of APT. The median number of partners treated for chlamydia per index case in UK clinics was 0.60. The number of partners needed to treat to interrupt transmission of chlamydia was lower for casual than for regular partners. In dynamic model simulations, > 10% of partners are chlamydia positive with look-back periods of up to 18 months. In the presence of a chlamydia screening programme that reduces population prevalence, treatment of current partners achieves most of the additional reduction in prevalence attributable to partner notification. Dynamic model simulations show that cotesting and treatment for chlamydia and gonorrhoea reduce the prevalence of both STIs. APT has a limited additional effect on prevalence but reduces the rate of index case reinfection. Published quality-adjusted life-year (QALY) weights were of insufficient quality to be used in a cost-effectiveness study of partner notification in this project. Using an intermediate outcome of cost per infection diagnosed, doubling the efficacy of partner notification from 0.4 to 0.8 partners treated per index case was more cost-effective than increasing chlamydia screening coverage. CONCLUSIONS There is evidence to support the improved clinical effectiveness of EPT in reducing index case reinfection. In a general heterosexual population, partner notification identifies new infected cases but the impact on chlamydia prevalence is limited. Partner notification to notify casual partners might have a greater impact than for regular partners in genitourinary clinic populations. Recommendations for future research are (1) to conduct randomised controlled trials using biological outcomes of the effectiveness of APT and of methods to increase testing for human immunodeficiency virus (HIV) and STIs after APT; (2) collection of HRQL data should be a priority to determine QALYs associated with the sequelae of curable STIs; and (3) standardised parameter sets for curable STIs should be developed for mathematical models of STI transmission that are used for policy-making. FUNDING The National Institute for Health Research Health Technology Assessment programme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference. DESIGN Mathematical modelling study based on data from ART programmes. METHODS We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained. RESULTS RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74-1.03) in scenario A, 0.94 (0.77-1.02) with delayed switching (scenario B) and 0.80 (0.44-1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality. CONCLUSIONS VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES This study sought to study the efficacy and safety of newer-generation drug-eluting stents (DES) compared with bare-metal stents (BMS) in an appropriately powered population of patients with ST-segment elevation myocardial infarction (STEMI). BACKGROUND Among patients with STEMI, early generation DES improved efficacy but not safety compared with BMS. Newer-generation DES, everolimus-eluting stents, and biolimus A9-eluting stents, have been shown to improve clinical outcomes compared with early generation DES. METHODS Individual patient data for 2,665 STEMI patients enrolled in 2 large-scale randomized clinical trials comparing newer-generation DES with BMS were pooled: 1,326 patients received a newer-generation DES (everolimus-eluting stent or biolimus A9-eluting stent), whereas the remaining 1,329 patients received a BMS. Random-effects models were used to assess differences between the 2 groups for the device-oriented composite endpoint of cardiac death, target-vessel reinfarction, and target-lesion revascularization and the patient-oriented composite endpoint of all-cause death, any infarction, and any revascularization at 1 year. RESULTS Newer-generation DES substantially reduce the risk of the device-oriented composite endpoint compared with BMS at 1 year (relative risk [RR]: 0.58; 95% confidence interval [CI]: 0.43 to 0.79; p = 0.0004). Similarly, the risk of the patient-oriented composite endpoint was lower with newer-generation DES than BMS (RR: 0.78; 95% CI: 0.63 to 0.96; p = 0.02). Differences in favor of newer-generation DES were driven by both a lower risk of repeat revascularization of the target lesion (RR: 0.33; 95% CI: 0.20 to 0.52; p < 0.0001) and a lower risk of target-vessel infarction (RR: 0.36; 95% CI: 0.14 to 0.92; p = 0.03). Newer-generation DES also reduced the risk of definite stent thrombosis (RR: 0.35; 95% CI: 0.16 to 0.75; p = 0.006) compared with BMS. CONCLUSIONS Among patients with STEMI, newer-generation DES improve safety and efficacy compared with BMS throughout 1 year. It remains to be determined whether the differences in favor of newer-generation DES are sustained during long-term follow-up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To update the 2006 systematic review of the comparative benefits and harms of erythropoiesis-stimulating agent (ESA) strategies and non-ESA strategies to manage anemia in patients undergoing chemotherapy and/or radiation for malignancy (excluding myelodysplastic syndrome and acute leukemia), including the impact of alternative thresholds for initiating treatment and optimal duration of therapy. Data sources: Literature searches were updated in electronic databases (n=3), conference proceedings (n=3), and Food and Drug Administration transcripts. Multiple sources (n=13) were searched for potential gray literature. A primary source for current survival evidence was a recently published individual patient data meta-analysis. In that meta-analysis, patient data were obtained from investigators for studies enrolling more than 50 patients per arm. Because those data constitute the most currently available data for this update, as well as the source for on-study (active treatment) mortality data, we limited inclusion in the current report to studies enrolling more than 50 patients per arm to avoid potential differential endpoint ascertainment in smaller studies. Review methods: Title and abstract screening was performed by one or two (to resolve uncertainty) reviewers; potentially included publications were reviewed in full text. Two or three (to resolve disagreements) reviewers assessed trial quality. Results were independently verified and pooled for outcomes of interest. The balance of benefits and harms was examined in a decision model. Results: We evaluated evidence from 5 trials directly comparing darbepoetin with epoetin, 41 trials comparing epoetin with control, and 8 trials comparing darbepoetin with control; 5 trials evaluated early versus late (delay until Hb ≤9 to 11 g/dL) treatment. Trials varied according to duration, tumor types, cancer therapy, trial quality, iron supplementation, baseline hemoglobin, ESA dosing frequency (and therefore amount per dose), and dose escalation. ESAs decreased the risk of transfusion (pooled relative risk [RR], 0.58; 95% confidence interval [CI], 0.53 to 0.64; I2 = 51%; 38 trials) without evidence of meaningful difference between epoetin and darbepoetin. Thromboembolic event rates were higher in ESA-treated patients (pooled RR, 1.51; 95% CI, 1.30 to 1.74; I2 = 0%; 37 trials) without difference between epoetin and darbepoetin. In 14 trials reporting the Functional Assessment of Cancer Therapy (FACT)-Fatigue subscale, the most common patient-reported outcome, scores decreased by −0.6 in control arms (95% CI, −6.4 to 5.2; I2 = 0%) and increased by 2.1 in ESA arms (95% CI, −3.9 to 8.1; I2 = 0%). There were fewer thromboembolic and on-study mortality adverse events when ESA treatment was delayed until baseline Hb was less than 10 g/dL, in keeping with current treatment practice, but the difference in effect from early treatment was not significant, and the evidence was limited and insufficient for conclusions. No evidence informed optimal duration of therapy. Mortality was increased during the on-study period (pooled hazard ratio [HR], 1.17; 95% CI, 1.04 to 1.31; I2 = 0%; 37 trials). There was one additional death for every 59 treated patients when the control arm on-study mortality was 10 percent and one additional death for every 588 treated patients when the control-arm on-study mortality was 1 percent. A cohort decision model yielded a consistent result—greater loss of life-years when control arm on-study mortality was higher. There was no discernible increase in mortality with ESA use over the longest available followup (pooled HR, 1.04; 95% CI, 0.99 to 1.10; I2 = 38%; 44 trials), but many trials did not include an overall survival endpoint and potential time-dependent confounding was not considered. Conclusions: Results of this update were consistent with the 2006 review. ESAs reduced the need for transfusions and increased the risk of thromboembolism. FACT-Fatigue scores were better with ESA use but the magnitude was less than the minimal clinically important difference. An increase in mortality accompanied the use of ESAs. An important unanswered question is whether dosing practices and overall ESA exposure might influence harms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compared Pundamilia nyererei and Pundamilia pundamilia males in routine metabolic rate (RR ) and in the metabolic costs males pay during territorial interactions (active metabolic rate, RA ). Pundamilia nyererei and P. pundamilia males housed in social isolation did not differ in RR . In contrast to expectation, however, P. nyererei males used less oxygen than P. pundamilia males, for a given mass and level of agonistic activity. This increased metabolic efficiency may be an adaptation to limit the metabolic cost that P. nyererei males pay for their higher rate of aggressiveness compared to P. pundamilia males. Thus, the divergence between the species in agonistic behaviour is correlated with metabolic differentiation. Such concerted divergence in physiology and behaviour might be widespread in the dramatically diverse cichlid radiations in East African lakes and may be an important factor in the remarkably rapid speciation of these fishes. The results did not support the hypothesis that higher metabolic rates caused a physiological cost to P. nyererei males that would offset their dominance advantage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the present study was to evaluate the effect of different methods of rubber-ring castration on acute and chronic pain in calves. Sixty-three 4-6 week-old calves were randomly and sequentially allocated to one of five groups: Group RR (traditional rubber ring castration); group BRR (combination of one rubber ring with Burdizzo); group Rcut (one rubber ring applied with the scrotal tissue and rubber ring removed on day 9); group 3RR (three rubber rings placed one above the other around the scrotal neck); and group CO (controls; sham-castrated). All calves received 0.2 mL/kg bodyweight lidocaine 2%, injected into the spermatic cords and around the scrotal neck 15 min before castration. The presence of acute and chronic pain was assessed using plasma cortisol concentrations, response to palpation of scrotal area, time from castration until complete wound healing, and behavioural signs. Calves of group 3RR showed severe swelling and inflammation, and licking of the scrotal area occurred significantly more often than in groups Rcut and CO. Technique 3RR was discontinued for welfare reasons before the end of the study. All castration groups had significantly more pain upon palpation than calves of group CO, but palpation elicited markedly less pain in group Rcut than in the other castration groups. The most rapid healing time and shortest duration of chronic pain after castration was achieved in group Rcut. For welfare reasons, the Rcut technique should be considered as a valuable alternative to traditional rubber ring castration of calves at 4-6 weeks of age.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To characterize the clinical findings in dogs and cats that sustained blunt trauma and to compare clinical respiratory examination results with post-traumatic thoracic radiography findings. Design: Retrospective clinical study. Setting: University small animal teaching hospital. Animals, interventions and measurements: Case records of 63 dogs and 96 cats presenting with a history of blunt trauma and thoracic radiographs between September 2001 and May 2003 were examined. Clinical signs of respiratory distress (respiratory rate (RR), pulmonary auscultation) and outcome were compared with radiographic signs of blunt trauma. Results: Forty-nine percent of dogs and 63.5% of cats had radiographic signs attributed to thoracic trauma. Twenty-two percent of dogs and 28% of cats had normal radiographs. Abnormal auscultation results were significantly associated with radiographic signs of thoracic trauma, radiography score and presence and degree of contusions. Seventy-two percent of animals with no other injuries showed signs of thoracic trauma on chest radiographs. No correlation was found between the radiographic findings and outcome, whereas the trauma score at presentation was significantly associated with outcome and with signs of chest trauma but not with the radiography score. Conclusion: Thoracic trauma is encountered in many blunt trauma patients. The RR of animals with blunt trauma is not useful in predicting thoracic injury, whereas abnormal chest auscultation results are indicative of chest abnormalities. Thorough chest auscultation is, therefore, mandatory in all trauma animals and might help in the assessment of necessity of chest radiographs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION HIV care and treatment programmes worldwide are transforming as they push to deliver universal access to essential prevention, care and treatment services to persons living with HIV and their communities. The characteristics and capacity of these HIV programmes affect patient outcomes and quality of care. Despite the importance of ensuring optimal outcomes, few studies have addressed the capacity of HIV programmes to deliver comprehensive care. We sought to describe such capacity in HIV programmes in seven regions worldwide. METHODS Staff from 128 sites in 41 countries participating in the International epidemiologic Databases to Evaluate AIDS completed a site survey from 2009 to 2010, including sites in the Asia-Pacific region (n=20), Latin America and the Caribbean (n=7), North America (n=7), Central Africa (n=12), East Africa (n=51), Southern Africa (n=16) and West Africa (n=15). We computed a measure of the comprehensiveness of care based on seven World Health Organization-recommended essential HIV services. RESULTS Most sites reported serving urban (61%; region range (rr): 33-100%) and both adult and paediatric populations (77%; rr: 29-96%). Only 45% of HIV clinics that reported treating children had paediatricians on staff. As for the seven essential services, survey respondents reported that CD4+ cell count testing was available to all but one site, while tuberculosis (TB) screening and community outreach services were available in 80 and 72%, respectively. The remaining four essential services - nutritional support (82%), combination antiretroviral therapy adherence support (88%), prevention of mother-to-child transmission (PMTCT) (94%) and other prevention and clinical management services (97%) - were uniformly available. Approximately half (46%) of sites reported offering all seven services. Newer sites and sites in settings with low rankings on the UN Human Development Index (HDI), especially those in the President's Emergency Plan for AIDS Relief focus countries, tended to offer a more comprehensive array of essential services. HIV care programme characteristics and comprehensiveness varied according to the number of years the site had been in operation and the HDI of the site setting, with more recently established clinics in low-HDI settings reporting a more comprehensive array of available services. Survey respondents frequently identified contact tracing of patients, patient outreach, nutritional counselling, onsite viral load testing, universal TB screening and the provision of isoniazid preventive therapy as unavailable services. CONCLUSIONS This study serves as a baseline for on-going monitoring of the evolution of care delivery over time and lays the groundwork for evaluating HIV treatment outcomes in relation to site capacity for comprehensive care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To assess the impact of dental caries and traumatic dental injuries (TDI) on the oral health-related quality of life (OHRQoL) of 5- to 6-year-olds according to both self- and parental reports. METHODS A total of 335 pairs of parents and children who sought dental screening at the Dental School, University of São Paulo, completed the Scale of Oral Health Outcomes for 5-year-old children (SOHO-5), which consists of a child self-report and a parental proxy-report version. Three calibrated examiners assessed the experience of caries according to primary teeth that were decayed, indicated for extraction due to caries, or filled (def-t). TDI were classified into uncomplicated and complicated injuries. Poisson regression models were used to associate the different clinical and sociodemographic factors to the outcome. RESULTS Overall, 74.6% of children reported an oral impact, and the corresponding estimate for parental reports was 70.5%. The mean (standard deviation) SOHO-5 scores in child self-report and parental versions were 3.32(3.22) and 5.18(6.28), respectively. In both versions, caries was associated with worse children's OHRQoL, for the total score and all SOHO-5 items (P < 0.001). In contrast, TDI did not have a negative impact on children's OHRQoL, with the exception of two items of the parental version and one item of the child self-report version. In the final multivariate adjusted models, there was a gradient in the association between caries experience and child's OHRQoL with worse SOHO-5 score at each consecutive level with more severe caries experience, for both child and parental perceptions [RR (CI 95%) = 6.37 (4.71, 8.62) and 10.81 (7.65, 15.27)], respectively. A greater family income had a positive impact on the children's OHRQoL for child and parental versions [RR (CI 95%) = 0.68 (0.49, 0.94) and 0.70 (0.54, 0.90)], respectively. CONCLUSIONS Dental caries, but not TDI, is associated with worse OHRQoL of 5- to 6-year-old children in terms of perceptions of both children and their parents. Families with higher income report better OHRQoL at this age, independent of the presence of oral diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND  Transmitted HIV-1 drug-resistance mutations(TDR) are transmitted from treatment-failing or treatment-naïve patients. Although prevalence of drug-resistance in treatment-failing patients has declined in developed countries, TDR prevalence has not. Mechanisms causing this paradox are poorly explored. METHODS  We included recently-infected, treatment-naïve patients with genotypic-resistance-tests performed ≤1year post-infection and <2013. Potential risk factors for TDR were analyzed using logistic regression. Association of TDR prevalences with population viral load(PVL) from treatment-patients during 1997-2011 was estimated with Poisson regression for all TDR and individually for most frequent resistance-mutations against each drug class(M184V/L90M/K103N). RESULTS  We included 2421 recently-infected, treatment-naïve patients and 5399 treatment-failing patients. TDR prevalence fluctuated considerably over time. Two opposing developments could explain these fluctuations: generally continuous increases in TDR(Odds Ratio[OR]=1.13,p=0.010), punctuated by sharp decreases when new drug-classes were introduced. Overall, TDR prevalence increased with decreasing PVL(Rate Ratio[RR]=0.91/1000Log10-PVL,p=0.033). Additionally, we observed that the transmitted high-fitness-cost mutation M184V was positively associated with PVL of treatment-failing patients carrying M184V(RR=1.50/100Log10-PVL,p<0.001). Such association was absent and negative for K103N(RR-K103N=1.00/100Log10-PVL,p=0.99) and L90M(RR-L90M=0.75/100Log10-PVL,p=0.022), respectively. CONCLUSIONS  Transmission of antiretroviral drug-resistance is temporarily reduced by the introduction of new drug classes and driven by treatment-failing and treatment-naïve patients. These findings suggest a continuous need for new drugs, early detection/treatment of HIV-1-infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we use IP and alkenone biomarker proxies to document the subdecadal variations of sea ice and sea surface temperature in the subpolar North Atlantic induced by the decadally paced explosive tropical volcanic eruptions of the second half of the thirteenth century. The short- and long-term evolutions of both variables were investigated by cross analysis with a simulation of the IPSL-CM5A LR model. Our results show short-term ocean cooling and sea ice expansion in response to each volcanic eruption. They also highlight that the long response time of the ocean leads to cumulative surface cooling and subsurface heat buildup due to sea ice capping. As volcanic forcing relaxes, the surface ocean rapidly warms, likely amplified by subsurface heat, and remains almost ice free for several decades