763 resultados para Janet-Cartan


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND There are limited published data on the outcomes of infants starting antiretroviral therapy (ART) in routine care in Southern Africa. This study aimed to examine the baseline characteristics and outcomes of infants initiating ART. METHODS We analyzed prospectively collected cohort data from routine ART initiation in infants from 11 cohorts contributing to the International Epidemiologic Database to Evaluate AIDS in Southern Africa. We included ART-naive HIV-infected infants aged <12 months initiating ≥3 antiretroviral drugs between 2004 and 2012. Kaplan-Meier estimates were calculated for mortality, loss to follow-up (LTFU), transfer out, and virological suppression. We used Cox proportional hazard models stratified by cohort to determine baseline characteristics associated with outcomes mortality and virological suppression. RESULTS The median (interquartile range) age at ART initiation of 4945 infants was 5.9 months (3.7-8.7) with follow-up of 11.2 months (2.8-20.0). At ART initiation, 77% had WHO clinical stage 3 or 4 disease and 87% were severely immunosuppressed. Three-year mortality probability was 16% and LTFU 29%. Severe immunosuppression, WHO stage 3 or 4, anemia, being severely underweight, and initiation of treatment before 2010 were associated with higher mortality. At 12 months after ART initiation, 17% of infants were severely immunosuppressed and the probability of attaining virological suppression was 56%. CONCLUSIONS Most infants initiating ART in Southern Africa had severe disease with high probability of LTFU and mortality on ART. Although the majority of infants remaining in care showed immune recovery and virological suppression, these responses were suboptimal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Ongoing CD4 monitoring in patients on antiretroviral therapy (ART) with viral suppression has been questioned. We evaluated the probability of CD4 decline in children with viral suppression and CD4 recovery after 1 year on ART. METHODS We included children from 8 South African cohorts with routine HIV-RNA monitoring if (1) they were "responders" [HIV-RNA < 400 copies/mL and no severe immunosuppression after ≥1 year on ART (time 0)] and (2) ≥1 HIV-RNA and CD4 measurement within 15 months of time 0. We determined the probability of CD4 decline to World Health Organization-defined severe immunosuppression for 3 years after time 0 if viral suppression was maintained. Follow-up was censored at the earliest of the following dates: the day before first HIV-RNA measurement >400 copies/mL; day before a >15-month gap in testing and date of death, loss to follow-up, transfer out or database closure. RESULTS Among 5984 children [median age at time 0: 5.8 years (interquartile range: 3.1-9.0)], 270 children experienced a single CD4 decline to severe immunosuppression within 3 years of time 0 with probability of 6.6% (95% CI: 5.8-7.4). A subsequent CD4 measurement within 15 months of the first low measurement was available for 63% of children with CD4 decline and 86% showed CD4 recovery. The probability of CD4 decline was lowest (2.8%) in children aged 2 years or older with no or mild immunosuppression and on ART for <18 months at time 0. This group comprised 40% of children. CONCLUSIONS This finding suggests that it may be safe to stop routine CD4 monitoring in children older than 2 years and rely on virologic monitoring alone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paracaspase MALT1 plays an important role in immune receptor-driven signaling pathways leading to NF-κB activation. MALT1 promotes signaling by acting as a scaffold, recruiting downstream signaling proteins, as well as by proteolytic cleavage of multiple substrates. However, the relative contributions of these two different activities to T and B cell function are not well understood. To investigate how MALT1 proteolytic activity contributes to overall immune cell regulation, we generated MALT1 protease-deficient mice (Malt1(PD/PD)) and compared their phenotype with that of MALT1 knockout animals (Malt1(-/-)). Malt1(PD/PD) mice displayed defects in multiple cell types including marginal zone B cells, B1 B cells, IL-10-producing B cells, regulatory T cells, and mature T and B cells. In general, immune defects were more pronounced in Malt1(-/-) animals. Both mouse lines showed abrogated B cell responses upon immunization with T-dependent and T-independent Ags. In vitro, inactivation of MALT1 protease activity caused reduced stimulation-induced T cell proliferation, impaired IL-2 and TNF-α production, as well as defective Th17 differentiation. Consequently, Malt1(PD/PD) mice were protected in a Th17-dependent experimental autoimmune encephalomyelitis model. Surprisingly, Malt1(PD/PD) animals developed a multiorgan inflammatory pathology, characterized by Th1 and Th2/0 responses and enhanced IgG1 and IgE levels, which was delayed by wild-type regulatory T cell reconstitution. We therefore propose that the pathology characterizing Malt1(PD/PD) animals arises from an immune imbalance featuring pathogenic Th1- and Th2/0-skewed effector responses and reduced immunosuppressive compartments. These data uncover a previously unappreciated key function of MALT1 protease activity in immune homeostasis and underline its relevance in human health and disease.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND Little is known on the risk of cancer in HIV-positive children in sub-Saharan Africa. We examined incidence and risk factors of AIDS-defining and other cancers in pediatric antiretroviral therapy (ART) programs in South Africa. METHODS We linked the records of five ART programs in Johannesburg and Cape Town to those of pediatric oncology units, based on name and surname, date of birth, folder and civil identification numbers. We calculated incidence rates and obtained hazard ratios (HR) with 95% confidence intervals (CI) from Cox regression models including ART, sex, age, and degree of immunodeficiency. Missing CD4 counts and CD4% were multiply imputed. Immunodeficiency was defined according to World Health Organization 2005 criteria. RESULTS Data of 11,707 HIV-positive children were included in the analysis. During 29,348 person-years of follow-up 24 cancers were diagnosed, for an incidence rate of 82 per 100,000 person-years (95% CI 55-122). The most frequent cancers were Kaposi Sarcoma (34 per 100,000 person-years) and Non Hodgkin Lymphoma (31 per 100,000 person-years). The incidence of non AIDS-defining malignancies was 17 per 100,000. The risk of developing cancer was lower on ART (HR 0.29, 95%CI 0.09-0.86), and increased with age at enrolment (>10 versus <3 years: HR 7.3, 95% CI 2.2-24.6) and immunodeficiency at enrolment (advanced/severe versus no/mild: HR 3.5, 95%CI 1.1-12.0). The HR for the effect of ART from complete case analysis was similar but ceased to be statistically significant (p=0.078). CONCLUSIONS Early HIV diagnosis and linkage to care, with start of ART before advanced immunodeficiency develops, may substantially reduce the burden of cancer in HIV-positive children in South Africa and elsewhere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predicting the timing and amount of tree mortality after a forest fire is of paramount importance for post-fire management decisions, such as salvage logging or reforestation. Such knowledge is particularly needed in mountainous regions where forest stands often serve as protection against natural hazards (e.g., snow avalanches, rockfalls, landslides). In this paper, we focus on the drivers and timing of mortality in fire-injured beech trees (Fagus sylvatica L.) in mountain regions. We studied beech forests in the southwestern European Alps, which burned between 1970 and 2012. The results show that beech trees, which lack fire-resistance traits, experience increased mortality within the first two decades post-fire with a timing and amount strongly related to the burn severity. Beech mortality is fast and ubiquitous in high severity sites, whereas small- (DBH <12 cm) and intermediate-diameter (DBH 12–36 cm) trees face a higher risk to die in moderate-severity sites. Large-diameter trees mostly survive, representing a crucial ecological legacy for beech regeneration. Mortality remains low and at a level similar to unburnt beech forests for low burn severity sites. Beech trees diameter, the presence of fungal infestation and elevation are the most significant drivers of mortality. The risk of beech to die increases toward higher elevation and is higher for small-diameter than for large-diameter trees. In case of secondary fungi infestation beech faces generally a higher risk to die. Interestingly, fungi that initiate post-fire tree mortality differ from fungi occurring after mechanical injury. From a management point of view, the insights about the controls of post-fire mortality provided by this study should help in planning post-fire silvicultural measures in montane beech forests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

METHODS Spirometry datasets from South-Asian children were collated from four centres in India and five within the UK. Records with transcription errors, missing values for height or spirometry, and implausible values were excluded(n = 110). RESULTS Following exclusions, cross-sectional data were available from 8,124 children (56.3% male; 5-17 years). When compared with GLI-predicted values from White Europeans, forced expired volume in 1s (FEV1) and forced vital capacity (FVC) in South-Asian children were on average 15% lower, ranging from 4-19% between centres. By contrast, proportional reductions in FEV1 and FVC within all but two datasets meant that the FEV1/FVC ratio remained independent of ethnicity. The 'GLI-Other' equation fitted data from North India reasonably well while 'GLI-Black' equations provided a better approximation for South-Asian data than the 'GLI-White' equation. However, marked discrepancies in the mean lung function z-scores between centres especially when examined according to socio-economic conditions precluded derivation of a single South-Asian GLI-adjustment. CONCLUSION Until improved and more robust prediction equations can be derived, we recommend the use of 'GLI-Black' equations for interpreting most South-Asian data, although 'GLI-Other' may be more appropriate for North Indian data. Prospective data collection using standardised protocols to explore potential sources of variation due to socio-economic circumstances, secular changes in growth/predictors of lung function and ethnicities within the South-Asian classification are urgently required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The surveillance of HIV-related cancers in South Africa is hampered by the lack of systematic collection of cancer diagnoses in HIV cohorts and the absence of HIV status in cancer registries. To improve cancer ascertainment and estimate cancer incidence, we linked records of adults (aged ≥ 16 years) on antiretroviral treatment (ART) enrolled at Sinikithemba HIV clinic, McCord Hospital in KwaZulu-Natal (KZN) with the cancer records of public laboratories in KZN province using probabilistic record linkage methods. We calculated incidence rates for all cancers, Kaposi sarcoma (KS), cervix, non-Hodgkin's lymphoma and non-AIDS defining cancers (NADCs) before and after inclusion of linkage-identified cancers with 95% confidence intervals (CI). A total of 8,721 records of HIV-positive patients were linked with 35,536 cancer records. Between 2004 and 2010 we identified 448 cancers, 82% (n=367) were recorded in the cancer registry only, 10% (n=43) in the HIV cohort only and 8% (n=38) both in the HIV cohort and the cancer registry. The overall cancer incidence rate in patients starting ART increased from 134 (95% CI 91-212) to 877 (95% CI 744-1,041) after inclusion of linkage-identified cancers. Incidence rates were highest for KS (432, 95% CI 341-555), followed by cervix (259, 95% CI 179-390) and NADCs (294, 95% CI 223-395) per 100,000 person-years. Ascertainment of cancer in HIV cohorts is incomplete, probabilistic record linkage is both feasible and essential for cancer ascertainment. This article is protected by copyright. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background. Population health within and between nations is heavily influenced by political determinants, yet these determinants have received significantly less attention than socioeconomic factors in public health. It has been hypothesized that the welfare state, as a political variable, may play a particularly prominent role in affecting both health indicators and health disparities in developed countries. The research, however, provides conflicting evidence regarding the health impact of particular regimes over others and the mechanisms through which the welfare state can most significantly affect health.^ Objective. To perform a systematic review of the literature as a means of exploring what the current research indicates regarding the benefits or detriments of particular regimes styles and the pathways through which the welfare state can impact heath indicators and health disparities within developed countries.^ Methods. A thorough search of the EBSCO, Pubmed, Medline, Web of Science, and Scopus electronic databases was conducted and resulted in the identification of 15 studies that evaluated the association between welfare state regime and population health outcomes, and/or pathways through with the welfare state influences health. ^ Results. Social democratic countries tended to perform best when infant mortality rate (IMR) was the primary outcome of interest, whereas liberal countries performed strongly in relation to self perceived health. The results were mixed regarding welfare state effectiveness in mitigating health inequities, with Christian democratic countries performing as well as social democratic countries. In relation to welfare state pathways, public health spending and medical coverage were associated with positive health indicators. Redistributive impact of the welfare state was also consistently associated with better health outcomes while social security expenditures were not.^ Discussion/Conclusions. Studies consistently discovered a significant relationship between the welfare state and population health and/or health disparities, lending support to the hypothesis that the welfare state is, indeed, an important non-medical determinant of health. However, it is still fairly unclear which welfare state regime may be most protective for health, as results varied according to the measured health indicator. The research regarding welfare state pathways is particularly undeveloped, and does not provide much insight into the importance of in-kind service provision or cash transfers, or targeted or universal approaches to the welfare state. Suggestions to direct future research are provided.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Excessively high, accelerating lung cancer rates among women in Harris County, Texas, prompted this case-comparison study. Objectives were to compare patterns of employment, indirect exposures, and sociodemographic variables of lung cancer cases with comparison subjects (compeers) after standardizing for possible confounders, such as age and cigarette smoking. Lung cancer cases were microscopically confirmed, white, Harris County residents. Compeers, chosen from Medicare records and Texas Department of Public Safety records, were matched on gender, race, age, resident and vital status. Personal interviews were conducted with study subjects or next-of-kin. Industries and occupations were categorized as high risk, based on previous studies.^ Almost all cases (95.0%) and 60.0% of compeers smoked cigarettes. The odds ratio for lung cancer and smoking is 13.9. Stopping smoking between ages 30-50 years carries a lower risk than stopping at age 58 or more years. Women's employment in a high risk industry or occupation results in consistently elevated, smoking-adjusted odds ratios. Frequency and duration of employment demonstrate a moderate dose-response effect. A temporal association exists with employment in a high risk occupation during 1940-1949.^ No increased risk appeared with passive smoking. Husband's employment in a construction industry or a structural occupation significantly increased the smoking-adjusted odds ratios among cases and compeers (O.R. = 2.9, 2.2). Smoking-adjusted odds ratios increased significantly when women had resided with persons employed in cement (O.R. = 3.2) or insulation (O.R. = 5.5) manufacturing, or a high rise construction industry (O.R. = 2.4). A family history of lung cancer resulted in a two-fold increase in smoking-adjusted odds ratios. Vital status of compeers affected the odds ratios.^ Work-related exposures appear to increase the risk of lung cancer in women although cigarette smoking has the single highest odds ratio. Indirect exposure to certain employment also plays a significant role in lung cancer in women. Investigations of specific direct and indirect hazardous exposures in the workplace and home are needed. Cigarette smoking is as hazardous for women as for men. Smoking should be prevented and eliminated. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Next to leisure, sport, and household activities, the most common activity resulting in medically consulted injuries and poisonings in the United States is work, with an estimated 4 million workplace related episodes reported in 2008 (U.S. Department of Health and Human Services, 2009). To address the risks inherent to various occupations, risk management programs are typically put in place that include worker training, engineering controls, and personal protective equipment. Recent studies have shown that such interventions alone are insufficient to adequately manage workplace risks, and that the climate in which the workers and safety program exist (known as the "safety climate") is an equally important consideration. The organizational safety climate is so important that many studies have focused on developing means of measuring it in various work settings. While safety climate studies have been reported for several industrial settings, published studies on assessing safety climate in the university work setting are largely absent. Universities are particularly unique workplaces because of the potential exposure to a diversity of agents representing both acute and chronic risks. Universities are also unique because readily detectable health and safety outcomes are relatively rare. The ability to measure safety climate in a work setting with rarely observed systemic outcome measures could serve as a powerful means of measure for the evaluation of safety risk management programs. ^ The goal of this research study was the development of a survey tool to measure safety climate specifically in the university work setting. The use of a standardized tool also allows for comparisons among universities throughout the United States. A specific study objective was accomplished to quantitatively assess safety climate at five universities across the United States. At five universities, 971 participants completed an online questionnaire to measure the safety climate. The average safety climate score across the five universities was 3.92 on a scale of 1 to 5, with 5 indicating very high perceptions of safety at these universities. The two lowest overall dimensions of university safety climate were "acknowledgement of safety performance" and "department and supervisor's safety commitment". The results underscore how the perception of safety climate is significantly influenced at the local level. A second study objective regarding evaluating the reliability and validity of the safety climate questionnaire was accomplished. A third objective fulfilled was to provide executive summaries resulting from the questionnaire to the participating universities' health & safety professionals and collect feedback on usefulness, relevance and perceived accuracy. Overall, the professionals found the survey and results to be very useful, relevant and accurate. Finally, the safety climate questionnaire will be offered to other universities for benchmarking purposes at the annual meeting of a nationally recognized university health and safety organization. The ultimate goal of the project was accomplished and was the creation of a standardized tool that can be used for measuring safety climate in the university work setting and can facilitate meaningful comparisons amongst institutions.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Viral hepatitis is a significant public health problem worldwide and is due to viral infections that are classified as Hepatitis A, B, C, D, and E. Hepatitis B is one of the five known hepatic viruses. A safe and effective vaccine for Hepatitis B was first developed in 1981, and became adopted into national immunization programs targeting infants since 1990 and adolescents since 1995. In the U.S., this vaccination schedule has led to an 82% reduction in incidence from 8.5 cases per 100,000 in 1990 to 1.5 cases per 100,000 in 2007. Although there has been a decline in infection among adolescents, there is still a large burden of hepatitis B infection among adults and minorities. There is very little research in regards to vaccination gaps among adults. Using the National Health and Nutrition Examination Survey (NHANES) question "{Have you/Has SP (Study Participant)} ever received the 3-dose series of the hepatitis B vaccine?" the existence of racial/ethnic gaps using a cross-sectional study design was explored. In this study, other variables such as age, gender, socioeconomic variables (federal poverty line, educational attainment), and behavioral factors (sexual practices, self-report of men having sex with men, and intravenous drug use) were examined. We found that the current vaccination programs and policies for Hepatitis B had eliminated racial and ethnic disparities in Hepatitis B vaccination, but that a low coverage exists particularly for adults who engage in high risk behaviors. This study found a statistically significant 10% gap in Hepatitis B vaccination between those who have and those who do not have access to health insurance.^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite continued research and public health efforts to reduce smoking during pregnancy, prenatal cessation rates in the United States have decreased and the incidence of low birth weight has increased from 1985 to 1991. Lower socioeconomic status women who are at increased risk for poor pregnancy outcomes may be resistant to current intervention efforts during pregnancy. The purpose of this dissertation was to investigate the determinants of continued smoking and quitting among low-income pregnant women.^ Using data from cross-sectional surveys of 323 low-income pregnant smokers, the first study developed and tested measures of the pros and cons of smoking during pregnancy. The original decisional balance measure for smoking was compared with a new measure that added items thought to be more salient to the target population. Confirmatory factor analysis using structural equation modeling showed neither the original nor new measure fit the data adequately. Using behavioral science theory, content from interviews with the population, and statistical evidence, two 7-item scales representing the pros and cons were developed from a portion (n = 215) of the sample and successfully cross-validated on the remainder of the sample (n = 108). Logistic regression found only pros were significantly associated with continued smoking. In a discriminant function analysis, stage of change was significantly associated with pros and cons of smoking.^ The second study examined the structural relationships between psychosocial constructs representing some of the levels of and the pros and cons of smoking. The cross-sectional design mandates that statements made regarding prediction do not prove causation or directionality from the data or methods analysis. Structural equation modeling found the following: more stressors and family criticism were significantly more predictive of negative affect than social support; a bi-directional relationship was found between negative affect and current nicotine addiction; and negative affect, addiction, stressors, and family criticism were significant predictors of pros of smoking.^ The findings imply reversing the trend of decreasing smoking cessation during pregnancy may require supplementing current interventions for this population of pregnant smokers with programs addressing nicotine addiction, negative affect, and other psychosocial factors such as family functioning and stressors. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Widespread Lower Cretaceous magmatism occurred along the Indian-Australian/Antarctic margins, and in the juvenile Indian Ocean, during the rifting of eastern Gondwana. The formation of this magmatic province probably began around 120-130 Ma with the eruption of basalts on the Naturaliste Plateau and at Bunbury, western Australia. On the northeast margin of India, activity began around 117 Ma with the Rajmahal continental basalts and associated lamprophyre intrusions. The formation of the Kerguelen Plateau in the Indian Ocean began no later than 114 Ma. Ultramafic lamprophyres (alnoites) were emplaced in the Prince Charles Mountains near the Antarctic continental margin at ~ 110 Ma. These events are considered to be related to a major mantle plume, the remnant of which is situated beneath the region of Kerguelen and Heard islands at the present day. Geochemical data are presented for each of these volcanic suites and are indicative of complex interactions between asthenosphere-derived magmas and the continental lithosphere. Kerguelen Plateau basalts have Sr and Nd isotopic compositions lying outside the field for Indian Ocean mid-ocean ridge basalts (MORB) but, with the exception of Site 738 at the southern end of the plateau, within the range of more recent hotspot basalts from Kerguelen and Heard Islands. However, a number of the plateau tholeiites are characterized by lower 206Pb/204Pb ratios than are basalts from Kerguelen Island, and many also have anomalously high La/Nb ratios. These features suggest that the source of the Kerguelen Plateau basalts suffered contamination by components derived from the Gondwana continental lithosphere. An extreme expression of this lithospheric signature is shown by a tholeiite from Site 738, suggesting that the southernmost part of the Kerguelen Plateau may be underlain by continental crust. The Rajmahal tholeiites mostly fall into two distinct geochemical groups. Some Group I tholeiites have Sr and Nd isotopic compositions and incompatible element abundances, similar to Kerguelen Plateau tholeiites from Sites 749 and 750, indicating that the Kerguelen-Heard mantle plume may have directly furnished Rajmahal volcanism. However, their elevated 207Pb/204Pb ratios indicate that these magmas did not totally escape contamination by continental lithosphere. In contrast to the Group I tholeiites, significant contamination is suggested for Group II Rajmahal tholeiites, on the basis of incompatible element abundances and isotopic compositions. The Naturaliste Plateau and the Bunbury Basalt samples show varying degrees of enrichment in incompatible elements over normal MORB. The Naturaliste Plateau samples (and Bunbury Basalt) have high La/Nb ratios, a feature not inconsistent with the notion that the plateau may consist of stretched continental lithosphere, near the ocean-continent divide.