30 resultados para median
Resumo:
Conventional hedonic techniques for estimating the value of local amenities rely on the assumption that households move freely among locations. We show that when moving is costly, the variation in housing prices and wages across locations may no longer reflect the value of differences in local amenities. We develop an alternative discrete-choice approach that models the household location decision directly, and we apply it to the case of air quality in US metro areas in 1990 and 2000. Because air pollution is likely to be correlated with unobservable local characteristics such as economic activity, we instrument for air quality using the contribution of distant sources to local pollution-excluding emissions from local sources, which are most likely to be correlated with local conditions. Our model yields an estimated elasticity of willingness to pay with respect to air quality of 0.34-0.42. These estimates imply that the median household would pay $149-$185 (in constant 1982-1984 dollars) for a one-unit reduction in average ambient concentrations of particulate matter. These estimates are three times greater than the marginal willingness to pay estimated by a conventional hedonic model using the same data. Our results are robust to a range of covariates, instrumenting strategies, and functional form assumptions. The findings also confirm the importance of instrumenting for local air pollution. © 2009 Elsevier Inc. All rights reserved.
Resumo:
Following the completion of a 20-week, open-label study of the safety and efficacy of liquid rivastigmine for adolescents with Down syndrome, 5 of the 10 adolescents in the clinical trial continued long-term rivastigmine therapy and 5 did not. After an average period of 38 months, all 10 subjects returned for a follow-up assessment to determine the safety and efficacy of long-term rivastigmine use. Rivastigmine was well tolerated and overall health appeared to be unaffected by long-term rivastigmine use. Performance change on cognitive and language measures administered at the termination of the open-label clinical trial was compared between the two groups. No between-group difference in median performance change across the long-term period was found, suggesting that the long-term use of rivastigmine does not improve cognitive and language performance. However, two subjects demonstrated remarkable improvement in adaptive function over the long-term period. Both subjects had received long-term rivastigmine therapy. The discussion addresses the challenge of assessing cognitive change in clinical trials using adolescents with Down syndrome as subjects and the use of group versus individual data to evaluate the relevance of medication effects.
Resumo:
INTRODUCTION: Anti-cholinergic medications have been associated with increased risks of cognitive impairment, premature mortality and increased risk of hospitalisation. Anti-cholinergic load associated with medication increases as death approaches in those with advanced cancer, yet little is known about associated adverse outcomes in this setting. METHODS: A substudy of 112 participants in a randomised control trial who had cancer and an Australia modified Karnofsky Performance Scale (AKPS) score (AKPS) of 60 or above, explored survival and health service utilisation; with anti-cholinergic load calculated using the Clinician Rated Anti-cholinergic Scale (modified version) longitudinally to death. A standardised starting point for prospectively calculating survival was an AKPS of 60 or above. RESULTS: Baseline entry to the sub-study was a mean 62 +/- 81 days (median 37, range 1-588) days before death (survival), with mean of 4.8 (median 3, SD 4.18, range 1 - 24) study assessments in this time period. Participants spent 22% of time as an inpatient. There was no significant association between anti-cholinergic score and time spent as an inpatient (adjusted for survival time) (p = 0.94); or survival time. DISCUSSION: No association between anti-cholinergic load and survival or time spent as an inpatient was seen. Future studies need to include cognitively impaired populations where the risks of symptomatic deterioration may be more substantial.
Resumo:
BACKGROUND: Invasive fungal infections (IFIs) are a major cause of morbidity and mortality among organ transplant recipients. Multicenter prospective surveillance data to determine disease burden and secular trends are lacking. METHODS: The Transplant-Associated Infection Surveillance Network (TRANSNET) is a consortium of 23 US transplant centers, including 15 that contributed to the organ transplant recipient dataset. We prospectively identified IFIs among organ transplant recipients from March, 2001 through March, 2006 at these sites. To explore trends, we calculated the 12-month cumulative incidence among 9 sequential cohorts. RESULTS: During the surveillance period, 1208 IFIs were identified among 1063 organ transplant recipients. The most common IFIs were invasive candidiasis (53%), invasive aspergillosis (19%), cryptococcosis (8%), non-Aspergillus molds (8%), endemic fungi (5%), and zygomycosis (2%). Median time to onset of candidiasis, aspergillosis, and cryptococcosis was 103, 184, and 575 days, respectively. Among a cohort of 16,808 patients who underwent transplantation between March 2001 and September 2005 and were followed through March 2006, a total of 729 IFIs were reported among 633 persons. One-year cumulative incidences of the first IFI were 11.6%, 8.6%, 4.7%, 4.0%, 3.4%, and 1.3% for small bowel, lung, liver, heart, pancreas, and kidney transplant recipients, respectively. One-year incidence was highest for invasive candidiasis (1.95%) and aspergillosis (0.65%). Trend analysis showed a slight increase in cumulative incidence from 2002 to 2005. CONCLUSIONS: We detected a slight increase in IFIs during the surveillance period. These data provide important insights into the timing and incidence of IFIs among organ transplant recipients, which can help to focus effective prevention and treatment strategies.
Resumo:
BACKGROUND: The incidence and epidemiology of invasive fungal infections (IFIs), a leading cause of death among hematopoeitic stem cell transplant (HSCT) recipients, are derived mainly from single-institution retrospective studies. METHODS: The Transplant Associated Infections Surveillance Network, a network of 23 US transplant centers, prospectively enrolled HSCT recipients with proven and probable IFIs occurring between March 2001 and March 2006. We collected denominator data on all HSCTs preformed at each site and clinical, diagnostic, and outcome information for each IFI case. To estimate trends in IFI, we calculated the 12-month cumulative incidence among 9 sequential subcohorts. RESULTS: We identified 983 IFIs among 875 HSCT recipients. The median age of the patients was 49 years; 60% were male. Invasive aspergillosis (43%), invasive candidiasis (28%), and zygomycosis (8%) were the most common IFIs. Fifty-nine percent and 61% of IFIs were recognized within 60 days of neutropenia and graft-versus-host disease, respectively. Median onset of candidiasis and aspergillosis after HSCT was 61 days and 99 days, respectively. Within a cohort of 16,200 HSCT recipients who received their first transplants between March 2001 and September 2005 and were followed up through March 2006, we identified 718 IFIs in 639 persons. Twelve-month cumulative incidences, based on the first IFI, were 7.7 cases per 100 transplants for matched unrelated allogeneic, 8.1 cases per 100 transplants for mismatched-related allogeneic, 5.8 cases per 100 transplants for matched-related allogeneic, and 1.2 cases per 100 transplants for autologous HSCT. CONCLUSIONS: In this national prospective surveillance study of IFIs in HSCT recipients, the cumulative incidence was highest for aspergillosis, followed by candidiasis. Understanding the epidemiologic trends and burden of IFIs may lead to improved management strategies and study design.
Resumo:
BACKGROUND: To our knowledge, the antiviral activity of pegylated interferon alfa-2a has not been studied in participants with untreated human immunodeficiency virus type 1 (HIV-1) infection but without chronic hepatitis C virus (HCV) infection. METHODS: Untreated HIV-1-infected volunteers without HCV infection received 180 microg of pegylated interferon alfa-2a weekly for 12 weeks. Changes in plasma HIV-1 RNA load, CD4(+) T cell counts, pharmacokinetics, pharmacodynamic measurements of 2',5'-oligoadenylate synthetase (OAS) activity, and induction levels of interferon-inducible genes (IFIGs) were measured. Nonparametric statistical analysis was performed. RESULTS: Eleven participants completed 12 weeks of therapy. The median plasma viral load decrease and change in CD4(+) T cell counts at week 12 were 0.61 log(10) copies/mL (90% confidence interval [CI], 0.20-1.18 log(10) copies/mL) and -44 cells/microL (90% CI, -95 to 85 cells/microL), respectively. There was no correlation between plasma viral load decreases and concurrent pegylated interferon plasma concentrations. However, participants with larger increases in OAS level exhibited greater decreases in plasma viral load at weeks 1 and 2 (r = -0.75 [90% CI, -0.93 to -0.28] and r = -0.61 [90% CI, -0.87 to -0.09], respectively; estimated Spearman rank correlation). Participants with higher baseline IFIG levels had smaller week 12 decreases in plasma viral load (0.66 log(10) copies/mL [90% CI, 0.06-0.91 log(10) copies/mL]), whereas those with larger IFIG induction levels exhibited larger decreases in plasma viral load (-0.74 log(10) copies/mL [90% CI, -0.93 to -0.21 log(10) copies/mL]). CONCLUSION: Pegylated interferon alfa-2a was well tolerated and exhibited statistically significant anti-HIV-1 activity in HIV-1-monoinfected patients. The anti-HIV-1 effect correlated with OAS protein levels (weeks 1 and 2) and IFIG induction levels (week 12) but not with pegylated interferon concentrations.
Resumo:
BACKGROUND: Monogamy, together with abstinence, partner reduction, and condom use, is widely advocated as a key behavioral strategy to prevent HIV infection in sub-Saharan Africa. We examined the association between the number of sexual partners and the risk of HIV seropositivity among men and women presenting for HIV voluntary counseling and testing (VCT) in northern Tanzania. METHODOLOGY/ PRINCIPAL FINDINGS: Clients presenting for HIV VCT at a community-based AIDS service organization in Moshi, Tanzania were surveyed between November 2003 and December 2007. Data on sociodemographic characteristics, reasons for testing, sexual behaviors, and symptoms were collected. Men and women were categorized by number of lifetime sexual partners, and rates of seropositivity were reported by category. Factors associated with HIV seropositivity among monogamous males and females were identified by a multivariate logistic regression model. Of 6,549 clients, 3,607 (55%) were female, and the median age was 30 years (IQR 24-40). 939 (25%) females and 293 (10%) males (p<0.0001) were HIV seropositive. Among 1,244 (34%) monogamous females and 423 (14%) monogamous males, the risk of HIV infection was 19% and 4%, respectively (p<0.0001). The risk increased monotonically with additional partners up to 45% (p<0.001) and 15% (p<0.001) for women and men, respectively with 5 or more partners. In multivariate analysis, HIV seropositivity among monogamous women was most strongly associated with age (p<0.0001), lower education (p<0.004), and reporting a partner with other partners (p = 0.015). Only age was a significant risk factor for monogamous men (p = 0.0004). INTERPRETATION: Among women presenting for VCT, the number of partners is strongly associated with rates of seropositivity; however, even women reporting lifetime monogamy have a high risk for HIV infection. Partner reduction should be coupled with efforts to place tools in the hands of sexually active women to reduce their risk of contracting HIV.
Resumo:
Eukaryotic genomes are mostly composed of noncoding DNA whose role is still poorly understood. Studies in several organisms have shown correlations between the length of the intergenic and genic sequences of a gene and the expression of its corresponding mRNA transcript. Some studies have found a positive relationship between intergenic sequence length and expression diversity between tissues, and concluded that genes under greater regulatory control require more regulatory information in their intergenic sequences. Other reports found a negative relationship between expression level and gene length and the interpretation was that there is selection pressure for highly expressed genes to remain small. However, a correlation between gene sequence length and expression diversity, opposite to that observed for intergenic sequences, has also been reported, and to date there is no testable explanation for this observation. To shed light on these varied and sometimes conflicting results, we performed a thorough study of the relationships between sequence length and gene expression using cell-type (tissue) specific microarray data in Arabidopsis thaliana. We measured median gene expression across tissues (expression level), expression variability between tissues (expression pattern uniformity), and expression variability between replicates (expression noise). We found that intergenic (upstream and downstream) and genic (coding and noncoding) sequences have generally opposite relationships with respect to expression, whether it is tissue variability, median, or expression noise. To explain these results we propose a model, in which the lengths of the intergenic and genic sequences have opposite effects on the ability of the transcribed region of the gene to be epigenetically regulated for differential expression. These findings could shed light on the role and influence of noncoding sequences on gene expression.
Resumo:
BACKGROUND: Small laboratory fish share many anatomical and histological characteristics with other vertebrates, yet can be maintained in large numbers at low cost for lifetime studies. Here we characterize biomarkers associated with normal aging in the Japanese medaka (Oryzias latipes), a species that has been widely used in toxicology studies and has potential utility as a model organism for experimental aging research. PRINCIPAL FINDINGS: The median lifespan of medaka was approximately 22 months under laboratory conditions. We performed quantitative histological analysis of tissues from age-grouped individuals representing young adults (6 months old), mature adults (16 months old), and adults that had survived beyond the median lifespan (24 months). Livers of 24-month old individuals showed extensive morphologic changes, including spongiosis hepatis, steatosis, ballooning degeneration, inflammation, and nuclear pyknosis. There were also phagolysosomes, vacuoles, and residual bodies in parenchymal cells and congestion of sinusoidal vessels. Livers of aged individuals were characterized by increases in lipofuscin deposits and in the number of TUNEL-positive apoptotic cells. Some of these degenerative characteristics were seen, to a lesser extent, in the livers of 16-month old individuals, but not in 6-month old individuals. The basal layer of the dermis showed an age-dependent decline in the number of dividing cells and an increase in senescence-associated β-galactosidase. The hearts of aged individuals were characterized by fibrosis and lipofuscin deposition. There was also a loss of pigmented cells from the retinal epithelium. By contrast, age-associated changes were not apparent in skeletal muscle, the ocular lens, or the brain. SIGNIFICANCE: The results provide a set of markers that can be used to trace the process of normal tissue aging in medaka and to evaluate the effect of environmental stressors.
Resumo:
This study examines the timing of menarche in relation to infant-feeding methods, specifically addressing the potential effects of soy isoflavone exposure through soy-based infant feeding. Subjects were participants in the Avon Longitudinal Study of Parents and Children (ALSPAC). Mothers were enrolled during pregnancy and their children have been followed prospectively. Early-life feeding regimes, categorised as primarily breast, early formula, early soy and late soy, were defined using infant-feeding questionnaires administered during infancy. For this analysis, age at menarche was assessed using questionnaires administered approximately annually between ages 8 and 14.5. Eligible subjects were limited to term, singleton, White females. We used Kaplan-Meier survival curves and Cox proportional hazards models to assess age at menarche and risk of menarche over the study period. The present analysis included 2920 girls. Approximately 2% of mothers reported that soy products were introduced into the infant diet at or before 4 months of age (early soy). The median age at menarche [interquartile range (IQR)] in the study sample was 153 months [144-163], approximately 12.8 years. The median age at menarche among early soy-fed girls was 149 months (12.4 years) [IQR, 140-159]. Compared with girls fed non-soy-based infant formula or milk (early formula), early soy-fed girls were at 25% higher risk of menarche throughout the course of follow-up (hazard ratio 1.25 [95% confidence interval 0.92, 1.71]). Our results also suggest that girls fed soy products in early infancy may have an increased risk of menarche specifically in early adolescence. These findings may be the observable manifestation of mild endocrine-disrupting effects of soy isoflavone exposure. However, our study is limited by few soy-exposed subjects and is not designed to assess biological mechanisms. Because soy formula use is common in some populations, this subtle association with menarche warrants more in-depth evaluation in future studies.
Resumo:
Fixed dose combination abacavir/lamivudine/zidovudine (ABC/3TC/ZDV) among HIV-1 and tuberculosis (TB)-coinfected patients was evaluated and outcomes between early vs. delayed initiation were compared. In a randomized, pilot study conducted in the Kilimanjaro Region of Tanzania, HIV-infected inpatients with smear-positive TB and total lymphocyte count <1200/mm(3) were randomized to initiate ABC/3TC/ZDV either 2 (early) or 8 (delayed) weeks after commencing antituberculosis therapy and were followed for 104 weeks. Of 94 patients screened, 70 enrolled (41% female, median CD4 count 103 cells/mm(3)), and 33 in each group completed 104 weeks. Two deaths and 12 serious adverse events (SAEs) were observed in the early arm vs. one death, one clinical failure, and seven SAEs in the delayed arm (p = 0.6012 for time to first grade 3/4 event, SAE, or death). CD4 cell increases were +331 and +328 cells/mm(3), respectively. TB-immune reconstitution inflammatory syndromes (TB-IRIS) were not observed in any subject. Using intent-to-treat (ITT), missing = failure analyses, 74% (26/35) vs. 89% (31/35) randomized to early vs. delayed therapy had HIV RNA levels <400 copies/ml at 104 weeks (p = 0.2182) and 66% (23/35) vs. 74% (26/35), respectively, had HIV RNA levels <50 copies/ml (p = 0.6026). In an analysis in which switches from ABC/3TC/ZDV = failure, those receiving early therapy were less likely to be suppressed to <400 copies/ml [60% (21/35) vs. 86% (30/35), p = 0.030]. TB-IRIS was not observed among the 70 coinfected subjects beginning antiretroviral treatment. ABC/3TC/ZDV was well tolerated and resulted in steady immunologic improvement. Rates of virologic suppression were similar between early and delayed treatment strategies with triple nucleoside regimens when substitutions were allowed.
Resumo:
Previously we have shown that a functional nonsynonymous single nucleotide polymorphism (rs6318) of the 5HTR2C gene located on the X-chromosome is associated with hypothalamic-pituitary-adrenal axis response to a stress recall task, and with endophenotypes associated with cardiovascular disease (CVD). These findings suggest that individuals carrying the rs6318 Ser23 C allele will be at higher risk for CVD compared to Cys23 G allele carriers. The present study examined allelic variation in rs6318 as a predictor of coronary artery disease (CAD) severity and a composite endpoint of all-cause mortality or myocardial infarction (MI) among Caucasian participants consecutively recruited through the cardiac catheterization laboratory at Duke University Hospital (Durham, NC) as part of the CATHGEN biorepository. Study population consisted of 6,126 Caucasian participants (4,036 [65.9%] males and 2,090 [34.1%] females). A total of 1,769 events occurred (1,544 deaths and 225 MIs; median follow-up time = 5.3 years, interquartile range = 3.3-8.2). Unadjusted Cox time-to-event regression models showed, compared to Cys23 G carriers, males hemizygous for Ser23 C and females homozygous for Ser23C were at increased risk for the composite endpoint of all-cause death or MI: Hazard Ratio (HR) = 1.47, 95% confidence interval (CI) = 1.17, 1.84, p = .0008. Adjusting for age, rs6318 genotype was not related to body mass index, diabetes, hypertension, dyslipidemia, smoking history, number of diseased coronary arteries, or left ventricular ejection fraction in either males or females. After adjustment for these covariates the estimate for the two Ser23 C groups was modestly attenuated, but remained statistically significant: HR = 1.38, 95% CI = 1.10, 1.73, p = .005. These findings suggest that this functional polymorphism of the 5HTR2C gene is associated with increased risk for CVD mortality and morbidity, but this association is apparently not explained by the association of rs6318 with traditional risk factors or conventional markers of atherosclerotic disease.
Elucidation of hepatitis C virus transmission and early diversification by single genome sequencing.
Resumo:
A precise molecular identification of transmitted hepatitis C virus (HCV) genomes could illuminate key aspects of transmission biology, immunopathogenesis and natural history. We used single genome sequencing of 2,922 half or quarter genomes from plasma viral RNA to identify transmitted/founder (T/F) viruses in 17 subjects with acute community-acquired HCV infection. Sequences from 13 of 17 acute subjects, but none of 14 chronic controls, exhibited one or more discrete low diversity viral lineages. Sequences within each lineage generally revealed a star-like phylogeny of mutations that coalesced to unambiguous T/F viral genomes. Numbers of transmitted viruses leading to productive clinical infection were estimated to range from 1 to 37 or more (median = 4). Four acutely infected subjects showed a distinctly different pattern of virus diversity that deviated from a star-like phylogeny. In these cases, empirical analysis and mathematical modeling suggested high multiplicity virus transmission from individuals who themselves were acutely infected or had experienced a virus population bottleneck due to antiviral drug therapy. These results provide new quantitative and qualitative insights into HCV transmission, revealing for the first time virus-host interactions that successful vaccines or treatment interventions will need to overcome. Our findings further suggest a novel experimental strategy for identifying full-length T/F genomes for proteome-wide analyses of HCV biology and adaptation to antiviral drug or immune pressures.
Resumo:
BACKGROUND: There have been major changes in the management of anemia in US hemodialysis patients in recent years. We sought to determine the influence of clinical trial results, safety regulations, and changes in reimbursement policy on practice. METHODS: We examined indicators of anemia management among incident and prevalent hemodialysis patients from a medium-sized dialysis provider over three time periods: (1) 2004 to 2006 (2) 2007 to 2009, and (3) 2010. Trends across the three time periods were compared using generalized estimating equations. RESULTS: Prior to 2007, the median proportion of patients with monthly hemoglobin >12 g/dL for patients on dialysis 0 to 3, 4 to 6 and 7 to 18 months, respectively, was 42%, 55% and 46% declined to 41%, 54%, and 40% after 2007, and declined more sharply in 2010 to 34%, 41%, and 30%. Median weekly Epoeitin alpha doses over the same periods were 18,000, 12,400, and 9,100 units before 2007; remained relatively unchanged from 2007 to 2009; and decreased sharply in the patients 3-6 and 6-18 months on dialysis to 10,200 and 7,800 units, respectively in 2010. Iron doses, serum ferritin, and transferrin saturation levels increased over time with more pronounced increases in 2010. CONCLUSION: Modest changes in anemia management occurred between 2007 and 2009, followed by more dramatic changes in 2010. Studies are needed to examine the effects of declining erythropoietin use and hemoglobin levels and increasing intravenous iron use on quality of life, transplantation rates, infection rates and survival.
Resumo:
BACKGROUND: Primary care providers' suboptimal recognition of the severity of chronic kidney disease (CKD) may contribute to untimely referrals of patients with CKD to subspecialty care. It is unknown whether U.S. primary care physicians' use of estimated glomerular filtration rate (eGFR) rather than serum creatinine to estimate CKD severity could improve the timeliness of their subspecialty referral decisions. METHODS: We conducted a cross-sectional study of 154 United States primary care physicians to assess the effect of use of eGFR (versus creatinine) on the timing of their subspecialty referrals. Primary care physicians completed a questionnaire featuring questions regarding a hypothetical White or African American patient with progressing CKD. We asked primary care physicians to identify the serum creatinine and eGFR levels at which they would recommend patients like the hypothetical patient be referred for subspecialty evaluation. We assessed significant improvement in the timing [from eGFR < 30 to ≥ 30 mL/min/1.73m(2)) of their recommended referrals based on their use of creatinine versus eGFR. RESULTS: Primary care physicians recommended subspecialty referrals later (CKD more advanced) when using creatinine versus eGFR to assess kidney function [median eGFR 32 versus 55 mL/min/1.73m(2), p < 0.001]. Forty percent of primary care physicians significantly improved the timing of their referrals when basing their recommendations on eGFR. Improved timing occurred more frequently among primary care physicians practicing in academic (versus non-academic) practices or presented with White (versus African American) hypothetical patients [adjusted percentage(95% CI): 70% (45-87) versus 37% (reference) and 57% (39-73) versus 25% (reference), respectively, both p ≤ 0.01). CONCLUSIONS: Primary care physicians recommended subspecialty referrals earlier when using eGFR (versus creatinine) to assess kidney function. Enhanced use of eGFR by primary care physicians' could lead to more timely subspecialty care and improved clinical outcomes for patients with CKD.