29 resultados para Ehrenreich, Barbara: Nickel and dimed - undercover in low-wage USA
Resumo:
BACKGROUND: Reduced bone mineral density (BMD) is common in adults infected with human immunodeficiency virus (HIV). The role of proximal renal tubular dysfunction (PRTD) and alterations in bone metabolism in HIV-related low BMD are incompletely understood. METHODS: We quantified BMD (dual-energy x-ray absorptiometry), blood and urinary markers of bone metabolism and renal function, and risk factors for low BMD (hip or spine T score, -1 or less) in an ambulatory care setting. We determined factors associated with low BMD and calculated 10-year fracture risks using the World Health Organization FRAX equation. RESULTS: We studied 153 adults (98% men; median age, 48 years; median body mass index, 24.5; 67 [44%] were receiving tenofovir, 81 [53%] were receiving a boosted protease inhibitor [PI]). Sixty-five participants (42%) had low BMD, and 11 (7%) had PRTD. PI therapy was associated with low BMD in multivariable analysis (odds ratio, 2.69; 95% confidence interval, 1.09-6.63). Tenofovir use was associated with increased osteoblast and osteoclast activity (P< or = .002). The mean estimated 10-year risks were 1.2% for hip fracture and 5.4% for any major osteoporotic fracture. CONCLUSIONS: In this mostly male population, low BMD was significantly associated with PI therapy. Tenofovir recipients showed evidence of increased bone turnover. Measurement of BMD and estimation of fracture risk may be warranted in treated HIV-infected adults.
Resumo:
During the last few years γ-hydroxybutyric acid (GHB) and γ-butyrolactone (GBL) have attracted much interest as recreational drugs and knock-out drops in drug-facilitated sexual assaults. This experiment aims at getting an insight into the pharmacokinetics of GHB after intake of GBL. Therefore Two volunteers took a single dose of 1.5 ml GBL, which had been spiked to a soft drink. Assuming that GBL was completely metabolized to GHB, the corresponding amount of GHB was 2.1 g. Blood and urine samples were collected 5 h and 24 h after ingestion, respectively. Additionally, hair samples (head hair and beard hair) were taken within four to five weeks after intake of GBL. Samples were analyzed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) after protein precipitation with acetonitrile. The following observations were made: spiked to a soft drink, GBL, which tastes very bitter, formed a liquid layer at the bottom of the glass, only disappearing when stirring. Both volunteers reported weak central effects after approximately 15 min, which disappeared completely half an hour later. Maximum concentrations of GHB in serum were measured after 20 min (95 µg/ml and 106 µg/ml). Already after 4-5 h the GHB concentrations in serum decreased below 1 µg/ml. In urine maximum GHB concentrations (140 µg/ml and 120 µg/ml) were measured after 1-2 h, and decreased to less than 1 µg/ml within 8-10 h. The Ratio of GHB in serum versus blood was 1.2 and 1.6
Resumo:
OBJECTIVE To describe the CD4 cell count at the start of combination antiretroviral therapy (cART) in low-income (LIC), lower middle-income (LMIC), upper middle-income (UMIC), and high-income (HIC) countries. METHODS Patients aged 16 years or older starting cART in a clinic participating in a multicohort collaboration spanning 6 continents (International epidemiological Databases to Evaluate AIDS and ART Cohort Collaboration) were eligible. Multilevel linear regression models were adjusted for age, gender, and calendar year; missing CD4 counts were imputed. RESULTS In total, 379,865 patients from 9 LIC, 4 LMIC, 4 UMIC, and 6 HIC were included. In LIC, the median CD4 cell count at cART initiation increased by 83% from 80 to 145 cells/μL between 2002 and 2009. Corresponding increases in LMIC, UMIC, and HIC were from 87 to 155 cells/μL (76% increase), 88 to 135 cells/μL (53%), and 209 to 274 cells/μL (31%). In 2009, compared with LIC, median counts were 13 cells/μL [95% confidence interval (CI): -56 to +30] lower in LMIC, 22 cells/μL (-62 to +18) lower in UMIC, and 112 cells/μL (+75 to +149) higher in HIC. They were 23 cells/μL (95% CI: +18 to +28 cells/μL) higher in women than men. Median counts were 88 cells/μL (95% CI: +35 to +141 cells/μL) higher in countries with an estimated national cART coverage >80%, compared with countries with <40% coverage. CONCLUSIONS Median CD4 cell counts at the start of cART increased 2000-2009 but remained below 200 cells/μL in LIC and MIC and below 300 cells/μL in HIC. Earlier start of cART will require substantial efforts and resources globally.
Resumo:
In a forest grove at Korup dominated by the ectomycorrhizal species Microberlinia bisulcata, an experiment tested whether phosphorus (P) was a limiting nutrient. P-fertilization of seven subplots 1995-97 was compared with seven controls. It led to large increases in soil P concentrations. Trees were measured in 1995 and 2000. M. bisulcata and four other species were transplanted into the treatments, and a wild cohort of M. bisulcata seedlings was followed in both. Leaf litter fall from trees and seedlings were analysed for nutrients. Growth of trees was not affected by added P. Transplanted seedlings survived better in the controls than added-P subplots: they did not grow better with added-P.M. bisulcata wildlings survived slightly better in the added-P subplots in yr 1 but not later. Litter fall and transplanted survivors had much higher concentrations of P (not N) in the added-P than control subplots. Under current conditions, it appears that P does not limit growth of trees or hinder seedling establishment, especially of M. bisculcata, in these low-P grove soils.
Resumo:
BACKGROUND Management of persistent low-level viraemia (pLLV) in patients on combined antiretroviral therapy (cART) with previously undetectable HIV viral loads (VLs) is challenging. We examined virological outcome and management among patients enrolled in the Swiss HIV Cohort Study (SHCS). METHODS In this retrospective study (2000-2011), pLLV was defined as a VL of 21-400 copies/mL on ≥3 consecutive plasma samples with ≥8 weeks between first and last analyses, in patients undetectable for ≥24 weeks on cART. Control patients had ≥3 consecutive undetectable VLs over ≥32 weeks. Virological failure (VF), analysed in the pLLV patient group, was defined as a VL>400 copies/mL. RESULTS Among 9972 patients, 179 had pLLV and 5389 were controls. Compared to controls, pLLV patients were more often on unboosted PI-based (adjusted odds ratio, aOR, [95%CI] 3.2 [1.8-5.9]) and NRTI-only combinations (aOR 2.1 [1.1-4.2]) than on NNRTI and boosted PI-based regimens. At 48 weeks, 102/155 pLLV patients (66%) still had pLLV, 19/155 (12%) developed VF, and 34/155 (22%) had undetectable VLs. Predictors of VF were previous VF (aOR 35 [3.8-315]), unboosted PI-based (aOR 12.8 [1.7-96]) or NRTI-only combinations (aOR 115 [6.8-1952]), and VLs>200 during pLLV (aOR 3.7 [1.1-12]). No VF occurred in patients with persistent very LLV (pVLLV, 21-49 copies/mL; N=26). At 48 weeks, 29/39 patients (74%) who changed cART had undetectable VLs, compared to 19/74 (26%) without change (P<0.001). CONCLUSIONS Among patients with pLLV, VF was predicted by previous VF, cART regimen and VL ≥200. Most patients who changed cART had undetectable VLs 48 weeks later. These findings support cART modification for pLLV >200 copies/ml.
Resumo:
BACKGROUND The CD4 cell count or percent (CD4%) at the start of combination antiretroviral therapy (cART) is an important prognostic factor in children starting therapy and an important indicator of program performance. We describe trends and determinants of CD4 measures at cART initiation in children from low-, middle-, and high-income countries. METHODS We included children aged <16 years from clinics participating in a collaborative study spanning sub-Saharan Africa, Asia, Latin America, and the United States. Missing CD4 values at cART start were estimated through multiple imputation. Severe immunodeficiency was defined according to World Health Organization criteria. Analyses used generalized additive mixed models adjusted for age, country, and calendar year. RESULTS A total of 34,706 children from 9 low-income, 6 lower middle-income, 4 upper middle-income countries, and 1 high-income country (United States) were included; 20,624 children (59%) had severe immunodeficiency. In low-income countries, the estimated prevalence of children starting cART with severe immunodeficiency declined from 76% in 2004 to 63% in 2010. Corresponding figures for lower middle-income countries were from 77% to 66% and for upper middle-income countries from 75% to 58%. In the United States, the percentage decreased from 42% to 19% during the period 1996 to 2006. In low- and middle-income countries, infants and children aged 12-15 years had the highest prevalence of severe immunodeficiency at cART initiation. CONCLUSIONS Despite progress in most low- and middle-income countries, many children continue to start cART with severe immunodeficiency. Early diagnosis and treatment of HIV-infected children to prevent morbidity and mortality associated with immunodeficiency must remain a global public health priority.
Resumo:
Pressure–Temperature–time (P–T–t) estimates of the syn-kinematic strain at the peak-pressure conditions reached during shallow underthrusting of the Briançonnais Zone in the Alpine subduction zone was made by thermodynamic modelling and 40Ar/39Ar dating in the Plan-de-Phasy unit (SE of the Pelvoux Massif, Western Alps). The dated phengite minerals crystallized syn-kinematically in a shear zone indicating top-to-the-N motion. By combining X-ray mapping with multi-equilibrium calculations, we estimate the phengite crystallization conditions at 270 ± 50 °C and 8.1 ± 2 kbar at an age of 45.9 ± 1.1 Ma. Combining this P–T–t estimate with data from the literature allows us to constrain the timing and geometry of Alpine continental subduction. We propose that the Briançonnais units were scalped on top of the slab during ongoing continental subduction and exhumed continuously until collision.
Resumo:
Geological site characterisation programmes typically rely on drill cores for direct information on subsurface rocks. However, porosity, transport properties and porewater composition measured on drill cores can deviate from in-situ values due to two main artefacts caused by drilling and sample recovery: (1) mechanical disruption that increases porosity and (2) contamination of the porewater by drilling fluid. We investigated the effect and magnitude of these perturbations on large drill core samples (12–20 cm long, 5 cmdiameter) of high-grade, granitic gneisses obtained from 350 to 600 m depth in a borehole on Olkiluoto Island (SW Finland). The drilling fluid was traced with sodium–iodide. By combining out-diffusion experiments, gravimetry, UV-microscopy and iodide mass balance calculations, we successfully quantified the magnitudes of the artefacts: 2–6% increase in porosity relative to the bulk connected porosity and 0.9 to 8.9 vol.% contamination by drilling fluid. The spatial distribution of the drilling-induced perturbations was revealed by numerical simulations of 2D diffusion matched to the experimental data. This showed that the rims of the samples have a mechanically disrupted zone 0.04 to 0.22 cm wide, characterised by faster transport properties compared to the undisturbed centre (1.8 to 7.7 times higher pore diffusion coefficient). Chemical contamination was shown to affect an even wider zone in all samples, ranging from 0.15 to 0.60 cm, inwhich iodide enrichmentwas up to 180 mg/kgwater, compared to 0.5 mg/kgwater in the uncontaminated centre. For all samples in the present case study, it turned out that the magnitude of the artefacts caused by drilling and sample recovery is so small that no correction is required for their effects. Therefore, the standard laboratory measurements of porosity, transport properties and porewater composition can be taken as valid in-situ estimates. However, it is clear that the magnitudes strongly depend on site- and drilling-specific factors and therefore our results cannot be transferred simply to other locations. We recommend the approach presented in this study as a route to obtain reliable values in future drilling campaigns aimed at characterising in-situ bedrock properties.
Resumo:
OBJECTIVES This study compared clinical outcomes and revascularization strategies among patients presenting with low ejection fraction, low-gradient (LEF-LG) severe aortic stenosis (AS) according to the assigned treatment modality. BACKGROUND The optimal treatment modality for patients with LEF-LG severe AS and concomitant coronary artery disease (CAD) requiring revascularization is unknown. METHODS Of 1,551 patients, 204 with LEF-LG severe AS (aortic valve area <1.0 cm(2), ejection fraction <50%, and mean gradient <40 mm Hg) were allocated to medical therapy (MT) (n = 44), surgical aortic valve replacement (SAVR) (n = 52), or transcatheter aortic valve replacement (TAVR) (n = 108). CAD complexity was assessed using the SYNTAX score (SS) in 187 of 204 patients (92%). The primary endpoint was mortality at 1 year. RESULTS LEF-LG severe AS patients undergoing SAVR were more likely to undergo complete revascularization (17 of 52, 35%) compared with TAVR (8 of 108, 8%) and MT (0 of 44, 0%) patients (p < 0.001). Compared with MT, both SAVR (adjusted hazard ratio [adj HR]: 0.16; 95% confidence interval [CI]: 0.07 to 0.38; p < 0.001) and TAVR (adj HR: 0.30; 95% CI: 0.18 to 0.52; p < 0.001) improved survival at 1 year. In TAVR and SAVR patients, CAD severity was associated with higher rates of cardiovascular death (no CAD: 12.2% vs. low SS [0 to 22], 15.3% vs. high SS [>22], 31.5%; p = 0.037) at 1 year. Compared with no CAD/complete revascularization, TAVR and SAVR patients undergoing incomplete revascularization had significantly higher 1-year cardiovascular death rates (adj HR: 2.80; 95% CI: 1.07 to 7.36; p = 0.037). CONCLUSIONS Among LEF-LG severe AS patients, SAVR and TAVR improved survival compared with MT. CAD severity was associated with worse outcomes and incomplete revascularization predicted 1-year cardiovascular mortality among TAVR and SAVR patients.
Resumo:
OBJECTIVES Hypothetically the atherogenic effect of the metabolic syndrome may be mediated through the increased occurrence of small LDL-particles which are easily modified to atherogenic oxidized LDL (ox-LDL). The aim of this study was to test this concept by examining the association between circulating ox-LDL, LDL-particle size, and the metabolic syndrome. DESIGN AND RESULTS A population-based sample of clinically healthy 58-year-old men (n = 391) was recruited. Ox-LDL was measured by ELISA (specific monoclonal antibody, mAb-4E6) and LDL-particle size by gradient gel electrophoresis. The results showed that ox-LDL significantly correlated to factors constituting the metabolic syndrome; triglycerides (r = 0.43), plasma insulin (r = 0.20), body mass index (r = 0.20), waist-to-hip ratio (r = 0.21) and HDL (r = -0.24); (P < 0.001). Ox-LDL correlated also to LDL-particle size (r = -0.42), Apo-B (r = 0.70), LDL (r = 0.65); (P < 0.001) and, furthermore, with Apo A-1 (r = -0.13) and heart rate (r = 0.13); (P < 0.01). CONCLUSION The metabolic syndrome was accompanied by high plasma ox-LDL concentrations compared with those without the syndrome. Ox-LDL levels were associated with most of the risk factors constituting the metabolic syndrome and was, in addition related to small LDL-particle size. To our knowledge the present study is the first one to demonstrate that circulating ox-LDL levels are associated with small LDL-particle size in a population representative sample of clinically healthy middle-aged men. The high degree of intercorrelation amongst several factors makes it difficult to clarify the independent role of any specific factor.
Resumo:
BACKGROUND The global burden of childhood tuberculosis (TB) is estimated to be 0.5 million new cases per year. Human immunodeficiency virus (HIV)-infected children are at high risk for TB. Diagnosis of TB in HIV-infected children remains a major challenge. METHODS We describe TB diagnosis and screening practices of pediatric antiretroviral treatment (ART) programs in Africa, Asia, the Caribbean, and Central and South America. We used web-based questionnaires to collect data on ART programs and patients seen from March to July 2012. Forty-three ART programs treating children in 23 countries participated in the study. RESULTS Sputum microscopy and chest Radiograph were available at all programs, mycobacterial culture in 40 (93%) sites, gastric aspiration in 27 (63%), induced sputum in 23 (54%), and Xpert MTB/RIF in 16 (37%) sites. Screening practices to exclude active TB before starting ART included contact history in 41 sites (84%), symptom screening in 38 (88%), and chest Radiograph in 34 sites (79%). The use of diagnostic tools was examined among 146 children diagnosed with TB during the study period. Chest Radiograph was used in 125 (86%) children, sputum microscopy in 76 (52%), induced sputum microscopy in 38 (26%), gastric aspirate microscopy in 35 (24%), culture in 25 (17%), and Xpert MTB/RIF in 11 (8%) children. CONCLUSIONS Induced sputum and Xpert MTB/RIF were infrequently available to diagnose childhood TB, and screening was largely based on symptom identification. There is an urgent need to improve the capacity of ART programs in low- and middle-income countries to exclude and diagnose TB in HIV-infected children.
Resumo:
BACKGROUND Viral load and CD4% are often not available in resource-limited settings for monitoring children's responses to antiretroviral therapy (ART). We aimed to construct normative curves for weight gain at 6, 12, 18, and 24 months following initiation of ART in children, and to assess the association between poor weight gain and subsequent responses to ART. DESIGN Analysis of data from HIV-infected children younger than 10 years old from African and Asian clinics participating in the International epidemiologic Databases to Evaluate AIDS. METHODS The generalized additive model for location, scale, and shape was used to construct normative percentile curves for weight gain at 6, 12, 18, and 24 months following ART initiation. Cox proportional models were used to assess the association between lower percentiles (< 50th) of weight gain distribution at the different time points and subsequent death, virological suppression, and virological failure. RESULTS Among 7173 children from five regions of the world, 45% were underweight at baseline. Weight gain below the 50th percentile at 6, 12, 18, and 24 months of ART was associated with increased risk of death, independent of baseline characteristics. Poor weight gain was not associated with increased hazards of virological suppression or virological failure. CONCLUSION Monitoring weight gain on ART using age-specific and sex-specific normative curves specifically developed for HIV-infected children on ART is a simple, rapid, sustainable tool that can aid in the identification of children who are at increased risk of death in the first year of ART.