40 resultados para Transformation-based semi-parametric estimators
Resumo:
BACKGROUND: We sought to characterize the impact that hepatitis C virus (HCV) infection has on CD4 cells during the first 48 weeks of antiretroviral therapy (ART) in previously ART-naive human immunodeficiency virus (HIV)-infected patients. METHODS: The HIV/AIDS Drug Treatment Programme at the British Columbia Centre for Excellence in HIV/AIDS distributes all ART in this Canadian province. Eligible individuals were those whose first-ever ART included 2 nucleoside reverse transcriptase inhibitors and either a protease inhibitor or a nonnucleoside reverse transcriptase inhibitor and who had a documented positive result for HCV antibody testing. Outcomes were binary events (time to an increase of > or = 75 CD4 cells/mm3 or an increase of > or = 10% in the percentage of CD4 cells in the total T cell population [CD4 cell fraction]) and continuous repeated measures. Statistical analyses used parametric and nonparametric methods, including multivariate mixed-effects linear regression analysis and Cox proportional hazards analysis. RESULTS: Of 1186 eligible patients, 606 (51%) were positive and 580 (49%) were negative for HCV antibodies. HCV antibody-positive patients were slower to have an absolute (P<.001) and a fraction (P = .02) CD4 cell event. In adjusted Cox proportional hazards analysis (controlling for age, sex, baseline absolute CD4 cell count, baseline pVL, type of ART initiated, AIDS diagnosis at baseline, adherence to ART regimen, and number of CD4 cell measurements), HCV antibody-positive patients were less likely to have an absolute CD4 cell event (adjusted hazard ratio [AHR], 0.84 [95% confidence interval [CI], 0.72-0.98]) and somewhat less likely to have a CD4 cell fraction event (AHR, 0.89 [95% CI, 0.70-1.14]) than HCV antibody-negative patients. In multivariate mixed-effects linear regression analysis, HCV antibody-negative patients had increases of an average of 75 cells in the absolute CD4 cell count and 4.4% in the CD4 cell fraction, compared with 20 cells and 1.1% in HCV antibody-positive patients, during the first 48 weeks of ART, after adjustment for time-updated pVL, number of CD4 cell measurements, and other factors. CONCLUSION: HCV antibody-positive HIV-infected patients may have an altered immunologic response to ART.
Resumo:
The current paradigm on leukemogenesis indicates that leukemias are propagated by leukemic stem cells. The genomic events and pathways involved in the transformation of hematopoietic precursors into leukemic stem cells are increasingly understood. This concept is based on genomic mutations or functional dysregulation of transcription factors in malignant cells of patients with acute myeloid leukemia (AML). Loss of the CCAAT/enhancer binding protein-alpha (CEBPA) function in myeloid cells in vitro and in vivo leads to a differentiation block, similar to that observed in blasts from AML patients. CEBPA alterations in specific subgroups of AML comprise genomic mutations leading to dominant-negative mutant proteins, transcriptional suppression by leukemic fusion proteins, translational inhibition by activated RNA-binding proteins, and functional inhibition by phosphorylation or increased proteasomal-dependent degradation. The PU.1 gene can be mutated or its expression or function can be blocked by leukemogenic fusion proteins in AML. Point mutations in the RUNX1/AML1 gene are also observed in specific subtypes of AML, in addition to RUNX1 being the most frequent target for chromosomal translocation in AML. These data are persuasive evidence that impaired function of particular transcription factors contributes directly to the development of human AML, and restoring their function represents a promising target for novel therapeutic strategies in AML.
Resumo:
OBJECTIVES: Bone attrition probably constitutes remodeling of the bone, resulting in flattening or depression of the articular surfaces. Defining bone attrition is challenging because it is an accentuation of the normal curvature of the tibial plateaus. We aimed to define bone attrition on magnetic resonance imaging (MRI) of the knee using information from both radiographs and MRIs, and to assess whether bone attrition is common prior to end stage disease osteoarthritis (OA) in the tibio-femoral joint. METHODS: All knees of participants in the community-based sample of the Framingham OA Study were evaluated for bone attrition in radiographs and MRIs. Radiographs were scored based on templates designed to outline the normal contours of the tibio-femoral joint. MRIs were analyzed using the semi-quantitative Whole-Organ Magnetic Resonance Imaging Scoring (WORMS) method. The prevalence of bone attrition was calculated using two different thresholds for MRI scores. RESULTS: Inter-observer agreement for identification of bone attrition was substantial for the radiographs (kappa=0.71, 95% CI 0.67-0.81) and moderate for MRI (kappa=0.56, 95% CI 0.40-0.72). Of 964 knees, 5.7% of the radiographs showed bone attrition. Of these, 91% of MRIs were also read as showing bone attrition. We selected a conservative threshold for bone attrition on MRI scoring (> or = 2 on a 0-3 scale) based on agreement with attrition on the radiograph or when bone attrition on MRI co-occurred with cartilage loss on OA. Using this threshold for bone attrition on MRI, bone attrition was common in knees with OA. For example, in knees with mild OA but no joint space narrowing, 13 of 88 MRIs (14.8%) showed bone attrition. CONCLUSIONS: Using MRI we found that many knees with mild OA without joint narrowing on radiographs had bone attrition, even using conservative definitions. The validity of our definition of bone attrition should be evaluated in further studies. Bone attrition may occur in milder OA and at earlier stages of disease than previously thought.
Resumo:
Annually laminated (varved) sediments of proglacial Lake Silvaplana (46 ̊27’N, 9 ̊48’E, 1791 m a.s.l., Engadine, eastern Swiss Alps) provide an excellent archive for quantitative high-resolution (seasonal – annual) reconstruction of high- and lowfrequency climate signals back to AD 1580. The chronology of the core is based on varve counting, Cs-137, Pb-210 and event stratigraphy. In this study we present a reconstruction based on in-situ reflectance spectroscopy. In situ reflectance spectroscopy is known as a cost- and time-effective non destructtive method for semi-quantitative analysis of pigments (e.g., chlorines and carotenoids) and of lithoclastic sediment fractions. Reflectance-dependent absorption (RDA) was measured with a Gretac Macbeth spectrolino at 2 mm resolution. The spectral coverage ranges from 380 nm to 730 nm at 10 nm band resolution. In proglacial Lake Silvaplana, 99% of the sediment is lithoclastic prior to AD 1950. Therefore, we concentrate on absorption features that are characteristic for lithoclastic sediment fractions. In Lake Silvaplana, two significant correlations that are stable in time were found between RDA typical for lithoclastics and meteorological data: (1) the time series R 570 /R 630 (ratio between RDA at 570 nm and 630 nm) of varves in Lake Silvaplana and May to October temperatures at nearby station of Sils correlate highly significantly (calibration period AD 1864 – 1951, r = 0.74, p < 0.01 for 5ptsmoothed series; RMSE is 0.28 ̊C, RE = 0.41 and CE = 0.38), and (2) the minimum reflectance within the 690nm band (min690) data correlate with May to October (calibration period AD 1864 – 1951, r = 0.68, p < 0.01 for 5pt-smoothed series; RMSE = 0.22 ̊C, RE = 0.5, CE = 0.31). Both proxy series (min690nm and R 570 /R 630 values) are internally highly consistent (r = 0.8, p < 0.001). In proglacial Lake Silvaplana the largest amount of sediment is transported by glacial meltwater. The melting season spans approximately from May to October, which gives us a good understanding of the geophysical processes explaining the correlations between lithoclastic proxies and the meteorological data. The reconstructions were extended back to AD 1580 and show a broad corresponddence with fully independent reconstructions from tree rings and documentary data.
Resumo:
Loss to follow-up (LTFU) is a common problem in many epidemiological studies. In antiretroviral treatment (ART) programs for patients with human immunodeficiency virus (HIV), mortality estimates can be biased if the LTFU mechanism is non-ignorable, that is, mortality differs between lost and retained patients. In this setting, routine procedures for handling missing data may lead to biased estimates. To appropriately deal with non-ignorable LTFU, explicit modeling of the missing data mechanism is needed. This can be based on additional outcome ascertainment for a sample of patients LTFU, for example, through linkage to national registries or through survey-based methods. In this paper, we demonstrate how this additional information can be used to construct estimators based on inverse probability weights (IPW) or multiple imputation. We use simulations to contrast the performance of the proposed estimators with methods widely used in HIV cohort research for dealing with missing data. The practical implications of our approach are illustrated using South African ART data, which are partially linkable to South African national vital registration data. Our results demonstrate that while IPWs and proper imputation procedures can be easily constructed from additional outcome ascertainment to obtain valid overall estimates, neglecting non-ignorable LTFU can result in substantial bias. We believe the proposed estimators are readily applicable to a growing number of studies where LTFU is appreciable, but additional outcome data are available through linkage or surveys of patients LTFU. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
BACKGROUND Moraxella catarrhalis, a major nasopharyngeal pathogen of the human respiratory tract, is exposed to rapid downshifts of environmental temperature when humans breathe cold air. The prevalence of pharyngeal colonization and respiratory tract infections caused by M. catarrhalis is greatest in winter. We investigated how M. catarrhalis uses the physiologic exposure to cold air to regulate pivotal survival systems that may contribute to M. catarrhalis virulence. RESULTS In this study we used the RNA-seq techniques to quantitatively catalogue the transcriptome of M. catarrhalis exposed to a 26 °C cold shock or to continuous growth at 37 °C. Validation of RNA-seq data using quantitative RT-PCR analysis demonstrated the RNA-seq results to be highly reliable. We observed that a 26 °C cold shock induces the expression of genes that in other bacteria have been related to virulence a strong induction was observed for genes involved in high affinity phosphate transport and iron acquisition, indicating that M. catarrhalis makes a better use of both phosphate and iron resources after exposure to cold shock. We detected the induction of genes involved in nitrogen metabolism, as well as several outer membrane proteins, including ompA, m35-like porin and multidrug efflux pump (acrAB) indicating that M. catarrhalis remodels its membrane components in response to downshift of temperature. Furthermore, we demonstrate that a 26 °C cold shock enhances the induction of genes encoding the type IV pili that are essential for natural transformation, and increases the genetic competence of M. catarrhalis, which may facilitate the rapid spread and acquisition of novel virulence-associated genes. CONCLUSION Cold shock at a physiologically relevant temperature of 26 °C induces in M. catarrhalis a complex of adaptive mechanisms that could convey novel pathogenic functions and may contribute to enhanced colonization and virulence.
Resumo:
High-resolution quantitative computed tomography (HRQCT)-based analysis of spinal bone density and microstructure, finite element analysis (FEA), and DXA were used to investigate the vertebral bone status of men with glucocorticoid-induced osteoporosis (GIO). DXA of L1–L3 and total hip, QCT of L1–L3, and HRQCT of T12 were available for 73 men (54.6±14.0years) with GIO. Prevalent vertebral fracture status was evaluated on radiographs using a semi-quantitative (SQ) score (normal=0 to severe fracture=3), and the spinal deformity index (SDI) score (sum of SQ scores of T4 to L4 vertebrae). Thirty-one (42.4%) subjects had prevalent vertebral fractures. Cortical BMD (Ct.BMD) and thickness (Ct.Th), trabecular BMD (Tb.BMD), apparent trabecular bone volume fraction (app.BV/TV), and apparent trabecular separation (app.Tb.Sp) were analyzed by HRQCT. Stiffness and strength of T12 were computed by HRQCT-based nonlinear FEA for axial compression, anterior bending and axial torsion. In logistic regressions adjusted for age, glucocorticoid dose and osteoporosis treatment, Tb.BMD was most closely associated with vertebral fracture status (standardized odds ratio [sOR]: Tb.BMD T12: 4.05 [95% CI: 1.8–9.0], Tb.BMD L1–L3: 3.95 [1.8–8.9]). Strength divided by cross-sectional area for axial compression showed the most significant association with spine fracture status among FEA variables (2.56 [1.29–5.07]). SDI was best predicted by a microstructural model using Ct.Th and app.Tb.Sp (r2=0.57, p<0.001). Spinal or hip DXA measurements did not show significant associations with fracture status or severity. In this cross-sectional study of males with GIO, QCT, HRQCT-based measurements and FEA variables were superior to DXA in discriminating between patients of differing prevalent vertebral fracture status. A microstructural model combining aspects of cortical and trabecular bone reflected fracture severity most accurately.
Resumo:
Background Agroforestry is a sustainable land use method with a long tradition in the Bolivian Andes. A better understanding of people’s knowledge and valuation of woody species can help to adjust actor-oriented agroforestry systems. In this case study, carried out in a peasant community of the Bolivian Andes, we aimed at calculating the cultural importance of selected agroforestry species, and at analysing the intracultural variation in the cultural importance and knowledge of plants according to peasants’ sex, age, and migration. Methods Data collection was based on semi-structured interviews and freelisting exercises. Two ethnobotanical indices (Composite Salience, Cultural Importance) were used for calculating the cultural importance of plants. Intracultural variation in the cultural importance and knowledge of plants was detected by using linear and generalised linear (mixed) models. Results and discussion The culturally most important woody species were mainly trees and exotic species (e.g. Schinus molle, Prosopis laevigata, Eucalyptus globulus). We found that knowledge and valuation of plants increased with age but that they were lower for migrants; sex, by contrast, played a minor role. The age effects possibly result from decreasing ecological apparency of valuable native species, and their substitution by exotic marketable trees, loss of traditional plant uses or the use of other materials (e.g. plastic) instead of wood. Decreasing dedication to traditional farming may have led to successive abandonment of traditional tool uses, and the overall transformation of woody plant use is possibly related to diminishing medicinal knowledge. Conclusions Age and migration affect how people value woody species and what they know about their uses. For this reason, we recommend paying particular attention to the potential of native species, which could open promising perspectives especially for the young migrating peasant generation and draw their interest in agroforestry. These native species should be ecologically sound and selected on their potential to provide subsistence and promising commercial uses. In addition to offering socio-economic and environmental services, agroforestry initiatives using native trees and shrubs can play a crucial role in recovering elements of the lost ancient landscape that still forms part of local people’s collective identity.
Resumo:
Identifying drivers of species diversity is a major challenge in understanding and predicting the dynamics of species-rich semi-natural grasslands. In particular in temperate grasslands changes in land use and its consequences, i.e. increasing fragmentation, the on-going loss of habitat and the declining importance of regional processes such as seed dispersal by livestock, are considered key drivers of the diversity loss witnessed within the last decades. It is a largely unresolved question to what degree current temperate grassland communities already reflect a decline of regional processes such as longer distance seed dispersal. Answering this question is challenging since it requires both a mechanistic approach to community dynamics and a sufficient data basis that allows identifying general patterns. Here, we present results of a local individual- and trait-based community model that was initialized with plant functional types (PFTs) derived from an extensive empirical data set of species-rich grasslands within the `Biodiversity Exploratories' in Germany. Driving model processes included above- and belowground competition, dynamic resource allocation to shoots and roots, clonal growth, grazing, and local seed dispersal. To test for the impact of regional processes we also simulated seed input from a regional species pool. Model output, with and without regional seed input, was compared with empirical community response patterns along a grazing gradient. Simulated response patterns of changes in PFT richness, Shannon diversity, and biomass production matched observed grazing response patterns surprisingly well if only local processes were considered. Already low levels of additional regional seed input led to stronger deviations from empirical community pattern. While these findings cannot rule out that regional processes other than those considered in the modeling study potentially play a role in shaping the local grassland communities, our comparison indicates that European grasslands are largely isolated, i.e. local mechanisms explain observed community patterns to a large extent.
Resumo:
BACKGROUND & Aims: Standardized instruments are needed to assess the activity of eosinophilic esophagitis (EoE), to provide endpoints for clinical trials and observational studies. We aimed to develop and validate a patient-reported outcome (PRO) instrument and score, based on items that could account for variations in patients' assessments of disease severity. We also evaluated relationships between patients' assessment of disease severity and EoE-associated endoscopic, histologic, and laboratory findings. METHODS We collected information from 186 patients with EoE in Switzerland and the US (69.4% male; median age, 43 years) via surveys (n = 135), focus groups (n = 27), and semi-structured interviews (n = 24). Items were generated for the instruments to assess biologic activity based on physician input. Linear regression was used to quantify the extent to which variations in patient-reported disease characteristics could account for variations in patients' assessment of EoE severity. The PRO instrument was prospectively used in 153 adult patients with EoE (72.5% male; median age, 38 years), and validated in an independent group of 120 patients with EoE (60.8% male; median age, 40.5 years). RESULTS Seven PRO factors that are used to assess characteristics of dysphagia, behavioral adaptations to living with dysphagia, and pain while swallowing accounted for 67% of the variation in patients' assessment of disease severity. Based on statistical consideration and patient input, a 7-day recall period was selected. Highly active EoE, based on endoscopic and histologic findings, was associated with an increase in patient-assessed disease severity. In the validation study, the mean difference between patient assessment of EoE severity and PRO score was 0.13 (on a scale from 0 to 10). CONCLUSIONS We developed and validated an EoE scoring system based on 7 PRO items that assesses symptoms over a 7-day recall period. Clinicaltrials.gov number: NCT00939263.
Resumo:
Many techniques based on data which are drawn by Ranked Set Sampling (RSS) scheme assume that the ranking of observations is perfect. Therefore it is essential to develop some methods for testing this assumption. In this article, we propose a parametric location-scale free test for assessing the assumption of perfect ranking. The results of a simulation study in two special cases of normal and exponential distributions indicate that the proposed test performs well in comparison with its leading competitors.
Resumo:
BACKGROUND AND PURPOSE Precise mechanisms underlying the effectiveness of the stroke unit (SU) are not fully established. Studies that compare monitored stroke units (semi-intensive type, SI-SU) versus an intensive care unit (ICU)-based mobile stroke team (MST-ICU) are lacking. Although inequalities in access to stroke unit care are globally improving, acute stroke patients may be admitted to Intensive Care Units for monitoring and followed by a mobile stroke team in hospital's lacking an SU with continuous cardiovascular monitoring. We aimed at comparing the stroke outcome between SI-SU and MST-ICU and hypothesized that the benefits of SI-SU are driven by additional elements other than cardiovascular monitoring, which is equally offered in both care systems. METHODS In a single-center setting, we compared the unfavorable outcomes (dependency and mortality) at 3 months in consecutive patients with ischemic stroke or spontaneous intracerebral hemorrhage admitted to a stroke unit with semi-intensive monitoring (SI-SU) to a cohort of stroke patients hospitalized in an ICU and followed by a mobile stroke team (MST-ICU) during an equal observation period of 27 months. Secondary objectives included comparing mortality and the proportion of patients with excellent outcomes (modified Rankin Score (mRS) 0-1). Equal cardiovascular monitoring was offered in patients admitted in both SI-SU and MST-ICU. RESULTS 458 patients were treated in the SI-SU and compared to the MST-ICU (n = 370) cohort. The proportion of death and dependency after 3 months was significantly improved for patients in the SI-SU compared to MST-ICU (p < 0.001; aOR = 0.45; 95% CI: 0.31-0.65). The shift analysis of the mRS distribution showed significant shift to the lower mRS in the SI-SU group, p < 0.001. The proportion of mortality in patients after 3 months also differed between the MST-ICU and the SI-SU (p < 0.05), but after adjusting for confounders this association was not significant (aOR = 0.59; 95% CI: 0.31-1.13). The proportion of patients with excellent outcome was higher in the SI-SU (59.4 vs. 44.9%, p < 0.001) but the relationship was no more significant after adjustment (aOR = 1.17; 95% CI: 0.87-1.5). CONCLUSIONS Our study shows that moving from a stroke team in a monitored setting (ICU) to an organized stroke unit leads to a significant reduction in the 3 months unfavorable outcome in patients with an acute ischemic or hemorrhagic stroke. Cardiovascular monitoring is indispensable, but benefits of a semi-intensive Stroke Unit are driven by additional elements beyond intensive cardiovascular monitoring. This observation supports the ongoing development of Stroke Centers for efficient stroke care. © 2015 S. Karger AG, Basel.
Resumo:
Inflammation is one possible mechanism underlying the associations between mental disorders and cardiovascular diseases (CVD). However, studies on mental disorders and inflammation have yielded inconsistent results and the majority did not adjust for potential confounding factors. We examined the associations of several pro-inflammatory cytokines (IL-1β, IL-6 and TNF-α) and high sensitive C-reactive protein (hsCRP) with lifetime and current mood, anxiety and substance use disorders (SUD), while adjusting for multiple covariates. The sample included 3719 subjects, randomly selected from the general population, who underwent thorough somatic and psychiatric evaluations. Psychiatric diagnoses were made with a semi-structured interview. Major depressive disorder was subtyped into "atypical", "melancholic", "combined atypical-melancholic" and "unspecified". Associations between inflammatory markers and psychiatric diagnoses were assessed using multiple linear and logistic regression models. Lifetime bipolar disorders and atypical depression were associated with increased levels of hsCRP, but not after multivariate adjustment. After multivariate adjustment, SUD remained associated with increased hsCRP levels in men (β = 0.13 (95% CI: 0.03,0.23)) but not in women. After multivariate adjustment, lifetime combined and unspecified depression were associated with decreased levels of IL-6 (β = -0.27 (-0.51,-0.02); β = -0.19 (-0.34,-0.05), respectively) and TNF-α (β = -0.16 (-0.30,-0.01); β = -0.10 (-0.19,-0.02), respectively), whereas current combined and unspecified depression were associated with decreased levels of hsCRP (β = -0.20 (-0.39,-0.02); β = -0.12 (-0.24,-0.01), respectively). Our data suggest that the significant associations between increased hsCRP levels and mood disorders are mainly attributable to the effects of comorbid disorders, medication as well as behavioral and physical CVRFs.
Resumo:
Binding of CD47 to signal regulatory protein alpha (SIRPα), an inhibitory receptor, negatively regulates phagocytosis. In acute myeloid leukemia (AML), CD47 is overexpressed on peripheral blasts and leukemia stem cells and inversely correlates with survival. Aim of the study was to investigate the correlation between CD47 protein expression by immunohistochemistry (IHC) in a bone marrow (BM) tissue microarray (TMA) and clinical outcome in AML patients. CD47 staining on BM leukemia blasts was scored semi-quantitatively and correlated with clinical parameters and known prognostic factors in AML. Low (scores 0-2) and high (score 3) CD47 protein expression were observed in 75% and 25% of AML patients. CD47 expression significantly correlated with percentage BM blast infiltration and peripheral blood blasts. Moreover, high CD47 expression was associated with nucleophosmin (NPM1) gene mutations. In contrast, CD47 expression did not significantly correlate with overall or progression free survival or response to therapy. In summary, a BM TMA permits rapid and reproducible semi-quantitative analysis of CD47 protein expression by IHC. While CD47 expression on circulating AML blasts has been shown to be a negative prognostic marker for a very defined population of AML patients with NK AML, CD47 expression on AML BM blasts is not.
Resumo:
This paper discusses the effects of global change in African mountains, with the example of Mount Kenya. The geographical focus is the northwestern, semi-arid foot zone of the mountain (Laikipia District). Over the past 50 years, this area has experienced rapid and profound transformation, the respective processes of which are all linked to global change. The main driving forces behind these processes have been political and economic in nature. To these an environmental change factor has been added in recent years – climate change. After introducing the area of research, the paper presents three dimensions of global change that are manifested in the region and largely shape its development: Socio-political change, economic change, environmental change. For the regions northwest of Mount Kenya, climate models predict important changes in rainfall distribution that will have a profound impact on freshwater availability and management. The results presented here are based on research undertaken northwest of Mount Kenya within the framework of a series of long-term Kenyan-Swiss research programmes that began in the early 1980s.