920 resultados para nonparametric inference
Resumo:
Citalopram is a chiral antidepressant drug. Its eutomer, S-citalopram (escitalopram), has recently been introduced as an antidepressant. In an open pilot study, four outpatients and two inpatients with a major depressive episode (ICD-10), and who were nonresponders to a 4-week pretreatment with 40-60 mg/day citalopram, were comedicated for another 4-week period with carbamazepine (200-400 mg/day). Some of the patients suffered also from comorbidities: Phobic anxiety disorder with panic attacks (n=2), generalised anxiety disorder, alcohol abuse, dependent personality disorder, hypertension (n=1). After a 4-week augmentation therapy with carbamazepine, a significant (P<0.03) decrease of the plasma concentrations of S-citalopram and R-citalopram, by 27 and 31%, respectively, was observed. Apparently, the probable induction of CYP3A4 by carbamazepine results in a nonstereoselective increase in N-demethylation of citalopram. Moreover, there was a significant (P<0.03) decrease of the ratio S/R-citalopram propionic acid derivative, the formation of it being partly regulated by MAO-A and MAO-B. Already, within 1 week after addition of carbamazepine, there was a slight but significant (P<0.03) decrease of the MADRS depression scores, from 27.0+/-7.7 (mean+/-S.D.) to 23.3+/-6.6, and the final score on day 56 was 18.8+/-10.9. The treatment was generally well tolerated. There was no evidence of occurrence of a serotonin syndrome. After augmentation with carbamazepine, treatment related adverse events were: Nausea in one case, diarrhea in one case, and rash in two cases. In conclusion, the results of this pilot study suggest that carbamazepine augmentation of a citalopram treatment in previous nonresponders to citalopram may be clinically useful, but that in addition carbamazepine can lead to a decrease of the plasma concentrations of the active enantiomer escitalopram.
Resumo:
Introduction: The Thalidomide-Dexamethasone (TD) regimen has provided encouraging results in relapsed MM. To improve results, bortezomib (Velcade) has been added to the combination in previous phase II studies, the so called VTD regimen. In January 2006, the European Group for Blood and Marrow Transplantation (EBMT) and the Intergroupe Francophone du Myélome (IFM) initiated a prospective, randomized, parallel-group, open-label phase III, multicenter study, comparing VTD (arm A) with TD (arm B) for MM patients progressing or relapsing after autologous transplantation. Patients and Methods: Inclusion criteria: patients in first progression or relapse after at least one autologous transplantation, including those who had received bortezomib or thalidomide before transplant. Exclusion criteria: subjects with neuropathy above grade 1 or non secretory MM. Primary study end point was time to progression (TTP). Secondary end points included safety, response rate, progression-free survival (PFS) and overall survival (OS). Treatment was scheduled as follows: bortezomib 1.3 mg/m2 was given as an i.v bolus on Days 1, 4, 8 and 11 followed by a 10-Day rest period (days 12 to 21) for 8 cycles (6 months) and then on Days 1, 8, 15, 22 followed by a 20-Day rest period (days 23 to 42) for 4 cycles (6 months). In both arms, thalidomide was scheduled at 200 mg/Day orally for one year and dexamethasone 40 mg/Day orally four days every three weeks for one year. Patients reaching remission could proceed to a new stem cell harvest. However, transplantation, either autologous or allogeneic, could only be performed in patients who completed the planned one year treatment period. Response was assessed by EBMT criteria, with additional category of near complete remission (nCR). Adverse events were graded by the NCI-CTCAE, Version 3.0.The trial was based on a group sequential design, with 4 planned interim analyses and one final analysis that allowed stopping for efficacy as well as futility. The overall alpha and power were set equal to 0.025 and 0.90 respectively. The test for decision making was based on the comparison in terms of the ratio of the cause-specific hazards of relapse/progression, estimated in a Cox model stratified on the number of previous autologous transplantations. Relapse/progression cumulative incidence was estimated using the proper nonparametric estimator, the comparison was done by the Gray test. PFS and OS probabilities were estimated by the Kaplan-Meier curves, the comparison was performed by the Log-Rank test. An interim safety analysis was performed when the first hundred patients had been included. The safety committee recommended to continue the trial. Results: As of 1st July 2010, 269 patients had been enrolled in the study, 139 in France (IFM 2005-04 study), 21 in Italy, 38 in Germany, 19 in Switzerland (a SAKK study), 23 in Belgium, 8 in Austria, 8 in the Czech republic, 11 in Hungary, 1 in the UK and 1 in Israel. One hundred and sixty nine patients were males and 100 females; the median age was 61 yrs (range 29-76). One hundred and thirty six patients were randomized to receive VTD and 133 to receive TD. The current analysis is based on 246 patients (124 in arm A, 122 in arm B) included in the second interim analysis, carried out when 134 events were observed. Following this analysis, the trial was stopped because of significant superiority of VTD over TD. The remaining patients were too premature to contribute to the analysis. The number of previous autologous transplants was one in 63 vs 60 and two or more in 61 vs 62 patients in arm A vs B respectively. The median follow-up was 25 months. The median TTP was 20 months vs 15 months respectively in arm A and B, with cumulative incidence of relapse/progression at 2 years equal to 52% (95% CI: 42%-64%) vs 70% (95% CI: 61%-81%) (p=0.0004, Gray test). The same superiority of arm A was also observed when stratifying on the number of previous autologous transplantations. At 2 years, PFS was 39% (95% CI: 30%-51%) vs 23% (95% CI: 16%-34%) (A vs B, p=0.0006, Log-Rank test). OS in the first two years was comparable in the two groups. Conclusion: VTD resulted in significantly longer TTP and PFS in patients relapsing after ASCT. Analysis of response and safety data are on going and results will be presented at the meeting. Protocol EU-DRACT number: 2005-001628-35.
Resumo:
The Gene Ontology (GO) (http://www.geneontology.org) is a community bioinformatics resource that represents gene product function through the use of structured, controlled vocabularies. The number of GO annotations of gene products has increased due to curation efforts among GO Consortium (GOC) groups, including focused literature-based annotation and ortholog-based functional inference. The GO ontologies continue to expand and improve as a result of targeted ontology development, including the introduction of computable logical definitions and development of new tools for the streamlined addition of terms to the ontology. The GOC continues to support its user community through the use of e-mail lists, social media and web-based resources.
Resumo:
BACKGROUND: In alcohol withdrawal, fixed doses of benzodiazepine are generally recommended as a first-line pharmacologic approach. This study determines the benefits of an individualized treatment regimen on the quantity of benzodiazepine administered and the duration of its use during alcohol withdrawal treatment. METHODS: We conducted a prospective, randomized, double-blind, controlled trial including 117 consecutive patients with alcohol dependence, according to the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, entering an alcohol treatment program at both the Lausanne and Geneva university hospitals, Switzerland. Patients were randomized into 2 groups: (1) 56 were treated with oxazepam in response to the development of signs of alcohol withdrawal (symptom-triggered); and (2) 61 were treated with oxazepam every 6 hours with additional doses as needed (fixed-schedule). The administration of oxazepam in group 1 and additional oxazepam in group 2 was determined using a standardized measure of alcohol withdrawal. The main outcome measures were the total amount and duration of treatment with oxazepam, the incidence of complications, and the comfort level. RESULTS: A total of 22 patients (39%) in the symptom-triggered group were treated with oxazepam vs 100% in the fixed-schedule group (P<.001). The mean oxazepam dose administered in the symptom-triggered group was 37.5 mg compared with 231.4 mg in the fixed-schedule group (P<.001). The mean duration of oxazepam treatment was 20.0 hours in the symptom-triggered group vs 62.7 hours in the fixed-schedule group (P<.001). Withdrawal complications were limited to a single episode of seizures in the symptom-triggered group. There were no differences in the measures of comfort between the 2 groups. CONCLUSIONS: Symptom-triggered benzodiazepine treatment for alcohol withdrawal is safe, comfortable, and associated with a decrease in the quantity of medication and duration of treatment.
Resumo:
Introduction: Schizophrenia is associated with multiple neuropsychological dysfunctions, such as disturbances of attention, memory, perceptual functioning, concept formation and executive processes. These cognitive functions are reported to depend on the integrity of the prefrontal and thalamo-prefrontal circuits. Multiple lines of evidence suggest that schizophrenia is related to abnormalities in neural circuitry and impaired structural connectivity. Here, we report a preliminary case-control study that showed a correlation between thalamo-frontal connections and several cognitive functions known to be impaired in schizophrenia. Materials and Methods: We investigated 9 schizophrenic patients (DSM IV criteria, Diagnostic Interview for Genetic Studies) and 9 age and sex matched control subjects. We obtained from each volunteer a DT-MRI dataset (3 T, _ _ 1,000 s/mm2), and a high resolution anatomic T1. The thalamo- frontal tracts are simulated with DTI tractography on these dataset, a method allowing inference of the main neural fiber tracks from Diffusion MRI data. In order to see an eventual correlation with the thalamo-frontal connections, every subject performs a battery of neuropsychological tests including computerized tests of attention (sustained attention, selective attention and reaction time), working memory tests (Plane test and the working memory sub-tests of the Wechsler Adult Intelligence Scale), a executive functioning task (Tower of Hanoï) and a test of visual binding abilities. Results: In a pilot case-control study (patients: n _ 9; controls: n _ 9), we showed that this methodology is appropriate and giving results in the excepted range. Considering the relation of the connectivity density and the neuropsychological data, a correlation between the number of thalamo- frontal fibers and the performance in the Tower of Hanoï was observed in the patients (Pearson correlation, r _ 0.76, p _ 0.05) but not in control subjects. In the most difficult item of the test, the least number of fibers corresponds to the worst performance of the test (fig. 2, number of supplementary movements of the elements necessary to realize the right configuration). It's interesting to note here that in an independent study, we showed that schizophrenia patients (n _ 32) perform in the most difficult item of the Tower of Hanoï (Mann-Whitney, p _ 0.005) significantly worse than control subjects (n _ 29). This has been observed in several others neuropsychological studies. Discussion: This pilot study of schizophrenia patients shows a correlation between the number of thalam-frontal fibers and the performance in the Tower of Hanoï, which is a planning and goal oriented actions task known to be associated with frontal dysfonction. This observation is consistent with the proposed impaired connectivity in schizophrenia. We aim to pursue the study with a larger sample in order to determine if other neuropsychological tests may be associated with the connectivity density.
Resumo:
BACKGROUND: Finding genes that are differentially expressed between conditions is an integral part of understanding the molecular basis of phenotypic variation. In the past decades, DNA microarrays have been used extensively to quantify the abundance of mRNA corresponding to different genes, and more recently high-throughput sequencing of cDNA (RNA-seq) has emerged as a powerful competitor. As the cost of sequencing decreases, it is conceivable that the use of RNA-seq for differential expression analysis will increase rapidly. To exploit the possibilities and address the challenges posed by this relatively new type of data, a number of software packages have been developed especially for differential expression analysis of RNA-seq data. RESULTS: We conducted an extensive comparison of eleven methods for differential expression analysis of RNA-seq data. All methods are freely available within the R framework and take as input a matrix of counts, i.e. the number of reads mapping to each genomic feature of interest in each of a number of samples. We evaluate the methods based on both simulated data and real RNA-seq data. CONCLUSIONS: Very small sample sizes, which are still common in RNA-seq experiments, impose problems for all evaluated methods and any results obtained under such conditions should be interpreted with caution. For larger sample sizes, the methods combining a variance-stabilizing transformation with the 'limma' method for differential expression analysis perform well under many different conditions, as does the nonparametric SAMseq method.
Resumo:
Uncertainty quantification of petroleum reservoir models is one of the present challenges, which is usually approached with a wide range of geostatistical tools linked with statistical optimisation or/and inference algorithms. The paper considers a data driven approach in modelling uncertainty in spatial predictions. Proposed semi-supervised Support Vector Regression (SVR) model has demonstrated its capability to represent realistic features and describe stochastic variability and non-uniqueness of spatial properties. It is able to capture and preserve key spatial dependencies such as connectivity, which is often difficult to achieve with two-point geostatistical models. Semi-supervised SVR is designed to integrate various kinds of conditioning data and learn dependences from them. A stochastic semi-supervised SVR model is integrated into a Bayesian framework to quantify uncertainty with multiple models fitted to dynamic observations. The developed approach is illustrated with a reservoir case study. The resulting probabilistic production forecasts are described by uncertainty envelopes.
Resumo:
An enormous burst of interest in the public health burden from chronic disease in Africa has emerged as a consequence of efforts to estimate global population health. Detailed estimates are now published for Africa as a whole and each country on the continent. These data have formed the basis for warnings about sharp increases in cardiovascular disease (CVD) in the coming decades. In this essay we briefly examine the trajectory of social development on the continent and its consequences for the epidemiology of CVD and potential control strategies. Since full vital registration has only been implemented in segments of South Africa and the island nations of Seychelles and Mauritius - formally part of WHO-AFRO - mortality data are extremely limited. Numerous sample surveys have been conducted but they often lack standardization or objective measures of health status. Trend data are even less informative. However, using the best quality data available, age-standardized trends in CVD are downward, and in the case of stroke, sharply so. While acknowledging that the extremely limited available data cannot be used as the basis for inference to the continent, we raise the concern that general estimates based on imputation to fill in the missing mortality tables may be even more misleading. No immediate remedies to this problem can be identified, however bilateral collaborative efforts to strength local educational institutions and governmental agencies rank as the highest priority for near term development.
Resumo:
An effect of subthalamic nucleus deep brain stimulation (STN-DBS) on cognition has been suspected but long-term observations are lacking. The aim of this study was to evaluate the long-term cognitive profile and the incidence of dementia in a cohort of Parkinson's disease (PD) patients treated by STN-DBS. 57 consecutive patients were prospectively assessed by the mean of a neuropsychological battery over 3 years after surgery. Dementia (DSM-IV) and UPDRS I to IV were recorded. 24.5% of patients converted to dementia over 3 years (incidence of 89 of 1,000 per year). This group of patients cognitively continuously worsened over 3 years up to fulfilling dementia criteria (PDD). The rest of the cohort remained cognitively stable (PD) over the whole follow-up. Preoperative differences between PDD and PD included older age (69.2 +/- 5.8 years; 62.6 +/- 8 years), presence of hallucinations and poorer executive score (10.1 +/- 5.9; 5.5 +/- 4.4). The incidence of dementia over 3 years after STN-DBS is similar to the one reported in medically treated patients. The PDD presented preoperative risk factors of developing dementia similar to those described in medically treated patients. These observations suggest dementia being secondary to the natural evolution of PD rather than a direct effect of STN-DBS.
Resumo:
Modern cochlear implantation technologies allow deaf patients to understand auditory speech; however, the implants deliver only a coarse auditory input and patients must use long-term adaptive processes to achieve coherent percepts. In adults with post-lingual deafness, the high progress of speech recovery is observed during the first year after cochlear implantation, but there is a large range of variability in the level of cochlear implant outcomes and the temporal evolution of recovery. It has been proposed that when profoundly deaf subjects receive a cochlear implant, the visual cross-modal reorganization of the brain is deleterious for auditory speech recovery. We tested this hypothesis in post-lingually deaf adults by analysing whether brain activity shortly after implantation correlated with the level of auditory recovery 6 months later. Based on brain activity induced by a speech-processing task, we found strong positive correlations in areas outside the auditory cortex. The highest positive correlations were found in the occipital cortex involved in visual processing, as well as in the posterior-temporal cortex known for audio-visual integration. The other area, which positively correlated with auditory speech recovery, was localized in the left inferior frontal area known for speech processing. Our results demonstrate that the visual modality's functional level is related to the proficiency level of auditory recovery. Based on the positive correlation of visual activity with auditory speech recovery, we suggest that visual modality may facilitate the perception of the word's auditory counterpart in communicative situations. The link demonstrated between visual activity and auditory speech perception indicates that visuoauditory synergy is crucial for cross-modal plasticity and fostering speech-comprehension recovery in adult cochlear-implanted deaf patients.
Resumo:
To test whether quantitative traits are under directional or homogenizing selection, it is common practice to compare population differentiation estimates at molecular markers (F(ST)) and quantitative traits (Q(ST)). If the trait is neutral and its determinism is additive, then theory predicts that Q(ST) = F(ST), while Q(ST) > F(ST) is predicted under directional selection for different local optima, and Q(ST) < F(ST) is predicted under homogenizing selection. However, nonadditive effects can alter these predictions. Here, we investigate the influence of dominance on the relation between Q(ST) and F(ST) for neutral traits. Using analytical results and computer simulations, we show that dominance generally deflates Q(ST) relative to F(ST). Under inbreeding, the effect of dominance vanishes, and we show that for selfing species, a better estimate of Q(ST) is obtained from selfed families than from half-sib families. We also compare several sampling designs and find that it is always best to sample many populations (>20) with few families (five) rather than few populations with many families. Provided that estimates of Q(ST) are derived from individuals originating from many populations, we conclude that the pattern Q(ST) > F(ST), and hence the inference of directional selection for different local optima, is robust to the effect of nonadditive gene actions.
Resumo:
Macrophage migration inhibitory factor (MIF) is a proinflammatory cytokine produced by many cells and tissues including pancreatic beta-cells, liver, skeletal muscle, and adipocytes. This study investigates the potential role of MIF in carbohydrate homeostasis in a physiological setting outside of severe inflammation, utilizing Mif knockout (MIF-/-) mice. Compared with wild-type (WT) mice, MIF-/- mice had a lower body weight, from birth until 4 months of age, but subsequently gained weight faster, resulting in a higher body weight at 12 months of age. The lower weight in young mice was related to a higher energy expenditure, and the higher weight in older mice was related to an increased food intake and a higher fat mass. Fasting blood insulin level was higher in MIF-/- mice compared with WT mice at any age. After i.p. glucose injection, the elevation of blood insulin level was higher in MIF-/- mice compared with WT mice, at 2 months of age, but was lower in 12-month-old MIF-/- mice. As a result, the glucose clearance during intraperitoneal glucose tolerance tests was higher in MIF-/- mice compared with WT mice until 4 months of age, and was lower in 12-month-old MIF-/- mice. Insulin resistance was estimated (euglycemic-hyperinsulinemic clamp tests), and the phosphorylation activity of AKT was similar in MIF-/- mice and WT mice. In conclusion, this mouse model provides evidence for the role of MIF in the control of glucose homeostasis.
Resumo:
Background and Aims Paleoclimatic data indicate that an abrupt climate change occurred at the Eocene-Oligocene (E-O) boundary affecting the distribution of tropical forests on Earth. The same period has seen the emergence of South-East (SE) Asia, caused by the collision of the Eurasian and Australian plates. How the combination of these climatic and geomorphological factors affected the spatio-temporal history of angiosperms is little known. This topic is investigated by using the worldwide sapindaceous clade as a case study. Methods Analyses of divergence time inference, diversification and biogeography (constrained by paleogeography) are applied to a combined plastid and nuclear DNA sequence data set. Biogeographical and diversification analyses are performed over a set of trees to take phylogenetic and dating uncertainty into account. Results are analysed in the context of past climatic fluctuations. Key Results An increase in the number of dispersal events at the E-O boundary is recorded, which intensified during the Miocene. This pattern is associated with a higher rate in the emergence of new genera. These results are discussed in light of the geomorphological importance of SE Asia, which acted as a tropical bridge allowing multiple contacts between areas and additional speciation across landmasses derived from Laurasia and Gondwana. Conclusions This study demonstrates the importance of the combined effect of geomorphological (the emergence of most islands in SE Asia approx. 30 million years ago) and climatic (the dramatic E-O climate change that shifted the tropical belt and reduced sea levels) factors in shaping species distribution within the sapindaceous clade.
Resumo:
The trabecular bone score (TBS) is a new parameter that is determined from gray-level analysis of dual-energy X-ray absorptiometry (DXA) images. It relies on the mean thickness and volume fraction of trabecular bone microarchitecture. This was a preliminary case-control study to evaluate the potential diagnostic value of TBS as a complement to bone mineral density (BMD), by comparing postmenopausal women with and without fractures. The sample consisted of 45 women with osteoporotic fractures (5 hip fractures, 20 vertebral fractures, and 20 other types of fracture) and 155 women without a fracture. Stratification was performed, taking into account each type of fracture (except hip), and women with and without fractures were matched for age and spine BMD. BMD and TBS were measured at the total spine. TBS measured at the total spine revealed a significant difference between the fracture and age- and spine BMD-matched nonfracture group, when considering all types of fractures and vertebral fractures. In these cases, the diagnostic value of the combination of BMD and TBS likely will be higher compared with that of BMD alone. TBS, as evaluated from standard DXA scans directly, potentially complements BMD in the detection of osteoporotic fractures. Prospective studies are necessary to fully evaluate the potential role of TBS as a complementary risk factor for fracture.
Resumo:
PURPOSE: To evaluate the effect of a real-time adaptive trigger delay on image quality to correct for heart rate variability in 3D whole-heart coronary MR angiography (MRA). MATERIALS AND METHODS: Twelve healthy adults underwent 3D whole-heart coronary MRA with and without the use of an adaptive trigger delay. The moment of minimal coronary artery motion was visually determined on a high temporal resolution MRI. Throughout the scan performed without adaptive trigger delay, trigger delay was kept constant, whereas during the scan performed with adaptive trigger delay, trigger delay was continuously updated after each RR-interval using physiological modeling. Signal-to-noise, contrast-to-noise, vessel length, vessel sharpness, and subjective image quality were compared in a blinded manner. RESULTS: Vessel sharpness improved significantly for the middle segment of the right coronary artery (RCA) with the use of the adaptive trigger delay (52.3 +/- 7.1% versus 48.9 +/- 7.9%, P = 0.026). Subjective image quality was significantly better in the middle segments of the RCA and left anterior descending artery (LAD) when the scan was performed with adaptive trigger delay compared to constant trigger delay. CONCLUSION: Our results demonstrate that the use of an adaptive trigger delay to correct for heart rate variability improves image quality mainly in the middle segments of the RCA and LAD.