909 resultados para Nonparametric Bayes
Resumo:
When preparing an article on image restoration in astronomy, it is obvious that some topics have to be dropped to keep the work at reasonable length. We have decided to concentrate on image and noise models and on the algorithms to find the restoration. Topics like parameter estimation and stopping rules are also commented on. We start by describing the Bayesian paradigm and then proceed to study the noise and blur models used by the astronomical community. Then the prior models used to restore astronomical images are examined. We describe the algorithms used to find the restoration for the most common combinations of degradation and image models. Then we comment on important issues such as acceleration of algorithms, stopping rules, and parameter estimation. We also comment on the huge amount of information available to, and made available by, the astronomical community.
Resumo:
An effect of subthalamic nucleus deep brain stimulation (STN-DBS) on cognition has been suspected but long-term observations are lacking. The aim of this study was to evaluate the long-term cognitive profile and the incidence of dementia in a cohort of Parkinson's disease (PD) patients treated by STN-DBS. 57 consecutive patients were prospectively assessed by the mean of a neuropsychological battery over 3 years after surgery. Dementia (DSM-IV) and UPDRS I to IV were recorded. 24.5% of patients converted to dementia over 3 years (incidence of 89 of 1,000 per year). This group of patients cognitively continuously worsened over 3 years up to fulfilling dementia criteria (PDD). The rest of the cohort remained cognitively stable (PD) over the whole follow-up. Preoperative differences between PDD and PD included older age (69.2 +/- 5.8 years; 62.6 +/- 8 years), presence of hallucinations and poorer executive score (10.1 +/- 5.9; 5.5 +/- 4.4). The incidence of dementia over 3 years after STN-DBS is similar to the one reported in medically treated patients. The PDD presented preoperative risk factors of developing dementia similar to those described in medically treated patients. These observations suggest dementia being secondary to the natural evolution of PD rather than a direct effect of STN-DBS.
Resumo:
Modern cochlear implantation technologies allow deaf patients to understand auditory speech; however, the implants deliver only a coarse auditory input and patients must use long-term adaptive processes to achieve coherent percepts. In adults with post-lingual deafness, the high progress of speech recovery is observed during the first year after cochlear implantation, but there is a large range of variability in the level of cochlear implant outcomes and the temporal evolution of recovery. It has been proposed that when profoundly deaf subjects receive a cochlear implant, the visual cross-modal reorganization of the brain is deleterious for auditory speech recovery. We tested this hypothesis in post-lingually deaf adults by analysing whether brain activity shortly after implantation correlated with the level of auditory recovery 6 months later. Based on brain activity induced by a speech-processing task, we found strong positive correlations in areas outside the auditory cortex. The highest positive correlations were found in the occipital cortex involved in visual processing, as well as in the posterior-temporal cortex known for audio-visual integration. The other area, which positively correlated with auditory speech recovery, was localized in the left inferior frontal area known for speech processing. Our results demonstrate that the visual modality's functional level is related to the proficiency level of auditory recovery. Based on the positive correlation of visual activity with auditory speech recovery, we suggest that visual modality may facilitate the perception of the word's auditory counterpart in communicative situations. The link demonstrated between visual activity and auditory speech perception indicates that visuoauditory synergy is crucial for cross-modal plasticity and fostering speech-comprehension recovery in adult cochlear-implanted deaf patients.
Resumo:
Macrophage migration inhibitory factor (MIF) is a proinflammatory cytokine produced by many cells and tissues including pancreatic beta-cells, liver, skeletal muscle, and adipocytes. This study investigates the potential role of MIF in carbohydrate homeostasis in a physiological setting outside of severe inflammation, utilizing Mif knockout (MIF-/-) mice. Compared with wild-type (WT) mice, MIF-/- mice had a lower body weight, from birth until 4 months of age, but subsequently gained weight faster, resulting in a higher body weight at 12 months of age. The lower weight in young mice was related to a higher energy expenditure, and the higher weight in older mice was related to an increased food intake and a higher fat mass. Fasting blood insulin level was higher in MIF-/- mice compared with WT mice at any age. After i.p. glucose injection, the elevation of blood insulin level was higher in MIF-/- mice compared with WT mice, at 2 months of age, but was lower in 12-month-old MIF-/- mice. As a result, the glucose clearance during intraperitoneal glucose tolerance tests was higher in MIF-/- mice compared with WT mice until 4 months of age, and was lower in 12-month-old MIF-/- mice. Insulin resistance was estimated (euglycemic-hyperinsulinemic clamp tests), and the phosphorylation activity of AKT was similar in MIF-/- mice and WT mice. In conclusion, this mouse model provides evidence for the role of MIF in the control of glucose homeostasis.
Resumo:
The trabecular bone score (TBS) is a new parameter that is determined from gray-level analysis of dual-energy X-ray absorptiometry (DXA) images. It relies on the mean thickness and volume fraction of trabecular bone microarchitecture. This was a preliminary case-control study to evaluate the potential diagnostic value of TBS as a complement to bone mineral density (BMD), by comparing postmenopausal women with and without fractures. The sample consisted of 45 women with osteoporotic fractures (5 hip fractures, 20 vertebral fractures, and 20 other types of fracture) and 155 women without a fracture. Stratification was performed, taking into account each type of fracture (except hip), and women with and without fractures were matched for age and spine BMD. BMD and TBS were measured at the total spine. TBS measured at the total spine revealed a significant difference between the fracture and age- and spine BMD-matched nonfracture group, when considering all types of fractures and vertebral fractures. In these cases, the diagnostic value of the combination of BMD and TBS likely will be higher compared with that of BMD alone. TBS, as evaluated from standard DXA scans directly, potentially complements BMD in the detection of osteoporotic fractures. Prospective studies are necessary to fully evaluate the potential role of TBS as a complementary risk factor for fracture.
Resumo:
PURPOSE: To evaluate the effect of a real-time adaptive trigger delay on image quality to correct for heart rate variability in 3D whole-heart coronary MR angiography (MRA). MATERIALS AND METHODS: Twelve healthy adults underwent 3D whole-heart coronary MRA with and without the use of an adaptive trigger delay. The moment of minimal coronary artery motion was visually determined on a high temporal resolution MRI. Throughout the scan performed without adaptive trigger delay, trigger delay was kept constant, whereas during the scan performed with adaptive trigger delay, trigger delay was continuously updated after each RR-interval using physiological modeling. Signal-to-noise, contrast-to-noise, vessel length, vessel sharpness, and subjective image quality were compared in a blinded manner. RESULTS: Vessel sharpness improved significantly for the middle segment of the right coronary artery (RCA) with the use of the adaptive trigger delay (52.3 +/- 7.1% versus 48.9 +/- 7.9%, P = 0.026). Subjective image quality was significantly better in the middle segments of the RCA and left anterior descending artery (LAD) when the scan was performed with adaptive trigger delay compared to constant trigger delay. CONCLUSION: Our results demonstrate that the use of an adaptive trigger delay to correct for heart rate variability improves image quality mainly in the middle segments of the RCA and LAD.
Resumo:
BACKGROUND: Gastroesophageal reflux and progressive esophageal dilatation can develop after gastric banding (GB). HYPOTHESIS: Gastric banding may interfere with esophageal motility, enhance reflux, or promote esophageal dilatation. DESIGN: Before-after trial in patients undergoing GB. SETTING: University teaching hospital. PATIENTS AND METHODS: Between January 1999 and August 2002, 43 patients undergoing laparoscopic GB for morbid obesity underwent upper gastrointestinal endoscopy, 24-hour pH monitoring, and stationary esophageal manometry before GB and between 6 and 18 months postoperatively. MAIN OUTCOME MEASURES: Reflux symptoms, endoscopic esophagitis, pressures measured at manometry, esophageal acid exposure. RESULTS: There was no difference in the prevalence of reflux symptoms or esophagitis before and after GB. The lower esophageal sphincter was unaffected by surgery, but contractions in the lower esophagus weakened after GB, in correlation with preoperative values. There was a trend toward more postoperative nonspecific motility disorders. Esophageal acid exposure tended to decrease after GB, with fewer reflux episodes. A few patients developed massive postoperative reflux. There was no clear correlation between preoperative testing and postoperative esophageal acid exposure, although patients with abnormal preoperative acid exposure tended to maintain high values after GB. CONCLUSIONS: Postoperative esophageal dysmotility and gastroesophageal reflux are not uncommon after GB. Preoperative testing should be done routinely. Low amplitude of contraction in the lower esophagus and increased esophageal acid exposure should be regarded as contraindications to GB. Patients with such findings should be offered an alternative procedure, such as Roux-en-Y gastric bypass.
Resumo:
BACKGROUND: Many factors affect survival in haemodialysis (HD) patients. Our aim was to study whether quality of clinical care may affect survival in this population, when adjusted for demographic characteristics and co-morbidities. METHODS: We studied survival in 553 patients treated by chronic HD during March 2001 in 21 dialysis facilities in western Switzerland. Indicators of quality of care were established for anaemia control, calcium and phosphate product, serum albumin, pre-dialysis blood pressure (BP), type of vascular access and dialysis adequacy (spKt/V) and their baseline values were related to 3-year survival. The modified Charlson co-morbidity index (including age) and transplantation status were also considered as a predictor of survival. RESULTS: Three-year survival was obtained for 96% of the patients; 39% (211/541) of these patients had died. The 3-year survival was 50, 62 and 69%, respectively, in patients who had 0-2, 3 and >or=4 fulfilled indicators of quality of care (test for linear trend, P < 0.001). In a Cox multivariate analysis model, the absence of transplantation, a higher modified Charlson's score, decreased fulfilment of indicators of good clinical care and low pre-dialysis systolic BP were independent predictors of death. CONCLUSION: Good clinical care improves survival in HD patients, even after adjustment for availability of transplantation and co-morbidities.
Resumo:
The determination of dyes present in illicit pills is shown to be useful and easy-to-get information in strategic and tactical drug intelligence. An analytical strategy including solid-phase extraction (SPE) thin-layer chromatography (TLC) and capillary zone electrophoresis equipped with a diode array detector (CZE-DAD) was developed to identify and quantify 14 hydrosoluble, acidic, synthetic food dyes allowed in the European Community. Indeed, these may be the most susceptible dyes to be found in illicit pills through their availability and easiness of use. The results show (1) that this analytical method is well adapted to small samples such as illicit pills, (2) that most dyes actually found belong to hydrosoluble, acidic, synthetic food dyes allowed in the European Community, and (3) that this evidence turns out to be important in drug intelligence and may be assessed into a Bayesian framework.
Resumo:
The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.
Resumo:
BACKGROUND: The majority of Haemosporida species infect birds or reptiles, but many important genera, including Plasmodium, infect mammals. Dipteran vectors shared by avian, reptilian and mammalian Haemosporida, suggest multiple invasions of Mammalia during haemosporidian evolution; yet, phylogenetic analyses have detected only a single invasion event. Until now, several important mammal-infecting genera have been absent in these analyses. This study focuses on the evolutionary origin of Polychromophilus, a unique malaria genus that only infects bats (Microchiroptera) and is transmitted by bat flies (Nycteribiidae). METHODS: Two species of Polychromophilus were obtained from wild bats caught in Switzerland. These were molecularly characterized using four genes (asl, clpc, coI, cytb) from the three different genomes (nucleus, apicoplast, mitochondrion). These data were then combined with data of 60 taxa of Haemosporida available in GenBank. Bayesian inference, maximum likelihood and a range of rooting methods were used to test specific hypotheses concerning the phylogenetic relationships between Polychromophilus and the other haemosporidian genera. RESULTS: The Polychromophilus melanipherus and Polychromophilus murinus samples show genetically distinct patterns and group according to species. The Bayesian tree topology suggests that the monophyletic clade of Polychromophilus falls within the avian/saurian clade of Plasmodium and directed hypothesis testing confirms the Plasmodium origin. CONCLUSION: Polychromophilus' ancestor was most likely a bird- or reptile-infecting Plasmodium before it switched to bats. The invasion of mammals as hosts has, therefore, not been a unique event in the evolutionary history of Haemosporida, despite the suspected costs of adapting to a new host. This was, moreover, accompanied by a switch in dipteran host.
Resumo:
Plasmid DNA and adenovirus vectors currently used in cardiovascular gene therapy trials are limited by low efficiency and short-lived transgene expression, respectively. Recombinant adeno-associated virus (AAV) has recently emerged as an attractive vector for cardiovascular gene therapy. In the present study, we have compared AAV and adenovirus vectors with respect to gene transfer efficiency and the duration of transgene expression in mouse hearts and arteries in vivo. AAV vectors (titer: 5 x 10(8) transducing units (TU)/ml) and adenovirus vectors (1.2 x 10(10) TU/ml) expressing a green fluorescent protein (EGFP) gene were injected either intramyocardially (n=32) or intrapericardially (n=3) in CD-1 mice. Hearts were harvested at varying time intervals (3 days to 1 year) after gene delivery. After intramyocardial injection of 5 microl virus stock solution, cardiomyocyte transduction rates with AAV vectors were 4-fold lower than with adenovirus vectors (1.5% (range: 0.5-2.6%) vs. 6.2% (range: 2.7-13.7%); P<0.05), but similar to titer-matched adenovirus vectors (0.7%; range: 0.2-1.2%). AAV-mediated EGFP expression lasted for at least 1 year. AAV vectors instilled into the pericardial space transduced epicardial myocytes. Arterial gene transfer was studied in mouse carotids (n=26). Both vectors selectively transduced endothelial cells after luminal instillation. Transduction rates with AAV vectors were 8-fold lower than with adenovirus vectors (2.0% (range: 0-3.2%) vs. 16.2% (range: 8.5-20.2%); P<0.05). Prolonged EGFP expression was observed after AAV but not adenovirus-mediated gene transfer. In conclusion, AAV vectors deliver and express genes for extended periods of time in the myocardium and arterial endothelium in vivo. AAV vectors may be useful for gene therapy approaches to chronic cardiovascular diseases.
Resumo:
BACKGROUND: Outcome following foot and ankle surgery can be assessed by disease- and region-specific scores. Many scoring systems exist, making comparison among studies difficult. The present study focused on outcome measures for a common foot and ankle abnormality and compared the results obtained by 2 disease-specific and 2 body region-specific scores. METHODS: We reviewed 41 patients who underwent lateral ankle ligament reconstruction. Four outcome scales were administered simultaneously: the Cumberland Ankle Instability Tool (CAIT) and the Chronic Ankle Instability Scale (CAIS), which are disease specific, and the American Orthopedic Foot & Ankle Society (AOFAS) hindfoot scale and the Foot and Ankle Ability Measure (FAAM), which are both body region-specific. The degree of correlation between scores was assessed by Pearson's correlation coefficient. Nonparametric tests, the Kruskal-Wallis and the Mann-Whitney test for pairwise comparison of the scores, were performed. RESULTS: A significant difference (P < .005) was observed between the CAIS and the AOFAS score (P = .0002), between the CAIS and the FAAM 1 (P = .0001), and between the CAIT and the AOFAS score (P = .0003). CONCLUSIONS: This study compared the performances of 4 disease- and body region-specific scoring systems. We demonstrated a correlation between the 4 administered scoring systems and notable differences between the results given by each of them. Disease-specific scores appeared more accurate than body region-specific scores. A strong correlation between the AOFAS score and the other scales was observed. The FAAM seemed a good compromise because it offered the possibility to evaluate the patient according to his or her own functional demand. CLINICAL RELEVANCE: The present study contributes to the development of more critical and accurate outcome assesment methods in foot and ankle surgery.
Resumo:
The historically-reactive approach to identifying safety problems and mitigating them involves selecting black spots or hot spots by ranking locations based on crash frequency and severity. The approach focuses mainly on the corridor level without taking the exposure rate (vehicle miles traveled) and socio-demographics information of the study area, which are very important in the transportation planning process, into consideration. A larger study analysis unit at the Transportation Analysis Zone (TAZ) level or the network planning level should be used to address the needs of development of the community in the future and incorporate safety into the long-range transportation planning process. In this study, existing planning tools (such as the PLANSAFE models presented in NCHRP Report 546) were evaluated for forecasting safety in small and medium-sized communities, particularly as related to changes in socio-demographics characteristics, traffic demand, road network, and countermeasures. The research also evaluated the applicability of the Empirical Bayes (EB) method to network-level analysis. In addition, application of the United States Road Assessment Program (usRAP) protocols at the local urban road network level was investigated. This research evaluated the applicability of these three methods for the City of Ames, Iowa. The outcome of this research is a systematic process and framework for considering road safety issues explicitly in the small and medium-sized community transportation planning process and for quantifying the safety impacts of new developments and policy programs. More specifically, quantitative safety may be incorporated into the planning process, through effective visualization and increased awareness of safety issues (usRAP), the identification of high-risk locations with potential for improvement, (usRAP maps and EB), countermeasures for high-risk locations (EB before and after study and PLANSAFE), and socio-economic and demographic induced changes at the planning-level (PLANSAFE).
Resumo:
Wolves in Italy strongly declined in the past and were confined south of the Alps since the turn of the last century, reduced in the 1970s to approximately 100 individuals surviving in two fragmented subpopulations in the central-southern Apennines. The Italian wolves are presently expanding in the Apennines, and started to recolonize the western Alps in Italy, France and Switzerland about 16 years ago. In this study, we used a population genetic approach to elucidate some aspects of the wolf recolonization process. DNA extracted from 3068 tissue and scat samples collected in the Apennines (the source populations) and in the Alps (the colony), were genotyped at 12 microsatellite loci aiming to assess (i) the strength of the bottleneck and founder effects during the onset of colonization; (ii) the rates of gene flow between source and colony; and (iii) the minimum number of colonizers that are needed to explain the genetic variability observed in the colony. We identified a total of 435 distinct wolf genotypes, which showed that wolves in the Alps: (i) have significantly lower genetic diversity (heterozygosity, allelic richness, number of private alleles) than wolves in the Apennines; (ii) are genetically distinct using pairwise F(ST) values, population assignment test and Bayesian clustering; (iii) are not in genetic equilibrium (significant bottleneck test). Spatial autocorrelations are significant among samples separated up to c. 230 km, roughly correspondent to the apparent gap in permanent wolf presence between the Alps and north Apennines. The estimated number of first-generation migrants indicates that migration has been unidirectional and male-biased, from the Apennines to the Alps, and that wolves in southern Italy did not contribute to the Alpine population. These results suggest that: (i) the Alps were colonized by a few long-range migrating wolves originating in the north Apennine subpopulation; (ii) during the colonization process there has been a moderate bottleneck; and (iii) gene flow between sources and colonies was moderate (corresponding to 1.25-2.50 wolves per generation), despite high potential for dispersal. Bottleneck simulations showed that a total of c. 8-16 effective founders are needed to explain the genetic diversity observed in the Alps. Levels of genetic diversity in the expanding Alpine wolf population, and the permanence of genetic structuring, will depend on the future rates of gene flow among distinct wolf subpopulation fragments.