11 resultados para Baire-Open Maps

em Duke University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the effects of permanent and transitory changes in government purchases in the context of a model of a small open economy that produces and consumes both traded and nontraded goods. The model incorporates an equilibrium interpretation of the business cycle that emphasizes the responsiveness of agents to intertemporal relative price changes. It is demonstrated that transitory increases in government purchases lead to an appreciation of the real exchange rate and an ambiguous change (although a likely worsening) in the current account, while permanent increases have an ambiguous impact on the real exchange rate and no effect on the current account. When agents do not know whether a given increase in government purchases is permanent or transitory the effect is a weighted average of these separate effects. The weights depend on the relative variances of the transitory and permanent components of government purchases. © 1985.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: The purpose of this work is to improve the noise power spectrum (NPS), and thus the detective quantum efficiency (DQE), of computed radiography (CR) images by correcting for spatial gain variations specific to individual imaging plates. CR devices have not traditionally employed gain-map corrections, unlike the case with flat-panel detectors, because of the multiplicity of plates used with each reader. The lack of gain-map correction has limited the DQE(f) at higher exposures with CR. This current work describes a feasible solution to generating plate-specific gain maps. METHODS: Ten high-exposure open field images were taken with an RQA5 spectrum, using a sixth generation CR plate suspended in air without a cassette. Image values were converted to exposure, the plates registered using fiducial dots on the plate, the ten images averaged, and then high-pass filtered to remove low frequency contributions from field inhomogeneity. A gain-map was then produced by converting all pixel values in the average into fractions with mean of one. The resultant gain-map of the plate was used to normalize subsequent single images to correct for spatial gain fluctuation. To validate performance, the normalized NPS (NNPS) for all images was calculated both with and without the gain-map correction. Variations in the quality of correction due to exposure levels, beam voltage/spectrum, CR reader used, and registration were investigated. RESULTS: The NNPS with plate-specific gain-map correction showed improvement over the noncorrected case over the range of frequencies from 0.15 to 2.5 mm(-1). At high exposure (40 mR), NNPS was 50%-90% better with gain-map correction than without. A small further improvement in NNPS was seen from carefully registering the gain-map with subsequent images using small fiducial dots, because of slight misregistration during scanning. Further improvement was seen in the NNPS from scaling the gain map about the mean to account for different beam spectra. CONCLUSIONS: This study demonstrates that a simple gain-map can be used to correct for the fixed-pattern noise in a given plate and thus improve the DQE of CR imaging. Such a method could easily be implemented by manufacturers because each plate has a unique bar code and the gain-map for all plates associated with a reader could be stored for future retrieval. These experiments indicated that an improvement in NPS (and hence, DQE) is possible, depending on exposure level, over a wide range of frequencies with this technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to predict the existence and crystal type of ordered structures of materials from their components is a major challenge of current materials research. Empirical methods use experimental data to construct structure maps and make predictions based on clustering of simple physical parameters. Their usefulness depends on the availability of reliable data over the entire parameter space. Recent development of high-throughput methods opens the possibility to enhance these empirical structure maps by ab initio calculations in regions of the parameter space where the experimental evidence is lacking or not well characterized. In this paper we construct enhanced maps for the binary alloys of hcp metals, where the experimental data leaves large regions of poorly characterized systems believed to be phase separating. In these enhanced maps, the clusters of noncompound-forming systems are much smaller than indicated by the empirical results alone. © 2010 The American Physical Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The ability to write clearly and effectively is of central importance to the scientific enterprise. Encouraged by the success of simulation environments in other biomedical sciences, we developed WriteSim TCExam, an open-source, Web-based, textual simulation environment for teaching effective writing techniques to novice researchers. We shortlisted and modified an existing open source application - TCExam to serve as a textual simulation environment. After testing usability internally in our team, we conducted formal field usability studies with novice researchers. These were followed by formal surveys with researchers fitting the role of administrators and users (novice researchers) RESULTS: The development process was guided by feedback from usability tests within our research team. Online surveys and formal studies, involving members of the Research on Research group and selected novice researchers, show that the application is user-friendly. Additionally it has been used to train 25 novice researchers in scientific writing to date and has generated encouraging results. CONCLUSION: WriteSim TCExam is the first Web-based, open-source textual simulation environment designed to complement traditional scientific writing instruction. While initial reviews by students and educators have been positive, a formal study is needed to measure its benefits in comparison to standard instructional methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given a probability distribution on an open book (a metric space obtained by gluing a disjoint union of copies of a half-space along their boundary hyperplanes), we define a precise concept of when the Fréchet mean (barycenter) is sticky. This nonclassical phenomenon is quantified by a law of large numbers (LLN) stating that the empirical mean eventually almost surely lies on the (codimension 1 and hence measure 0) spine that is the glued hyperplane, and a central limit theorem (CLT) stating that the limiting distribution is Gaussian and supported on the spine.We also state versions of the LLN and CLT for the cases where the mean is nonsticky (i.e., not lying on the spine) and partly sticky (i.e., is, on the spine but not sticky). © Institute of Mathematical Statistics, 2013.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transcriptional regulation has been studied intensively in recent decades. One important aspect of this regulation is the interaction between regulatory proteins, such as transcription factors (TF) and nucleosomes, and the genome. Different high-throughput techniques have been invented to map these interactions genome-wide, including ChIP-based methods (ChIP-chip, ChIP-seq, etc.), nuclease digestion methods (DNase-seq, MNase-seq, etc.), and others. However, a single experimental technique often only provides partial and noisy information about the whole picture of protein-DNA interactions. Therefore, the overarching goal of this dissertation is to provide computational developments for jointly modeling different experimental datasets to achieve a holistic inference on the protein-DNA interaction landscape.

We first present a computational framework that can incorporate the protein binding information in MNase-seq data into a thermodynamic model of protein-DNA interaction. We use a correlation-based objective function to model the MNase-seq data and a Markov chain Monte Carlo method to maximize the function. Our results show that the inferred protein-DNA interaction landscape is concordant with the MNase-seq data and provides a mechanistic explanation for the experimentally collected MNase-seq fragments. Our framework is flexible and can easily incorporate other data sources. To demonstrate this flexibility, we use prior distributions to integrate experimentally measured protein concentrations.

We also study the ability of DNase-seq data to position nucleosomes. Traditionally, DNase-seq has only been widely used to identify DNase hypersensitive sites, which tend to be open chromatin regulatory regions devoid of nucleosomes. We reveal for the first time that DNase-seq datasets also contain substantial information about nucleosome translational positioning, and that existing DNase-seq data can be used to infer nucleosome positions with high accuracy. We develop a Bayes-factor-based nucleosome scoring method to position nucleosomes using DNase-seq data. Our approach utilizes several effective strategies to extract nucleosome positioning signals from the noisy DNase-seq data, including jointly modeling data points across the nucleosome body and explicitly modeling the quadratic and oscillatory DNase I digestion pattern on nucleosomes. We show that our DNase-seq-based nucleosome map is highly consistent with previous high-resolution maps. We also show that the oscillatory DNase I digestion pattern is useful in revealing the nucleosome rotational context around TF binding sites.

Finally, we present a state-space model (SSM) for jointly modeling different kinds of genomic data to provide an accurate view of the protein-DNA interaction landscape. We also provide an efficient expectation-maximization algorithm to learn model parameters from data. We first show in simulation studies that the SSM can effectively recover underlying true protein binding configurations. We then apply the SSM to model real genomic data (both DNase-seq and MNase-seq data). Through incrementally increasing the types of genomic data in the SSM, we show that different data types can contribute complementary information for the inference of protein binding landscape and that the most accurate inference comes from modeling all available datasets.

This dissertation provides a foundation for future research by taking a step toward the genome-wide inference of protein-DNA interaction landscape through data integration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Lumbar disc herniation has a prevalence of up to 58% in the athletic population. Lumbar discectomy is a common surgical procedure to alleviate pain and disability in athletes. We systematically reviewed the current clinical evidence regarding athlete return to sport (RTS) following lumbar discectomy compared to conservative treatment. METHODS: A computer-assisted literature search of MEDLINE, CINAHL, Web of Science, PEDro, OVID and PubMed databases (from inception to August 2015) was utilised using keywords related to lumbar disc herniation and surgery. The design of this systematic review was developed using the guidelines of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). Methodological quality of individual studies was assessed using the Downs and Black scale (0-16 points). RESULTS: The search strategy revealed 14 articles. Downs and Black quality scores were generally low with no articles in this review earning a high-quality rating, only 5 articles earning a moderate quality rating and 9 of the 14 articles earning a low-quality rating. The pooled RTS for surgical intervention of all included studies was 81% (95% CI 76% to 86%) with significant heterogeneity (I(2)=63.4%, p<0.001) although pooled estimates report only 59% RTS at same level. Pooled analysis showed no difference in RTS rate between surgical (84% (95% CI 77% to 90%)) and conservative intervention (76% (95% CI 56% to 92%); p=0.33). CONCLUSIONS: Studies comparing surgical versus conservative treatment found no significant difference between groups regarding RTS. Not all athletes that RTS return at the level of participation they performed at prior to surgery. Owing to the heterogeneity and low methodological quality of included studies, rates of RTS cannot be accurately determined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Bioinformatics Open Source Conference (BOSC) is organized by the Open Bioinformatics Foundation (OBF), a nonprofit group dedicated to promoting the practice and philosophy of open source software development and open science within the biological research community. Since its inception in 2000, BOSC has provided bioinformatics developers with a forum for communicating the results of their latest efforts to the wider research community. BOSC offers a focused environment for developers and users to interact and share ideas about standards; software development practices; practical techniques for solving bioinformatics problems; and approaches that promote open science and sharing of data, results, and software. BOSC is run as a two-day special interest group (SIG) before the annual Intelligent Systems in Molecular Biology (ISMB) conference. BOSC 2015 took place in Dublin, Ireland, and was attended by over 125 people, about half of whom were first-time attendees. Session topics included "Data Science;" "Standards and Interoperability;" "Open Science and Reproducibility;" "Translational Bioinformatics;" "Visualization;" and "Bioinformatics Open Source Project Updates". In addition to two keynote talks and dozens of shorter talks chosen from submitted abstracts, BOSC 2015 included a panel, titled "Open Source, Open Door: Increasing Diversity in the Bioinformatics Open Source Community," that provided an opportunity for open discussion about ways to increase the diversity of participants in BOSC in particular, and in open source bioinformatics in general. The complete program of BOSC 2015 is available online at http://www.open-bio.org/wiki/BOSC_2015_Schedule.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter presents a model averaging approach in the M-open setting using sample re-use methods to approximate the predictive distribution of future observations. It first reviews the standard M-closed Bayesian Model Averaging approach and decision-theoretic methods for producing inferences and decisions. It then reviews model selection from the M-complete and M-open perspectives, before formulating a Bayesian solution to model averaging in the M-open perspective. It constructs optimal weights for MOMA:M-open Model Averaging using a decision-theoretic framework, where models are treated as part of the ‘action space’ rather than unknown states of nature. Using ‘incompatible’ retrospective and prospective models for data from a case-control study, the chapter demonstrates that MOMA gives better predictive accuracy than the proxy models. It concludes with open questions and future directions.