992 resultados para computational statistics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Both the emission properties and the evolution of the radio jets of Active Galactic Nuclei are dependent on the magnetic (B) fields that thread them. A number of observations of AGN jets suggest that the B fields they carry have a significant helical component, at least on parsec scales. This thesis uses a model, first proposed by Laing and then developed by Papageorgiou, to explore how well the observed properties of AGN jets can be reproduced by assuming a helical B field with three parameters; pitch angle, viewing angle and degree of entanglement. This model has been applied to multifrequency Very Long Baseline Interferometry (VLBI) observations of the AGN jets of Markarian 501 and M87, making it possible to derive values for the helical pitch angle, the viewing angle and the degree of entanglement for these jets. Faraday rotation measurements are another important tool for investigating the B fields of AGN jets. A helical B field component should result in a systematic gradient in the observed Faraday rotation across the jet. Real observed radio images have finite resolution; typical beam sizes for cm-wavelength VLBI observations are often comparable to or larger than the intrinsic jet widths, raising questions about how well resolved a jet must be in the transverse direction in order to reliably detect transverse Faraday-rotation structure. This thesis presents results of Monte Carlo simulations of Faraday rotation images designed to directly investigate this question, together with a detailed investigation into the probabilities of observing spurious Faraday Rotation gradients as a result of random noise and finite resolution. These simulations clearly demonstrate the possibility of detecting transverse Faraday-rotation structures even when the intrinsic jet widths are appreciably smaller than the beam width.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This PhD thesis concerns the computational modeling of the electronic and atomic structure of point defects in technologically relevant materials. Identifying the atomistic origin of defects observed in the electrical characteristics of electronic devices has been a long-term goal of first-principles methods. First principles simulations are performed in this thesis, consisting of density functional theory (DFT) supplemented with many body perturbation theory (MBPT) methods, of native defects in bulk and slab models of In0.53Ga0.47As. The latter consist of (100) - oriented surfaces passivated with A12O3. Our results indicate that the experimentally extracted midgap interface state density (Dit) peaks are not the result of defects directly at the semiconductor/oxide interface, but originate from defects in a more bulk-like chemical environment. This conclusion is reached by considering the energy of charge transition levels for defects at the interface as a function of distance from the oxide. Our work provides insight into the types of defects responsible for the observed departure from ideal electrical behaviour in III-V metal-oxidesemiconductor (MOS) capacitors. In addition, the formation energetics and electron scattering properties of point defects in carbon nanotubes (CNTs) are studied using DFT in conjunction with Green’s function based techniques. The latter are applied to evaluate the low-temperature, low-bias Landauer conductance spectrum from which mesoscopic transport properties such as the elastic mean free path and localization length of technologically relevant CNT sizes can be estimated from computationally tractable CNT models. Our calculations show that at CNT diameters pertinent to interconnect applications, the 555777 divacancy defect results in increased scattering and hence higher electrical resistance for electron transport near the Fermi level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Copper is the main interconnect material in microelectronic devices, and a 2 nm-thick continuous Cu film seed layer needs to be deposited to produce microelectronic devices with the smallest features and more functionality. Atomic layer deposition (ALD) is the most suitable method to deposit such thin films. However, the reaction mechanism and the surface chemistry of copper ALD remain unclear, which is deterring the development of better precursors and design of new ALD processes. In this thesis, we study the surface chemistries during ALD of copper by means of density functional theory (DFT). To understand the effect of temperature and pressure on the composition of copper with substrates, we used ab initio atomistic thermodynamics to obtain phase diagram of the Cu(111)/SiO2(0001) interface. We found that the interfacial oxide Cu2O phases prefer high oxygen pressure and low temperature while the silicide phases are stable at low oxygen pressure and high temperature for Cu/SiO2 interface, which is in good agreement with experimental observations. Understanding the precursor adsorption on surfaces is important for understanding the surface chemistry and reaction mechanism of the Cu ALD process. Focusing on two common Cu ALD precursors, Cu(dmap)2 and Cu(acac)2, we studied the precursor adsorption on Cu surfaces by means of van der Waals (vdW) inclusive DFT methods. We found that the adsorption energies and adsorption geometries are dependent on the adsorption sites and on the method used to include vdW in the DFT calculation. Both precursor molecules are partially decomposed and the Cu cations are partially reduced in their chemisorbed structure. It is found that clean cleavage of the ligand−metal bond is one of the requirements for selecting precursors for ALD of metals. 2 Bonding between surface and an atom in the ligand which is not coordinated with the Cu may result in impurities in the thin film. To have insight into the reaction mechanism of a full ALD cycle of Cu ALD, we proposed reaction pathways based on activation energies and reaction energies for a range of surface reactions between Cu(dmap)2 and Et2Zn. The butane formation and desorption steps are found to be extremely exothermic, explaining the ALD reaction scheme of original experimental work. Endothermic ligand diffusion and re-ordering steps may result in residual dmap ligands blocking surface sites at the end of the Et2Zn pulse, and in residual Zn being reduced and incorporated as an impurity. This may lead to very slow growth rate, as was the case in the experimental work. By investigating the reduction of CuO to metallic Cu, we elucidated the role of the reducing agent in indirect ALD of Cu. We found that CuO bulk is protected from reduction during vacuum annealing by the CuO surface and that H2 is required in order to reduce that surface, which shows that the strength of reducing agent is important to obtain fully reduced metal thin films during indirect ALD processes. Overall, in this thesis, we studied the surface chemistries and reaction mechanisms of Cu ALD processes and the nucleation of Cu to form a thin film.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we examine exchange rates in Vietnam’s transitional economy. Evidence of long-run equilibrium are established in most cases through a single co-integrating vector among endogenous variables that determine the real exchange rates. This supports relative PPP in which ECT of the system can be combined linearly into a stationary process, reducing deviation from PPP in the long run. Restricted coefficient vectors ß’ = (1, 1, -1) for real exchange rates of currencies in question are not rejected. This empirics of relative PPP adds to found evidences by many researchers, including Flre et al. (1999), Lee (1999), Johnson (1990), Culver and Papell (1999), Cuddington and Liang (2001). Instead of testing for different time series on a common base currency, we use different base currencies (USD, GBP, JPY and EUR). By doing so we want to know the whether theory may posit significant differences against one currency? We have found consensus, given inevitable technical differences, even with smallerdata sample for EUR. Speeds of convergence to PPP and adjustment are faster compared to results from other researches for developed economies, using both observed and bootstrapped HL measures. Perhaps, a better explanation is the adjustment from hyperinflation period, after which the theory indicates that adjusting process actually accelerates. We observe that deviation appears to have been large in early stages of the reform, mostly overvaluation. Over time, its correction took place leading significant deviations to gradually disappear.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a strategy for Markov chain Monte Carlo analysis of non-linear, non-Gaussian state-space models involving batch analysis for inference on dynamic, latent state variables and fixed model parameters. The key innovation is a Metropolis-Hastings method for the time series of state variables based on sequential approximation of filtering and smoothing densities using normal mixtures. These mixtures are propagated through the non-linearities using an accurate, local mixture approximation method, and we use a regenerating procedure to deal with potential degeneracy of mixture components. This provides accurate, direct approximations to sequential filtering and retrospective smoothing distributions, and hence a useful construction of global Metropolis proposal distributions for simulation of posteriors for the set of states. This analysis is embedded within a Gibbs sampler to include uncertain fixed parameters. We give an example motivated by an application in systems biology. Supplemental materials provide an example based on a stochastic volatility model as well as MATLAB code.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes advances in statistical computation for large-scale data analysis in structured Bayesian mixture models via graphics processing unit (GPU) programming. The developments are partly motivated by computational challenges arising in fitting models of increasing heterogeneity to increasingly large datasets. An example context concerns common biological studies using high-throughput technologies generating many, very large datasets and requiring increasingly high-dimensional mixture models with large numbers of mixture components.We outline important strategies and processes for GPU computation in Bayesian simulation and optimization approaches, give examples of the benefits of GPU implementations in terms of processing speed and scale-up in ability to analyze large datasets, and provide a detailed, tutorial-style exposition that will benefit readers interested in developing GPU-based approaches in other statistical models. Novel, GPU-oriented approaches to modifying existing algorithms software design can lead to vast speed-up and, critically, enable statistical analyses that presently will not be performed due to compute time limitations in traditional computational environments. Supplementalmaterials are provided with all source code, example data, and details that will enable readers to implement and explore the GPU approach in this mixture modeling context. © 2010 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Scale-invariant neuronal avalanches have been observed in cell cultures and slices as well as anesthetized and awake brains, suggesting that the brain operates near criticality, i.e. within a narrow margin between avalanche propagation and extinction. In theory, criticality provides many desirable features for the behaving brain, optimizing computational capabilities, information transmission, sensitivity to sensory stimuli and size of memory repertoires. However, a thorough characterization of neuronal avalanches in freely-behaving (FB) animals is still missing, thus raising doubts about their relevance for brain function. METHODOLOGY/PRINCIPAL FINDINGS: To address this issue, we employed chronically implanted multielectrode arrays (MEA) to record avalanches of action potentials (spikes) from the cerebral cortex and hippocampus of 14 rats, as they spontaneously traversed the wake-sleep cycle, explored novel objects or were subjected to anesthesia (AN). We then modeled spike avalanches to evaluate the impact of sparse MEA sampling on their statistics. We found that the size distribution of spike avalanches are well fit by lognormal distributions in FB animals, and by truncated power laws in the AN group. FB data surrogation markedly decreases the tail of the distribution, i.e. spike shuffling destroys the largest avalanches. The FB data are also characterized by multiple key features compatible with criticality in the temporal domain, such as 1/f spectra and long-term correlations as measured by detrended fluctuation analysis. These signatures are very stable across waking, slow-wave sleep and rapid-eye-movement sleep, but collapse during anesthesia. Likewise, waiting time distributions obey a single scaling function during all natural behavioral states, but not during anesthesia. Results are equivalent for neuronal ensembles recorded from visual and tactile areas of the cerebral cortex, as well as the hippocampus. CONCLUSIONS/SIGNIFICANCE: Altogether, the data provide a comprehensive link between behavior and brain criticality, revealing a unique scale-invariant regime of spike avalanches across all major behaviors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report a comprehensive study of the binary systems of the platinum-group metals with the transition metals, using high-throughput first-principles calculations. These computations predict stability of new compounds in 28 binary systems where no compounds have been reported in the literature experimentally and a few dozen of as-yet unreported compounds in additional systems. Our calculations also identify stable structures at compound compositions that have been previously reported without detailed structural data and indicate that some experimentally reported compounds may actually be unstable at low temperatures. With these results, we construct enhanced structure maps for the binary alloys of platinum-group metals. These maps are much more complete, systematic, and predictive than those based on empirical results alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R Markdown is a new technology that makes creating fully-reproducible statistical analysis simple and painless. It provides a solution suitable not only for cutting edge research, but also for use in an introductory statistics course. We present experiential and statistical evidence that R Markdown can be used effectively in introductory statistics courses, and discuss its role in the rapidly-changing world of statistical computation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proteins are essential components of cells and are crucial for catalyzing reactions, signaling, recognition, motility, recycling, and structural stability. This diversity of function suggests that nature is only scratching the surface of protein functional space. Protein function is determined by structure, which in turn is determined predominantly by amino acid sequence. Protein design aims to explore protein sequence and conformational space to design novel proteins with new or improved function. The vast number of possible protein sequences makes exploring the space a challenging problem.

Computational structure-based protein design (CSPD) allows for the rational design of proteins. Because of the large search space, CSPD methods must balance search accuracy and modeling simplifications. We have developed algorithms that allow for the accurate and efficient search of protein conformational space. Specifically, we focus on algorithms that maintain provability, account for protein flexibility, and use ensemble-based rankings. We present several novel algorithms for incorporating improved flexibility into CSPD with continuous rotamers. We applied these algorithms to two biomedically important design problems. We designed peptide inhibitors of the cystic fibrosis agonist CAL that were able to restore function of the vital cystic fibrosis protein CFTR. We also designed improved HIV antibodies and nanobodies to combat HIV infections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determining how information flows along anatomical brain pathways is a fundamental requirement for understanding how animals perceive their environments, learn, and behave. Attempts to reveal such neural information flow have been made using linear computational methods, but neural interactions are known to be nonlinear. Here, we demonstrate that a dynamic Bayesian network (DBN) inference algorithm we originally developed to infer nonlinear transcriptional regulatory networks from gene expression data collected with microarrays is also successful at inferring nonlinear neural information flow networks from electrophysiology data collected with microelectrode arrays. The inferred networks we recover from the songbird auditory pathway are correctly restricted to a subset of known anatomical paths, are consistent with timing of the system, and reveal both the importance of reciprocal feedback in auditory processing and greater information flow to higher-order auditory areas when birds hear natural as opposed to synthetic sounds. A linear method applied to the same data incorrectly produces networks with information flow to non-neural tissue and over paths known not to exist. To our knowledge, this study represents the first biologically validated demonstration of an algorithm to successfully infer neural information flow networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our media is saturated with claims of ``facts'' made from data. Database research has in the past focused on how to answer queries, but has not devoted much attention to discerning more subtle qualities of the resulting claims, e.g., is a claim ``cherry-picking''? This paper proposes a Query Response Surface (QRS) based framework that models claims based on structured data as parameterized queries. A key insight is that we can learn a lot about a claim by perturbing its parameters and seeing how its conclusion changes. This framework lets us formulate and tackle practical fact-checking tasks --- reverse-engineering vague claims, and countering questionable claims --- as computational problems. Within the QRS based framework, we take one step further, and propose a problem along with efficient algorithms for finding high-quality claims of a given form from data, i.e. raising good questions, in the first place. This is achieved to using a limited number of high-valued claims to represent high-valued regions of the QRS. Besides the general purpose high-quality claim finding problem, lead-finding can be tailored towards specific claim quality measures, also defined within the QRS framework. An example of uniqueness-based lead-finding is presented for ``one-of-the-few'' claims, landing in interpretable high-quality claims, and an adjustable mechanism for ranking objects, e.g. NBA players, based on what claims can be made for them. Finally, we study the use of visualization as a powerful way of conveying results of a large number of claims. An efficient two stage sampling algorithm is proposed for generating input of 2d scatter plot with heatmap, evalutaing a limited amount of data, while preserving the two essential visual features, namely outliers and clusters. For all the problems, we present real-world examples and experiments that demonstrate the power of our model, efficiency of our algorithms, and usefulness of their results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With increasing recognition of the roles RNA molecules and RNA/protein complexes play in an unexpected variety of biological processes, understanding of RNA structure-function relationships is of high current importance. To make clean biological interpretations from three-dimensional structures, it is imperative to have high-quality, accurate RNA crystal structures available, and the community has thoroughly embraced that goal. However, due to the many degrees of freedom inherent in RNA structure (especially for the backbone), it is a significant challenge to succeed in building accurate experimental models for RNA structures. This chapter describes the tools and techniques our research group and our collaborators have developed over the years to help RNA structural biologists both evaluate and achieve better accuracy. Expert analysis of large, high-resolution, quality-conscious RNA datasets provides the fundamental information that enables automated methods for robust and efficient error diagnosis in validating RNA structures at all resolutions. The even more crucial goal of correcting the diagnosed outliers has steadily developed toward highly effective, computationally based techniques. Automation enables solving complex issues in large RNA structures, but cannot circumvent the need for thoughtful examination of local details, and so we also provide some guidance for interpreting and acting on the results of current structure validation for RNA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Slowly-compressed single crystals, bulk metallic glasses (BMGs), rocks, granular materials, and the earth all deform via intermittent slips or "quakes". We find that although these systems span 12 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties. Remarkably, the size distributions follow the same power law multiplied with the same exponential cutoff. The cutoff grows with applied force for materials spanning length scales from nanometers to kilometers. The tuneability of the cutoff with stress reflects "tuned critical" behavior, rather than self-organized criticality (SOC), which would imply stress-independence. A simple mean field model for avalanches of slipping weak spots explains the agreement across scales. It predicts the observed slip-size distributions and the observed stress-dependent cutoff function. The results enable extrapolations from one scale to another, and from one force to another, across different materials and structures, from nanocrystals to earthquakes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2014 .The adoption of antisense gene silencing as a novel disinfectant for prokaryotic organisms is hindered by poor silencing efficiencies. Few studies have considered the effects of off-targets on silencing efficiencies, especially in prokaryotic organisms. In this computational study, a novel algorithm was developed that determined and sorted the number of off-targets as a function of alignment length in Escherichia coli K-12 MG1655 and Mycobacterium tuberculosis H37Rv. The mean number of off-targets per a single location was calculated to be 14.1. ±. 13.3 and 36.1. ±. 58.5 for the genomes of E. coli K-12 MG1655 and M. tuberculosis H37Rv, respectively. Furthermore, when the entire transcriptome was analyzed, it was found that there was no general gene location that could be targeted to minimize or maximize the number of off-targets. In an effort to determine the effects of off-targets on silencing efficiencies, previously published studies were used. Analyses with acpP, ino1, and marORAB revealed a statistically significant relationship between the number of short alignment length off-targets hybrids and the efficacy of the antisense gene silencing, suggesting that the minimization of off-targets may be beneficial for antisense gene silencing in prokaryotic organisms.