951 resultados para Statistical Analysis
Resumo:
The brain is perhaps the most complex system to have ever been subjected to rigorous scientific investigation. The scale is staggering: over 10^11 neurons, each making an average of 10^3 synapses, with computation occurring on scales ranging from a single dendritic spine, to an entire cortical area. Slowly, we are beginning to acquire experimental tools that can gather the massive amounts of data needed to characterize this system. However, to understand and interpret these data will also require substantial strides in inferential and statistical techniques. This dissertation attempts to meet this need, extending and applying the modern tools of latent variable modeling to problems in neural data analysis.
It is divided into two parts. The first begins with an exposition of the general techniques of latent variable modeling. A new, extremely general, optimization algorithm is proposed - called Relaxation Expectation Maximization (REM) - that may be used to learn the optimal parameter values of arbitrary latent variable models. This algorithm appears to alleviate the common problem of convergence to local, sub-optimal, likelihood maxima. REM leads to a natural framework for model size selection; in combination with standard model selection techniques the quality of fits may be further improved, while the appropriate model size is automatically and efficiently determined. Next, a new latent variable model, the mixture of sparse hidden Markov models, is introduced, and approximate inference and learning algorithms are derived for it. This model is applied in the second part of the thesis.
The second part brings the technology of part I to bear on two important problems in experimental neuroscience. The first is known as spike sorting; this is the problem of separating the spikes from different neurons embedded within an extracellular recording. The dissertation offers the first thorough statistical analysis of this problem, which then yields the first powerful probabilistic solution. The second problem addressed is that of characterizing the distribution of spike trains recorded from the same neuron under identical experimental conditions. A latent variable model is proposed. Inference and learning in this model leads to new principled algorithms for smoothing and clustering of spike data.
Resumo:
High-resolution orbital and in situ observations acquired of the Martian surface during the past two decades provide the opportunity to study the rock record of Mars at an unprecedented level of detail. This dissertation consists of four studies whose common goal is to establish new standards for the quantitative analysis of visible and near-infrared data from the surface of Mars. Through the compilation of global image inventories, application of stratigraphic and sedimentologic statistical methods, and use of laboratory analogs, this dissertation provides insight into the history of past depositional and diagenetic processes on Mars. The first study presents a global inventory of stratified deposits observed in images from the High Resolution Image Science Experiment (HiRISE) camera on-board the Mars Reconnaissance Orbiter. This work uses the widespread coverage of high-resolution orbital images to make global-scale observations about the processes controlling sediment transport and deposition on Mars. The next chapter presents a study of bed thickness distributions in Martian sedimentary deposits, showing how statistical methods can be used to establish quantitative criteria for evaluating the depositional history of stratified deposits observed in orbital images. The third study tests the ability of spectral mixing models to obtain quantitative mineral abundances from near-infrared reflectance spectra of clay and sulfate mixtures in the laboratory for application to the analysis of orbital spectra of sedimentary deposits on Mars. The final study employs a statistical analysis of the size, shape, and distribution of nodules observed by the Mars Science Laboratory Curiosity rover team in the Sheepbed mudstone at Yellowknife Bay in Gale crater. This analysis is used to evaluate hypotheses for nodule formation and to gain insight into the diagenetic history of an ancient habitable environment on Mars.
Resumo:
In multisource industrial scenarios (MSIS) coexist NOAA generating activities with other productive sources of airborne particles, such as parallel processes of manufacturing or electrical and diesel machinery. A distinctive characteristic of MSIS is the spatially complex distribution of aerosol sources, as well as their potential differences in dynamics, due to the feasibility of multi-task configuration at a given time. Thus, the background signal is expected to challenge the aerosol analyzers at a probably wide range of concentrations and size distributions, depending of the multisource configuration at a given time. Monitoring and prediction by using statistical analysis of time series captured by on-line particle analyzers in industrial scenarios, have been proven to be feasible in predicting PNC evolution provided a given quality of net signals (difference between signal at source and background). However the analysis and modelling of non-consistent time series, influenced by low levels of SNR (Signal-Noise Ratio) could build a misleading basis for decision making. In this context, this work explores the use of stochastic models based on ARIMA methodology to monitor and predict exposure values (PNC). The study was carried out in a MSIS where an case study focused on the manufacture of perforated tablets of nano-TiO2 by cold pressing was performed
Resumo:
Bottlenose dolphins (Tursiops truncatus) inhabit estuarine waters near Charleston, South Carolina (SC) feeding, nursing and socializing. While in these waters, dolphins are exposed to multiple direct and indirect threats such as anthropogenic impacts (egs. harassment with boat traffic and entanglements in fishing gear) and environmental degradation. Bottlenose dolphins are protected under the Marine Mammal Protection Act of 1972. Over the years, the percentage of strandings in the estuaries has increased in South Carolina and, specifically, recent stranding data shows an increase in strandings occurring in Charleston, SC near areas of residential development. During the same timeframe, Charleston experienced a shift in human population towards the coastline. These two trends, rise in estuarine dolphin strandings and shift in human population, have raised questions on whether the increase in strandings is a result of more detectable strandings being reported, or a true increase in stranding events. Using GIS, the trends in strandings were compared to residential growth, boat permits, fishing permits, and dock permits in Charleston County from 1994-2009. A simple linear regression analysis was performed to determine if there were any significant relationships between strandings, boat permits, commercial fishing permits, and crabpot permits. The results of this analysis show the stranding trend moves toward Charleston Harbor and adjacent rivers over time which suggests the increase in strandings is related to the strandings becoming more detectable. The statistical analysis shows that the factors that cause human interaction strandings such as boats, commercial fishing, and crabpot line entanglements are not significantly related to strandings further supporting the hypothesis that the increase in strandings are due to increased observations on the water as human coastal population increases and are not a natural phenomenon. This study has local and potentially regional marine spatial planning implications to protect coastal natural resources, such as the bottlenose dolphin, while balancing coastal development.
Resumo:
A statistical analysis was made of the dimensions involved in the adoption of improved fish curing practices.
Resumo:
We conducted a comparative statistical analysis of tetra- through hexanucleotide frequencies in two sets of introns of yeast genes. The first set consisted of introns of genes that have transcription rates higher than 30 mRNAs/h while the second set contained introns of genes whose transcription rates were lower than or equal to 10 mRNAs/h. Some oligonucleotides whose occurrence frequencies in the first set of introns are significantly higher than those in the second set of introns were detected. The frequencies of occurrence of most of these detected oligonucleotides are also significantly higher than those in the exons flanking the introns of the first set. Interestingly some of these detected oligonucleotides are the same as well known "signature" sequences of transcriptional regulatory elements. This could imply the existence of potential positive regulatory motifs of transcription in yeast introns. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
A novel launch scheme is proposed for multimode-fiber (MMF) links. Enhanced performance in 10 Gb/s MMF links using electronic equalization is demonstrated by statistical analysis of installed-base fiber and an experimental investigation. © 2007 Optical Society of America.
Resumo:
Rigorous statistical analysis is applied for the first time to identify optimal launch conditions and carrier frequencies for SCM transmission over worst-case MMF. The feasibility of multichannel schemes for 10 Gb/s over 300 m is demonstrated. © 2005 Optical Society of America.
Resumo:
Rigorous statistical analysis is applied for the first time to identify optimal launch conditions and carrier frequencies for SCM transmission over worst-case MMF. The feasibility of multichannel schemes for 10 Gb/s over 300 m is demonstrated. © 2005 Optical Society of America.
Resumo:
Toivonen, H., Srinivasan, A., King, R. D., Kramer, S. and Helma, C. (2003) Statistical Evaluation of the Predictive Toxicology Challenge 2000-2001. Bioinformatics 19: 1183-1193
Resumo:
The spread of democracy in the latter part of the twenty first century has been accompanied by an increasing focus on its perceived performance in established western democracies. Recent literature has expressed concern about a critical outlook among younger cohorts which threatens their political support and engagement. Political efficacy, referring to the feeling of political effectiveness, is considered to be a key indicator of the performance of democratic politics; as it refers to the empowerment of citizens, and relates to their willingness to engage in political matters. The aim of this thesis is to analyse the socialisation of political efficacy among those on the threshold of political adulthood; i.e., 'threshold voters'. The long-term significance of attitudes developed by time of entry to adulthood for political engagement during adulthood has been emphasised in recent literature. By capturing the effect of non-political and political learning among threshold voters, the study advances existing research frames which focus on childhood and early adolescent socialisation. The theoretical and methodological framework applied herein recognises the distinction between internal and external political efficacy, which has not been consistently operationalized in existing research on efficacy socialisation. This research involves a case study of 'threshold voters' in the Republic of Ireland, and employs a quantitative methodology. A study on Irish threshold voters is timely as the parliament and government have recently proposed a lowering of the voting age and an expansion of formal political education to this age group. A project-specific survey instrument was developed and administered to a systematic stratified sample of 1,042 post-primary students in the Cork area. Interpretation of the results of statistical analysis leads to findings on the divergent influence of family, school, associational, and political agents/environments on threshold voter internal and external political efficacy.
Resumo:
In 1966, Roy Geary, Director of the ESRI, noted “the absence of any kind of import and export statistics for regions is a grave lacuna” and further noted that if regional analyses were to be developed then regional Input-Output Tables must be put on the “regular statistical assembly line”. Forty-five years later, the lacuna lamented by Geary still exists and remains the most significant challenge to the construction of regional Input-Output Tables in Ireland. The continued paucity of sufficient regional data to compile effective regional Supply and Use and Input-Output Tables has retarded the capacity to construct sound regional economic models and provide a robust evidence base with which to formulate and assess regional policy. This study makes a first step towards addressing this gap by presenting the first set of fully integrated, symmetric, Supply and Use and domestic Input-Output Tables compiled for the NUTS 2 regions in Ireland: The Border, Midland and Western region and the Southern & Eastern region. These tables are general purpose in nature and are consistent fully with the official national Supply & Use and Input-Output Tables, and the regional accounts. The tables are constructed using a survey-based or bottom-up approach rather than employing modelling techniques, yielding more robust and credible tables. These tables are used to present a descriptive statistical analysis of the two administrative NUTS 2 regions in Ireland, drawing particular attention to the underlying structural differences of regional trade balances and composition of Gross Value Added in those regions. By deriving regional employment multipliers, Domestic Demand Employment matrices are constructed to quantify and illustrate the supply chain impact on employment. In the final part of the study, the predictive capability of the Input-Output framework is tested over two time periods. For both periods, the static Leontief production function assumptions are relaxed to allow for labour productivity. Comparative results from this experiment are presented.
Resumo:
Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R Markdown is a new technology that makes creating fully-reproducible statistical analysis simple and painless. It provides a solution suitable not only for cutting edge research, but also for use in an introductory statistics course. We present experiential and statistical evidence that R Markdown can be used effectively in introductory statistics courses, and discuss its role in the rapidly-changing world of statistical computation.
Resumo:
New representations of tree-structured data objects, using ideas from topological data analysis, enable improved statistical analyses of a population of brain artery trees. A number of representations of each data tree arise from persistence diagrams that quantify branching and looping of vessels at multiple scales. Novel approaches to the statistical analysis, through various summaries of the persistence diagrams, lead to heightened correlations with covariates such as age and sex, relative to earlier analyses of this data set. The correlation with age continues to be significant even after controlling for correlations from earlier significant summaries.
Resumo:
The objectives of this study were to determine the fracture toughness of adhesive interfaces between dentine and clinically relevant, thin layers of dental luting cements. Cements tested included a conventional glass-ionomer, F (Fuji I), a resin-modified glass-ionomer, FP (Fuji Plus) and a compomer cement, D (DyractCem). Ten miniature short-bar chevron notch specimens were manufactured for each cement, each comprising a 40 µm thick chevron of lute, between two 1.5 mm thick blocks of bovine dentine, encased in resin composite. The interfacial KIC results (MN/m3/2) were median (range): F; 0.152 (0.14-0.16), FP; 0.306 (0.27-0.37), D; 0.351 (0.31-0.37). Non-parametric statistical analysis showed that the fracture toughness of F was significantly lower (p