89 resultados para Data Interpretation, Statistical


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We are developing computational tools supporting the detailed analysis of the dependence of neural electrophysiological response on dendritic morphology. We approach this problem by combining simulations of faithful models of neurons (experimental real life morphological data with known models of channel kinetics) with algorithmic extraction of morphological and physiological parameters and statistical analysis. In this paper, we present the novel method for an automatic recognition of spike trains in voltage traces, which eliminates the need for human intervention. This enables classification of waveforms with consistent criteria across all the analyzed traces and so it amounts to reduction of the noise in the data. This method allows for an automatic extraction of relevant physiological parameters necessary for further statistical analysis. In order to illustrate the usefulness of this procedure to analyze voltage traces, we characterized the influence of the somatic current injection level on several electrophysiological parameters in a set of modeled neurons. This application suggests that such an algorithmic processing of physiological data extracts parameters in a suitable form for further investigation of structure-activity relationship in single neurons.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structurefunction relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of threedimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Singlecell neuroanatomy can be characterized quantitatively at several levels. In computeraided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This Cartesian description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise blueprint of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of fundamental, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, LNEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the computational neuroanatomy strategy for neuroscience databases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This chapter introduces the latest practices and technologies in the interactive interpretation of environmental data. With environmental data becoming ever larger, more diverse and more complex, there is a need for a new generation of tools that provides new capabilities over and above those of the standard workhorses of science. These new tools aid the scientist in discovering interesting new features (and also problems) in large datasets by allowing the data to be explored interactively using simple, intuitive graphical tools. In this way, new discoveries are made that are commonly missed by automated batch data processing. This chapter discusses the characteristics of environmental science data, common current practice in data analysis and the supporting tools and infrastructure. New approaches are introduced and illustrated from the points of view of both the end user and the underlying technology. We conclude by speculating as to future developments in the field and what must be achieved to fulfil this vision.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As in any field of scientific inquiry, advancements in the field of second language acquisition (SLA) rely in part on the interpretation and generalizability of study findings using quantitative data analysis and inferential statistics. While statistical techniques such as ANOVA and t-tests are widely used in second language research, this review article provides a review of a class of newer statistical models that have not yet been widely adopted in the field, but have garnered interest in other fields of language research. The class of statistical models called mixed-effects models are introduced, and the potential benefits of these models for the second language researcher are discussed. A simple example of mixed-effects data analysis using the statistical software package R (R Development Core Team, 2011) is provided as an introduction to the use of these statistical techniques, and to exemplify how such analyses can be reported in research articles. It is concluded that mixed-effects models provide the second language researcher with a powerful tool for the analysis of a variety of types of second language acquisition data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regular visual observations of persistent contrails over Reading, UK, have been used to evaluate radiosonde measurements of temperature and humidity defining cold ice-supersaturated atmospheric regions which are assumed to be a necessary condition for persistent condensation trails (contrails) to form. Results show a good correlation between observations and predictions using data from Larkhill, 63 km from Reading. A statistical analysis of this result and the forecasts using data from four additional UK radiosonde stations are presented. The horizontal extent of supersaturated layers could be inferred from this to be several hundred kilometres. The necessity of bias corrections to radiosonde humidity measurements is discussed and an analysis of measured ice-supersaturated atmospheric layers in the troposphere is presented. It is found that ice supersaturation is more likely to occur in winter than in summer, with frequencies of 17.3% and 9.4%, respectively, which is mostly due to the layers being thicker in winter than in summer. The most probable height for them to occur is about 10 km.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian inference has been used to determine rigorous estimates of hydroxyl radical concentrations () and air mass dilution rates (K) averaged following air masses between linked observations of nonmethane hydrocarbons (NMHCs) spanning the North Atlantic during the Intercontinental Transport and Chemical Transformation (ITCT)-Lagrangian-2K4 experiment. The Bayesian technique obtains a refined (posterior) distribution of a parameter given data related to the parameter through a model and prior beliefs about the parameter distribution. Here, the model describes hydrocarbon loss through OH reaction and mixing with a background concentration at rate K. The Lagrangian experiment provides direct observations of hydrocarbons at two time points, removing assumptions regarding composition or sources upstream of a single observation. The estimates are sharpened by using many hydrocarbons with different reactivities and accounting for their variability and measurement uncertainty. A novel technique is used to construct prior background distributions of many species, described by variation of a single parameter . This exploits the high correlation of species, related by the first principal component of many NMHC samples. The Bayesian method obtains posterior estimates of , K and following each air mass. Median values are typically between 0.5 and 2.0 106 molecules cm3, but are elevated to between 2.5 and 3.5 106 molecules cm3, in low-level pollution. A comparison of estimates from absolute NMHC concentrations and NMHC ratios assuming zero background (the photochemical clock method) shows similar distributions but reveals systematic high bias in the estimates from ratios. Estimates of K are 0.1 day1 but show more sensitivity to the prior distribution assumed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We suggest that climate variability in Europe for the pre-industrial period 15001900 is fundamentally a consequence of internal fluctuations of the climate system. This is because a model simulation, using fixed pre-industrial forcing, in several important aspects is consistent with recent observational reconstructions at high temporal resolution. This includes extreme warm and cold seasonal events as well as different measures of the decadal to multi-decadal variance. Significant trends of 50-year duration can be seen in the model simulation. While the global temperature is highly correlated with ENSO (El Nino- Southern Oscillation), European seasonal temperature is only weakly correlated with the global temperature broadly consistent with data from ERA-40 reanalyses. Seasonal temperature anomalies of the European land area are largely controlled by the position of the North Atlantic storm tracks. We believe the result is highly relevant for the interpretation of past observational records suggesting that the effect of external forcing appears to be of secondary importance. That variations in the solar irradiation could have been a credible cause of climate variations during the last centuries, as suggested in some previous studies, is presumably due to the fact that the models used in these studies may have underestimated the internal variability of the climate. The general interpretation from this study is that the past climate is just one of many possible realizations and thus in many respects not reproducible in its time evolution with a general circulation model but only reproducible in a statistical sense.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes the hypothesis that the low-frequency variability of the North Atlantic Oscillation (NAO) arises as a result of variations in the occurrence of upper-level Rossby wavebreaking events over the North Atlantic. These events lead to synoptic situations similar to midlatitude blocking that are referred to as high-latitude blocking episodes. A positive NAO is envisaged as being a description of periods in which these episodes are infrequent and can be considered as a basic, unblocked situation. A negative NAO is a description of periods in which episodes occur frequently. A similar, but weaker, relationship exists between wave breaking over the Pacific and the west Pacific pattern. Evidence is given to support this hypothesis by using a two-dimensional potential-vorticity-based index to identify wave breaking at various latitudes. This is applied to Northern Hemisphere winter data from the 40-yr ECMWF Re-Analysis (ERA-40), and the events identified are then related to the NAO. Certain dynamical precursors are identified that appear to increase the likelihood of wave breaking. These suggest mechanisms by which variability in the tropical Pacific, and in the stratosphere, could affect the NAO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 20012025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the observed climatology and the true climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 20012025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent analysis of the Arctic Oscillation (AO) in the stratosphere and troposphere has suggested that predictability of the state of the tropospheric AO may be obtained from the state of the stratospheric AO. However, much of this research has been of a purely qualitative nature. We present a more thorough statistical analysis of a long AO amplitude dataset which seeks to establish the magnitude of such a link. A relationship between the AO in the lower stratosphere and on the 1000 hPa surface on a 10-45 day time-scale is revealed. The relationship accounts for 5% of the variance of the 1000 hPa time series at its peak value and is significant at the 5% level. Over a similar time-scale the 1000 hPa time series accounts for 1% of itself and is not significant at the 5% level. Further investigation of the relationship reveals that it is only present during the winter season and in particular during February and March. It is also demonstrated that using stratospheric AO amplitude data as a predictor in a simple statistical model results in a gain of skill of 5% over a troposphere-only statistical model. This gain in skill is not repeated if an unrelated time series is included as a predictor in the model. Copyright 2003 Royal Meteorological Society

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity inherent in climate data makes it necessary to introduce more than one statistical tool to the researcher to gain insight into the climate system. Empirical orthogonal function (EOF) analysis is one of the most widely used methods to analyze weather/climate modes of variability and to reduce the dimensionality of the system. Simple structure rotation of EOFs can enhance interpretability of the obtained patterns but cannot provide anything more than temporal uncorrelatedness. In this paper, an alternative rotation method based on independent component analysis (ICA) is considered. The ICA is viewed here as a method of EOF rotation. Starting from an initial EOF solution rather than rotating the loadings toward simplicity, ICA seeks a rotation matrix that maximizes the independence between the components in the time domain. If the underlying climate signals have an independent forcing, one can expect to find loadings with interpretable patterns whose time coefficients have properties that go beyond simple noncorrelation observed in EOFs. The methodology is presented and an application to monthly means sea level pressure (SLP) field is discussed. Among the rotated (to independence) EOFs, the North Atlantic Oscillation (NAO) pattern, an Arctic Oscillationlike pattern, and a Scandinavian-like pattern have been identified. There is the suggestion that the NAO is an intrinsic mode of variability independent of the Pacific.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The foraminiferal-rich pelagic Bateig Limestone forms several varieties of the important building stones quarried at Bateig Hill in southeastern Spain. Three principal ichnofabrics (Bichordites, mottled-Palaeophycus and mottled-Ophiomorpha) are recognized, which are present in at least two (possibly up to four) repeated successions (cycles). Each succession begins with an erosional event. The Bichordites ichnofabric represents a new type of facies, formed as thin turbidity/grain flow, stratiform units derived from sediment slips off a fault into deep water. Each slipped unit became almost completely bioturbated by infaunal echinoids, colonizing by lateral migration. Because of the thinness of the units, successive colonizations tended to truncate the underlying burrows giving rise to a pseudo-stratification. As the Bichordites ichnofabric accumulated on the fault apron, thus reducing the effective height of the fault scarp, the substrate gradually came under the influence of currents traversing the shelf. This led to a change in hydraulic regime, and to the mottled-Palaeophycus and mottled-Ophiomorpha ichnofabrics in sediment deposited under bed load transport, and associated with laminar and cross-stratified beds and local muddy intervals. Reactivation of the fault triggered erosion and channeling and a return to grain flow sedimentation, and to the Bichordites ichnofabric of the succeeding cycle. The highest unit of the Bateig Limestone is formed entirely of cross-stratified calcarenites with occasional Ophiomorpha (Ophiomorpha-primary lamination ichnofabric) and is similar to many shallow marine facies but they still bear a significant content of pelagic foraminifera. The sedimentary setting bears resemblance with that described for the Pleistocene Monte Torre Paleostrait and the modem Strait of Messina (Italy), where the narrow morphology of the depositional area enhanced tidal currents and allowed for high-energy sandy deposition in relatively deep areas. More data on the Miocene paleogeography of the Bateig area should provide further testing for this hypothesis. The ichnofacies and stacking of the Bateig Limestone differ from the classic Seilacherian model in that they reflect changes in hydraulic process and are associated with faulting and subsidence and changes in sediment supply. Recognition of the unusual ichnofabrics and their relationships provides a clear indication of the overall dynamic setting. (c) 2006 Elsevier B.V. All rights reserved.