990 resultados para Data Integrity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several common genetic variants have recently been discovered that appear to influence white matter microstructure, as measured by diffusion tensor imaging (DTI). Each genetic variant explains only a small proportion of the variance in brain microstructure, so we set out to explore their combined effect on the white matter integrity of the corpus callosum. We measured six common candidate single-nucleotide polymorphisms (SNPs) in the COMT, NTRK1, BDNF, ErbB4, CLU, and HFE genes, and investigated their individual and aggregate effects on white matter structure in 395 healthy adult twins and siblings (age: 20-30 years). All subjects were scanned with 4-tesla 94-direction high angular resolution diffusion imaging. When combined using mixed-effects linear regression, a joint model based on five of the candidate SNPs (COMT, NTRK1, ErbB4, CLU, and HFE) explained ∼ 6% of the variance in the average fractional anisotropy (FA) of the corpus callosum. This predictive model had detectable effects on FA at 82% of the corpus callosum voxels, including the genu, body, and splenium. Predicting the brain's fiber microstructure from genotypes may ultimately help in early risk assessment, and eventually, in personalized treatment for neuropsychiatric disorders in which brain integrity and connectivity are affected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information from the full diffusion tensor (DT) was used to compute voxel-wise genetic contributions to brain fiber microstructure. First, we designed a new multivariate intraclass correlation formula in the log-Euclidean framework. We then analyzed used the full multivariate structure of the tensor in a multivariate version of a voxel-wise maximum-likelihood structural equation model (SEM) that computes the variance contributions in the DTs from genetic (A), common environmental (C) and unique environmental (E) factors. Our algorithm was tested on DT images from 25 identical and 25 fraternal twin pairs. After linear and fluid registration to a mean template, we computed the intraclass correlation and Falconer's heritability statistic for several scalar DT-derived measures and for the full multivariate tensors. Covariance matrices were found from the DTs, and inputted into SEM. Analyzing the full DT enhanced the detection of A and C effects. This approach should empower imaging genetics studies that use DTI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. High-angular resolution diffusion imaging (HARDI) can resolve more complex diffusion geometries than standard DTI, including fibers crossing or mixing. The tensor distribution function (TDF) can be used to reconstruct multiple underlying fibers per voxel, representing the diffusion profile as a probabilistic mixture of tensors. Here we found that DTIderived mean diffusivity (MD) correlates well with actual individual fiber MD, but DTI-derived FA correlates poorly with actual individual fiber anisotropy, and may be suboptimal when used to detect disease processes that affect myelination. Analysis of the TDFs revealed that almost 40% of voxels in the white matter had more than one dominant fiber present. To more accurately assess fiber integrity in these cases, we here propose the differential diffusivity (DD), which measures the average anisotropy based on all dominant directions in each voxel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a major effort in medical imaging to develop algorithms to extract information from DTI and HARDI, which provide detailed information on brain integrity and connectivity. As the images have recently advanced to provide extraordinarily high angular resolution and spatial detail, including an entire manifold of information at each point in the 3D images, there has been no readily available means to view the results. This impedes developments in HARDI research, which need some method to check the plausibility and validity of image processing operations on HARDI data or to appreciate data features or invariants that might serve as a basis for new directions in image segmentation, registration, and statistics. We present a set of tools to provide interactive display of HARDI data, including both a local rendering application and an off-screen renderer that works with a web-based viewer. Visualizations are presented after registration and averaging of HARDI data from 90 human subjects, revealing important details for which there would be no direct way to appreciate using conventional display of scalar images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heritability of brain anatomical connectivity has been studied with diffusion-weighted imaging (DWI) mainly by modeling each voxel's diffusion pattern as a tensor (e.g., to compute fractional anisotropy), but this method cannot accurately represent the many crossing connections present in the brain. We hypothesized that different brain networks (i.e., their component fibers) might have different heritability and we investigated brain connectivity using High Angular Resolution Diffusion Imaging (HARDI) in a cohort of twins comprising 328 subjects that included 70 pairs of monozygotic and 91 pairs of dizygotic twins. Water diffusion was modeled in each voxel with a Fiber Orientation Distribution (FOD) function to study heritability for multiple fiber orientations in each voxel. Precision was estimated in a test-retest experiment on a sub-cohort of 39 subjects. This was taken into account when computing heritability of FOD peaks using an ACE model on the monozygotic and dizygotic twins. Our results confirmed the overall heritability of the major white matter tracts but also identified differences in heritability between connectivity networks. Inter-hemispheric connections tended to be more heritable than intra-hemispheric and cortico-spinal connections. The highly heritable tracts were found to connect particular cortical regions, such as medial frontal cortices, postcentral, paracentral gyri, and the right hippocampus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience, genetics, and medicine, ENIGMA studies have analyzed neuroimaging data from over 12,826 subjects. In addition, data from 12,171 individuals were provided by the CHARGE consortium for replication of findings, in a total of 24,997 subjects. By meta-analyzing results from many sites, ENIGMA has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected. ENIGMA's first project was a genome-wide association study identifying common variants in the genome associated with hippocampal volume or intracranial volume. Continuing work is exploring genetic associations with subcortical volumes (ENIGMA2) and white matter microstructure (ENIGMA-DTI). Working groups also focus on understanding how schizophrenia, bipolar illness, major depression and attention deficit/hyperactivity disorder (ADHD) affect the brain. We review the current progress of the ENIGMA Consortium, along with challenges and unexpected discoveries made on the way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reliance on police data for the counting of road crash injuries can be problematic, as it is well known that not all road crash injuries are reported to police which under-estimates the overall burden of road crash injuries. The aim of this study was to use multiple linked data sources to estimate the extent of under-reporting of road crash injuries to police in the Australian state of Queensland. Data from the Queensland Road Crash Database (QRCD), the Queensland Hospital Admitted Patients Data Collection (QHAPDC), Emergency Department Information System (EDIS), and the Queensland Injury Surveillance Unit (QISU) for the year 2009 were linked. The completeness of road crash cases reported to police was examined via discordance rates between the police data (QRCD) and the hospital data collections. In addition, the potential bias of this discordance (under-reporting) was assessed based on gender, age, road user group, and regional location. Results showed that the level of under-reporting varied depending on the data set with which the police data was compared. When all hospital data collections are examined together the estimated population of road crash injuries was approximately 28,000, with around two-thirds not linking to any record in the police data. The results also showed that the under-reporting was more likely for motorcyclists, cyclists, males, young people, and injuries occurring in Remote and Inner Regional areas. These results have important implications for road safety research and policy in terms of: prioritising funding and resources; targeting road safety interventions into areas of higher risk; and estimating the burden of road crash injuries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, rapid advances in information technology have led to various data collection systems which are enriching the sources of empirical data for use in transport systems. Currently, traffic data are collected through various sensors including loop detectors, probe vehicles, cell-phones, Bluetooth, video cameras, remote sensing and public transport smart cards. It has been argued that combining the complementary information from multiple sources will generally result in better accuracy, increased robustness and reduced ambiguity. Despite the fact that there have been substantial advances in data assimilation techniques to reconstruct and predict the traffic state from multiple data sources, such methods are generally data-driven and do not fully utilize the power of traffic models. Furthermore, the existing methods are still limited to freeway networks and are not yet applicable in the urban context due to the enhanced complexity of the flow behavior. The main traffic phenomena on urban links are generally caused by the boundary conditions at intersections, un-signalized or signalized, at which the switching of the traffic lights and the turning maneuvers of the road users lead to shock-wave phenomena that propagate upstream of the intersections. This paper develops a new model-based methodology to build up a real-time traffic prediction model for arterial corridors using data from multiple sources, particularly from loop detectors and partial observations from Bluetooth and GPS devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In public transport, seamless coordinated transfer strengthens the quality of service and attracts ridership. The problem of transfer coordination is sophisticated due to (1) the stochasticity of travel time variability, (2) unavailability of passenger transfer plan. However, the proliferation of Big Data technologies provides a tremendous opportunity to solve these problems. This dissertation enhances passenger transfer quality by offline and online transfer coordination. While offline transfer coordination exploits the knowledge of travel time variability to coordinate transfers, online transfer coordination provides simultaneous vehicle arrivals at stops to facilitate transfers by employing the knowledge of passenger behaviours.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 3D Water Chemistry Atlas is an intuitive, open source, Web-based system that enables the three-dimensional (3D) sub-surface visualization of ground water monitoring data, overlaid on the local geological model (formation and aquifer strata). This paper firstly describes the results of evaluating existing virtual globe technologies, which led to the decision to use the Cesium open source WebGL Virtual Globe and Map Engine as the underlying platform. Next it describes the backend database and search, filtering, browse and analysis tools that were developed to enable users to interactively explore the groundwater monitoring data and interpret it spatially and temporally relative to the local geological formations and aquifers via the Cesium interface. The result is an integrated 3D visualization system that enables environmental managers and regulators to assess groundwater conditions, identify inconsistencies in the data, manage impacts and risks and make more informed decisions about coal seam gas extraction, waste water extraction, and water reuse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To this point, the collection has provided research-based, empirical accounts of the various and multiple effects of the National Assessment Program – Literacy and Numeracy (NAPLAN) in Australian schooling as a specific example of the global phenomenon of national testing. In this chapter, we want to develop a more theoretical analysis of national testing systems, globalising education policy and the promise of national testing as adaptive, online tests. These future moves claim to provide faster feedback and more useful diagnostic help for teachers. There is a utopian testing dream that one day adaptive, online tests will be responsive in real time providing an integrated personalised testing, pedagogy and intervention for each student. The moves towards these next generation assessments are well advanced, including the work of Pearson’s NextGen Learning and Assessment research group, the Organization for Economic Co-operation and Development’s (OECD) move into assessing affective skills and the Australian Curriculum, Assessment and Reporting Authority’s (ACARA) decision to phase in NAPLAN as an online, adaptive test from 2017...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-stakes testing is changing what it means to be a ‘good teacher’ in the contemporary school. This paper uses Deleuze and Guattari's ideas on the control society and dividuation in the context of National Assessment Program Literacy and Numeracy (NAPLAN) testing in Australia to suggest that the database generates new understandings of the ‘good teacher’. Media reports are used to look at how teachers are responding to the high-stakes database through manipulating the data. This article argues that manipulating the data is a regrettable, but logical, response to manifestations of teaching where only the data counts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resolving species relationships and confirming diagnostic morphological characters for insect clades that are highly plastic, and/or include morphologically cryptic species, is crucial for both academic and applied reasons. Within the true fly (Diptera) family Chironomidae, a most ubiquitous freshwater insect group, the genera CricotopusWulp, 1874 and ParatrichocladiusSantos-Abreu, 1918 have long been taxonomically confusing. Indeed, until recently the Australian fauna had been examined in just two unpublished theses: most species were known by informal manuscript names only, with no concept of relationships. Understanding species limits, and the associated ecology and evolution, is essential to address taxonomic sufficiency in biomonitoring surveys. Immature stages are collected routinely, but tolerance is generalized at the genus level, despite marked variation among species. Here, we explored this issue using a multilocus molecular phylogenetic approach, including the standard mitochondrial barcode region, and tested explicitly for phylogenetic signal in ecological tolerance of species. Additionally, we addressed biogeographical patterns by conducting Bayesian divergence time estimation. We sampled all but one of the now recognized Australian Cricotopus species and tested monophyly using representatives from other austral and Asian locations. Cricotopus is revealed as paraphyletic by the inclusion of a nested monophyletic Paratrichocladius, with in-group diversification beginning in the Eocene. Previous morphological species concepts are largely corroborated, but some additional cryptic diversity is revealed. No significant relationship was observed between the phylogenetic position of a species and its ecology, implying either that tolerance to deleterious environmental impacts is a convergent trait among many Cricotopus species or that sensitive and restricted taxa have diversified into more narrow niches from a widely tolerant ancestor.