879 resultados para Edge based analysis
Resumo:
Perception of simultaneity and temporal order is studied with simultaneity judgment (SJ) and temporal-order judgment (TOJ) tasks. In the former, observers report whether presentation of two stimuli was subjectively simultaneous; in the latter, they report which stimulus was subjectively presented first. SJ and TOJ tasks typically give discrepant results, which has prompted the view that performance is mediated by different processes in each task. We looked at these discrepancies from a model that yields psychometric functions whose parameters characterize the timing, decisional, and response processes involved in SJ and TOJ tasks. We analyzed 12 data sets from published studies in which both tasks had been used in within-subjects designs, all of which had reported differences in performance across tasks. Fitting the model jointly to data from both tasks, we tested the hypothesis that common timing processes sustain simultaneity and temporal order judgments, with differences in performance arising from task-dependent decisional and response processes. The results supported this hypothesis, also showing that model psychometric functions account for aspects of SJ and TOJ data that classical analyses overlook. Implications for research on perception of simultaneity and temporal order are discussed.
Resumo:
Peer reviewed
Resumo:
Peer reviewed
Resumo:
Because of the role that DNA damage and depletion play in human disease, it is important to develop and improve tools to assess these endpoints. This unit describes PCR-based methods to measure nuclear and mitochondrial DNA damage and copy number. Long amplicon quantitative polymerase chain reaction (LA-QPCR) is used to detect DNA damage by measuring the number of polymerase-inhibiting lesions present based on the amount of PCR amplification; real-time PCR (RT-PCR) is used to calculate genome content. In this unit, we provide step-by-step instructions to perform these assays in Homo sapiens, Mus musculus, Rattus norvegicus, Caenorhabditis elegans, Drosophila melanogaster, Danio rerio, Oryzias latipes, Fundulus grandis, and Fundulus heteroclitus, and discuss the advantages and disadvantages of these assays.
Resumo:
Purpose: Our purpose in this report was to define genes and pathways dysregulated as a consequence of the t(4;14) in myeloma, and to gain insight into the downstream functional effects that may explain the different prognosis of this subgroup.Experimental Design: Fibroblast growth factor receptor 3 (FGFR3) overexpression, the presence of immunoglobulin heavy chain-multiple myeloma SET domain (IgH-MMSET) fusion products and the identification of t(4;14) breakpoints were determined in a series of myeloma cases. Differentially expressed genes were identified between cases with (n = 55) and without (n = 24) a t(4;14) by using global gene expression analysis.Results: Cases with a t(4;14) have a distinct expression pattern compared with other cases of myeloma. A total of 127 genes were identified as being differentially expressed including MMSET and cyclin D2, which have been previously reported as being associated with this translocation. Other important functional classes of genes include cell signaling, apoptosis and related genes, oncogenes, chromatin structure, and DNA repair genes. Interestingly, 25% of myeloma cases lacking evidence of this translocation had up-regulation of the MMSET transcript to the same level as cases with a translocation.Conclusions: t(4;14) cases form a distinct subgroup of myeloma cases with a unique gene signature that may account for their poor prognosis. A number of non-t(4;14) cases also express MMSET consistent with this gene playing a role in myeloma pathogenesis.
Resumo:
This thesis sets out to explore the place and agency of non-comital women in twelfth-century Anglo-Norman England. Until now, broad generalisations have been applied to all aristocratic women based on a long established scholarship on royal and comital women. Non-comital women have been overlooked, mainly because of an assumed lack of suitable sources from this time period. The first aim of this thesis is to demonstrate that there is a sufficient corpus of charters for a study of this social group of women. It is based on a database created from 5545 charters, of which 3046 were issued by non-comital women and men, taken from three case study counties, Oxfordshire, Suffolk and Yorkshire, and is also supported by other government records. This thesis demonstrates that non-comital women had significant social and economic agency in their own person. By means of a detailed analysis of charters and their clauses this thesis argues that scholarship on non-comital women must rethink the framework applied to the study of non-comital women to address the lifecycle as one of continuities and as active agents in a wider public society. Non-comital women’s agency and identity was not only based on land or in widowhood, which has been the one period in their life cycles where scholars have recognised some level of autonomy, and women had agency in all stages of their life cycle. Women’s agency and identity were drawn from and part of a wider framework that included their families, their kin, and broader local political, religious, and social networks. Natal families continued to be important sources of agency and identity to women long after they had married. Part A of the thesis applies modern charter diplomatic analysis methods to the corpus of charters to bring out and explore women’s presence therein. Part B contextualises these findings and explores women’s agency in their families, landholding, the gift-economy, and the wider religious and social networks of which they were a part.
Resumo:
Background: Celiac disease (CD) has a negative impact on the health-related quality of life (HRQL) of affected patients. Although HRQL and its determinants have been examined in Spanish CD patients specifically recruited in hospital settings, these aspects of CD have not been assessed among the general Spanish population. Methods: An observational, cross-sectional study of a non-randomized, representative sample of adult celiac patients throughout all of Spain's Autonomous Regions. Subjects were recruited through celiac patient associations. A Spanish version of the self-administered Celiac Disease-Quality of Life (CD-QOL) questionnaire was used. Determinant factors of HRQL were assessed with the aid of multivariate analysis to control for confounding factors. Results: We analyzed the responses provided by 1,230 patients, 1,092 (89.2%) of whom were women. The overall mean value for the CD-QOL index was 56.3 ± 18.27 points. The dimension that obtained the most points was dysphoria, with 81.3 ± 19.56 points, followed by limitations with 52.3 ± 23.43 points; health problems, with 51.6 ± 26.08 points, and inadequate treatment, with 36.1 ± 21.18 points. Patient age and sex, along with time to diagnosis, and length of time on a gluten-free diet were all independent determinant factors of certain dimensions of HRQL: women aged 31 to 40 expressed poorer HRQL while time to diagnosis and length of time on a gluten-free diet were determinant factors for better HRQL scores. Conclusions: The HRQL level of adult Spanish celiac subjects is moderate, improving with the length of time patients remain on a gluten-free diet.
Resumo:
There is an increasing concern to reduce the cost and overheads during the development of reliable systems. Selective protection of most critical parts of the systems represents a viable solution to obtain a high level of reliability at a fraction of the cost. In particular to design a selective fault mitigation strategy for processor-based systems, it is mandatory to identify and prioritize the most vulnerable registers in the register file as best candidates to be protected (hardened). This paper presents an application-based metric to estimate the criticality of each register from the microprocessor register file in microprocessor-based systems. The proposed metric relies on the combination of three different criteria based on common features of executed applications. The applicability and accuracy of our proposal have been evaluated in a set of applications running in different microprocessors. Results show a significant improvement in accuracy compared to previous approaches and regardless of the underlying architecture.
Resumo:
The mechanics-based analysis framework predicts top-down fatigue cracking initiation time in asphalt concrete pavements by utilising fracture mechanics and mixture morphology-based property. To reduce the level of complexity involved, traffic data were characterised and incorporated into the framework using the equivalent single axle load (ESAL) approach. There is a concern that this kind of simplistic traffic characterisation might result in erroneous performance predictions and pavement structural designs. This paper integrates axle load spectra and other traffic characterisation parameters into the mechanics-based analysis framework and studies the impact these traffic characterisation parameters have on predicted fatigue cracking performance. The traffic characterisation inputs studied are traffic growth rate, axle load spectra, lateral wheel wander and volume adjustment factors. For this purpose, a traffic integration approach which incorporates Monte Carlo simulation and representative traffic characterisation inputs was developed. The significance of these traffic characterisation parameters was established by evaluating a number of field pavement sections. It is evident from the results that all the traffic characterisation parameters except truck wheel wander have been observed to have significant influence on predicted top-down fatigue cracking performance.
Resumo:
The acquisition and update of Geographic Information System (GIS) data are typically carried out using aerial or satellite imagery. Since new roads are usually linked to georeferenced pre-existing road network, the extraction of pre-existing road segments may provide good hypotheses for the updating process. This paper addresses the problem of extracting georeferenced roads from images and formulating hypotheses for the presence of new road segments. Our approach proceeds in three steps. First, salient points are identified and measured along roads from a map or GIS database by an operator or an automatic tool. These salient points are then projected onto the image-space and errors inherent in this process are calculated. In the second step, the georeferenced roads are extracted from the image using a dynamic programming (DP) algorithm. The projected salient points and corresponding error estimates are used as input for this extraction process. Finally, the road center axes extracted in the previous step are analyzed to identify potential new segments attached to the extracted, pre-existing one. This analysis is performed using a combination of edge-based and correlation-based algorithms. In this paper we present our approach and early implementation results.
Resumo:
The important technological advances experienced along the last years have resulted in an important demand for new and efficient computer vision applications. On the one hand, the increasing use of video editing software has given rise to a necessity for faster and more efficient editing tools that, in a first step, perform a temporal segmentation in shots. On the other hand, the number of electronic devices with integrated cameras has grown enormously. These devices require new, fast, and efficient computer vision applications that include moving object detection strategies. In this dissertation, we propose a temporal segmentation strategy and several moving object detection strategies, which are suitable for the last generation of computer vision applications requiring both low computational cost and high quality results. First, a novel real-time high-quality shot detection strategy is proposed. While abrupt transitions are detected through a very fast pixel-based analysis, gradual transitions are obtained from an efficient edge-based analysis. Both analyses are reinforced with a motion analysis that allows to detect and discard false detections. This analysis is carried out exclusively over a reduced amount of candidate transitions, thus maintaining the computational requirements. On the other hand, a moving object detection strategy, which is based on the popular Mixture of Gaussians method, is proposed. This strategy, taking into account the recent history of each image pixel, adapts dynamically the amount of Gaussians that are required to model its variations. As a result, we improve significantly the computational efficiency with respect to other similar methods and, additionally, we reduce the influence of the used parameters in the results. Alternatively, in order to improve the quality of the results in complex scenarios containing dynamic backgrounds, we propose different non-parametric based moving object detection strategies that model both background and foreground. To obtain high quality results regardless of the characteristics of the analyzed sequence we dynamically estimate the most adequate bandwidth matrices for the kernels that are used in the background and foreground modeling. Moreover, the application of a particle filter allows to update the spatial information and provides a priori knowledge about the areas to analyze in the following images, enabling an important reduction in the computational requirements and improving the segmentation results. Additionally, we propose the use of an innovative combination of chromaticity and gradients that allows to reduce the influence of shadows and reflects in the detections.
Resumo:
The increasing use of video editing software requires faster and more efficient editing tools. As a first step, these tools perform a temporal segmentation in shots that allows a later building of indexes describing the video content. Here, we propose a novel real-time high-quality shot detection strategy, suitable for the last generation of video editing software requiring both low computational cost and high quality results. While abrupt transitions are detected through a very fast pixel-based analysis, gradual transitions are obtained from an efficient edge-based analysis. Both analyses are reinforced with a motion analysis that helps to detect and discard false detections. This motion analysis is carried out exclusively over a reduced set of candidate transitions, thus maintaining the computational requirements demanded by new applications to fulfill user needs.
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.
Resumo:
Natural systems are inherently non linear. Recurrent behaviours are typical of natural systems. Recurrence is a fundamental property of non linear dynamical systems which can be exploited to characterize the system behaviour effectively. Cross recurrence based analysis of sensor signals from non linear dynamical system is presented in this thesis. The mutual dependency among relatively independent components of a system is referred as coupling. The analysis is done for a mechanically coupled system specifically designed for conducting experiment. Further, cross recurrence method is extended to the actual machining process in a lathe to characterize the chatter during turning. The result is verified by permutation entropy method. Conventional linear methods or models are incapable of capturing the critical and strange behaviours associated with the dynamical process. Hence any effective feature extraction methodologies should invariably gather information thorough nonlinear time series analysis. The sensor signals from the dynamical system normally contain noise and non stationarity. In an effort to get over these two issues to the maximum possible extent, this work adopts the cross recurrence quantification analysis (CRQA) methodology since it is found to be robust against noise and stationarity in the signals. The study reveals that the CRQA is capable of characterizing even weak coupling among system signals. It also divulges the dependence of certain CRQA variables like percent determinism, percent recurrence and entropy to chatter unambiguously. The surrogate data test shows that the results obtained by CRQA are the true properties of the temporal evolution of the dynamics and contain a degree of deterministic structure. The results are verified using permutation entropy (PE) to detect the onset of chatter from the time series. The present study ascertains that this CRP based methodology is capable of recognizing the transition from regular cutting to the chatter cutting irrespective of the machining parameters or work piece material. The results establish this methodology to be feasible for detection of chatter in metal cutting operation in a lathe.