935 resultados para Electronic data processing - Distributed processing
Resumo:
MicroRNAs (miRs) are involved in the pathogenesis of several neoplasms; however, there are no data on their expression patterns and possible roles in adrenocortical tumors. Our objective was to study adrenocortical tumors by an integrative bioinformatics analysis involving miR and transcriptomics profiling, pathway analysis, and a novel, tissue-specific miR target prediction approach. Thirty-six tissue samples including normal adrenocortical tissues, benign adenomas, and adrenocortical carcinomas (ACC) were studied by simultaneous miR and mRNA profiling. A novel data-processing software was used to identify all predicted miR-mRNA interactions retrieved from PicTar, TargetScan, and miRBase. Tissue-specific target prediction was achieved by filtering out mRNAs with undetectable expression and searching for mRNA targets with inverse expression alterations as their regulatory miRs. Target sets and significant microarray data were subjected to Ingenuity Pathway Analysis. Six miRs with significantly different expression were found. miR-184 and miR-503 showed significantly higher, whereas miR-511 and miR-214 showed significantly lower expression in ACCs than in other groups. Expression of miR-210 was significantly lower in cortisol-secreting adenomas than in ACCs. By calculating the difference between dCT(miR-511) and dCT(miR-503) (delta cycle threshold), ACCs could be distinguished from benign adenomas with high sensitivity and specificity. Pathway analysis revealed the possible involvement of G2/M checkpoint damage in ACC pathogenesis. To our knowledge, this is the first report describing miR expression patterns and pathway analysis in sporadic adrenocortical tumors. miR biomarkers may be helpful for the diagnosis of adrenocortical malignancy. This tissue-specific target prediction approach may be used in other tumors too.
Resumo:
Multi-centre data repositories like the Alzheimer's Disease Neuroimaging Initiative (ADNI) offer a unique research platform, but pose questions concerning comparability of results when using a range of imaging protocols and data processing algorithms. The variability is mainly due to the non-quantitative character of the widely used structural T1-weighted magnetic resonance (MR) images. Although the stability of the main effect of Alzheimer's disease (AD) on brain structure across platforms and field strength has been addressed in previous studies using multi-site MR images, there are only sparse empirically-based recommendations for processing and analysis of pooled multi-centre structural MR data acquired at different magnetic field strengths (MFS). Aiming to minimise potential systematic bias when using ADNI data we investigate the specific contributions of spatial registration strategies and the impact of MFS on voxel-based morphometry in AD. We perform a whole-brain analysis within the framework of Statistical Parametric Mapping, testing for main effects of various diffeomorphic spatial registration strategies, of MFS and their interaction with disease status. Beyond the confirmation of medial temporal lobe volume loss in AD, we detect a significant impact of spatial registration strategy on estimation of AD related atrophy. Additionally, we report a significant effect of MFS on the assessment of brain anatomy (i) in the cerebellum, (ii) the precentral gyrus and (iii) the thalamus bilaterally, showing no interaction with the disease status. We provide empirical evidence in support of pooling data in multi-centre VBM studies irrespective of disease status or MFS.
Resumo:
This report is divided into two volumes. This volume (Volume I) summarizes a structural health monitoring (SHM) system that was developed for the Iowa DOT to remotely and continuously monitor fatigue critical bridges (FCB) to aid in the detection of crack formation. The developed FCB SHM system enables bridge owners to remotely monitor FCB for gradual or sudden damage formation. The SHM system utilizes fiber bragg grating (FBG) fiber optic sensors (FOSs) to measure strains at critical locations. The strain-based SHM system is trained with measured performance data to identify typical bridge response when subjected to ambient traffic loads, and that knowledge is used to evaluate newly collected data. At specified intervals, the SHM system autonomously generates evaluation reports that summarize the current behavior of the bridge. The evaluation reports are collected and distributed to the bridge owner for interpretation and decision making. Volume II summarizes the development and demonstration of an autonomous, continuous SHM system that can be used to monitor typical girder bridges. The developed SHM system can be grouped into two main categories: an office component and a field component. The office component is a structural analysis software program that can be used to generate thresholds which are used for identifying isolated events. The field component includes hardware and field monitoring software which performs data processing and evaluation. The hardware system consists of sensors, data acquisition equipment, and a communication system backbone. The field monitoring software has been developed such that, once started, it will operate autonomously with minimal user interaction. In general, the SHM system features two key uses. First, the system can be integrated into an active bridge management system that tracks usage and structural changes. Second, the system helps owners to identify damage and deterioration.
Resumo:
This work proposes a parallel architecture for a motion estimation algorithm. It is well known that image processing requires a huge amount of computation, mainly at low level processing where the algorithms are dealing with a great numbers of data-pixel. One of the solutions to estimate motions involves detection of the correspondences between two images. Due to its regular processing scheme, parallel implementation of correspondence problem can be an adequate approach to reduce the computation time. This work introduces parallel and real-time implementation of such low-level tasks to be carried out from the moment that the current image is acquired by the camera until the pairs of point-matchings are detected
Resumo:
BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
DnaSP is a software package for a comprehensive analysis of DNA polymorphism data. Version 5 implements a number of new features and analytical methods allowing extensive DNA polymorphism analyses on large datasets. Among other features, the newly implemented methods allow for: (i) analyses on multiple data files; (ii) haplotype phasing; (iii) analyses on insertion/deletion polymorphism data; (iv) visualizing sliding window results integrated with available genome annotations in the UCSC browser.
Resumo:
Tapahtumankäsittelyä pidetään yleisesti eräänä luotettavan tietojenkäsittelyn perusvaatimuksena. Tapahtumalla tarkoitetaan operaatiosarjaa, jonka suoritusta voidaan pitää yhtenä loogisena toimenpiteenä. Tapahtumien suoritukselle on asetettu neljä perussääntöä, joista käytetään lyhennettä ACID. Hajautetuissa järjestelmissä tapahtumankäsittelyn tarve kasvaa entisestään, sillä toimenpiteiden onnistumista ei voida varmistaa pelkästään paikallisten menetelmien avulla. Hajautettua tapahtumankäsittelyä on yritetty standardoida useaan otteeseen, muttayrityksistä huolimatta siitä ei ole olemassa yleisesti hyväksyttyjä ja avoimia standardeja. Lähimpänä tällaisen standardin asemaa on todennäköisesti X/Open DTP-standardiperhe ja varsinkin siihen kuuluva XA-standardi. Tässä työssä on lisäksi tutkittu, kuinka Intellitel ONE-järjestelmän valmistajariippumatonta tietokanta-arkkitehtuuria tulisi kehittää, kun tavoitteena on mahdollistaa sen avulla suoritettavien tapahtumankäsittelyä vaativien sovellusten käyttäminen.
Resumo:
Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
Early warning systems (EWSs) rely on the capacity to forecast a dangerous event with a certain amount of advance by defining warning criteria on which the safety of the population will depend. Monitoring of landslides is facilitated by new technologies, decreasing prices and easier data processing. At the same time, predicting the onset of a rapid failure or the sudden transition from slow to rapid failure and subsequent collapse, and its consequences is challenging for scientists that must deal with uncertainties and have limited tools to do so. Furthermore, EWS and warning criteria are becoming more and more a subject of concern between technical experts, researchers, stakeholders and decision makers responsible for the activation, enforcement and approval of civil protection actions. EWSs imply also a sharing of responsibilities which is often averted by technical staff, managers of technical offices and governing institutions. We organized the First International Workshop on Warning Criteria for Active Slides (IWWCAS) to promote sharing and networking among members from specialized institutions and relevant experts of EWS. In this paper, we summarize the event to stimulate discussion and collaboration between organizations dealing with the complex task of managing hazard and risk related to active slides.
Resumo:
Peer-reviewed
Resumo:
This study examined nurses' attitudes toward computers before training and 2 months after training. A quantitative approach and a nonexperimental survey design were used in this study. Stronge and Brodt's (1985) instrument, Nurses' Attitudes Toward Computerization Questionnaire, was used to assess 27 nurses' attitudes prior to and 2 months after computer training. Demographic variables also were collected on the questionnaires. The results of this study showed that, overall, nurses had positive attitudes towards computers in both questionnaires. The results of the first questionnaire were consistent with other studies. There were no studies that involved a follow-up questionnaire using Stronge and Brodt's (1985) instrument. Attitude scores of Questionnaire 2 were higher than attitude scores of Questionnaire 1. More time for nursing tasks, less time for quality patient care, and threat to job security questions were found to be statistically significant. This study found no statistical significance between attitudes and demographic variables. Younger nurses a~d nurses with fewer years of computer experience were most likely to exhibit positive attitudes. Implications for practice and future research were discussed. Some limitations were identified and discussed.
Resumo:
A simple, low-cost concentric capillary nebulizer (CCN) was developed and evaluated for ICP spectrometry. The CCN could be operated at sample uptake rates of 0.050-1.00 ml min'^ and under oscillating and non-oscillating conditions. Aerosol characteristics for the CCN were studied using a laser Fraunhofter diffraction analyzer. Solvent transport efficiencies and transport rates, detection limits, and short- and long-term stabilities were evaluated for the CCN with a modified cyclonic spray chamber at different sample uptake rates. The Mg II (280.2nm)/l\/lg 1(285.2nm) ratio was used for matrix effect studies. Results were compared to those with conventional nebulizers, a cross-flow nebulizer with a Scott-type spray chamber, a GemCone nebulizer with a cyclonic spray chamber, and a Meinhard TR-30-K3 concentric nebulizer with a cyclonic spray chamber. Transport efficiencies of up to 57% were obtained for the CCN. For the elements tested, short- and long-term precisions and detection limits obtained with the CCN at 0.050-0.500 ml min'^ are similar to, or better than, those obtained on the same instrument using the conventional nebulizers (at 1.0 ml min'^). The depressive and enhancement effects of easily ionizable element Na, sulfuric acid, and dodecylamine surfactant on analyte signals with the CCN are similar to, or better than, those obtained with the conventional nebulizers. However, capillary clog was observed when the sample solution with high dissolved solids was nebulized for more than 40 min. The effects of data acquisition and data processing on detection limits were studied using inductively coupled plasma-atomic emission spectrometry. The study examined the effects of different detection limit approaches, the effects of data integration modes, the effects of regression modes, the effects of the standard concentration range and the number of standards, the effects of sample uptake rate, and the effect of Integration time. All the experiments followed the same protocols. Three detection limit approaches were examined, lUPAC method, the residual standard deviation (RSD), and the signal-to-background ratio and relative standard deviation of the background (SBR-RSDB). The study demonstrated that the different approaches, the integration modes, the regression methods, and the sample uptake rates can have an effect on detection limits. The study also showed that the different approaches give different detection limits and some methods (for example, RSD) are susceptible to the quality of calibration curves. Multicomponents spectral fitting (MSF) gave the best results among these three integration modes, peak height, peak area, and MSF. Weighted least squares method showed the ability to obtain better quality calibration curves. Although an effect of the number of standards on detection limits was not observed, multiple standards are recommended because they provide more reliable calibration curves. An increase of sample uptake rate and integration time could improve detection limits. However, an improvement with increased integration time on detection limits was not observed because the auto integration mode was used.
Resumo:
L’évolution rapide des technologies de détection et de caractérisation des exoplanètes depuis le début des années 1990 permet de croire que de nouveaux instruments du type Terrestrial Planet Finder (TPF) pourront prendre les premiers spectres d’exoplanètes semblables à la Terre d’ici une ou deux décennies. Dans ce contexte, l’étude du spectre de la seule planète habitée connue, la Terre, est essentielle pour concevoir ces instruments et analyser leurs résultats. Cette recherche présente les spectres de la Terre dans le visible (390-900 nm), acquis lors de 8 nuits d’observation étalées sur plus d’un an. Ces spectres ont été obtenus en observant la lumière cendrée de la Lune avec le télescope de 1.6 m de l’Observatoire du Mont-Mégantic (OMM). La surface de la Lune réfléchissant de manière diffuse la lumière provenant d’une portion de la Terre, ces spectres sont non résolus spatialement. L’évolution de ces spectres en fonction de la lumière réfléchie à différentes phases de Terre est analogue à celle du spectre d’une exoplanète, dont la phase change selon sa position autour de l’étoile. L'eau, l'oxygène et l'ozone de l’atmosphère, détectés dans tous nos spectres, sont des biomarqueurs dont la présence suggère l’habitabilité de la planète et/ou la présence d’une activité biologique. Le Vegetation Red Edge (VRE), une autre biosignature spectrale, dû aux organismes photosynthétiques à la surface, est caractérisé par l’augmentation de la réflectivité autour de 700 nm. Pour les spectres de 5 nuits, cette augmentation a été évaluée entre -5 et 15% ±~5%, après que les contributions de la diffusion de Rayleigh, des aérosols et d’une large bande moléculaire de l’ozone aient été enlevées. Les valeurs mesurées sont cohérentes avec la présence de végétation dans la phase de la Terre contribuant au spectre, mais s’étendent sur une plage de variations plus large que celles trouvées dans la littérature (0-10%). Cela pourrait s’expliquer par des choix faits lors de la réduction des données et du calcul du VRE, ou encore par la présence d’autres éléments de surface ou de l’atmosphère dont la contribution spectrale autour de 700 nm serait variable.