888 resultados para Parallel processing (Electronic computers) - Research


Relevância:

30.00% 30.00%

Publicador:

Resumo:

DNA strand-breaks (SBs) with non-ligatable ends are generated by ionizing radiation, oxidative stress, various chemotherapeutic agents, and also as base excision repair (BER) intermediates. Several neurological diseases have already been identified as being due to a deficiency in DNA end-processing activities. Two common dirty ends, 3'-P and 5'-OH, are processed by mammalian polynucleotide kinase 3'-phosphatase (PNKP), a bifunctional enzyme with 3'-phosphatase and 5'-kinase activities. We have made the unexpected observation that PNKP stably associates with Ataxin-3 (ATXN3), a polyglutamine repeat-containing protein mutated in spinocerebellar ataxia type 3 (SCA3), also known as Machado-Joseph Disease (MJD). This disease is one of the most common dominantly inherited ataxias worldwide; the defect in SCA3 is due to CAG repeat expansion (from the normal 14-41 to 55-82 repeats) in the ATXN3 coding region. However, how the expanded form gains its toxic function is still not clearly understood. Here we report that purified wild-type (WT) ATXN3 stimulates, and by contrast the mutant form specifically inhibits, PNKP's 3' phosphatase activity in vitro. ATXN3-deficient cells also show decreased PNKP activity. Furthermore, transgenic mice conditionally expressing the pathological form of human ATXN3 also showed decreased 3'-phosphatase activity of PNKP, mostly in the deep cerebellar nuclei, one of the most affected regions in MJD patients' brain. Finally, long amplicon quantitative PCR analysis of human MJD patients' brain samples showed a significant accumulation of DNA strand breaks. Our results thus indicate that the accumulation of DNA strand breaks due to functional deficiency of PNKP is etiologically linked to the pathogenesis of SCA3/MJD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"A workshop within the 19th International Conference on Applications and Theory of Petri Nets - ICATPN’1998"

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project was funded under the Applied Research Grants Scheme administered by Enterprise Ireland. The project was a partnership between Galway - Mayo Institute of Technology and an industrial company, Tyco/Mallinckrodt Galway. The project aimed to develop a semi - automatic, self - learning pattern recognition system capable of detecting defects on the printed circuits boards such as component vacancy, component misalignment, component orientation, component error, and component weld. The research was conducted in three directions: image acquisition, image filtering/recognition and software development. Image acquisition studied the process of forming and digitizing images and some fundamental aspects regarding the human visual perception. The importance of choosing the right camera and illumination system for a certain type of problem has been highlighted. Probably the most important step towards image recognition is image filtering, The filters are used to correct and enhance images in order to prepare them for recognition. Convolution, histogram equalisation, filters based on Boolean mathematics, noise reduction, edge detection, geometrical filters, cross-correlation filters and image compression are some examples of the filters that have been studied and successfully implemented in the software application. The software application developed during the research is customized in order to meet the requirements of the industrial partner. The application is able to analyze pictures, perform the filtering, build libraries, process images and generate log files. It incorporates most of the filters studied and together with the illumination system and the camera it provides a fully integrated framework able to analyze defects on printed circuit boards.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to evaluate the determinism of the AS-lnterface network and the 3 main families of control systems, which may use it, namely PLC, PC and RTOS. During the course of this study the PROFIBUS and Ethernet field level networks were also considered in order to ensure that they would not introduce unacceptable latencies into the overall control system. This research demonstrated that an incorrectly configured Ethernet network introduces unacceptable variable duration latencies into the control system, thus care must be exercised if the determinism of a control system is not to be compromised. This study introduces a new concept of using statistics and process capability metrics in the form of CPk values, to specify how suitable a control system is for a given control task. The PLC systems, which were tested, demonstrated extremely deterministic responses, but when a large number of iterations were introduced in the user program, the mean control system latency was much too great for an AS-I network. Thus the PLC was found to be unsuitable for an AS-I network if a large, complex user program Is required. The PC systems, which were tested were non-deterministic and had latencies of variable duration. These latencies became extremely exaggerated when a graphing ActiveX was included in the control application. These PC systems also exhibited a non-normal frequency distribution of control system latencies, and as such are unsuitable for implementation with an AS-I network. The RTOS system, which was tested, overcame the problems identified with the PLC systems and produced an extremely deterministic response, even when a large number of iterations were introduced in the user program. The RTOS system, which was tested, is capable of providing a suitable deterministic control system response, even when an extremely large, complex user program is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays a huge attention of the academia and research teams is attracted to the potential of the usage of the 60 GHz frequency band in the wireless communications. The use of the 60GHz frequency band offers great possibilities for wide variety of applications that are yet to be implemented. These applications also imply huge implementation challenges. Such example is building a high data rate transceiver which at the same time would have very low power consumption. In this paper we present a prototype of Single Carrier -SC transceiver system, illustrating a brief overview of the baseband design, emphasizing the most important decisions that need to be done. A brief overview of the possible approaches when implementing the equalizer, as the most complex module in the SC transceiver, is also presented. The main focus of this paper is to suggest a parallel architecture for the receiver in a Single Carrier communication system. This would provide higher data rates that the communication system canachieve, for a price of higher power consumption. The suggested architecture of such receiver is illustrated in this paper,giving the results of its implementation in comparison with its corresponding serial implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modern computer systems that are in use nowadays are mostly processor-dominant, which means that their memory is treated as a slave element that has one major task – to serve execution units data requirements. This organization is based on the classical Von Neumann's computer model, proposed seven decades ago in the 1950ties. This model suffers from a substantial processor-memory bottleneck, because of the huge disparity between the processor and memory working speeds. In order to solve this problem, in this paper we propose a novel architecture and organization of processors and computers that attempts to provide stronger match between the processing and memory elements in the system. The proposed model utilizes a memory-centric architecture, wherein the execution hardware is added to the memory code blocks, allowing them to perform instructions scheduling and execution, management of data requests and responses, and direct communication with the data memory blocks without using registers. This organization allows concurrent execution of all threads, processes or program segments that fit in the memory at a given time. Therefore, in this paper we describe several possibilities for organizing the proposed memory-centric system with multiple data and logicmemory merged blocks, by utilizing a high-speed interconnection switching network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows how a high level matrix programming language may be used to perform Monte Carlo simulation, bootstrapping, estimation by maximum likelihood and GMM, and kernel regression in parallel on symmetric multiprocessor computers or clusters of workstations. The implementation of parallelization is done in a way such that an investigator may use the programs without any knowledge of parallel programming. A bootable CD that allows rapid creation of a cluster for parallel computing is introduced. Examples show that parallelization can lead to important reductions in computational time. Detailed discussion of how the Monte Carlo problem was parallelized is included as an example for learning to write parallel programs for Octave.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Es presenten els resultats d’una enquesta sobre l’ús de revistes electròniques realitzada al professorat de les universitats que formen el Consorci de Biblioteques Universitàries de Catalunya (CBUC). Els resultats mostren un elevat grau de coneixement de la col·lecció de revistes electròniques entre el personal docent i investigador i una creixent preferència pel format electrònic en detriment de l’imprès. L’alt grau de coneixement i d’ús dels títols electrònics, i la preferència per aquest suport, comporten una elevada valoració de la col·lecció de revistes electròniques. Al mateix temps, la major part dels usuaris preveu un increment de l’ús dels títols electrònics durant els propers anys. Els resultats també confirmen la importància de la disciplina i de l’edat com a factors explicatius de l’ús de les revistes electròniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: This study describes the prevalence, associated anomalies, and demographic characteristics of cases of multiple congenital anomalies (MCA) in 19 population-based European registries (EUROCAT) covering 959,446 births in 2004 and 2010. METHODS: EUROCAT implemented a computer algorithm for classification of congenital anomaly cases followed by manual review of potential MCA cases by geneticists. MCA cases are defined as cases with two or more major anomalies of different organ systems, excluding sequences, chromosomal and monogenic syndromes. RESULTS: The combination of an epidemiological and clinical approach for classification of cases has improved the quality and accuracy of the MCA data. Total prevalence of MCA cases was 15.8 per 10,000 births. Fetal deaths and termination of pregnancy were significantly more frequent in MCA cases compared with isolated cases (p < 0.001) and MCA cases were more frequently prenatally diagnosed (p < 0.001). Live born infants with MCA were more often born preterm (p < 0.01) and with birth weight < 2500 grams (p < 0.01). Respiratory and ear, face, and neck anomalies were the most likely to occur with other anomalies (34% and 32%) and congenital heart defects and limb anomalies were the least likely to occur with other anomalies (13%) (p < 0.01). However, due to their high prevalence, congenital heart defects were present in half of all MCA cases. Among males with MCA, the frequency of genital anomalies was significantly greater than the frequency of genital anomalies among females with MCA (p < 0.001). CONCLUSION: Although rare, MCA cases are an important public health issue, because of their severity. The EUROCAT database of MCA cases will allow future investigation on the epidemiology of these conditions and related clinical and diagnostic problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have used massively parallel signature sequencing (MPSS) to sample the transcriptomes of 32 normal human tissues to an unprecedented depth, thus documenting the patterns of expression of almost 20,000 genes with high sensitivity and specificity. The data confirm the widely held belief that differences in gene expression between cell and tissue types are largely determined by transcripts derived from a limited number of tissue-specific genes, rather than by combinations of more promiscuously expressed genes. Expression of a little more than half of all known human genes seems to account for both the common requirements and the specific functions of the tissues sampled. A classification of tissues based on patterns of gene expression largely reproduces classifications based on anatomical and biochemical properties. The unbiased sampling of the human transcriptome achieved by MPSS supports the idea that most human genes have been mapped, if not functionally characterized. This data set should prove useful for the identification of tissue-specific genes, for the study of global changes induced by pathological conditions, and for the definition of a minimal set of genes necessary for basic cell maintenance. The data are available on the Web at http://mpss.licr.org and http://sgb.lynxgen.com.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous studies have demonstrated that a region in the left ventral occipito-temporal (LvOT) cortex is highly selective to the visual forms of written words and objects relative to closely matched visual stimuli. Here, we investigated why LvOT activation is not higher for reading than picture naming even though written words and pictures of objects have grossly different visual forms. To compare neuronal responses for words and pictures within the same LvOT area, we used functional magnetic resonance imaging adaptation and instructed participants to name target stimuli that followed briefly presented masked primes that were either presented in the same stimulus type as the target (word-word, picture-picture) or a different stimulus type (picture-word, word-picture). We found that activation throughout posterior and anterior parts of LvOT was reduced when the prime had the same name/response as the target irrespective of whether the prime-target relationship was within or between stimulus type. As posterior LvOT is a visual form processing area, and there was no visual form similarity between different stimulus types, we suggest that our results indicate automatic top-down influences from pictures to words and words to pictures. This novel perspective motivates further investigation of the functional properties of this intriguing region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The synthesis of published research in systematic reviews is essential when providing evidence to inform clinical and health policy decision-making. However, the validity of systematic reviews is threatened if journal publications represent a biased selection of all studies that have been conducted (dissemination bias). To investigate the extent of dissemination bias we conducted a systematic review that determined the proportion of studies published as peer-reviewed journal articles and investigated factors associated with full publication in cohorts of studies (i) approved by research ethics committees (RECs) or (ii) included in trial registries. METHODS AND FINDINGS: Four bibliographic databases were searched for methodological research projects (MRPs) without limitations for publication year, language or study location. The searches were supplemented by handsearching the references of included MRPs. We estimated the proportion of studies published using prediction intervals (PI) and a random effects meta-analysis. Pooled odds ratios (OR) were used to express associations between study characteristics and journal publication. Seventeen MRPs (23 publications) evaluated cohorts of studies approved by RECs; the proportion of published studies had a PI between 22% and 72% and the weighted pooled proportion when combining estimates would be 46.2% (95% CI 40.2%-52.4%, I2 = 94.4%). Twenty-two MRPs (22 publications) evaluated cohorts of studies included in trial registries; the PI of the proportion published ranged from 13% to 90% and the weighted pooled proportion would be 54.2% (95% CI 42.0%-65.9%, I2 = 98.9%). REC-approved studies with statistically significant results (compared with those without statistically significant results) were more likely to be published (pooled OR 2.8; 95% CI 2.2-3.5). Phase-III trials were also more likely to be published than phase II trials (pooled OR 2.0; 95% CI 1.6-2.5). The probability of publication within two years after study completion ranged from 7% to 30%. CONCLUSIONS: A substantial part of the studies approved by RECs or included in trial registries remains unpublished. Due to the large heterogeneity a prediction of the publication probability for a future study is very uncertain. Non-publication of research is not a random process, e.g., it is associated with the direction of study findings. Our findings suggest that the dissemination of research findings is biased.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Several studies observed a female advantage in the prognosis of cutaneous melanoma, for which behavioral factors or an underlying biologic mechanism might be responsible. Using complete and reliable follow-up data from four phase III trials of the European Organisation for Research and Treatment of Cancer (EORTC) Melanoma Group, we explored the female advantage across multiple end points and in relation to other important prognostic indicators. PATIENTS AND METHODS: Patients diagnosed with localized melanoma were included in EORTC adjuvant treatment trials 18832, 18871, 18952, and 18961 and randomly assigned during the period of 1984 to 2005. Cox proportional hazard models were used to calculate hazard ratios (HRs) and 95% CIs for women compared with men, adjusted for age, Breslow thickness, body site, ulceration, performed lymph node dissection, and treatment. RESULTS: A total of 2,672 patients with stage I/II melanoma were included. Women had a highly consistent and independent advantage in overall survival (adjusted HR, 0.70; 95% CI, 0.59 to 0.83), disease-specific survival (adjusted HR, 0.74; 95% CI, 0.62 to 0.88), time to lymph node metastasis (adjusted HR, 0.70; 95% CI, 0.51 to 0.96), and time to distant metastasis (adjusted HR, 0.69; 95% CI, 0.59 to 0.81). Subgroup analysis showed that the female advantage was consistent across all prognostic subgroups (with the possible exception of head and neck melanomas) and in pre- and postmenopausal age groups. CONCLUSION: Women have a consistent and independent relative advantage in all aspects of the progression of localized melanoma of approximately 30%, most likely caused by an underlying biologic sex difference.