111 resultados para Parallel methods
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Background: A survey of pathology reporting of breast cancer in Western Australia in 1989 highlighted the need for improvement. The current study documents (1) changes in pathology reporting from 1989 to 1999 and (2) changes in patterns of histopathological prognostic indicators for breast cancer following introduction of mammographic screening in 1989. Methods: Data concerning all breast cancer cases reported in Western Australia in 1989, 1994 and 1999 were retrieved using the State Cancer Registry, Hospital Morbidity data system, and pathology laboratory records. Results: Pathology reports improved in quality during the decade surveyed. For invasive carcinoma, tumour size was not recorded in 1.2% of pathology reports in 1999 compared with 16.1% in 1989 (rho<0.001). Corresponding figures for other prognostic factors were: tumour grade 3.3% and 51.6% (rho<0.001), tumour type 0.2% and 4.1% (rho<0.001), vascular invasion 3.7% and 70.9% (rho<0.001), and lymph node status 1.9% and 4.5% (rho=0.023). In 1999, 5.9% of reports were not in a synoptic/checklist format, whereas all reports were descriptive in 1989 (rho<0.001). For the population as a whole, the proportion of invasive carcinomas <1 cm was 20.9% in 1999 compared with 14.5% in 1989 (rho<0.001); for tumours <2 cm the corresponding figures were 65.4% and 59.7% (rho=0.013). In 1999, 30.5% of tumours were histologically well-differentiated compared with 10.6% in 1989 (rho<0.001), and 61.7% were lymph node negative in 1999 compared with 57.1% in 1989 (rho=0.006). Pure ductal carcinoma in situ (DCIS) constituted 10.9% and 7.9% of total cases of breast carcinoma in 1999 and 1989, respectively (rho=0.01). Conclusions: Quality of pathology reporting improved markedly over the period, in parallel with adoption of stanclardised synoptic pathology reports. By 1999, recording of important prognostic information was almost complete. Frequency of favourable prognostic factors generally increased over time, reflecting expected effects of mammographic screening.
Resumo:
This special issue represents a further exploration of some issues raised at a symposium entitled “Functional magnetic resonance imaging: From methods to madness” presented during the 15th annual Theoretical and Experimental Neuropsychology (TENNET XV) meeting in Montreal, Canada in June, 2004. The special issue’s theme is methods and learning in functional magnetic resonance imaging (fMRI), and it comprises 6 articles (3 reviews and 3 empirical studies). The first (Amaro and Barker) provides a beginners guide to fMRI and the BOLD effect (perhaps an alternative title might have been “fMRI for dummies”). While fMRI is now commonplace, there are still researchers who have yet to employ it as an experimental method and need some basic questions answered before they venture into new territory. This article should serve them well. A key issue of interest at the symposium was how fMRI could be used to elucidate cerebral mechanisms responsible for new learning. The next 4 articles address this directly, with the first (Little and Thulborn) an overview of data from fMRI studies of category-learning, and the second from the same laboratory (Little, Shin, Siscol, and Thulborn) an empirical investigation of changes in brain activity occurring across different stages of learning. While a role for medial temporal lobe (MTL) structures in episodic memory encoding has been acknowledged for some time, the different experimental tasks and stimuli employed across neuroimaging studies have not surprisingly produced conflicting data in terms of the precise subregion(s) involved. The next paper (Parsons, Haut, Lemieux, Moran, and Leach) addresses this by examining effects of stimulus modality during verbal memory encoding. Typically, BOLD fMRI studies of learning are conducted over short time scales, however, the fourth paper in this series (Olson, Rao, Moore, Wang, Detre, and Aguirre) describes an empirical investigation of learning occurring over a longer than usual period, achieving this by employing a relatively novel technique called perfusion fMRI. This technique shows considerable promise for future studies. The final article in this special issue (de Zubicaray) represents a departure from the more familiar cognitive neuroscience applications of fMRI, instead describing how neuroimaging studies might be conducted to both inform and constrain information processing models of cognition.
Resumo:
Minimal perfect hash functions are used for memory efficient storage and fast retrieval of items from static sets. We present an infinite family of efficient and practical algorithms for generating order preserving minimal perfect hash functions. We show that almost all members of the family construct space and time optimal order preserving minimal perfect hash functions, and we identify the one with minimum constants. Members of the family generate a hash function in two steps. First a special kind of function into an r-graph is computed probabilistically. Then this function is refined deterministically to a minimal perfect hash function. We give strong theoretical evidence that the first step uses linear random time. The second step runs in linear deterministic time. The family not only has theoretical importance, but also offers the fastest known method for generating perfect hash functions.
Resumo:
Little consensus exists in the literature regarding methods for determination of the onset of electromyographic (EMG) activity. The aim of this study was to compare the relative accuracy of a range of computer-based techniques with respect to EMG onset determined visually by an experienced examiner. Twenty-seven methods were compared which varied in terms of EMG processing (low pass filtering at 10, 50 and 500 Hz), threshold value (1, 2 and 3 SD beyond mean of baseline activity) and the number of samples for which the mean must exceed the defined threshold (20, 50 and 100 ms). Three hundred randomly selected trials of a postural task were evaluated using each technique. The visual determination of EMG onset was found to be highly repeatable between days. Linear regression equations were calculated for the values selected by each computer method which indicated that the onset values selected by the majority of the parameter combinations deviated significantly from the visually derived onset values. Several methods accurately selected the time of onset of EMG activity and are recommended for future use. Copyright (C) 1996 Elsevier Science Ireland Ltd.
Resumo:
This study investigated the effect of two anti-pronation taping techniques on vertical navicular height, an indicator of foot pronation, after its application and 20 min of exercise. The taping techniques were: the low dye (LD) and low dye with the addition of calcaneal slings and reverse sixes (LDCR). A repeated measures study was used. It found that LDCR was superior to LD and control immediately after application and exercise. LD was better than control immediately after application but not after exercise. These findings provide practical directions to clinicians regularly using anti-pronation taping techniques.
Resumo:
A robust semi-implicit central partial difference algorithm for the numerical solution of coupled stochastic parabolic partial differential equations (PDEs) is described. This can be used for calculating correlation functions of systems of interacting stochastic fields. Such field equations can arise in the description of Hamiltonian and open systems in the physics of nonlinear processes, and may include multiplicative noise sources. The algorithm can be used for studying the properties of nonlinear quantum or classical field theories. The general approach is outlined and applied to a specific example, namely the quantum statistical fluctuations of ultra-short optical pulses in chi((2)) parametric waveguides. This example uses a non-diagonal coherent state representation, and correctly predicts the sub-shot noise level spectral fluctuations observed in homodyne detection measurements. It is expected that the methods used wilt be applicable for higher-order correlation functions and other physical problems as well. A stochastic differencing technique for reducing sampling errors is also introduced. This involves solving nonlinear stochastic parabolic PDEs in combination with a reference process, which uses the Wigner representation in the example presented here. A computer implementation on MIMD parallel architectures is discussed. (C) 1997 Academic Press.