977 resultados para Burroughs D-machine (Computer)
Resumo:
Theorem-proving is a one-player game. The history of computer programs being the players goes back to 1956 and the ‘LT’ LOGIC THEORY MACHINE of Newell, Shaw and Simon. In game-playing terms, the ‘initial position’ is the core set of axioms chosen for the particular logic and the ‘moves’ are the rules of inference. Now, the Univalent Foundations Program at IAS Princeton and the resulting ‘HoTT’ book on Homotopy Type Theory have demonstrated the success of a new kind of experimental mathematics using computer theorem proving.
Resumo:
We present an intuitive geometric approach for analysing the structure and fragility of T1-weighted structural MRI scans of human brains. Apart from computing characteristics like the surface area and volume of regions of the brain that consist of highly active voxels, we also employ Network Theory in order to test how close these regions are to breaking apart. This analysis is used in an attempt to automatically classify subjects into three categories: Alzheimer’s disease, mild cognitive impairment and healthy controls, for the CADDementia Challenge.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
Resumo:
The three-dimensional molecular dynamics simulation method has been used to study the dynamic responses of an electrorheological (ER) fluid in oscillatory shear. The structure and related viscoelastic behaviour of the fluid are found to be sensitive to the amplitude of the strain. With the increase of the strain amplitude, the structure formed by the particles changes from isolated columns to sheet-like structures which may be perpendicular or parallel to the oscillating direction. Along with the structure evolution, the field-induced moduli decrease significantly with an increase in strain amplitude. The viscoelastic behaviour of the structures obtained in the cases of different strain amplitudes was examined in the linear response regime and an evident structure dependence of the moduli was found. The reason for this lies in the anisotropy of the arrangement of the particles in these structures. Short-range interactions between the particles cannot be neglected in determining the viscoelastic behaviour of ER fluids at small strain amplitude, especially for parallel sheets. The simulation results were compared with available experimental data and good agreement was reached for most of them.
Resumo:
Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n = 30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org.
Resumo:
Dietary assessment in older adults can be challenging. The Novel Assessment of Nutrition and Ageing (NANA) method is a touch-screen computer-based food record that enables older adults to record their dietary intakes. The objective of the present study was to assess the relative validity of the NANA method for dietary assessment in older adults. For this purpose, three studies were conducted in which a total of ninety-four older adults (aged 65–89 years) used the NANA method of dietary assessment. On a separate occasion, participants completed a 4 d estimated food diary. Blood and 24 h urine samples were also collected from seventy-six of the volunteers for the analysis of biomarkers of nutrient intake. The results from all the three studies were combined, and nutrient intake data collected using the NANA method were compared against the 4 d estimated food diary and biomarkers of nutrient intake. Bland–Altman analysis showed a reasonable agreement between the dietary assessment methods for energy and macronutrient intake; however, there were small, but significant, differences for energy and protein intake, reflecting the tendency for the NANA method to record marginally lower energy intakes. Significant positive correlations were observed between urinary urea and dietary protein intake using both the NANA and the 4 d estimated food diary methods, and between plasma ascorbic acid and dietary vitamin C intake using the NANA method. The results demonstrate the feasibility of computer-based dietary assessment in older adults, and suggest that the NANA method is comparable to the 4 d estimated food diary, and could be used as an alternative to the food diary for the short-term assessment of an individual’s dietary intake.
Resumo:
A Brain-computer music interface (BCMI) is developed to allow for continuous modification of the tempo of dynamically generated music. Six out of seven participants are able to control the BCMI at significant accuracies and their performance is observed to increase over time.
Resumo:
Objective. This study was designed to determine the precision and accuracy of angular measurements using three-dimensional computed tomography (3D-CT) volume rendering by computer systems. Study design. The study population consisted of 28 dried skulls that were scanned with a 64-row multislice CT, and 3D-CT images were generated. Angular measurements, (n = 6) based upon conventional craniometric anatomical landmarks (n = 9), were identified independently in 3D-CT images by 2 radiologists, twice each, and were then performed by 3D-CT imaging. Subsequently, physical measurements were made by a third examiner using a Beyond Crysta-C9168 series 900 device. Results. The results demonstrated no statistically significant difference between interexaminer and intraexaminer analysis. The mean difference between the physical and 3-D-based angular measurements was -1.18% and -0.89%, respectively, for both examiners, demonstrating high accuracy. Conclusion. Maxillofacial analysis of angular measurements using 3D-CT volume rendering by 64-row multislice CT is established and can be used for orthodontic and dentofacial orthopedic applications.
Resumo:
Vector field formulation based on the Poisson theorem allows an automatic determination of rock physical properties (magnetization to density ratio-MDR-and the magnetization inclination-MI) from combined processing of gravity and magnetic geophysical data. The basic assumptions (i.e., Poisson conditions) are: that gravity and magnetic fields share common sources, and that these sources have a uniform magnetization direction and MDR. In addition, the previously existing formulation was restricted to profile data, and assumed sufficiently elongated (2-D) sources. For sources that violate Poisson conditions or have a 3-D geometry, the apparent values of MDR and MI that are generated in this way have an unclear relationship to the actual properties in the subsurface. We present Fortran programs that estimate MDR and MI values for 3-D sources through processing of gridded gravity and magnetic data. Tests with simple geophysical models indicate that magnetization polarity can be successfully recovered by MDR-MI processing, even in cases where juxtaposed bodies cannot be clearly distinguished on the basis of anomaly data. These results may be useful in crustal studies, especially in mapping magnetization polarity from marine-based gravity and magnetic data. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Species` potential distribution modelling consists of building a representation of the fundamental ecological requirements of a species from biotic and abiotic conditions where the species is known to occur. Such models can be valuable tools to understand the biogeography of species and to support the prediction of its presence/absence considering a particular environment scenario. This paper investigates the use of different supervised machine learning techniques to model the potential distribution of 35 plant species from Latin America. Each technique was able to extract a different representation of the relations between the environmental conditions and the distribution profile of the species. The experimental results highlight the good performance of random trees classifiers, indicating this particular technique as a promising candidate for modelling species` potential distribution. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
To plan testing activities, testers face the challenge of determining a strategy, including a test coverage criterion that offers an acceptable compromise between the available resources and test goals. Known theoretical properties of coverage criteria do not always help and, thus, empirical data are needed. The results of an experimental evaluation of several coverage criteria for finite state machines (FSMs) are presented, namely, state and transition coverage; initialisation fault and transition fault coverage. The first two criteria focus on FSM structure, whereas the other two on potential faults in FSM implementations. The authors elaborate a comparison approach that includes random generation of FSM, construction of an adequate test suite and test minimisation for each criterion to ensure that tests are obtained in a uniform way. The last step uses an improved greedy algorithm.
Resumo:
In medical processes where ionizing radiation is used, dose planning and dose delivery are the key elements to patient safety and treatment success, particularly, when the delivered dose in a single session of treatment can be an order of magnitude higher than the regular doses of radiotherapy. Therefore, the radiation dose should be well defined and precisely delivered to the target while minimizing radiation exposure to surrounding normal tissues [1]. Several methods have been proposed to obtain three-dimensional (3-D) dose distribution [2, 3]. In this paper, we propose an alternative method, which can be easily implemented in any stereotactic radiosurgery center with a magnetic resonance imaging (MRI) facility. A phantom with or without scattering centers filled with Fricke gel solution is irradiated with Gamma Knife(A (R)) system at a chosen spot. The phantom can be a replica of a human organ such as head, breast or any other organ. It can even be constructed from a real 3-D MR image of an organ of a patient using a computer-aided construction and irradiated at a specific region corresponding to the tumor position determined by MRI. The spin-lattice relaxation time T (1) of different parts of the irradiated phantom is determined by localized spectroscopy. The T (1)-weighted phantom images are used to correlate the image pixels intensity to the absorbed dose and consequently a 3-D dose distribution with a high resolution is obtained.