971 resultados para Computer methods
Resumo:
Even if we have recognized many short-term benefits of agile methods, we still know very little about their long-term effects. In this panel, we discuss the long-term perspective of the agile methods. The panelists are either industrial or academic representatives. They will discuss problems and benefits related to the long-term lifecycle system management in agile projects. Ideally, the panel’s outcome will provide ideas for future research.
Resumo:
BCI systems require correct classification of signals interpreted from the brain for useful operation. To this end this paper investigates a method proposed in [1] to correctly classify a series of images presented to a group of subjects in [2]. We show that it is possible to use the proposed methods to correctly recognise the original stimuli presented to a subject from analysis of their EEG. Additionally we use a verification set to show that the trained classification method can be applied to a different set of data. We go on to investigate the issue of invariance in EEG signals. That is, the brain representation of similar stimuli is recognisable across different subjects. Finally we consider the usefulness of the methods investigated towards an improved BCI system and discuss how it could potentially lead to great improvements in the ease of use for the end user by offering an alternative, more intuitive control based mode of operation.
Resumo:
Many scientific and engineering applications involve inverting large matrices or solving systems of linear algebraic equations. Solving these problems with proven algorithms for direct methods can take very long to compute, as they depend on the size of the matrix. The computational complexity of the stochastic Monte Carlo methods depends only on the number of chains and the length of those chains. The computing power needed by inherently parallel Monte Carlo methods can be satisfied very efficiently by distributed computing technologies such as Grid computing. In this paper we show how a load balanced Monte Carlo method for computing the inverse of a dense matrix can be constructed, show how the method can be implemented on the Grid, and demonstrate how efficiently the method scales on multiple processors. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Myoglobin has been studied in considerable detail using different experimental and computational techniques over the past decades. Recent developments in time-resolved spectroscopy have provided experimental data amenable to detailed atomistic simulations. The main theme of the present review are results on the structures, energetics and dynamics of ligands ( CO, NO) interacting with myoglobin from computer simulations. Modern computational methods including free energy simulations, mixed quantum mechanics/molecular mechanics simulations, and reactive molecular dynamics simulations provide insight into the dynamics of ligand dynamics in confined spaces complementary to experiment. Application of these methods to calculate and understand experimental observations for myoglobin interacting with CO and NO are presented and discussed.
Resumo:
Time correlation functions yield profound information about the dynamics of a physical system and hence are frequently calculated in computer simulations. For systems whose dynamics span a wide range of time, currently used methods require significant computer time and memory. In this paper, we discuss the multiple-tau correlator method for the efficient calculation of accurate time correlation functions on the fly during computer simulations. The multiple-tau correlator is efficacious in terms of computational requirements and can be tuned to the desired level of accuracy. Further, we derive estimates for the error arising from the use of the multiple-tau correlator and extend it for use in the calculation of mean-square particle displacements and dynamic structure factors. The method described here, in hardware implementation, is routinely used in light scattering experiments but has not yet found widespread use in computer simulations.
Resumo:
People with motion-impairments can often have difficulty with accurate control of standard pointing devices for computer input. The nature of the difficulties may vary, so to be most effective, methods of assisting cursor control must be suited to each user's needs. The work presented here involves a study of cursor trajectories as a means of assessing the requirements of motion-impaired computer users. A new cursor characteristic is proposed that attempts to capture difficulties with moving the cursor in a smooth trajectory. A study was conducted to see if haptic tunnels could improve performance in "point and click" tasks. Results indicate that the tunnels reduced times to target for those users identified by the new characteristic as having the most difficulty moving in a smooth trajectory. This suggests that cursor characteristics have potential applications in performing assessments of a user's cursor control capabilities which can then be used to determine appropriate methods of assistance.
Resumo:
Clinical pathways are widely adopted by many large hospitals around the world in order to provide high-quality patient treatment and reduce the length and cost of hospital stay. However, nowadays most of them are static and nonpersonalized. Our objective is to capture and represent clinical pathway using organizational semiotics method including Semantic Analysis which determines semantic units in clinical pathway, their relationship and their patterns of behavior, and Norm Analysis which extracts and specifies the norms that establish how and when these medical behaviors will occur. Finally, we propose a method to develop clinical pathway ontology based on the results of Semantic Analysis and Norm analysis. This approach will give a contribution to design personalized clinical pathway by defining a set of possible patterns of behavior and theClinical pathways are widely adopted by many large hospitals around the world in order to provide high-quality patient treatment and reduce the length and cost of hospital stay. However, nowadays most of them are static and nonpersonalized. Our objective is to capture and represent clinical pathway using organizational semiotics method including Semantic Analysis which determines semantic units in clinical pathway, their relationship and their patterns of behavior, and Norm Analysis which extracts and specifies the norms that establish how and when these medical behaviors will occur. Finally, we propose a method to develop clinical pathway ontology based on the results of Semantic Analysis and Norm analysis. This approach will give a contribution to design personalized clinical pathway by defining a set of possible patterns of behavior and the norms that govern the behavior based on patient’s condition.
Resumo:
This paper details an investigation into sensory substitution by means of direct electrical stimulation of the tongue for the purpose of information input to the human brain. In particular, a device has been constructed and a series of trials have been performed in order to demonstrate the efficacy and performance of an electro-tactile array mounted onto the tongue surface for the purpose of sensory augmentation. Tests have shown that by using a low resolution array a computer-human feedback loop can be successfully implemented by humans in order to complete tasks such as object tracking, surface shape identification and shape recognition with no training or prior experience with the device. Comparisons of this technique have been made with visual alternatives and these show that the tongue based tactile array can match such methods in convenience and accuracy in performing simple tasks.
Resumo:
Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.
Resumo:
Dietary assessment in older adults can be challenging. The Novel Assessment of Nutrition and Ageing (NANA) method is a touch-screen computer-based food record that enables older adults to record their dietary intakes. The objective of the present study was to assess the relative validity of the NANA method for dietary assessment in older adults. For this purpose, three studies were conducted in which a total of ninety-four older adults (aged 65–89 years) used the NANA method of dietary assessment. On a separate occasion, participants completed a 4 d estimated food diary. Blood and 24 h urine samples were also collected from seventy-six of the volunteers for the analysis of biomarkers of nutrient intake. The results from all the three studies were combined, and nutrient intake data collected using the NANA method were compared against the 4 d estimated food diary and biomarkers of nutrient intake. Bland–Altman analysis showed a reasonable agreement between the dietary assessment methods for energy and macronutrient intake; however, there were small, but significant, differences for energy and protein intake, reflecting the tendency for the NANA method to record marginally lower energy intakes. Significant positive correlations were observed between urinary urea and dietary protein intake using both the NANA and the 4 d estimated food diary methods, and between plasma ascorbic acid and dietary vitamin C intake using the NANA method. The results demonstrate the feasibility of computer-based dietary assessment in older adults, and suggest that the NANA method is comparable to the 4 d estimated food diary, and could be used as an alternative to the food diary for the short-term assessment of an individual’s dietary intake.
Resumo:
A fully automated and online artifact removal method for the electroencephalogram (EEG) is developed for use in brain-computer interfacing. The method (FORCe) is based upon a novel combination of wavelet decomposition, independent component analysis, and thresholding. FORCe is able to operate on a small channel set during online EEG acquisition and does not require additional signals (e.g. electrooculogram signals). Evaluation of FORCe is performed offline on EEG recorded from 13 BCI particpants with cerebral palsy (CP) and online with three healthy participants. The method outperforms the state-of the-art automated artifact removal methods Lagged auto-mutual information clustering (LAMIC) and Fully automated statistical thresholding (FASTER), and is able to remove a wide range of artifact types including blink, electromyogram (EMG), and electrooculogram (EOG) artifacts.
Resumo:
This case series compares patient experiences and therapeutic processes between two modalities of cognitive behaviour therapy (CBT) for depression: computerized CBT (cCBT) and therapist-delivered CBT (tCBT). In a mixed-methods repeated-measures case series, six participants were offered cCBT and tCBT in sequence, with the order of delivery randomized across participants. Questionnaires about patient experiences were administered after each session and a semi-structured interview was completed with each participant at the end of each therapy modality. Therapy expectations, patient experiences and session impact ratings in this study generally favoured tCBT. Participants typically experienced cCBT sessions as less meaningful, less positive and less helpful compared to tCBT sessions in terms of developing understanding, facilitating problem-solving and building a therapeutic relationship.