24 resultados para Annette
Resumo:
Purpose: To quantify to what extent the new registration method, DARTEL (Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra), may reduce the smoothing kernel width required and investigate the minimum group size necessary for voxel-based morphometry (VBM) studies. Materials and Methods: A simulated atrophy approach was employed to explore the role of smoothing kernel, group size, and their interactions on VBM detection accuracy. Group sizes of 10, 15, 25, and 50 were compared for kernels between 0–12 mm. Results: A smoothing kernel of 6 mm achieved the highest atrophy detection accuracy for groups with 50 participants and 8–10 mm for the groups of 25 at P < 0.05 with familywise correction. The results further demonstrated that a group size of 25 was the lower limit when two different groups of participants were compared, whereas a group size of 15 was the minimum for longitudinal comparisons but at P < 0.05 with false discovery rate correction. Conclusion: Our data confirmed DARTEL-based VBM generally benefits from smaller kernels and different kernels perform best for different group sizes with a tendency of smaller kernels for larger groups. Importantly, the kernel selection was also affected by the threshold applied. This highlighted that the choice of kernel in relation to group size should be considered with care.
Resumo:
Purpose – This paper describes visitors' reactions to using an Apple iPad or smartphone to follow trails in a museum by scanning QR codes and draws conclusions on the potential for this technology to help improve accessibility at low-cost. Design/methodology/approach – Activities were devised which involved visitors following trails around museum objects, each labelled with a QR code and symbolised text. Visitors scanned the QR codes using a mobile device which then showed more information about an object. Project-team members acted as participant-observers, engaging with visitors and noting how they used the system. Experiences from each activity fed into the design of the next. Findings – Some physical and technical problems with using QR codes can be overcome with the introduction of simple aids, particularly using movable object labels. A layered approach to information access is possible with the first layer comprising a label, the second a mobile-web enabled screen and the third choices of text, pictures, video and audio. Video was especially appealing to young people. The ability to repeatedly watch video or listen to audio seemed to be appreciated by visitors with learning disabilities. This approach can have low equipment-cost. However, maintaining the information behind labels and keeping-up with technological changes are on-going processes. Originality/value – Using QR codes on movable, symbolised object labels as part of a layered information system might help modestly-funded museums enhance their accessibility, particularly as visitors increasingly arrive with their own smartphones or tablets.
Resumo:
We describe development of a questionnaire to elicit pain symptoms and experience, for use by people with dementia or their carers, at hospital admission. The questionnaire provided contextual information to support professionals’ use of the Abbey Pain Scale, a validated tool used by nursing staff internationally. Appropriate information and physical design were required in order, not only to create an approachable questionnaire for patients and carers, but also to ensure fit with hospital processes. Fit with hospital process had significant influence on the final form of the questionnaire, compromising some aspects of design for patients and carers, but this compromise was considered essential to ensure pain management procedures were supplemented by wider, contextual information.
Resumo:
Background. Initial evidence suggests that the integrity of the ipsilesional corticospinal tract (CST) after stroke is strongly related to motor function in the chronic state but not the treatment gain induced by motor rehabilitation. Objective. We examined the association of motor status and treatment benefit by testing patients with a wide range of severity of hemiparesis of the left and right upper extremity. Method. Diffusion tensor imaging was performed in 22 patients beyond 12 months after onset of stroke with severe to moderate hemiparesis. Motor function was tested before and after 2 weeks of modified constraint-induced movement therapy. Results. CST integrity, but not lesion volume, correlated with the motor ability measures of the Wolf Motor Function Test and the Motor Activity Log. No differences were found between left and right hemiparesis. Motor performance improved significantly with the treatment regime, and did so equally for patients with left and right arm paresis. However, treatment benefit was not associated with either CST integrity or lesion volume. Conclusion. CST integrity correlated best in this small trial with chronic long-term status but not treatment-induced improvements. The CST may play a different role in the mechanisms mediating long-term outcome compared to those underlying practice-induced gains after a chronic plateau in motor function.
Resumo:
Approximately 20 % of individuals with Parkinson's disease (PD) report a positive family history. Yet, a large portion of causal and disease-modifying variants is still unknown. We used exome sequencing in two affected individuals from a family with late-onset PD to identify 15 potentially causal variants. Segregation analysis and frequency assessment in 862 PD cases and 1,014 ethnically matched controls highlighted variants in EEF1D and LRRK1 as the best candidates. Mutation screening of the coding regions of these genes in 862 cases and 1,014 controls revealed several novel non-synonymous variants in both genes in cases and controls. An in silico multi-model bioinformatics analysis was used to prioritize identified variants in LRRK1 for functional follow- up. However, protein expression, subcellular localization, and cell viability were not affected by the identified variants. Although it has yet to be proven conclusively that variants in LRRK1 are indeed causative of PD, our data strengthen a possible role for LRRK1 in addition to LRRK2 in the genetic underpinnings of PD but, at the same time, highlight the difficulties encountered in the study of rare variants identified by next-generation sequencing in diseases with autosomal dominant or complex patterns of inheritance.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.
A benchmark-driven modelling approach for evaluating deployment choices on a multi-core architecture
Resumo:
The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.
Resumo:
We assess the roles of long-lived greenhouse gases and ozone depletion in driving meridional surface pressure gradients in the southern extratropics; these gradients are a defining feature of the Southern Annular Mode. Stratospheric ozone depletion is thought to have caused a strengthening of this mode during summer, with increasing long-lived greenhouse gases playing a secondary role. Using a coupled atmosphere-ocean chemistry-climate model, we show that there is cancelation between the direct, radiative effect of increasing greenhouse gases by the also substantial indirect—chemical and dynamical—feedbacks that greenhouse gases have via their impact on ozone. This sensitivity of the mode to greenhouse gas-induced ozone changes suggests that a consistent implementation of ozone changes due to long-lived greenhouse gases in climate models benefits the simulation of this important aspect of Southern Hemisphere climate.