53 resultados para Post and Core Technique
Resumo:
Background: Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take >2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping.
Results: cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance.
Conclusion: Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.
Resumo:
This paper presents a scalable, statistical ‘black-box’ model for predicting the performance of parallel programs on multi-core non-uniform memory access (NUMA) systems. We derive a model with low overhead, by reducing data collection and model training time. The model can accurately predict the behaviour of parallel applications in response to changes in their concurrency, thread layout on NUMA nodes, and core voltage and frequency. We present a framework that applies the model to achieve significant energy and energy-delay-square (ED2) savings (9% and 25%, respectively) along with performance improvement (10% mean) on an actual 16-core NUMA system running realistic application workloads. Our prediction model proves substantially more accurate than previous efforts.
Resumo:
Enhancing sampling and analyzing simulations are central issues in molecular simulation. Recently, we introduced PLUMED, an open-source plug-in that provides some of the most popular molecular dynamics (MD) codes with implementations of a variety of different enhanced sampling algorithms and collective variables (CVs). The rapid changes in this field, in particular new directions in enhanced sampling and dimensionality reduction together with new hardware, require a code that is more flexible and more efficient. We therefore present PLUMED 2 here a,complete rewrite of the code in an object-oriented programming language (C++). This new version introduces greater flexibility and greater modularity, which both extends its core capabilities and makes it far easier to add new methods and CVs. It also has a simpler interface with the MD engines and provides a single software library containing both tools and core facilities. Ultimately, the new code better serves the ever-growing community of users and contributors in coping with the new challenges arising in the field.
Program summary
Program title: PLUMED 2
Catalogue identifier: AEEE_v2_0
Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEEE_v2_0.html
Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland
Licensing provisions: Yes
No. of lines in distributed program, including test data, etc.: 700646
No. of bytes in distributed program, including test data, etc.: 6618136
Distribution format: tar.gz
Programming language: ANSI-C++.
Computer: Any computer capable of running an executable produced by a C++ compiler.
Operating system: Linux operating system, Unix OSs.
Has the code been vectorized or parallelized?: Yes, parallelized using MPI.
RAM: Depends on the number of atoms, the method chosen and the collective variables used.
Classification: 3, 7.7, 23. Catalogue identifier of previous version: AEEE_v1_0.
Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 1961.
External routines: GNU libmatheval, Lapack, Bias, MPI. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
This programme of research aimed to understand the extent to which current UK medical graduates are prepared for practice. Commissioned by the General Medical Council, we conducted: (1) A Rapid Review of the literature between 2009 and 2013; (2) narrative interviews with a range of stakeholders; and (3) longitudinal audio-diaries with Foundation Year 1 doctors. The Rapid Review (RR) resulted in data from 81 manuscripts being extracted and mapped against a coding framework (including outcomes from Tomorrow's Doctors (2009) (TD09)). A narrative synthesis of the data was undertaken. Narrative interviews were conducted with 185 participants from 8 stakeholder groups: F1 trainees, newly registered trainee doctors, clinical educators, undergraduate and postgraduate deans and foundation programme directors, other healthcare professionals, employers, policy and government and patient and public representatives. Longitudinal audio-diaries were recorded by 26 F1 trainees over 4 months. The data were analysed thematically and mapped against TD09. Together these data shed light onto how preparedness for practice is conceptualised, measured, how prepared UK medical graduates are for practice, the effectiveness of transition interventions and the currently debated issue of bringing full registration forward to align with medical students’ graduation. Preparedness for practice was conceptualised as both a long- and short-term venture that included personal readiness as well as knowledge, skills and attitudes. It has mainly been researched using self-report measures of generalised incidents that have been shown to be problematic. In terms of transition interventions: assistantships were found to be valuable and efficacious for proactive students as team members, shadowing is effective when undertaken close to employment/setting of F1 post and induction is generally effective but of inconsistent quality. The August transition was highlighted in our interview and audio-diary data where F1s felt unprepared, particularly for the step-change in responsibility, workload, degree of multitasking and understanding where to go for help. Evidence of preparedness for specific tasks, skills and knowledge was contradictory: trainees are well prepared for some practical procedures but not others, reasonably well prepared for history taking and full physical examinations, but mostly unprepared for adopting an holistic understanding of the patient, involving patients in their care, safe and legal prescribing, diagnosing and managing complex clinical conditions and providing immediate care in medical emergencies. Evidence for preparedness for interactional and interpersonal aspects of practice was inconsistent with some studies in the RR suggesting graduates were prepared for team working and communicating with colleagues and patients, but other studies contradicting this. Interview and audio-diary data highlights concerns around F1s preparedness for communicating with angry or upset patients and relatives, breaking bad news, communicating with the wider team (including interprofessionally) and handover communication. There was some evidence in the RR to suggest that graduates were unprepared for dealing with error and safety incidents and lack an understanding of how the clinical environment works. Interview and audio-diary data backs this up, adding that F1s are also unprepared for understanding financial aspects of healthcare. In terms of being personally prepared, RR, interview and audio diary evidence is mixed around graduates’ preparedness for identifying their own limitations, but all data points to graduates’ difficulties in the domain of time management. In terms of personal and situational demographic factors, the RR found that gender did not typically predict perceptions of preparedness, but graduates from more recent cohorts, graduate entry students, graduates from problem based learning courses, UK educated graduates and graduates with an integrated degree reported feeling better prepared. The longitudinal audio-diaries provided insights into the preparedness journey for F1s. There seems to be a general development in the direction of trainees feeling more confident and competent as they gain more experience. However, these developments were not necessarily linear as challenging circumstances (e.g. new specialty, new colleagues, lack of staffing) sometimes made them feel unprepared for situations where they had previously indicated preparedness.
Resumo:
The water activity (a(w)) of microbial substrates, biological samples, and foods and drinks is usually determined by direct measurement of the equilibrium relative humidity above a sample. However, these materials can contain ethanol, which disrupts the operation of humidity sensors. Previously, an indirect and problematic technique based on freezing-point depression measurements was needed to calculate the a(w) when ethanol was present. We now describe a rapid and accurate method to determine the a(w) of ethanol-containing samples at ambient temperatures. Disruption of sensor measurements was minimized by using a newly developed, alcohol-resistant humidity sensor fitted with an alcohol filter. Linear equations were derived from a(w) measurements of standard ethanol-water mixtures, and from Norrish's equation, to correct sensor measurements. To our knowledge, this is the first time that electronic sensors have been used to determine the a(w) of ethanol- containing samples.
Resumo:
The initial composition of acrylic bone cement along with the mixing and delivery technique used can influence its final properties and therefore its clinical success in vivo. The polymerisation of acrylic bone cement is complex with a number of processes happening simultaneously. Acrylic bone cement mixing and delivery systems have undergone several design changes in their advancement, although the cement constituents themselves have remained unchanged since they were first used. This study was conducted to determine the factors that had the greatest effect on the final properties of acrylic bone cement using a pre-filled bone cement mixing and delivery system. A design of experiments (DoE) approach was used to determine the impact of the factors associated with this mixing and delivery method on the final properties of the cement produced. The DoE illustrated that all factors present within this study had a significant impact on the final properties of the cement. An optimum cement composition was hypothesised and tested. This optimum recipe produced cement with final mechanical and thermal properties within the clinical guidelines and stated by ISO 5833 (International Standard Organisation (ISO), International standard 5833: implants for surgery—acrylic resin cements, 2002), however the low setting times observed would not be clinically viable and could result in complications during the surgical technique. As a result further development would be required to improve the setting time of the cement in order for it to be deemed suitable for use in total joint replacement surgery.
Resumo:
Research has shown that fibre reinforced polymer (FRP) wraps are effective for strengthening concrete columns for increased axial and flexural load and deformation capacity, and this technique is now used around the world. The experimental study presented in this paper is focused on the mechanics of FRP confined concrete, with a particular emphasis on the influence of the unconfined concrete compressive strength on confinement effectiveness and hoop strain efficiency. An experimental programme was undertaken to study the compressive strength and stress-strain behaviour of unconfined and FRP confined concrete cylinders of different concrete strength but otherwise similar mix designs, aggregates, and constituents. This was accomplished by varying only the water-to-cement ratio during concrete mixing operations. Through the use of high-resolution digital image correlation to measure both axial and hoop strains, the observations yield insights into the mechanics of FRP confinement of concretes of similar composition but with varying unconfined concrete compressive strength.
Resumo:
Background: Large-scale biological jobs on high-performance computing systems require manual intervention if one or more computing cores on which they execute fail. This places not only a cost on the maintenance of the job, but also a cost on the time taken for reinstating the job and the risk of losing data and execution accomplished by the job before it failed. Approaches which can proactively detect computing core failures and take action to relocate the computing core's job onto reliable cores can make a significant step towards automating fault tolerance. Method: This paper describes an experimental investigation into the use of multi-agent approaches for fault tolerance. Two approaches are studied, the first at the job level and the second at the core level. The approaches are investigated for single core failure scenarios that can occur in the execution of parallel reduction algorithms on computer clusters. A third approach is proposed that incorporates multi-agent technology both at the job and core level. Experiments are pursued in the context of genome searching, a popular computational biology application. Result: The key conclusion is that the approaches proposed are feasible for automating fault tolerance in high-performance computing systems with minimal human intervention. In a typical experiment in which the fault tolerance is studied, centralised and decentralised checkpointing approaches on an average add 90% to the actual time for executing the job. On the other hand, in the same experiment the multi-agent approaches add only 10% to the overall execution time