44 resultados para 130306 Educational Technology and Computing
em CentAUR: Central Archive University of Reading - UK
Resumo:
Objective: To describe the training undertaken by pharmacists employed in a pharmacist-led information technology-based intervention study to reduce medication errors in primary care (PINCER Trial), evaluate pharmacists’ assessment of the training, and the time implications of undertaking the training. Methods: Six pharmacists received training, which included training on root cause analysis and educational outreach, to enable them to deliver the PINCER Trial intervention. This was evaluated using self-report questionnaires at the end of each training session. The time taken to complete each session was recorded. Data from the evaluation forms were entered onto a Microsoft Excel spreadsheet, independently checked and the summary of results further verified. Frequencies were calculated for responses to the three-point Likert scale questions. Free-text comments from the evaluation forms and pharmacists’ diaries were analysed thematically. Key findings: All six pharmacists received 22 hours of training over five sessions. In four out of the five sessions, the pharmacists who completed an evaluation form (27 out of 30 were completed) stated they were satisfied or very satisfied with the various elements of the training package. Analysis of free-text comments and the pharmacists’ diaries showed that the principles of root cause analysis and educational outreach were viewed as useful tools to help pharmacists conduct pharmaceutical interventions in both the study and other pharmacy roles that they undertook. The opportunity to undertake role play was a valuable part of the training received. Conclusions: Findings presented in this paper suggest that providing the PINCER pharmacists with training in root cause analysis and educational outreach contributed to the successful delivery of PINCER interventions and could potentially be utilised by other pharmacists based in general practice to deliver pharmaceutical interventions to improve patient safety.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Consideration of the quality, relevance and utility of research in educational leadership and management has been a growing concern of researchers, policy-makers and practitioners, but there is little agreement about its current state or priorities for development. The article reflects on the key criticisms that have been made of research in educational leadership and management in this issue, and elsewhere. It considers how we might begin to devise better ways of understanding its audiences, judging its quality and identifying priorities for the future. It argues that the research reflects its capture by those with particular interests or values, and impacts in ways which are complex and indirect. If educational leadership and management research is to be secure in its perceived value and contribution in the future, several developments are needed, including a greater emphasis on interdisciplinarity, an expansion of the range of methodologies, particularly qualtitative studies; and these shifts must be evident in training researchers as well as in the conduct of research.
Resumo:
Gene Chips are finding extensive use in animal and plant science. Generally microarrays are of two kind, cDNA or oligonucleotide. cDNA microarrays were developed at Stanford University, whereas oligonucleotide were developed by Affymetrix. The construction of cDNA or oligonucleotide on a glass slide helps to compare the gene expression level of treated and control samples by labeling mRNA with green (Cy3) and red (Cy5) dyes. The hybridized gene chip emit fluorescence whose intensity and colour can be measured. RNA labeling can be done directly or indirectly. Indirect method involves amino allyle modified dUTP instead of pre-labelled nucleotide. Hybridization of gene chip generally occurs in a minimum volume possible and to ensure the hetroduplex formation, a ten fold more DNA is spotted on slide than in the solutions. A confocal or semi confocal laser technologies coupled with CCD camera are used for image acquisition. For standardization, house keeping genes are used or cDNA are spotted in gene chip that are not present in treated or control samples. Moreover, statistical analysis (image analysis) and cluster analysis softwares have been developed by Stanford University. The gene-chip technology has many applications like expression analysis, gene expression signatures (molecular phenotypes) and promoter regulatory element co-expression.
Resumo:
Marketing activities are introduced into a rational expectations model of the food marketing system. The model is used to evaluate effects of alternative marketing technologies on the distribution of the benefits of contingency markets in agriculture. Benefits depend on two parameters: the cost share of farm inputs and the elasticity of substitution between farm and nonfarm inputs in food marketing. Over a broad spectrum of technologies, consumers are likely to be the net beneficiaries and farmers the net losers from the provision of contingency markets