870 resultados para Annette


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim: To determine the prevalence and nature of prescribing errors in general practice; to explore the causes, and to identify defences against error. Methods: 1) Systematic reviews; 2) Retrospective review of unique medication items prescribed over a 12 month period to a 2% sample of patients from 15 general practices in England; 3) Interviews with 34 prescribers regarding 70 potential errors; 15 root cause analyses, and six focus groups involving 46 primary health care team members Results: The study involved examination of 6,048 unique prescription items for 1,777 patients. Prescribing or monitoring errors were detected for one in eight patients, involving around one in 20 of all prescription items. The vast majority of the errors were of mild to moderate severity, with one in 550 items being associated with a severe error. The following factors were associated with increased risk of prescribing or monitoring errors: male gender, age less than 15 years or greater than 64 years, number of unique medication items prescribed, and being prescribed preparations in the following therapeutic areas: cardiovascular, infections, malignant disease and immunosuppression, musculoskeletal, eye, ENT and skin. Prescribing or monitoring errors were not associated with the grade of GP or whether prescriptions were issued as acute or repeat items. A wide range of underlying causes of error were identified relating to the prescriber, patient, the team, the working environment, the task, the computer system and the primary/secondary care interface. Many defences against error were also identified, including strategies employed by individual prescribers and primary care teams, and making best use of health information technology. Conclusion: Prescribing errors in general practices are common, although severe errors are unusual. Many factors increase the risk of error. Strategies for reducing the prevalence of error should focus on GP training, continuing professional development for GPs, clinical governance, effective use of clinical computer systems, and improving safety systems within general practices and at the interface with secondary care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissolved organic carbon (DOC) concentrations in surface waters have increased across much of Europe and North America, with implications for the terrestrial carbon balance, aquatic ecosystem functioning, water treatment costs and human health. Over the past decade, many hypotheses have been put forward to explain this phenomenon, from changing climate and land-management to eutrophication and acid deposition. Resolution of this debate has been hindered by a reliance on correlative analyses of time-series data, and a lack of robust experimental testing of proposed mechanisms. In a four-year, four-site replicated field experiment involving both acidifying and de-acidifying treatments, we tested the hypothesis that DOC leaching was previously suppressed by high levels of soil acidity in peat and organo-mineral soils, and therefore that observed DOC increases a consequence of decreasing soil acidity. We observed a consistent, positive relationship between DOC and acidity change at all sites. Responses were described by similar hyperbolic relationships between standardised changes in DOC and hydrogen ion concentrations at all sites, suggesting potentially general applicability. These relationships explained a substantial proportion of observed changes in peak DOC concentrations in nearby monitoring streams, and application to a UK-wide upland soil pH dataset suggests that recovery from acidification alone could have led to soil solution DOC increases in the range 46-126% by habitat type since 1978. Our findings raise the possibility that changing soil acidity may have wider impacts on ecosystem carbon balances. Decreasing sulphur deposition may be accelerating terrestrial carbon loss, and returning surface waters to a natural, high-DOC condition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bimanual actions impose intermanual coordination demands not present during unimanual actions. We investigated the functional neuroanatomical correlates of these coordination demands in motor imagery (MI) of everyday actions using functional magnetic resonance imaging (fMRI). For this, 17 participants imagined unimanual actions with the left and right hand as well as bimanual actions while undergoing fMRI. A univariate fMRI analysis showed no reliable cortical activations specific to bimanual MI, indicating that intermanual coordination demands in MI are not associated with increased neural processing. A functional connectivity analysis based on psychophysiological interactions (PPI), however, revealed marked increases in connectivity between parietal and premotor areas within and between hemispheres. We conclude that in MI of everyday actions intermanual coordination demands are primarily met by changes in connectivity between areas and only moderately, if at all, by changes in the amount of neural activity. These results are the first characterization of the neuroanatomical correlates of bimanual coordination demands in MI. Our findings support the assumed equivalence of overt and imagined actions and highlight the differences between uni- and bimanual actions. The findings extent our understanding of the motor system and may aid the development of clinical neurorehabilitation approaches based on mental practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motor imagery, passive movement, and movement observation have been suggested to activate the sensorimotor system without overt movement. The present study investigated these three covert movement modes together with overt movement in a within-subject design to allow for a fine-grained comparison of their abilities in activating the sensorimotor system, i.e. premotor, primary motor, and somatosensory cortices. For this, 21 healthy volunteers underwent functional magnetic resonance imaging (fMRI). In addition we explored the abilities of the different covert movement modes in activating the sensorimotor system in a pilot study of 5 stroke patients suffering from chronic severe hemiparesis. Results demonstrated that while all covert movement modes activated sensorimotor areas, there were profound differences between modes and between healthy volunteers and patients. In healthy volunteers, the pattern of neural activation in overt execution was best resembled by passive movement, followed by motor imagery, and lastly by movement observation. In patients, attempted overt execution was best resembled by motor imagery, followed by passive movement, and lastly by movement observation. Our results indicate that for severely hemiparetic stroke patients motor imagery may be the preferred way to activate the sensorimotor system without overt behavior. In addition, the clear differences between the covert movement modes point to the need for within-subject comparisons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To quantify to what extent the new registration method, DARTEL (Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra), may reduce the smoothing kernel width required and investigate the minimum group size necessary for voxel-based morphometry (VBM) studies. Materials and Methods: A simulated atrophy approach was employed to explore the role of smoothing kernel, group size, and their interactions on VBM detection accuracy. Group sizes of 10, 15, 25, and 50 were compared for kernels between 0–12 mm. Results: A smoothing kernel of 6 mm achieved the highest atrophy detection accuracy for groups with 50 participants and 8–10 mm for the groups of 25 at P < 0.05 with familywise correction. The results further demonstrated that a group size of 25 was the lower limit when two different groups of participants were compared, whereas a group size of 15 was the minimum for longitudinal comparisons but at P < 0.05 with false discovery rate correction. Conclusion: Our data confirmed DARTEL-based VBM generally benefits from smaller kernels and different kernels perform best for different group sizes with a tendency of smaller kernels for larger groups. Importantly, the kernel selection was also affected by the threshold applied. This highlighted that the choice of kernel in relation to group size should be considered with care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – This paper describes visitors' reactions to using an Apple iPad or smartphone to follow trails in a museum by scanning QR codes and draws conclusions on the potential for this technology to help improve accessibility at low-cost. Design/methodology/approach – Activities were devised which involved visitors following trails around museum objects, each labelled with a QR code and symbolised text. Visitors scanned the QR codes using a mobile device which then showed more information about an object. Project-team members acted as participant-observers, engaging with visitors and noting how they used the system. Experiences from each activity fed into the design of the next. Findings – Some physical and technical problems with using QR codes can be overcome with the introduction of simple aids, particularly using movable object labels. A layered approach to information access is possible with the first layer comprising a label, the second a mobile-web enabled screen and the third choices of text, pictures, video and audio. Video was especially appealing to young people. The ability to repeatedly watch video or listen to audio seemed to be appreciated by visitors with learning disabilities. This approach can have low equipment-cost. However, maintaining the information behind labels and keeping-up with technological changes are on-going processes. Originality/value – Using QR codes on movable, symbolised object labels as part of a layered information system might help modestly-funded museums enhance their accessibility, particularly as visitors increasingly arrive with their own smartphones or tablets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe development of a questionnaire to elicit pain symptoms and experience, for use by people with dementia or their carers, at hospital admission. The questionnaire provided contextual information to support professionals’ use of the Abbey Pain Scale, a validated tool used by nursing staff internationally. Appropriate information and physical design were required in order, not only to create an approachable questionnaire for patients and carers, but also to ensure fit with hospital processes. Fit with hospital process had significant influence on the final form of the questionnaire, compromising some aspects of design for patients and carers, but this compromise was considered essential to ensure pain management procedures were supplemented by wider, contextual information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background. Initial evidence suggests that the integrity of the ipsilesional corticospinal tract (CST) after stroke is strongly related to motor function in the chronic state but not the treatment gain induced by motor rehabilitation. Objective. We examined the association of motor status and treatment benefit by testing patients with a wide range of severity of hemiparesis of the left and right upper extremity. Method. Diffusion tensor imaging was performed in 22 patients beyond 12 months after onset of stroke with severe to moderate hemiparesis. Motor function was tested before and after 2 weeks of modified constraint-induced movement therapy. Results. CST integrity, but not lesion volume, correlated with the motor ability measures of the Wolf Motor Function Test and the Motor Activity Log. No differences were found between left and right hemiparesis. Motor performance improved significantly with the treatment regime, and did so equally for patients with left and right arm paresis. However, treatment benefit was not associated with either CST integrity or lesion volume. Conclusion. CST integrity correlated best in this small trial with chronic long-term status but not treatment-induced improvements. The CST may play a different role in the mechanisms mediating long-term outcome compared to those underlying practice-induced gains after a chronic plateau in motor function.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Approximately 20 % of individuals with Parkinson's disease (PD) report a positive family history. Yet, a large portion of causal and disease-modifying variants is still unknown. We used exome sequencing in two affected individuals from a family with late-onset PD to identify 15 potentially causal variants. Segregation analysis and frequency assessment in 862 PD cases and 1,014 ethnically matched controls highlighted variants in EEF1D and LRRK1 as the best candidates. Mutation screening of the coding regions of these genes in 862 cases and 1,014 controls revealed several novel non-synonymous variants in both genes in cases and controls. An in silico multi-model bioinformatics analysis was used to prioritize identified variants in LRRK1 for functional follow- up. However, protein expression, subcellular localization, and cell viability were not affected by the identified variants. Although it has yet to be proven conclusively that variants in LRRK1 are indeed causative of PD, our data strengthen a possible role for LRRK1 in addition to LRRK2 in the genetic underpinnings of PD but, at the same time, highlight the difficulties encountered in the study of rare variants identified by next-generation sequencing in diseases with autosomal dominant or complex patterns of inheritance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The complexity of current and emerging architectures provides users with options about how best to use the available resources, but makes predicting performance challenging. In this work a benchmark-driven model is developed for a simple shallow water code on a Cray XE6 system, to explore how deployment choices such as domain decomposition and core affinity affect performance. The resource sharing present in modern multi-core architectures adds various levels of heterogeneity to the system. Shared resources often includes cache, memory, network controllers and in some cases floating point units (as in the AMD Bulldozer), which mean that the access time depends on the mapping of application tasks, and the core's location within the system. Heterogeneity further increases with the use of hardware-accelerators such as GPUs and the Intel Xeon Phi, where many specialist cores are attached to general-purpose cores. This trend for shared resources and non-uniform cores is expected to continue into the exascale era. The complexity of these systems means that various runtime scenarios are possible, and it has been found that under-populating nodes, altering the domain decomposition and non-standard task to core mappings can dramatically alter performance. To find this out, however, is often a process of trial and error. To better inform this process, a performance model was developed for a simple regular grid-based kernel code, shallow. The code comprises two distinct types of work, loop-based array updates and nearest-neighbour halo-exchanges. Separate performance models were developed for each part, both based on a similar methodology. Application specific benchmarks were run to measure performance for different problem sizes under different execution scenarios. These results were then fed into a performance model that derives resource usage for a given deployment scenario, with interpolation between results as necessary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We assess the roles of long-lived greenhouse gases and ozone depletion in driving meridional surface pressure gradients in the southern extratropics; these gradients are a defining feature of the Southern Annular Mode. Stratospheric ozone depletion is thought to have caused a strengthening of this mode during summer, with increasing long-lived greenhouse gases playing a secondary role. Using a coupled atmosphere-ocean chemistry-climate model, we show that there is cancelation between the direct, radiative effect of increasing greenhouse gases by the also substantial indirect—chemical and dynamical—feedbacks that greenhouse gases have via their impact on ozone. This sensitivity of the mode to greenhouse gas-induced ozone changes suggests that a consistent implementation of ozone changes due to long-lived greenhouse gases in climate models benefits the simulation of this important aspect of Southern Hemisphere climate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To describe corneal graft survival and visual outcome after therapeutic penetrating keratoplasty in patients with Acanthamoeba keratitis (AK) that is unresponsive to clinical treatment. Methods: Retrospective study. Thirty-two patients with AK who underwent therapeutic penetrating keratoplasty (tPK) from August 1996 to August 2005 were included. Data relating to clinical features, visual acuity, surgical technique, graft survival and complications were collected. Graft survival was evaluated by the Kaplan-Meier method and comparisons were performed using the Log-rank test. Results: Most patients (62.5%) were female. Mean age [+/- standard deviation (SD)] was 35 (+/- 13) years (range 15-68 years). All patients were contact lens wearers. Eighteen patients (56%) presented paralytic mydriasis and glaucoma during the treatment. Thirteen patients (40%) developed glaucoma after surgery; eight of them (61%) required a second PK because of graft failure. Of the 32 keratoplasty eyes, 56.2% presented graft failure at any follow-up point. Forty-five per cent of graft failures occurred before the 12 month follow-up, so 55% remained clear in the first year after surgery. Twelve patients underwent a second PK; seven of them failed and 45% were clear at 1 year. Two patients presented graft recurrence of amoebic infection. There was no significant difference in graft survival when eyes with or without mydriasis were compared (P = 0.40). Eyes with glaucoma presented a significantly shorter graft survival (P = 0.01). Conclusion: Penetrating keratoplasty is a treatment option for eyes that are unresponsive to clinical treatment infections. However, graft survival is poor; postoperative glaucoma is frequent and is associated with shorter graft survival.