984 resultados para Resource productivity
Resumo:
Understanding how communities assemble is a key challenge in ecology. Conflicting hypotheses suggest that plant traits within communities should show divergence to reflect strategies to reduce competition or convergence to reflect strong selection for the environmental conditions operating. Further hypotheses suggest that plant traits related to productivity show convergence within communities, but those related to disturbance show divergence. Data on functional diversity (FD ) of 12 traits from 30 communities ranging from arable fields, mown and grazed grasslands to moorland and woodland were employed to test this using randomisations tests and correlation and regression analysis. No traits showed consistent significant convergence or divergence in functional diversity. When correlated to measures of the environment, the most common pattern was for functional diversity to decline (7 out of 12 traits) and the degree of convergence (7 out of 12 traits) to increase as the levels of productivity (measured as primary productivity, soil nitrogen release and vegetation C:N) and disturbance increased. Convergence or a relationship between functional diversity and the environment was not seen for a number of important traits, such as LDMC and SLA, which are considered as key predictors of ecosystem function. The analysis indicates that taking into account functional diversity within a system may be a necessary part of predicting the relationship between plant traits and ecosystem function, and that this may be of particular importance within less productive and less disturbed systems. © 2011 Springer-Verlag.
Resumo:
To describe the patient demographic characteristics and organisational factors that influence length of stay (LOS) among emergency medical admissions. Also, to describe differences in investigation practice among consultant physicians and to examine the impact of these on LOS.
Resumo:
Many scientific applications are programmed using hybrid programming models that use both message passing and shared memory, due to the increasing prevalence of large-scale systems with multicore, multisocket nodes. Previous work has shown that energy efficiency can be improved using software-controlled execution schemes that consider both the programming model and the power-aware execution capabilities of the system. However, such approaches have focused on identifying optimal resource utilization for one programming model, either shared memory or message passing, in isolation. The potential solution space, thus the challenge, increases substantially when optimizing hybrid models since the possible resource configurations increase exponentially. Nonetheless, with the accelerating adoption of hybrid programming models, we increasingly need improved energy efficiency in hybrid parallel applications on large-scale systems. In this work, we present new software-controlled execution schemes that consider the effects of dynamic concurrency throttling (DCT) and dynamic voltage and frequency scaling (DVFS) in the context of hybrid programming models. Specifically, we present predictive models and novel algorithms based on statistical analysis that anticipate application power and time requirements under different concurrency and frequency configurations. We apply our models and methods to the NPB MZ benchmarks and selected applications from the ASC Sequoia codes. Overall, we achieve substantial energy savings (8.74 percent on average and up to 13.8 percent) with some performance gain (up to 7.5 percent) or negligible performance loss.
Resumo:
BACKGROUND:
A novel online resource has been developed to aid OSCE examiner training comprising a series of videos of OSCE performances that allow inter-examiner comparison of global grade decisions.
AIMS:
To evaluate this training resource in terms of usefulness and ability to improve examiner confidence in awarding global grades in OSCEs.
METHOD:
Data collected from the first 200 users included global grades awarded, willingness to change grades following peer comparison and confidence in awarding grades before and after training.
RESULTS:
Most (86.5%) agreed that the resource was useful in developing global grade scoring ability in OSCEs, with a significant improvement in confidence in awarding grades after using the training package (p<0.001).
CONCLUSIONS:
This is a useful and effective online training package. As an adjunct to traditional training it offers a practical solution to the problem of availability of examiners.
Resumo:
Lakes in Arctic and subarctic regions display extreme levels of seasonal variation in light, temperature and ice cover. Comparatively little is known regarding the effects of such seasonal variation on the diet and resource use of fish species inhabiting these systems. Variation in the diet of European whitefish Coregonus lavaretus (L.) during periods of ice cover in this region is often regarded as 'common knowledge'; however, this aspect of the species' ecology has not been examined empirically. Here, we outline the differences in invertebrate community structure, fish activity, and resource use of monomorphic whitefish populations between summer (August-September) and winter (February-March) in three subarctic lakes in Finnish Lapland. Benthic macroinvertebrate densities did not exhibit measurable differences between summer and winter. Zooplankton diversity and abundance, and activity levels of all fish species (measured as catch per unit effort) were lower in winter. The summer diet of C. lavaretus was typical of a generalist utilising a variety of prey sources. In winter, its dietary niche was significantly reduced, and the diet was dominated by chironomid larvae in all study sites. Pelagic productivity decreases during winter, and fish species inhabiting these systems are therefore restricted to feeding on benthic prey. Sampling time has strong effect on our understanding of resource utilisation by whitefish in subarctic lakes and should be taken into account in future studies of these systems. © 2012 John Wiley & Sons A/S.
Resumo:
In this article, the authors provide an overview on the development of a Long-Term Care Best Practise Resource Centre. The results of both a feasibility study and the outcomes of a 1-year demonstration project are presented. The demonstration project involved a hospital as the information service provider and two demonstration sites, a home care service agency and a nursing home that used the services of the Centre. The goals of the Centre were threefold: provide access to literature for staff in long-term care (LTC) settings; improve the information management skills of health care providers; and support research and the integration of best practices in LTC organizations. The results of the pilot study contributed to the development of a collaborative information access system for LTC clinicians and managers that provides timely, up-to-date information contributing to improving the quality of care for adults receiving LTC. Based on this demonstration project, strategies for successful innovation in LTC are identified.
Resumo:
Background: Increasing emphasis is being placed on the economics of health care service delivery - including home-based palliative care. Aim: This paper analyzes resource utilization and costs of a shared-care demonstration project in rural Ontario (Canada) from the public health care system's perspective. Design: To provide enhanced end-of-life care, the shared-care approach ensured exchange of expertise and knowledge and coordination of services in line with the understood goals of care. Resource utilization and costs were tracked over the 15 month study period from January 2005 to March 2006. Results: Of the 95 study participants (average age 71 years), 83 had a cancer diagnosis (87%); the non-cancer diagnoses (12 patients, 13%) included mainly advanced heart diseases and COPD. Community Care Access Centre and Enhanced Palliative Care Team-based homemaking and specialized nursing services were the most frequented offerings, followed by equipment/transportation services and palliative care consults for pain and symptom management. Total costs for all patient-related services (in 2007 CAN) were 1,625,658.07 - or 17,112.19 per patient/117.95 per patient day. Conclusion: While higher than expenditures previously reported for a cancer-only population in an urban Ontario setting, the costs were still within the parameters of the US Medicare Hospice Benefits, on a par with the per diem funding assigned for long-term care homes and lower than both average alternate level of care and hospital costs within the Province of Ontario. The study results may assist service planners in the appropriate allocation of resources and service packaging to meet the complex needs of palliative care populations. © 2012 The Author(s).
Resumo:
This paper presents a detailed description of health care resource utilisation and costs of a pilot interdisciplinary health care model of palliative home care in Ontario, Canada. The descriptive evaluation entailed examining the use of services and costs of the pilot program: patient demographics, length of stay broken down by disposition (discharged, alive, death), access to services/resources, use of family physician and specialist services, and drug use. There were 434 patients included in the pilot program. Total costs were approximately CAN$2.4 million, and the cost per person amounted to approximately CAN$5586.33 with average length of stay equal to over 2 months (64.22 days). One may assume that length of stay would be influenced by the amount of service and support available. Future research might investigate whether in-home palliative home care is the most cost effective and suitable care setting for those patients requiring home care services for expected periods of time. © 2009 SAGE Publications.
Resumo:
Community identities enhance well-being through the provision of social support and feelings of collective efficacy as well as by acting as a basis for collective action and social change. However, the precise mechanisms through which community identity acts to enhance well-being are complicated by stigmatisation which potentially undermines solidarity and collective action. The present research examines a real-world stigmatised community group in order to investigate: (1) the community identity factors that act to enhance well-being, and (2) the consequences of community identity for community action. Study 1 consisted of a household survey conducted in disadvantaged areas of Limerick city in Ireland. Participants (n=322) completed measures of community identification, social support, collective efficacy, community action, and psychological well-being. Mediation analysis indicated that perceptions of collective efficacy mediated the relationship between identification and well-being. However, levels of self-reported community action were low and unrelated to community identification. In Study 2, twelve follow–up multiple-participant interviews with residents and community group workers were thematically analysed, revealing high levels of stigmatisation and opposition to identity-related collective action. These findings suggest the potential for stigma to reduce collective action through undermining solidarity and social support.
Resumo:
As a society, we have a responsibility to provide an inclusive built environment. As part of the need to promote inclusion, there is now a growing trend to place pupils with Special Educational Needs (SEN) into a mainstream school setting. This is often facilitated by providing a specialist SEN resource base located within the mainstream school. If so, the following paper outlines why the whole school should be considered when locating and implementing a SEN resource base. It also highlights the wider opportunities for enhancing inclusion for SEN pupils if giving holistic thought to the wider context of the resource base. It then indicates a four-stage approach, using the ASD pupil as an illustrative example, to help evaluate the optimum SEN resource base location within a mainstream school setting. Finally it highlights in conclusion, some benefits and challenges for an enriched school environment for all pupils, if considering genuine inclusion.
Resumo:
Evidence of 11-year Schwabe solar sunspot cycles, El Niño-Southern Oscillation (ENSO) and the Pacific Decadal Oscillation (PDO) were detected in an annual record of diatomaceous laminated sediments from anoxic Effingham Inlet, Vancouver Island, British Columbia. Radiometric dating and counting of annual varves dates the sediments from AD 1947-1993. Intact sediment slabs were X-rayed for sediment structure (lamina thickness and composition based on gray-scale), and subsamples were examined for diatom abundances and for grain size. Wavelet analysis reveals the presence of ~2-3, ~4.5, ~7 and ~9-12-year cycles in the diatom record and an w11e13 year record in the sedimentary varve thickness record. These cycle lengths suggest that both ENSO and the sunspot cycle had an influence on primary productivity and sedimentation patterns. Sediment grain size could not be correlated to the sunspot cycle although a peak in the grain size data centered around the mid-1970s may be related to the 1976-1977 Pacific climate shift, which occurred when the PDO index shifted from negative (cool conditions) to positive (warm conditions). Additional evidence of the PDO regime shift is found in wavelet and cross-wavelet results for Skeletonema costatum, a weakly silicified variant of S. costatum, annual precipitation and April to June precipitation. Higher spring (April/May) values of the North Pacific High pressure index during sunspot minima suggest that during this time, increased cloud cover and concomitant suppression of the Aleutian Low (AL) pressure system led to strengthened coastal upwelling and enhanced diatom production earlier in the year. These results suggest that the 11-year solar cycle, amplified by cloud cover and upwelling changes, as well as ENSO, exert significant influence on marine primary productivity in the northeast Pacific. The expression of these cyclic phenomena in the sedimentary record were in turn modulated by the phase of PDO, as indicated by the change in period of ENSO and suppression of the solar signal in the record after the 1976-1977 regime shift. © 2013 Elsevier Ltd and INQUA. All rights reserved.
Resumo:
Background: Resource utilisation and direct costs associated with glaucoma progression in Europe are unknown. As population progressively ages, the economic impact of the disease will increase. Methods: From a total of 1655 consecutive cases, the records of 194 patients were selected and stratified by disease severity. Record selection was based on diagnoses of primary open angle glaucoma, glaucoma suspect, ocular hypertension, or normal tension glaucoma; 5 years minimum follow up were required. Glaucoma severity was assessed using a six stage glaucoma staging system based on static threshold visual field parameters. Resource utilisation data were abstracted from the charts and unit costs were applied to estimate direct costs to the payer. Resource utilisation and estimated direct cost of treatment, per person year, were calculated. Results: A statistically significant increasing linear trend (p = 0.018) in direct cost as disease severity worsened was demonstrated. The direct cost of treatment increased by an estimated €86 for each incremental step ranging from €455 per person year for stage 0 to €969 per person year for stage 4 disease. Medication costs ranged from 42% to 56% of total direct cost for all stages of disease. Conclusions: These results demonstrate for the first time in Europe that resource utilisation and direct medical costs of glaucoma management increase with worsening disease severity. Based on these findings, managing glaucoma and effectively delaying disease progression would be expected to significantly reduce the economic burden of this disease. These data are relevant to general practitioners and healthcare administrators who have a direct influence on the distribution of resources.