947 resultados para evaluation methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Chronic kidney disease (CKD) and hypertension are global public health problems associated with considerable morbidity, premature mortality and attendant healthcare costs. Previous studies have highlighted that non-invasive examination of the retinal microcirculation can detect microvascular pathology that is associated with systemic disorders of the circulatory system such as hypertension. We examined the associations between retinal vessel caliber (RVC) and fractal dimension (DF), with both hypertension and CKD in elderly Irish nuns.

Methods: Data from 1233 participants in the cross-sectional observational Irish Nun Eye Study (INES) were assessed from digital photographs with a standardized protocol using computer-assisted software. Multivariate regression analyses were used to assess associations with hypertension and CKD, with adjustment for age, body mass index (BMI), refraction, fellow eye RVC, smoking, alcohol consumption, ischemic heart disease (IHD), cerebrovascular accident (CVA), diabetes and medication use.

Results: In total, 1122 (91%) participants (mean age: 76.3 [range: 56-100] years) had gradable retinal images of sufficient quality for blood vessel assessment. Hypertension was significantly associated with a narrower central retinal arteriolar equivalent (CRAE) in a fully adjusted analysis (P = 0.002; effect size= -2.16 μm; 95% confidence intervals [CI]: -3.51, -0.81 μm). No significant associations between other retinal vascular parameters and hypertension or between any retinal vascular parameters and CKD were found.

Conclusions: Individuals with hypertension have significantly narrower retinal arterioles which may afford an earlier opportunity for tailored prevention and treatment options to optimize the structure and function of the microvasculature, providing additional clinical utility. No significant associations between retinal vascular parameters and CKD were detected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of mixed hematopoietic chimerism (MC) after allogeneic bone marrow transplantation remains unknown. Increasingly sensitive detection methods have shown that MC occurs frequently. We report a highly sensitive novel method to assess MC based on the polymerase chain reaction (PCR). Simple dinucleotide repeat sequences called microsatellites have been found to vary in their repeat number between individuals. We use this variation to type donor-recipient pairs following allogeneic BMT. A panel of seven microsatellites was used to distinguish between donor and recipient cells of 32 transplants. Informative microsatellites were subsequently used to assess MC after BMT in this group of patients. Seventeen of the 32 transplants involved a donor of opposite sex; hence, cytogenetics and Y chromosome-specific PCR were also used as an index of chimerism in these patients. MC was detected in bone marrow aspirates and peripheral blood in 18 of 32 patients (56%) by PCR. In several cases, only stored slide material was available for analysis but PCR of microsatellites or Y chromosomal material could be used successfully to assess the origin of cells in this archival material. Cytogenetic analysis was possible in 17 patients and MC was detected in three patients. Twelve patients received T-cell-depleted marrow and showed a high incidence of MC as revealed by PCR (greater than 80%). Twenty patients received unmanipulated marrow, and while the incidence of MC was lower (44%), this was a high percentage when compared with other studies. Once MC was detected, the percentages of recipient cells tended to increase. However, in patients exhibiting MC who subsequently relapsed, this increase was relatively sudden. The overall level of recipient cells in the group of MC patients who subsequently relapsed was higher than in those who exhibited stable MC. Thus, while the occurrence of MC was not indicative of a poor prognosis per se, sudden increases in the proportions of recipient cells may be a prelude to graft rejection or relapse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solar heating systems have the potential to be an efficient renewable energy technology, provided they are sized correctly. Sizing a solar thermal system for domestic applications does not warrant the cost of a simulation. As a result simplified sizing procedures are required. The size of a system depends on a number of variables including the efficiency of the collector itself, the hot water demand and the solar radiation at a given location. Domestic Hot Water (DHW) demand varies with time and is assessed using a multi-parameter detailed model. Secondly, the national energy evaluation methodologies are evaluated from the perspective of solar thermal system sizing. Based on the assessment of the standards, limitations in the evaluation method for solar thermal systems are outlined and an adapted method, specific to the sizing of solar thermal systems, is proposed. The methodology is presented for two common dwelling scenarios. Results from this showed that it is difficult to achieve a high solar fraction given practical sizes of system infrastructure (storage tanks) for standard domestic properties. However, solar thermal systems can significantly offset energy loads due associated DHW consumption, particularly when sized appropriately. The presented methodology is valuable for simple solar system design and also for the quick comparison of salient criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Rapid Response Systems (RRS) have been implemented nationally and internationally to improve patient safety in hospital. However, to date the majority of the RRS research evidence has focused on measuring the effectiveness of the intervention on patient outcomes. To evaluate RRS it has been recommended that a multimodal approach is required to address the broad range of process and outcome measures required to determine the effectiveness of the RRS concept. Aim: The aim of this paper is to evaluate the official RRS programme theoretical assumptions regarding how the programme is meant to work against actual practice in order to determine what works. Methods: The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory [1]. This approach used a variety of mixed methods to test the programme theories including individual and focus group interviews with a purposive sample of 75 nurses and doctors, observation of ward practices and documentary analysis. The findings from the case studies were analysed and compared within and across cases to identify what works for whom and in what circumstances. Results: The RRS programme theories were critically evaluated and compared with study findings to develop a mid-range theory to explain what works, for whom in what circumstances. The findings of what works suggests that clinical experience, established working relationships, flexible implementation of protocols, ongoing experiential learning, empowerment and pre-emptive management are key to the success of RRS implementation.  Conclusion:These findings highlight the combination of factors that can improve the implementation of RRS and in light of this evidence several recommendations are made to provide policymakers with guidance and direction for their success and sustainability.References: 1.Pawson R and Tilley N. (1997) Realistic Evaluation. Sage Publications; LondonType of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Realistic Evaluation of EWS and ALERT: factors enabling and constraining implementation Background The implementation of EWS and ALERT in practice is essential to the success of Rapid Response Systems but is dependent upon nurses utilising EWS protocols and applying ALERT best practice guidelines. To date there is limited evidence on the effectiveness of EWS or ALERT as research has primarily focused on measuring patient outcomes (cardiac arrests, ICU admissions) following the implementation of a Rapid Response Team. Complex interventions in healthcare aimed at changing service delivery and related behaviour of health professionals require a different research approach to evaluate the evidence. To understand how and why EWS and ALERT work, or might not work, research needs to consider the social, cultural and organisational influences that will impact on successful implementation in practice. This requires a research approach that considers both the processes and outcomes of complex interventions, such as EWS and ALERT, implemented in practice. Realistic Evaluation is such an approach and was used to explain the factors that enable and constrain the implementation of EWS and ALERT in practice [1]. Aim The aim of this study was to evaluate factors that enabled and constrained the implementation and service delivery of early warnings systems (EWS) and ALERT in practice in order to provide direction for enabling their success and sustainability. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory. This approach used a variety of mixed methods to test the programme theories including individual and focus group interviews, observation and documentary analysis in a two stage process. A purposive sample of 75 key informants participated in individual and focus group interviews. Observation and documentary analysis of EWS compliance data and ALERT training records provided further evidence to support or refute the interview findings. Data was analysed using NVIVO8 to categorise interview findings and SPSS for ALERT documentary data. These findings were further synthesised by undertaking a within and cross case comparison to explain the factors enabling and constraining EWS and ALERT. Results A cross case analysis highlighted similarities, differences and factors enabling or constraining successful implementation across the case study sites. Findings showed that personal (confidence; clinical judgement; personality), social (ward leadership; communication), organisational (workload and staffing issues; pressure from managers to complete EWS audit and targets), educational (constraints on training; no clinical educator on ward) and cultural (routine task delegated) influences impact on EWS and acute care training outcomes. There were also differences noted between medical and surgical wards across both case sites. Conclusions Realist Evaluation allows refinement and development of the RRS programme theory to explain the realities of practice. These refined RRS programme theories are capable of informing the planning of future service provision and provide direction for enabling their success and sustainability. References: 1. McGaughey J, Blackwood B, O’Halloran P, Trinder T. J. & Porter S. (2010) A realistic evaluation of Track and Trigger systems and acute care training for early recognition and management of deteriorating ward–based patients. Journal of Advanced Nursing 66 (4), 923-932. Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim The aim of the study is to evaluate factors that enable or constrain the implementation and service delivery of early warnings systems or acute care training in practice. Background To date there is limited evidence to support the effectiveness of acute care initiatives (early warning systems, acute care training, outreach) in reducing the number of adverse events (cardiac arrest, death, unanticipated Intensive Care admission) through increased recognition and management of deteriorating ward based patients in hospital [1-3]. The reasons posited are that previous research primarily focused on measuring patient outcomes following the implementation of an intervention or programme without considering the social factors (the organisation, the people, external influences) which may have affected the process of implementation and hence measured end-points. Further research which considers the social processes is required in order to understand why a programme works, or does not work, in particular circumstances [4]. Method The design is a multiple case study approach of four general wards in two acute hospitals where Early Warning Systems (EWS) and Acute Life-threatening Events Recognition and Treatment (ALERT) course have been implemented. Various methods are being used to collect data about individual capacities, interpersonal relationships and institutional balance and infrastructures in order to understand the intended and unintended process outcomes of implementing EWS and ALERT in practice. This information will be gathered from individual and focus group interviews with key participants (ALERT facilitators, nursing and medical ALERT instructors, ward managers, doctors, ward nurses and health care assistants from each hospital); non-participant observation of ward organisation and structure; audit of patients' EWS charts and audit of the medical notes of patients who deteriorated during the study period to ascertain whether ALERT principles were followed. Discussion & progress to date This study commenced in January 2007. Ethical approval has been granted and data collection is ongoing with interviews being conducted with key stakeholders. The findings from this study will provide evidence for policy-makers to make informed decisions regarding the direction for strategic and service planning of acute care services to improve the level of care provided to acutely ill patients in hospital. References 1. Esmonde L, McDonnell A, Ball C, Waskett C, Morgan R, Rashidain A et al. Investigating the effectiveness of Critical Care Outreach Services: A systematic review. Intensive Care Medicine 2006; 32: 1713-1721 2. McGaughey J, Alderdice F, Fowler R, Kapila A, Mayhew A, Moutray M. Outreach and Early Warning Systems for the prevention of Intensive Care admission and death of critically ill patients on general hospital wards. Cochrane Database of Systematic Reviews 2007, Issue 3. www.thecochranelibrary.com 3. Winters BD, Pham JC, Hunt EA, Guallar E, Berenholtz S, Pronovost PJ (2007) Rapid Response Systems: A systematic review. Critical Care Medicine 2007; 35 (5): 1238-43 4. Pawson R and Tilley N. Realistic Evaluation. London; Sage: 1997

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statement of purpose The purpose of this concurrent session is to present the main findings and recommendations from a five year study evaluating the implementation of Early Warning Systems (EWS) and the Acute Life-threatening Events: Recognition and Treatment (ALERT) course in Northern Ireland. The presentation will provide delegates with an understanding of those factors that enable and constrain successful implementation of EWS and ALERT in practice in order to provide an impetus for change. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory [1]. The stages included identifying the programme theories underpinning EWS and ALERT, generating hypotheses, gathering empirical evidence and refining the programme theories. This approach used a variety of mixed methods including individual and focus group interviews, observation and documentary analysis of EWS compliance data and ALERT training records. A within and across case comparison facilitated the development of mid-range theories from the research evidence. Results The official RRS theories developed from the realist synthesis were critically evaluated and compared with the study findings to develop a mid-range theory to explain what works, for whom in what circumstances. The findings of what works suggests that clinical experience, established working relationships, flexible implementation of protocols, ongoing experiential learning, empowerment and pre-emptive management are key to the success of EWS and ALERT implementation. Each concept is presented as ‘context, mechanism and outcome configurations’ to provide an understanding of how the context impacts on individual reasoning or behaviour to produce certain outcomes. Conclusion These findings highlight the combination of factors that can improve the implementation and sustainability of EWS and ALERT and in light of this evidence several recommendations are made to provide policymakers with guidance and direction for future policy development. References: 1. Pawson R and Tilley N. (1997) Realistic Evaluation. Sage Publications; London Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: People with intellectual disabilities often present with unique challenges that make it more difficult to meet their
palliative care needs.
Aim: To define consensus norms for palliative care of people with intellectual disabilities in Europe.
Design: Delphi study in four rounds: (1) a taskforce of 12 experts from seven European countries drafted the norms, based on available empirical knowledge and regional/national guidelines; (2) using an online survey, 34 experts from 18 European countries evaluated the draft norms, provided feedback and distributed the survey within their professional networks. Criteria for consensus
were clearly defined; (3) modifications and recommendations were made by the taskforce; and (4) the European Association for
Palliative Care reviewed and approved the final version.
Setting and participants: Taskforce members: identified through international networking strategies. Expert panel: a purposive sample identified through taskforce members’ networks.
Results: A total of 80 experts from 15 European countries evaluated 52 items within the following 13 norms: equity of access, communication, recognising the need for palliative care, assessment of total needs, symptom management, end-of-life decision making, involving those who matter, collaboration, support for family/carers, preparing for death, bereavement support, education/training
and developing/managing services. None of the items scored less than 86% agreement, making a further round unnecessary. In light of respondents’ comments, several items were modified and one item was deleted.
Conclusion: This White Paper presents the first guidance for clinical practice, policy and research related to palliative care for people with intellectual disabilities based on evidence and European consensus, setting a benchmark for changes in policy and practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Diabetic retinopathy is an important cause of visual loss. Laser photocoagulation preserves vision in diabetic retinopathy but is currently used at the stage of proliferative diabetic retinopathy (PDR).

OBJECTIVES: The primary aim was to assess the clinical effectiveness and cost-effectiveness of pan-retinal photocoagulation (PRP) given at the non-proliferative stage of diabetic retinopathy (NPDR) compared with waiting until the high-risk PDR (HR-PDR) stage was reached. There have been recent advances in laser photocoagulation techniques, and in the use of laser treatments combined with anti-vascular endothelial growth factor (VEGF) drugs or injected steroids. Our secondary questions were: (1) If PRP were to be used in NPDR, which form of laser treatment should be used? and (2) Is adjuvant therapy with intravitreal drugs clinically effective and cost-effective in PRP?

ELIGIBILITY CRITERIA: Randomised controlled trials (RCTs) for efficacy but other designs also used.


REVIEW METHODS: Systematic review and economic modelling.

RESULTS: The Early Treatment Diabetic Retinopathy Study (ETDRS), published in 1991, was the only trial designed to determine the best time to initiate PRP. It randomised one eye of 3711 patients with mild-to-severe NPDR or early PDR to early photocoagulation, and the other to deferral of PRP until HR-PDR developed. The risk of severe visual loss after 5 years for eyes assigned to PRP for NPDR or early PDR compared with deferral of PRP was reduced by 23% (relative risk 0.77, 99% confidence interval 0.56 to 1.06). However, the ETDRS did not provide results separately for NPDR and early PDR. In economic modelling, the base case found that early PRP could be more effective and less costly than deferred PRP. Sensitivity analyses gave similar results, with early PRP continuing to dominate or having low incremental cost-effectiveness ratio. However, there are substantial uncertainties. For our secondary aims we found 12 trials of lasers in DR, with 982 patients in total, ranging from 40 to 150. Most were in PDR but five included some patients with severe NPDR. Three compared multi-spot pattern lasers against argon laser. RCTs comparing laser applied in a lighter manner (less-intensive burns) with conventional methods (more intense burns) reported little difference in efficacy but fewer adverse effects. One RCT suggested that selective laser treatment targeting only ischaemic areas was effective. Observational studies showed that the most important adverse effect of PRP was macular oedema (MO), which can cause visual impairment, usually temporary. Ten trials of laser and anti-VEGF or steroid drug combinations were consistent in reporting a reduction in risk of PRP-induced MO.

LIMITATION: The current evidence is insufficient to recommend PRP for severe NPDR.

CONCLUSIONS: There is, as yet, no convincing evidence that modern laser systems are more effective than the argon laser used in ETDRS, but they appear to have fewer adverse effects. We recommend a trial of PRP for severe NPDR and early PDR compared with deferring PRP till the HR-PDR stage. The trial would use modern laser technologies, and investigate the value adjuvant prophylactic anti-VEGF or steroid drugs.

STUDY REGISTRATION: This study is registered as PROSPERO CRD42013005408.

FUNDING: The National Institute for Health Research Health Technology Assessment programme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To establish the relationship between myopia and lens opacity. DESIGN: Population-based cross-sectional study. PARTICIPANTS: Two thousand five hundred twenty participants from the Salisbury Eye Evaluation aged 65 to 84 years. METHODS: Participants filled out questionnaires regarding medical history, social habits, and a detailed history of distance spectacle wear. They underwent a full ocular examination. Lens photographs were taken for assessment of lens opacity using the Wilmer grading system. Multivariate logistic regression models using generalized estimating equations were used to analyze the relationship between lens opacity type and degree of myopia, while accounting for potential confounders. MAIN OUTCOME MEASURES: Presence of posterior subcapsular opacity, cortical opacity, or nuclear opacity. RESULTS: Significant associations were found between myopia and both nuclear and posterior subcapsular opacities. For nuclear opacity, the odds ratios (ORs) were 2.25 for myopia between -0.50 diopters (D) and -1.99 D (P<0.001), 3.65 for myopia between -2.00 D and -3.99 D (P<0.001), 4.54 for myopia between -4.00 D and -5.99 D (P<0.001), and 3.61 for myopia -6.00 D or more (P = 0.002). For posterior subcapsular cataracts, ORs were 1.59 for myopia between -0.50 D and -1.99 D (P = 0.11), 3.22 for myopia between -2.00 D and -3.99 D (P = 0.002), 5.36 for myopia between -4.00 D and -5.99 D (P<0.001), and 12.34 for myopia -6.00 D or more (P<0.001). No association was found between myopia and cortical opacity. The association between posterior subcapsular opacity and myopia was equally strong for those wearing glasses by age 21 years and for those without glasses; for nuclear opacity, significantly higher ORs were found for myopes who started wearing glasses after age 21. CONCLUSIONS: These results confirm the previously reported association between myopia, posterior subcapsular opacity, and nuclear opacity. Furthermore, the strong association between early spectacle wear and posterior subcapsular opacity among myopes, absent for nuclear opacity, suggests that myopia may precede opacity in the case of posterior subcapsular opacity, but not nuclear opacity. Measures of association between posterior subcapsular opacity and myopia were stronger in the current study than have previously been found. Longitudinal studies to confirm the association are warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE:

To compare the outcomes of cataract surgery performed with 3 incision size-dependent phacoemulsification groups (1.8, 2.2, and 3.0 mm).

DESIGN:

Prospective randomized comparative study.

METHODS:

One hundred twenty eyes of 120 patients with age-related cataract (grades 2 to 4) were categorized according to the Lens Opacities Classification System III. Eligible subjects were randomly assigned to 3 surgical groups using coaxial phacoemulsification through 3 clear corneal incision sizes (1.8, 2.2, and 3.0 mm). Different intraoperative and postoperative outcome measures were obtained, with corneal incision size and surgically induced astigmatism as the main clinical outcomes.

RESULTS:

There were no statistically significant differences in most of the intraoperative and postoperative outcome measures among the 3 groups. However, the mean cord length of the clear corneal incision was increased in each group after surgery. The mean maximal clear corneal incision thickness in the 1.8-mm group was significantly greater than for the other groups at 1 month. The mean surgically induced astigmatism in the 1.8- and 2.2-mm groups was significantly less than that in the 3.0-mm group after 1 month, without significant difference between the 1.8- and 2.2-mm groups.

CONCLUSIONS:

With appropriate equipment, smaller incisions may result in less astigmatism, but the particular system used will influence incision stress and wound integrity, and may thus limit the reduction in incision size and astigmatism that is achievable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background

Specialty Registrars in Restorative Dentistry (StRs) should be competent in the independent restorative management of patients with developmental disorders including hypodontia and cleft lip/palate upon completion of their specialist training.1 Knowledge and management may be assessed via the Intercollegiate Specialty Fellowship Examination (ISFE) in Restorative Dentistry.2

Objective

The aim of this study was to collate and compare data on the training and experience of StRs in the management of patients with developmental disorders across different training units within the British Isles.

Methods

Questionnaires were distributed to all StRs attending the Annual General Meeting of the Specialty Registrars in Restorative Dentistry Group, Belfast, in October 2015. Participants were asked to rate their confidence and experience of assessing and planning treatment for patients with developmental disorders, construction of appropriate prostheses, and provision of dental implants. Respondents were also asked to record clinical supervision and didactic teaching at their unit, and to rate their confidence of passing a future ISFE station assessing knowledge of developmental disorders.

Results

Responses were obtained from 32 StRs (n=32) training within all five countries of the British Isles. The majority of respondents were based in England (72%) with three in Wales, and two in each of Scotland, Northern Ireland, and the Republic of Ireland. Approximately one third of respondents (34%) were in the final years of training (years 4-6). Almost half of the StRs reported that they were not confident of independently assessing (44%) new patients with a developmental disorder, with larger numbers (72%) indicating a lack of confidence in treatment planning. Six respondents rated their experience of treating obturator patients as ‘poor’ or ‘very poor’. The majority (56%) rated their experience of implant provision in these cases as ‘good’ or ‘excellent’ with three-quarters (75%) rating clinical supervision at their unit as ‘good’ or ‘excellent’. Less than half (41%) rated the didactic teaching at their unit as ‘good’ or ‘excellent’, and only 8 StRs indicated that they were confident of passing an ISFE station focused on developmental disorders.

Conclusion

Experience and training regarding patients with developmental disorders is inconsistent for StRs across the British Isles with a number of trainees reporting a lack of clinical exposure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The European “Community Bureau of Reference” (BCR) sequential extraction procedure, diffusive gradient in thin-films technique (DGT), and physiologically based extraction test were applied to assess metal bioavailability in sediments of Lake Taihu (n = 13). Findings from the three methods showed that Cd was a significant problem in the western lake whereas Cu, Zn, and Ni pollution was most severe in the northern lake. Results from the sequential extraction revealed that more than 50 % of the Cu and Zn were highly mobile and defined within the extractable fraction (AS1 + FM2 + OS3) in the majority of the sediments, in contrast extractable fractions of Ni and Cd were lower than 50 % in most of the sampling sites. Average Cu, Zn, Ni, and Cd bioaccessibilities were <50 % in the gastric phase. Zn and Cd bioaccessibility in the intestinal phase was ∼50 % lower than the gastric phase while bioaccessibilities of Cu and Ni were 47–57 % greater than the gastric phase. Linear regression analysis between DGT and BCR measurements indicated that the extractable fractions (AS1 + FM2 + OS3) in the reducing environment were the main source of DGT uptake, suggesting that DGT is a good in situ evaluation tool for metal bioavailability in sediments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Healthcare integration is a priority in many countries, yet there remains little direction on how to systematically evaluate this construct to inform further development. The examination of community-based palliative care networks provides an ideal opportunity for the advancement of integration measures, in consideration of how fundamental provider cohesion is to effective care at end of life.

AIM: This article presents a variable-oriented analysis from a theory-based case study of a palliative care network to help bridge the knowledge gap in integration measurement.

DESIGN: Data from a mixed-methods case study were mapped to a conceptual framework for evaluating integrated palliative care and a visual array depicting the extent of key factors in the represented palliative care network was formulated.

SETTING/PARTICIPANTS: The study included data from 21 palliative care network administrators, 86 healthcare professionals, and 111 family caregivers, all from an established palliative care network in Ontario, Canada.

RESULTS: The framework used to guide this research proved useful in assessing qualities of integration and functioning in the palliative care network. The resulting visual array of elements illustrates that while this network performed relatively well at the multiple levels considered, room for improvement exists, particularly in terms of interventions that could facilitate the sharing of information.

CONCLUSION: This study, along with the other evaluative examples mentioned, represents important initial attempts at empirically and comprehensively examining network-integrated palliative care and healthcare integration in general.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter focuses on the development of organizational creativity, using the CPS methodology, aiming at demonstrating its effectiveness in using the individual and team divergent thinking improvement in identifying organizational problems. A study was undertaken using problem solving teams in seven companies, in which each individual was submitted to a pre-post test in attitudes towards divergent thinking and asked to express the evaluation of the method. All the information reported in the sessions was recorded. The results indicate a change in attitude favourable to divergent thinking, the provision of a professional, efficient method of organizing knowledge in such a way that can help individuals to find original solutions to problems, and an important way to lead teams to creativity and innovation, according with companies different orientations.