818 resultados para MANAGEMENT OF HEALTH SERVICES
Level of contribution of intrinsic risk factors to the management of patients with plantar heel pain
Resumo:
Introduction: Injuries in the lower extremity are considered to have multifactorial causes, whilst people with heel pain represent the most frequent cause of visits to health professionals. Managing these patients can be very difficult. The purpose of this research is to identify key variables which can influence foot health in patients with heel pain. Materials and method: A cross-sectional observational study was carried out with a sample of sixty-two participants recruited from the Educational Welfare Unit of the University of Malaga. The therapists, blinded for the study, fill in the data with anthropometric information and the FPI, while participants fill in the foot health status questionnaire, FHSQ. The most significant results reveal that there is a moderate relation between the clinical variables and the FHSQ commands. The most significant contribution is the BMI in the foot health status questionnaire. Conclusion: The variables which can help manage clinical subjects with heel pain are age, BMI, footwear and FPI (left foot).
Resumo:
Phosphine is the primary fumigant used to protect the majority of the world' s grain and a variety of other stored commodities from insect pests. Phosphine is playing an increasingly important role in the protection of commodities for two primary reasons. Firstly, use of the alternative fumigant, methyl bromide, has been sharply curtailed and is tightly regulated due to its role in ozone depletion, and secondly, consumers are becoming increasingly intolerant of contact pesticides. Niche alternatives to phosphine exist, but they suffer from a range of factors that limit their use, including: 1) Limited commercial adoption due to expense or slow mode of action; 2) Poor efficacy due to low toxicity, rapid sorption, limited volatility or high density; 3) Public health concerns due to toxicity to handlers or nearby residents, as well as risk of explosion; 4) Poor consumer acceptance due to toxic residues or smell. These same factors limit the prospects of quickly identifying and deploying a new fumigant. Given that resistance toward phosphine is increasing among insect pests, improved monitoring and management of resistance is a priority. Knowledge of the mode of action of phosphine as well as the mechanisms of resistance may also greatly reduce the effort and expense of identifying synergists or novel replacement compounds.
Resumo:
1. The successful introduction of the red fox Vulpes vulpes into Australia in the 1870s has had dramatic and deleterious impacts on both native fauna and agricultural production. Historical accounts detail how the arrival of foxes in many areas coincided with the local demise of native fauna. Recent analyses suggest that native fauna can be successfully reintroduced to their former ranges only if foxes have been controlled, and several replicated removal experiments have confirmed that foxes are the major agents of extirpation of native fauna. Predation is the primary cause of losses, but competition and transmission of disease may be important for some species. 2. In agricultural landscapes, fox predation on lambs can cause losses of 1–30%; variation is due to flock size, health and management, as well as differences in the timing and duration of lambing and the density of foxes. 3. Fox control measures include trapping, shooting, den fumigation and exclusion fencing; baiting using the toxin 1080 is the most commonly employed method. Depending on the baiting strategy, habitat and area covered, baiting can reduce fox activity by 50–97%. We review patterns of baiting in a large sheep-grazing region in central New South Wales, and propose guidelines to increase landholder awareness of baiting strategies, to concentrate and coordinate bait use, and to maximize the cost-effectiveness of baiting programs. 4. The variable reduction in fox density within the baited area, together with the ability of the fox to recolonize rapidly, suggest that current baiting practices in eastern Australia are often ineffective, and that reforms are required. These might include increasing landholder awareness and involvement in group control programs, and the use of more efficient broadscale techniques, such as aerial baiting.
Resumo:
Background In a low socioeconomic status, small, rental retirement village, we have shown the older-aged managed their medicines poorly [1]. Objective As the number of participants was only 25, and the population in the rental retirement village turns over regularly; our objective was to determine whether the findings were consistent and ongoing. Methods We returned to the rental retirement villages after one and two years, and reassessed the management of medicines, using the same semi-structured interview method. Main outcome measure The main outcome measure was the perception of present and ongoing adherence. Results Although similar numbers (23-25) participated in the studies in 2011-2013, the actual participants changed with only 3 being interviewed on 3 occasions. Nevertheless, the findings over the 3 years were similar: less than 50% of the participants were adherent at the time of the study and unlikely to have problems in the next 6-12 months; only 50% had a good knowledge of their illnesses. Conclusions The management of medicines by the older-aged living in a low socioeconomic, rental retirement village is poor, and this finding is ongoing and consistent. This supports the need for extra assistance and resources for the older-aged, living in rental retirement villages, to manage their medicines.
Resumo:
This project built upon the successful outcomes of a previous project (TU02005) by adding to the database of salt tolerance among warm season turfgrass cultivars, through further hydroponic screening trials. Hydroponic screening trials focussed on new cultivars or cultivars that were not possible to cover in the time available under TU02005, including: 11 new cultivars of Paspalum vaginatum; 13 cultivars of Cynodon dactylon; six cultivars of Stenotaphrum secundatum; one accession of Cynodon transvaalensis; 12 Cynodon dactylon x transvaalensis hybrids; two cultivars of Sporobolus virginicus; five cultivars of Zoysia japonica; one cultivar of Z. macrantha, one common form of Z. tenuifolia and one Z. japonica x tenuifolia hybrid. The relative salinity tolerance of different turfgrasses is quantified in terms of their growth response to increasing levels of salinity, often defined by the salt level that equates to a 50% reduction in shoot yield, or alternatively the threshold salinity. The most salt tolerant species in these trials were Sporobolus virginicus and Paspalum vaginatum, consistent with the findings from TU02005 (Loch, Poulter et al. 2006). Cynodon dactylon showed the largest range in threshold values with some cultivars highly sensitive to salt, while others were tolerant to levels approaching that of the more halophytic grasses. Coupled with the observational and anecdotal evidence of high drought tolerance, this species and other intermediately tolerant species provide options for site specific situations in which soil salinity is coupled with additional challenges such as shade and high traffic conditions. By recognising the fact that a salt tolerant grass is not the complete solution to salinity problems, this project has been able to further investigate sustainable long-term establishment and management practices that maximise the ability of the selected grass to survive and grow under a particular set of salinity and usage parameters. Salt-tolerant turf grasses with potential for special use situations were trialled under field conditions at three sites within the Gold Coast City Council, while three sites, established under TU02005 within the Redland City Council boundaries were monitored for continued grass survival. Several randomised block experiments within Gold Coast City were established to compare the health and longevity of seashore paspalum (Paspalum vaginatum), Manila grass (Zoysia matrella), as well as the more tolerant cultivars of other species like buffalo grass (Stenotaphrum secundatum) and green couch (Cynodon dactylon). Whilst scientific results were difficult to achieve in the field situation, where conditions cannot be controlled, these trials provided valuable observational evidence of the likely survival of these species. Alternatives to laying full sod such as sprigging were investigated, and were found to be more appropriate for areas of low traffic as the establishment time is greater. Trials under controlled and protected conditions successfully achieved a full cover of Paspalum vaginatum from sprigs in a 10 week time frame. Salt affected sites are often associated with poor soil structure. Part of the research investigated techniques for the alleviation of soil compaction frequently found on saline sites. Various methods of soil de-compaction were investigated on highly compacted heavy clay soil in Redlands City. It was found that the heavy duplex soil of marine clay sediments required the most aggressive of treatments in order to achieve limited short-term effects. Interestingly, a well constructed sports field showed a far greater and longer term response to de-compaction operations, highlighting the importance of appropriate construction in the successful establishment and management of turfgrasses on salt affected sites. Fertiliser trials in this project determined plant demand for nitrogen (N) to species level. This work produced data that can be used as a guide when fertilising, in order to produce optimal growth and quality in the major turf grass species used in public parkland. An experiment commenced during TU02005 and monitored further in this project, investigated six representative warm-season turfgrasses to determine the optimum maintenance requirements for fertiliser N in south-east Queensland. In doing so, we recognised that optimum level is also related to use and intensity of use, with high profile well-used parks requiring higher maintenance N than low profile parks where maintaining botanical composition at a lower level of turf quality might be acceptable. Kikuyu (Pennisetum clandestinum) seemed to require the greatest N input (300-400 kg N/ha/year), followed by the green couch (Cynodon dactylon) cultivars ‘Wintergreen’ and ‘FLoraTeX’ requiring approximately 300 kg N/ha/year for optimal condition and growth. ‘Sir Walter’ (Stenotaphrum secundatum) and ‘Sea Isle 1’ (Paspalum vaginatum) had a moderate requirement of approximately 200 kg/ha/year. ‘Aussiblue’ (Digitaria didactyla)maintained optimal growth and quality at 100-200 kg N/ha/year. A set of guidelines has been prepared to provide various options from the construction and establishment of new grounds, through to the remediation of existing parklands by supporting the growth of endemic grasses. They describe a best management process through which salt affected sites should be assessed, remediated and managed. These guidelines, or Best Management Practices, will be readily available to councils. Previously, some high salinity sites have been turfed several times over a number of years (and Council budgets) for a 100% failure record. By eliminating this budgetary waste through targeted workable solutions, local authorities will be more amenable to investing appropriate amounts into these areas. In some cases, this will lead to cost savings as well as resulting in better quality turf. In all cases, however, improved turf quality will be of benefit to ratepayers, directly through increased local use of open space in parks and sportsfields and indirectly by attracting tourists and other visitors to the region bringing associated economic benefits. At the same time, environmental degradation and erosion of soil in bare areas will be greatly reduced.
Resumo:
Aims: The aims of this study were 1) to identify and describe health economic studies that have used quality-adjusted life years (QALYs) based on actual measurements of patients' health-related quality of life (HRQoL); 2) to test the feasibility of routine collection of health-related quality of life (HRQoL) data as an indicator of effectiveness of secondary health care; and 3) to establish and compare the cost-utility of three large-volume surgical procedures in a real-world setting in the Helsinki University Central Hospital, a large referral hospital providing secondary and tertiary health-care services for a population of approximately 1.4 million. Patients and methods: So as to identify studies that have used QALYs as an outcome measure, a systematic search of the literature was performed using the Medline, Embase, CINAHL, SCI and Cochrane Library electronic databases. Initial screening of the identified articles involved two reviewers independently reading the abstracts; the full-text articles were also evaluated independently by two reviewers, with a third reviewer used in cases where the two reviewers could not agree a consensus on which articles should be included. The feasibility of routinely evaluating the cost-effectiveness of secondary health care was tested by setting up a system for collecting HRQoL data on approximately 4 900 patients' HRQoL before and after operative treatments performed in the hospital. The HRQoL data used as an indicator of treatment effectiveness was combined with diagnostic and financial indicators routinely collected in the hospital. To compare the cost-effectiveness of three surgical interventions, 712 patients admitted for routine operative treatment completed the 15D HRQoL questionnaire before and also 3-12 months after the operation. QALYs were calculated using the obtained utility data and expected remaining life years of the patients. Direct hospital costs were obtained from the clinical patient administration database of the hospital and a cost-utility analysis was performed from the perspective of the provider of secondary health care services. Main results: The systematic review (Study I) showed that although QALYs gained are considered an important measure of the effectiveness of health care, the number of studies in which QALYs are based on actual measurements of patients' HRQoL is still fairly limited. Of the reviewed full-text articles, only 70 reported QALYs based on actual before after measurements using a valid HRQoL instrument. Collection of simple cost-effectiveness data in secondary health care is feasible and could easily be expanded and performed on a routine basis (Study II). It allows meaningful comparisons between various treatments and provides a means for allocating limited health care resources. The cost per QALY gained was 2 770 for cervical operations and 1 740 for lumbar operations. In cases where surgery was delayed the cost per QALY was doubled (Study III). The cost per QALY ranges between subgroups in cataract surgery (Study IV). The cost per QALY gained was 5 130 for patients having both eyes operated on and 8 210 for patients with only one eye operated on during the 6-month follow-up. In patients whose first eye had been operated on previous to the study period, the mean HRQoL deteriorated after surgery, thus precluding the establishment of the cost per QALY. In arthroplasty patients (Study V) the mean cost per QALY gained in a one-year period was 6 710 for primary hip replacement, 52 270 for revision hip replacement, and 14 000 for primary knee replacement. Conclusions: Although the importance of cost-utility analyses has during recent years been stressed, there are only a limited number of studies in which the evaluation is based on patients own assessment of the treatment effectiveness. Most of the cost-effectiveness and cost-utility analyses are based on modeling that employs expert opinion regarding the outcome of treatment, not on patient-derived assessments. Routine collection of effectiveness information from patients entering treatment in secondary health care turned out to be easy enough and did not, for instance, require additional personnel on the wards in which the study was executed. The mean patient response rate was more than 70 %, suggesting that patients were happy to participate and appreciated the fact that the hospital showed an interest in their well-being even after the actual treatment episode had ended. Spinal surgery leads to a statistically significant and clinically important improvement in HRQoL. The cost per QALY gained was reasonable, at less than half of that observed for instance for hip replacement surgery. However, prolonged waiting for an operation approximately doubled the cost per QALY gained from the surgical intervention. The mean utility gain following routine cataract surgery in a real world setting was relatively small and confined mostly to patients who had had both eyes operated on. The cost of cataract surgery per QALY gained was higher than previously reported and was associated with considerable degree of uncertainty. Hip and knee replacement both improve HRQoL. The cost per QALY gained from knee replacement is two-fold compared to hip replacement. Cost-utility results from the three studied specialties showed that there is great variation in the cost-utility of surgical interventions performed in a real-world setting even when only common, widely accepted interventions are considered. However, the cost per QALY of all the studied interventions, except for revision hip arthroplasty, was well below 50 000, this figure being sometimes cited in the literature as a threshold level for the cost-effectiveness of an intervention. Based on the present study it may be concluded that routine evaluation of the cost-utility of secondary health care is feasible and produces information essential for a rational and balanced allocation of scarce health care resources.
Resumo:
Although the principle of equal access to medically justified treatment has been promoted by official health policies in many Western health care systems, practices do not completely meet policy targets. Waiting times for elective surgery vary between patient groups and regions, and growing problems in the availability of services threaten equal access to treatment. Waiting times have come to the attention of decision-makers, and several policy initiatives have been introduced to ensure the availability of care within a reasonable time. In Finland, for example, the treatment guarantee came into force in 2005. However, no consensus exists on optimal waiting time for different patient groups. The purpose of this multi-centre randomized controlled trial was to analyse health-related quality of life, pain and physical function in total hip or knee replacement patients during the waiting time and to evaluate whether the waiting time is associated with patients health outcomes at admission. This study also assessed whether the length of waiting time is associated with social and health services utilization in patients awaiting total hip or knee replacement. In addition, patients health-related quality of life was compared with that of the general population. Consecutive patients with a need for a primary total hip or knee replacement due to osteoarthritis were placed on the waiting list between August 2002 and November 2003. Patients were randomly assigned to a short waiting time (maximum 3 months) or a non-fixed waiting time (waiting time not fixed in advance, instead the patient followed the hospitals routine practice). Patients health-related quality of life was measured upon being placed on the waiting list and again at hospital admission using the generic 15D instrument. Pain and physical function were evaluated using the self-report Harris Hip Score for hip patients and a scale modified from the Knee Society Clinical Rating System for knee patients. Utilization measures were the use of home health care, rehabilitation and social services, physician visits and inpatient care. Health and social services use was low in both waiting time groups. The most common services used while waiting were rehabilitation services and informal care, including unpaid care provided by relatives, neighbours and volunteers. Although patients suffered from clear restrictions in usual activities and physical functioning, they seemed primarily to lean on informal care and personal networks instead of professional care. While longer waiting time did not result in poorer health-related quality of life at admission and use of services during the waiting time was similar to that at the time of placement on the list, there is likely to be higher costs of waiting by people who wait longer simply because they are using services for a longer period. In economic terms, this would represent a negative impact of waiting. Only a few reports have been published of the health-related quality of life of patients awaiting total hip or knee replacement. These findings demonstrate that, in addition to physical dimensions of health, patients suffered from restrictions in psychological well-being such as depression, distress and reduced vitality. This raises the question of how to support patients who suffer from psychological distress during the waiting time and how to develop strategies to improve patients initiatives to reduce symptoms and the burden of waiting. Key words: waiting time, total hip replacement, total knee replacement, health-related quality of life, randomized controlled trial, outcome assessment, social service, utilization of health services
Resumo:
Escalating health care delivery costs and consumer expectations have led to a range of health service and workforce innovations in the provision of high quality cost effective patient care. This research has operationalised a theoretical framework to examine factors that influence sustainability of health service innovations, in particular, emergency nurse practitioner service. The results of this research will inform health service policy and practice for future implementation of innovative workforce models and add to the understanding of factors that influence sustainability.
Resumo:
In this 'Summary Guidance for Daily Practice', we describe the basic principles of prevention and management of foot problems in persons with diabetes. This summary is based on the International Working Group on the Diabetic Foot (IWGDF) Guidance 2015. There are five key elements that underpin prevention of foot problems: (1) identification of the at-risk foot; (2) regular inspection and examination of the at-risk foot; (3) education of patient, family and healthcare providers; (4) routine wearing of appropriate footwear, and; (5) treatment of pre-ulcerative signs. Healthcare providers should follow a standardized and consistent strategy for evaluating a foot wound, as this will guide further evaluation and therapy. The following items must be addressed: type, cause, site and depth, and signs of infection. There are seven key elements that underpin ulcer treatment: (1) relief of pressure and protection of the ulcer; (2) restoration of skin perfusion; (3) treatment of infection; (4) metabolic control and treatment of co-morbidity; (5) local wound care; (6) education for patient and relatives, and; (7) prevention of recurrence. Finally, successful efforts to prevent and manage foot problems in diabetes depend upon a well-organized team, using a holistic approach in which the ulcer is seen as a sign of multi-organ disease, and integrating the various disciplines involved.
Resumo:
Foot problems complicating diabetes are a source of major patient suffering and societal costs. Investing in evidence-based, internationally appropriate diabetic foot care guidance is likely among the most cost-effective forms of healthcare expenditure, provided it is goal-focused and properly implemented. The International Working Group on the Diabetic Foot (IWGDF) has been publishing and updating international Practical Guidelines since 1999. The 2015 updates are based on systematic reviews of the literature, and recommendations are formulated using the Grading of Recommendations Assessment Development and Evaluation system. As such, we changed the name from 'Practical Guidelines' to 'Guidance'. In this article we describe the development of the 2015 IWGDF Guidance documents on prevention and management of foot problems in diabetes. This Guidance consists of five documents, prepared by five working groups of international experts. These documents provide guidance related to foot complications in persons with diabetes on: prevention; footwear and offloading; peripheral artery disease; infections; and, wound healing interventions. Based on these five documents, the IWGDF Editorial Board produced a summary guidance for daily practice. The resultant of this process, after reviewed by the Editorial Board and by international IWGDF members of all documents, is an evidence-based global consensus on prevention and management of foot problems in diabetes. Plans are already under way to implement this Guidance. We believe that following the recommendations of the 2015 IWGDF Guidance will almost certainly result in improved management of foot problems in persons with diabetes and a subsequent worldwide reduction in the tragedies caused by these foot problems.
Resumo:
Chronic wounds cost the Australian health system at least US$2·85 billion per year. Wound care services in Australia involve a complex mix of treatment options, health care sectors and funding mechanisms. It is clear that implementation of evidence-based wound care coincides with large health improvements and cost savings, yet the majority of Australians with chronic wounds do not receive evidence-based treatment. High initial treatment costs, inadequate reimbursement, poor financial incentives to invest in optimal care and limitations in clinical skills are major barriers to the adoption of evidence-based wound care. Enhanced education and appropriate financial incentives in primary care will improve uptake of evidence-based practice. Secondary-level wound specialty clinics to fill referral gaps in the community, boosted by appropriate credentialing, will improve access to specialist care. In order to secure funding for better services in a competitive environment, evidence of cost-effectiveness is required. Future effort to generate evidence on the cost-effectiveness of wound management interventions should provide evidence that decision makers find easy to interpret. If this happens, and it will require a large effort of health services research, it could be used to inform future policy and decision-making activities, reduce health care costs and improve patient outcomes.
Resumo:
OBJECTIVE We aimed to 1) describe the peripartum management of type 1 diabetes at an Australian teaching hospital and 2) discuss factors influencing the apparent transient insulin independence postpartum. RESEARCH DESIGN AND METHODS We conducted a retrospective review of women with type 1 diabetes delivering singleton pregnancies from 2005 to 2010. Information was collected regarding demographics, medical history, peripartum management and outcome, and breast-feeding. To detect a difference in time to first postpartum blood glucose level (BGL) >8 mmol/L between women with an early (<4 h) and late (>12 h) requirement for insulin postpartum, with a power of 80% and a type 1 error of 0.05, at least 24 patients were required. RESULTS An intravenous insulin infusion was commenced in almost 95% of women. Univariate analysis showed that increased BMI at term, lower creatinine at term, longer duration from last dose of long- or intermediate-acting insulin, and discontinuation of an insulin infusion postpartum were associated with a shorter time to first requirement of insulin postpartum (P = 0.005, 0.026, 0.026, and <0.001, respectively). There was a correlation between higher doses of insulin commenced postpartum and number of out-of-range BGLs (r[36] = 0.358, P = 0.030) and hypoglycemia (r[36] = 0.434, P = 0.007). Almost 60% had at least one BGL <3.5 mmol/L between delivery and discharge. CONCLUSIONS Changes in the pharmacodynamic profile of insulin may contribute to the transient insulin independence sometimes observed postpartum in type 1 diabetes. A dose of 50–60% of the prepregnancy insulin requirement resulted in the lowest rate of hypoglycemia and glucose excursions. These results require validation in a larger, prospective study.
Resumo:
This thesis studies the incentives and behaviour of providers of expert services, like doctors, financial advisors and mechanics. The focus is in particular on provision of health care using a series of credence goods experiments conducted to investigate undertreatment, overtreatment and overcharging in a medical context. The findings of study one suggest that a medical framing compared to a neutral framing significantly increases pro-social behaviour for standard participants in economic experiments. Study two compares the behaviour of medical practitioners - mainly doctors - to students. It is observed that medical doctors’ undertreat and overcharge significantly less, but at the same time overtreat significantly more than students. The final study compares behaviours for other experts - accountants, engineers and lawyers - using experimental framings drawn from the respective contexts and students from the respective faculties as participants in credence goods experiments.
Resumo:
Persistent pain is a commonly experienced symptom. It affects 25% of community-dwelling older adults and up to 80% of nursing home residents, and can have a major impact on quality of life and functional capacity. Unfortunately pain in older patients is often undertreated and misunderstood. Assessment of pain type and severity is important. Most older people, even with moderately impaired cognition, are able to self-report pain. Validated assessment tools using non-verbal pain cues are available for people with more advanced cognitive impairment. Management of pain in older people can be challenging. Physiological changes may impact on pain perception and the pharmacodynamics and pharmacokinetics of medications. Older people are often more sensitive to the adverse effects of analgesic medications and are at risk of drug–drug interactions due to the presence of co-morbidities and polypharmacy. In general, analgesic medications should be commenced at low doses, titrated based on effect and tolerability, and regularly reviewed. Contemporary pain management often utilises multiple analgesics in lower doses to optimise efficacy and avoid dose-related toxicity. A bio-psycho-social approach to the management of persistent pain, utilising a multidisciplinary team and including non-drug strategies, may produce the best results. The goal of pain management is not always to eliminate pain, since this may not be attainable, but rather to enhance function and improve quality of life. This article discusses persistent non-cancer pain in older people, its assessment and management, and the risks and benefits of pharmacological treatment in this population.
Resumo:
M.A. (Educ.) Anu Kajamaa from the University of Helsinki, Center for Research on Activity, Development and Learning (CRADLE), examines change efforts and their consequences in health care in the public sector. The aim of her academic dissertation is, by providing a new conceptual framework, to widen our understanding of organizational change efforts and their consequences and managerial challenges. Despite the multiple change efforts, the results of health care development projects have not been very promising, and many developmental needs and managerial challenges exist. The study challenges the predominant, well-framed health care change paradigm and calls for an expanded view to explore the underlying issues and multiplicities of change efforts and their consequences. The study asks what kind of expanded conceptual framework is needed to better understand organizational change as transcending currently dominant oppositions in management thinking, specifically in the field of health care. The study includes five explorative case studies of health care change efforts and their consequences in Finland. Theory and practice are tightly interconnected in the study. The methodology of the study integrates the ethnography of organizational change, a narrative approach and cultural-historical activity theory. From the stance of activity theory, historicity, contradictions, locality and employee participation play significant roles in developing health care. The empirical data of the study has mainly been collected in two projects, funded by the Finnish Work Environment Fund. The data was collected in public sector health care organizations during the years 2004-2010. By exploring the oppositions between distinct views on organizational change and the multi-site, multi-level and multi-logic of organizational change, the study develops an expanded, multidimensional activity-theoretical framework on organizational change and management thinking. The findings of the study contribute to activity theory and organization studies, and provide information for health care management and practitioners. The study illuminates that continuous development efforts bridged to one another and anchored to collectively created new activity models can lead to significant improvements and organizational learning in health care. The study presents such expansive learning processes. The ways of conducting change efforts in organizations play a critical role in the creation of collective new practices and tools and in establishing ownership over them. Some of the studied change efforts were discontinuous or encapsulated, not benefiting the larger whole. The study shows that the stagnation and unexpected consequences of change efforts relate to the unconnectedness of the different organizational sites, levels and logics. If not dealt with, the unintended consequences such as obstacles, breaks and conflicts may stem promising change and learning processes.