52 resultados para Postoperative Care.
em Helda - Digital Repository of University of Helsinki
Resumo:
Background: Patients may need massive volume-replacement therapy after cardiac surgery because of large fluid transfer perioperatively, and the use of cardiopulmonary bypass. Hemodynamic stability is better maintained with colloids than crystalloids but colloids have more adverse effects such as coagulation disturbances and impairment of renal function than do crystalloids. The present study examined the effects of modern hydroxyethyl starch (HES) and gelatin solutions on blood coagulation and hemodynamics. The mechanism by which colloids disturb blood coagulation was investigated by thromboelastometry (TEM) after cardiac surgery and in vitro by use of experimental hemodilution. Materials and methods: Ninety patients scheduled for elective primary cardiac surgery (Studies I, II, IV, V), and twelve healthy volunteers (Study III) were included in this study. After admission to the cardiac surgical intensive care unit (ICU), patients were randomized to receive different doses of HES 130/0.4, HES 200/0.5, or 4% albumin solutions. Ringer’s acetate or albumin solutions served as controls. Coagulation was assessed by TEM, and hemodynamic measurements were based on thermodilutionally measured cardiac index (CI). Results: HES and gelatin solutions impaired whole blood coagulation similarly as measured by TEM even at a small dose of 7 mL/kg. These solutions reduced clot strength and prolonged clot formation time. These effects were more pronounced with increasing doses of colloids. Neither albumin nor Ringer’s acetate solution disturbed blood coagulation significantly. Coagulation disturbances after infusion of HES or gelatin solutions were clinically slight, and postoperative blood loss was comparable with that of Ringer’s acetate or albumin solutions. Both single and multiple doses of all the colloids increased CI postoperatively, and this effect was dose-dependent. Ringer’s acetate had no effect on CI. At a small dose (7 mL/kg), the effect of gelatin on CI was comparable with that of Ringer’s acetate and significantly less than that of HES 130/0.4 (Study V). However, when the dose was increased to 14 and 21 mL/kg, the hemodynamic effect of gelatin rose and became comparable with that of HES 130/0.4. Conclusions: After cardiac surgery, HES and gelatin solutions impaired clot strength in a dose-dependent manner. The potential mechanisms were interaction with fibrinogen and fibrin formation, resulting in decreased clot strength, and hemodilution. Although the use of HES and gelatin inhibited coagulation, postoperative bleeding on the first postoperative morning in all the study groups was similar. A single dose of HES solutions improved CI postoperatively more than did gelatin, albumin, or Ringer’s acetate. However, when administered in a repeated fashion, (cumulative dose of 14 mL/kg or more), no differences were evident between HES 130/0.4 and gelatin.
Resumo:
Continuous epidural analgesia (CEA) and continuous spinal postoperative analgesia (CSPA) provided by a mixture of local anaesthetic and opioid are widely used for postoperative pain relief. E.g., with the introduction of so-called microcatheters, CSPA found its way particularly in orthopaedic surgery. These techniques, however, may be associated with dose-dependent side-effects as hypotension, weakness in the legs, and nausea and vomiting. At times, they may fail to offer sufficient analgesia, e.g., because of a misplaced catheter. The correct position of an epidural catheter might be confirmed by the supposedly easy and reliable epidural stimulation test (EST). The aims of this thesis were to determine a) whether the efficacy, tolerability, and reliability of CEA might be improved by adding the α2-adrenergic agonists adrenaline and clonidine to CEA, and by the repeated use of EST during CEA; and, b) the feasibility of CSPA given through a microcatheter after vascular surgery. Studies I IV were double-blinded, randomized, and controlled trials; Study V was of a diagnostic, prospective nature. Patients underwent arterial bypass surgery of the legs (I, n=50; IV, n=46), total knee arthroplasty (II, n=70; III, n=72), and abdominal surgery or thoracotomy (V, n=30). Postoperative lumbar CEA consisted of regular mixtures of ropivacaine and fentanyl either without or with adrenaline (2 µg/ml (I) and 4 µg/ml (II)) and clonidine (2 µg/ml (III)). CSPA (IV) was given through a microcatheter (28G) and contained either ropivacaine (max. 2 mg/h) or a mixture of ropivacaine (max. 1 mg/h) and morphine (max. 8 µg/h). Epidural catheter tip position (V) was evaluated both by EST at the moment of catheter placement and several times during CEA, and by epidurography as reference diagnostic test. CEA and CSPA were administered for 24 or 48 h. Study parameters included pain scores assessed with a visual analogue scale, requirements of rescue pain medication, vital signs, and side-effects. Adrenaline (I and II) had no beneficial influence as regards the efficacy or tolerability of CEA. The total amounts of epidurally-infused drugs were even increased in the adrenaline group in Study II (p=0.02, RM ANOVA). Clonidine (III) augmented pain relief with lowered amounts of epidurally infused drugs (p=0.01, RM ANOVA) and reduced need for rescue oxycodone given i.m. (p=0.027, MW-U; median difference 3 mg (95% CI 0 7 mg)). Clonidine did not contribute to sedation and its influence on haemodynamics was minimal. CSPA (IV) provided satisfactory pain relief with only limited blockade of the legs (no inter-group differences). EST (V) was often related to technical problems and difficulties of interpretation, e.g., it failed to identify the four patients whose catheters were outside the spinal canal already at the time of catheter placement. As adjuvants to lumbar CEA, clonidine only slightly improved pain relief, while adrenaline did not provide any benefit. The role of EST applied at the time of epidural catheter placement or repeatedly during CEA remains open. The microcatheter CSPA technique appeared effective and reliable, but needs to be compared to routine CEA after peripheral arterial bypass surgery.
Resumo:
The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.
Resumo:
The proportion of patients over 75 years of age, receiving all different types of healthcare, is constantly increasing. The elderly undergo surgery and anaesthetic procedures more often than middle-aged patients. Poor pain management in the elderly is still an issue. Although the elderly consumes the greatest proportion of prescribed medicines in Western Europe, most clinical pharmacological studies have been performed in healthy volunteers or middle-aged patients. The aim of this study was to investigate pain measurement and management in cognitively impaired patients in long term hospital care and in cognitively normal elderly patients after cardiac surgery. This thesis incorporated 366 patients, including 86 home-dwelling or hospitalized elderly with chronic pain and 280 patients undergoing cardiac surgery with acute pain. The mean age of patients was 77 (SD ± 8) years and approximately 8400 pain measurements were performed with four pain scales: Verbal Rating Scale (VRS), the Visual Analogue Scale (VAS), the Red Wedge Scale (RWS), and the Facial Pain Scale (FPS). Cognitive function, depression, functional ability in daily life, postoperative sedation and postoperative confusion were assessed with MMSE, GDS, Barthel Index, RASS, and CAM-ICU, respectively. The effects and plasma concentrations of fentanyl and oxycodone were measured in elderly (≥ 75 years) and middle-aged patients (≤ 60 years) and the opioid-sparing effect of pregabalin was studied after cardiac surgery. The VRS pain scores after movement correlated with the Barthel Index. The VRS was most successful in the groups of demented patients (MMSE 17-23, 11-16 and ≤ 10) and in elderly patients on the first day after cardiac surgery. The elderly had a higher plasma concentration of fentanyl at the end of surgery than younger patients. The plasma concentrations of oxycodone were comparable between the groups. Pain intensity on the VRS was lower and the sedation scores were higher in the elderly. Total oxycodone consumption during five postoperative days was reduced by 48% and the CAM-ICU scores were higher on the first postoperative day in the pregabalin group. The incidence of postoperative pain during movement was lower in the pregabalin group three months after surgery. This investigation demonstrates that chronic pain did not seem to impair daily activities in home-dwelling Finnish elderly. The VRS appeared to be applicable for elderly patients with clear cognitive dysfunction (MMSE ≤17) and it was the most feasible pain scale for the early postoperative period after cardiac surgery. After cardiac surgery, plasma concentrations of fentanyl in elderly were elevated, although oxycodone concentrations were at similar level compared to middle-aged patients. The elderly had less pain and were more sedated after doses of oxycodone. Therefore, particular attention must be given to individual dosing of the opioids in elderly surgical patients, who often need a smaller amount for adequate analgesia than middle-aged patients. The administration of pregabalin reduced postoperative oxycodone consumption after cardiac surgery. Pregabalin-treated patients had less confusion, and additionally to less postoperative pain on the first postoperative day and during movement at three months post-surgery. Pregabalin might be a new alternative as analgesic for acute postoperative and chronic pain management in the elderly. Its clinical role and safety remains to be verified in large-scale randomized and controlled studies. In the future, many clinical trials in the older category of patients will be needed to facilitate improvements in health care methods.
Resumo:
This study examines boundaries in health care organizations. Boundaries are sometimes considered things to be avoided in everyday living. This study suggests that boundaries can be important temporally and spatially emerging locations of development, learning, and change in inter-organizational activity. Boundaries can act as mediators of cultural and social formations and practices. The data of the study was gathered in an intervention project during the years 2000-2002 in Helsinki in which the care of 26 patients with multiple and chronic illnesses was improved. The project used the Change Laboratory method that represents a research assisted method for developing work. The research questions of the study are: (1) What are the boundary dynamics of development, learning, and change in health care for patients with multiple and chronic illnesses? (2) How do individual patients experience boundaries in their health care? (3) How are the boundaries of health care constructed and reconstructed in social interaction? (4) What are the dynamics of boundary crossing in the experimentation with the new tools and new practice? The methodology of the study, the ethnography of the multi-organizational field of activity, draws on cultural-historical activity theory and anthropological methods. The ethnographic fieldwork involves multiple research techniques and a collaborative strategy for raising research data. The data of this study consists of observations, interviews, transcribed intervention sessions, and patients' health documents. According to the findings, the care of patients with multiple and chronic illnesses emerges as fragmented by divisions of a patient and professionals, specialties of medicine and levels of health care organization. These boundaries have a historical origin in the Finnish health care system. As an implication of these boundaries, patients frequently experience uncertainty and neglect in their care. However, the boundaries of a single patient were transformed in the Change Laboratory discussions among patients, professionals and researchers. In these discussions, the questioning of the prevailing boundaries was triggered by the observation of gaps in inter-organizational care. Transformation of the prevailing boundaries was achieved in implementation of the collaborative care agreement tool and the practice of negotiated care. However, the new tool and practice did not expand into general use during the project. The study identifies two complementary models for the development of health care organization in Finland. The 'care package model', which is based on productivity and process models adopted from engineering and the 'model of negotiated care', which is based on co-configuration and the public good.
Resumo:
The aim of the present study was to determine relationships between insurance status and utilization of oral health care and its characteristics and to identify factors related to insured patients’ selection of dental clinic or dentist. The study was based on cross-sectional data obtained through phone interviews. The target population included adults in the city of Tehran. Using a two-stage stratified random technique, 3,200 seven-digit numbers resembling real phone numbers were drawn; when calling, 1,669 numbers were unavailable (busy, no answer, fax, line blocked). Of the 1,531 subjects who answered the phone call, 224 were outside the target age (under 18), and 221 refused to respond, leaving 1,086 subjects in the final sample. The interviews were carried out using a structured questionnaire and covered characteristics of dental visits, the respondent’s reason for selecting a particular dentist or clinic and demographic and socio-economic background (gender, age, level of education, income, and insurance status). Data analysis included the Chi-square test, ANOVA, and logistic regression and the corresponding odds ratios (OR). Of all the 1,086 respondents, 57% were women, 62% were under age 35, 46% had a medium and 34% a high level of education, 13% were under the poverty line, and 70% had insurance coverage; 64% with the public, and 6% with a commercial insurance. Having insurance coverage was more likely for women (OR=1.5), for those in the oldest age group (OR=2.0), and for those with a high level of education (OR=2.5). Of those with dental insurance, 54% reported having had a dental visit within the past 12 months ; more often by those with commercial insurance in comparison with public (65% vs. 53% p<0.001). Check-up as the reason for the most recent visit occurred most frequently among those with commercial insurance (28%) compared with those having public insurance (16%) or being non-insured (13%) (p<0.001). Having had two or more dental visits within the past 12 months was most common among insured respondents, when compared with the non-insured (31% vs. 22% p=0.01). The non-insured respondents reported tooth extractions almost twice as frequently as did the insured ones (p<0.001). Of the 726 insured subjects, 60% selected fully out-of-pocket-paid services (FOP), and 53% were unaware of their insurance benefits. Of those who selected FOP, good interpersonal aspects (OR=4.6), being unaware of dental insurance benefits (OR=4.6), and good technical aspects (OR=2.3) as a reason had greater odds of selecting FOP. The present study revealed that dental insurance was positively related to demand for oral health care as well as to utilization of services, but to the latter with a minor extent. Among insured respondents, despite their opportunity to use fully or highly subsidized oral health care services, good interpersonal relationship and high quality of services were the most important factors when an insured patient selected a dentist or a clinic. The present findings indicate a clear need to modify dental insurance systems in Iran to facilitate optimal use of oral health care services to maximize the oral health of the population. A special emphasis in the insurance schemes should be focused on preventive care.
Resumo:
The present cross-sectional study aimed to assess oral health behaviour, dental and periodontal conditions, dental care, and their relationships among elderly dentate patients in Lithuania. The target population in the study were dentate patients aged 60 and older attending public dental services in Kedainiai, Lithuania. The data collection took place between the autumn of 1999 and the winter of 2001. Data were collected by means of a self-administered questionnaire for all (n=174) and a clinical examination targeting about half of the subjects (n=100). The questionnaire inquired about oral health behaviour, the life-first and also the most recent dental treatments, sources on and self-assessed knowledge of oral self-care, a self-reported number of teeth, and socio-demographic information. The clinical examination included basic dental and periodontal conditions. A total of 82 women and 92 men completed the questionnaire; their mean age was 69.2 and their average number of teeth was 16.2 (CI 95% 15.4-17.1). In all, 25% had 21 or more teeth and 32% indicated wearing removable dentures. The oral health behaviour, the participants reported, was poor: 30% reported twice daily toothbrushing, 57% responded that they always use fluoride toothpaste, 19% indicated daily interdental cleaning, nearly all said they take sugar in their coffee and tea, and 30% indicated going for check-ups. As the main source of information on oral self-care, the subjects indicated health professionals (82%), followed by social contacts (72%), broadcasted media (58%), and printed media (42%). A total of 34% assessed their knowledge of oral self-care as good, and their self-assessed knowledge correlated (r=0.52) with professional guidance they had received about oral self-care. In their most recent treatment, conservative (39%) and non-conservative (34%) treatments dominated, and preventive ones were the least reported (7%). Regarding guidance in oral self-care, 54% reported having received such about toothbrushing, 32% about interdental cleaning, and 33% had been given visual information. Clinical examinations revealed the presence of plaque, calculus, bleeding on probing and deepened pockets in all of the subjects; 70% of the subjects were diagnosed with pockets of 6mm and deeper, 94% with caries, and 73% with overhangs of restorations. Those subjects assessing their knowledge of oral self-care as good and reporting a higher intensity of guidance in oral self-care as received, indicated practicing the recommended oral self-care more frequently. Twice daily toothbrushing was associated with good self-assessed knowledge of oral self-care (OR 4.1, p<0.001) and a university education (OR 5.6, p<0.001). Those subjects with better oral health behaviour had a greater number of teeth. Having 21 or more teeth was associated with good self-assessed knowledge of oral self-care (OR 4.1, p=0.03). Better periodontal conditions were associated with a higher frequency of toothbrushing. The presence of periodontal pockets of 6mm and deeper was associated with the level of self-assessed knowledge of oral self-care being below good (OR=3.0, p=0.04) and the level of dental cleanliness being poor (OR=2.7, p=0.02). To conclude, oral health behaviour and conditions call for improvement in elderly subjects in Lithuania. To improve the oral health of their elderly dentate patients, dentists should apply all the available tools of chair-side prevention and active guidance. The latter would be an effective means of updating the knowledge of oral self-care and supporting recommended oral health behaviour. A preventive approach should be strongly emphasized in countries with limited resources for oral health care, such as Lithuania. Author’s address: Sonata Vyšniauskaite, Department of Oral Public Health, Institute of Dentistry, University of Helsinki, P.O.Box 41, FI-00014 Helsinki, Finland. E-mail: sonata.vysniauskaite@helsinki.fi
Resumo:
Aims: The aims of this study were 1) to identify and describe health economic studies that have used quality-adjusted life years (QALYs) based on actual measurements of patients' health-related quality of life (HRQoL); 2) to test the feasibility of routine collection of health-related quality of life (HRQoL) data as an indicator of effectiveness of secondary health care; and 3) to establish and compare the cost-utility of three large-volume surgical procedures in a real-world setting in the Helsinki University Central Hospital, a large referral hospital providing secondary and tertiary health-care services for a population of approximately 1.4 million. Patients and methods: So as to identify studies that have used QALYs as an outcome measure, a systematic search of the literature was performed using the Medline, Embase, CINAHL, SCI and Cochrane Library electronic databases. Initial screening of the identified articles involved two reviewers independently reading the abstracts; the full-text articles were also evaluated independently by two reviewers, with a third reviewer used in cases where the two reviewers could not agree a consensus on which articles should be included. The feasibility of routinely evaluating the cost-effectiveness of secondary health care was tested by setting up a system for collecting HRQoL data on approximately 4 900 patients' HRQoL before and after operative treatments performed in the hospital. The HRQoL data used as an indicator of treatment effectiveness was combined with diagnostic and financial indicators routinely collected in the hospital. To compare the cost-effectiveness of three surgical interventions, 712 patients admitted for routine operative treatment completed the 15D HRQoL questionnaire before and also 3-12 months after the operation. QALYs were calculated using the obtained utility data and expected remaining life years of the patients. Direct hospital costs were obtained from the clinical patient administration database of the hospital and a cost-utility analysis was performed from the perspective of the provider of secondary health care services. Main results: The systematic review (Study I) showed that although QALYs gained are considered an important measure of the effectiveness of health care, the number of studies in which QALYs are based on actual measurements of patients' HRQoL is still fairly limited. Of the reviewed full-text articles, only 70 reported QALYs based on actual before after measurements using a valid HRQoL instrument. Collection of simple cost-effectiveness data in secondary health care is feasible and could easily be expanded and performed on a routine basis (Study II). It allows meaningful comparisons between various treatments and provides a means for allocating limited health care resources. The cost per QALY gained was 2 770 for cervical operations and 1 740 for lumbar operations. In cases where surgery was delayed the cost per QALY was doubled (Study III). The cost per QALY ranges between subgroups in cataract surgery (Study IV). The cost per QALY gained was 5 130 for patients having both eyes operated on and 8 210 for patients with only one eye operated on during the 6-month follow-up. In patients whose first eye had been operated on previous to the study period, the mean HRQoL deteriorated after surgery, thus precluding the establishment of the cost per QALY. In arthroplasty patients (Study V) the mean cost per QALY gained in a one-year period was 6 710 for primary hip replacement, 52 270 for revision hip replacement, and 14 000 for primary knee replacement. Conclusions: Although the importance of cost-utility analyses has during recent years been stressed, there are only a limited number of studies in which the evaluation is based on patients own assessment of the treatment effectiveness. Most of the cost-effectiveness and cost-utility analyses are based on modeling that employs expert opinion regarding the outcome of treatment, not on patient-derived assessments. Routine collection of effectiveness information from patients entering treatment in secondary health care turned out to be easy enough and did not, for instance, require additional personnel on the wards in which the study was executed. The mean patient response rate was more than 70 %, suggesting that patients were happy to participate and appreciated the fact that the hospital showed an interest in their well-being even after the actual treatment episode had ended. Spinal surgery leads to a statistically significant and clinically important improvement in HRQoL. The cost per QALY gained was reasonable, at less than half of that observed for instance for hip replacement surgery. However, prolonged waiting for an operation approximately doubled the cost per QALY gained from the surgical intervention. The mean utility gain following routine cataract surgery in a real world setting was relatively small and confined mostly to patients who had had both eyes operated on. The cost of cataract surgery per QALY gained was higher than previously reported and was associated with considerable degree of uncertainty. Hip and knee replacement both improve HRQoL. The cost per QALY gained from knee replacement is two-fold compared to hip replacement. Cost-utility results from the three studied specialties showed that there is great variation in the cost-utility of surgical interventions performed in a real-world setting even when only common, widely accepted interventions are considered. However, the cost per QALY of all the studied interventions, except for revision hip arthroplasty, was well below 50 000, this figure being sometimes cited in the literature as a threshold level for the cost-effectiveness of an intervention. Based on the present study it may be concluded that routine evaluation of the cost-utility of secondary health care is feasible and produces information essential for a rational and balanced allocation of scarce health care resources.
Resumo:
Background. In Finland, the incidence of type 1 diabetes mellitus (T1DM) is the highest in the world, and it continues to increase steadily. No effective preventative interventions exist either for individuals at high risk or for the population as a whole. In addition to problems with daily lifelong insulin replacement therapy, T1DM patients with long-lasting disease suffer from various diabetes related complications. The complications can lead to severe impairments and reductions in functional capacity and quality of life and in the worst case they can be fatal. Longitudinal studies on the costs of T1DM are extremely rare, especially in Finland. Typically, in these studies, distinctions between the various types of diabetes have not been made, and costs have not been calculated separately for the sexes. Aims. The aim of this study was to describe inpatient hospital care and costs of inpatient care in a cohort of 5,166 T1DM patients by sex during 1973-1998 in Finland. Inpatient care and costs of care due to T1DM without complications, due to T1DM with complications and due to other causes were calculated separately. Material and Methods. The study population consisted of all Finnish T1DM patients diagnosed before the age of 18 years between January 1st in 1965 and December 31st in 1979 and derived from the Finnish population based T1DM register (N=5,120 in 1979 and N=4,701 in 1997). Data on hospitalisations were obtained from the Finnish Hospital Discharge Register. Results. In the early stages of T1DM, the majority of the use of inpatient care was due to the treatment of T1DM without complications. There were enormous increases in the use of inpatient care for certain complications when T1DM lasted longer (from 9.5 years to 16.5 years). For women, the yearly number of bed-days for renal complications increased 4.8-fold, for peripheral vascular disease 4.3-fold and for ophthalmic complications 2.5-fold. For men, the corresponding increases were as follows: 5-fold, 6.9-fold and 2.5-fold. The yearly bed-days for glaucoma increased 8-fold, nephropathy 7-fold and microangiopathy 6-fold in the total population. During these 7 years, the yearly numbers of bed-days for T1DM without complications dropped dramatically. The length of stay in inpatient care decreased notably, but hospital visits became more frequent when the length of duration of T1DM increased from 9.5 years to 16.5 years. The costs of treatments due to complications increased when T1DM lasted longer. Costs due to inpatient care of complications in the cohort 2.5-folded as duration of T1DM increased from 9.5 years to 16.5 years, while the total costs of inpatient care in the cohort dropped by 22% due to an 80% decrease in the costs of care of T1DM without complications. Treating complications of female patients was more expensive than treating complications of men when T1DM had lasted 9.5 years; the mean annual costs for inpatient care of a female diabetic (any cause) were 1,642 , and the yearly costs of care of complications were 237 . The corresponding yearly mean costs for a male patient were 1,198 and 167 . Treating complications of female patients was more expensive than that of male patients also when the duration of diabetes was 16.5 years, although the difference in average annual costs between sexes was somewhat smaller. Conclusions. In the early phases of T1DM, the treatment of T1DM without complications causes a considerable amount of hospital bed-days. The use of inpatient care due to complications of T1DM strongly increases with ageing of patients. The economic burden of inpatient care of T1DM is substantial.
Resumo:
Background: Malnutrition is a common problem for residents of nursing homes and long-term care hospitals. It has a negative influence on elderly residents and patients health and quality of life. Nutritional care seems to have a positive effect on elderly individuals nutritional status and well-being. Studies of Finnish elderly people s nutrition and nutritional care in institutions are scarce. Objectives: The primary aim was to investigate the nutritional status and its associated factors of elderly nursing home residents and long-term care patients in Finland. In particular, to find out, if the nursing or nutritional care factors are associated with the nutritional status, and how do carers and nurses recognize malnutrition. A further aim was to assess the energy and nutrient intake of the residents of dementia wards. A final objective was to find out, if the nutrition training of professionals leads to changes in their knowledge and further translate into better nutrition for the aged residents of dementia wards. Subjects and methods: The residents (n=2114) and patients (n=1043) nutritional status was assessed in all studies using the Mini Nutritional Assessment test (MNA). Information was gathered in a questionnaire on residents and patients daily routines providing nutritional care. Residents energy and nutrient intake (n=23; n=21) in dementia wards were determined over three days by the precise weighing method. Constructive learning theory was the basis for educating the professionals (n=28). A half-structured questionnaire was used to assess professionals learning. Studies I-IV were cross-sectional studies whereas study V was an intervention study. Results: Malnutrition was common among elderly residents and patients living in nursing homes and hospitals in Finland. According to the MNA, 11% to 57% of the studied elderly people suffered from malnutrition, and 40-89% were at risk of malnutrition, whereas only 0-16% had a good nutritional status. Resident- and patient-related factors such as dementia, impaired ADL (Activities of Daily Living), swallowing difficulties and constipation mainly explained the malnutrition, but also some nutritional care related factors, such as eating less than half of the offered food portion and not receiving snacks were also related to malnutrition. The intake of energy and some nutrients by the residents of dementia wards were lower than those recommended, although the offered food contained enough energy and nutrients. The proportion of residents receiving vitamin D supplementation was low, although there is a recommendation and known benefits for the adequate intake of vitamin D. Nurses recognized malnutrition poorly, only one in four (26.7%) of the actual cases. Keeping and analysing food diaries and reflecting on nutritional issues in small group discussions were effective training methods for professionals. The nutrition education of professionals had a positive impact on the energy and protein intake, BMIs, and the MNA scores of some residents in dementia wards. Conclusions: Malnutrition was common among elderly residents and patients living in nursing homes and hospitals in Finland. Although residents- and patient related factors mainly explained malnutrition, nurses recognized malnutrition poorly and nutritional care possibilities were in minor use. Professionals nutrition education had a positive impact on the nutrition of elderly residents. Further studies describing successful nutritional care and nutrition education of professionals are needed.