68 resultados para Hospital Reial i General (València)
Resumo:
The occurrence and nature of civilian firearm- and explosion-injuries in Finland, and the nature of severe gunshot injuries of the extremities were described in seven original articles. The main data sources used were the National Hospital Discharge Register, the Cause-of-Death Register, and the Archive of Death Certificates at Statistics Finland. The present study was population based. Epidemiologic methods were used in six and clinical analyses in five papers. In these clinical studies, every original hospital record and death certificate was critically analyzed. The trend of hospitalized firearm injuries has slightly declined in Finland from the late 1980s to the early 2000s. The occurrence decreased from 5.1 per 100 000 person-years in 1990 to 2.6 in 2003. The decline was found in the unintentional firearm injuries. A high incidence of unintentional injuries by firearms was characteristic of the country, while violence and homicides by firearms represented a minor problem. The incidence of fatal non-suicidal firearm injuries has been stable, 1.8 cases per 100 000 person-years. Suicides using firearms were eight times more common during the period studied. This is contrary to corresponding reports from many other countries. However, the use of alcohol and illegal drugs or substances was detected in as many as one-third of the injuries studied. The median length of hospitalization was three days and it was significantly associated (p<0.001) with the type of injury. The mean length of hospital stay has decreased from the 1980s to the early 2000s. In this study, there was a special interest in gunshot injuries of the extremities. From a clinical point of view, the nature of severe extremital gunshot wounds, as well as the primary operative approach in their management, varied. The patients with severe injuries of this kind were managed at university and central hospital emergency departments, by general surgeons in smaller hospitals and by cardiothoracic or vascular surgeons in larger hospitals. Injuries were rarities and as such challenges for surgeons on call. Some noteworthy aspects of the management were noticed and these should be focused on in the future. On the other hand, the small population density and the relatively large geographic area of Finland do not favor high volume, centralized trauma management systems. However, experimental war surgery has been increasingly taught in the country from the 1990s, and excellent results could be expected during the present decade. Epidemiologically, explosion injuries can be considered a minor problem in Finland at present, but their significance should not be underestimated. Fatal explosion injuries showed up sporadically. An increase occurred from 2002 to 2004 for no obvius reason. However, in view of the historical facts, a possibility for another rare major explosion involving several people might become likely within the next decade. The national control system of firearms is mainly based on the new legislations from 1998 and 2002. However, as shown in this study, there is no reason to assume that the national hospitalization policies, or the political climate, or the legislation might have changed over the study period and influenced the declining development, at least not directly. Indeed, the reason for the decline to appear in the incidence of unintentional injuries only remains unclear. It may derive from many practical steps, e.g. locked firearm cases, or from the stability of the community itself. For effective reduction of firearm-related injuries, preventive measures, such as education and counseling, should be targeted at recreational firearm users. To sum up, this study showed that the often reported increasing trend in firearm as well as explosion-related injuries has not manifested in Finland. Consequently, it can be recognized that, overall, the Finnish legislation together with the various strategies have succeeded in preventing firearm- and explosion-related injuries in the country.
Resumo:
Visual acuities at the time of referral and on the day before surgery were compared in 124 patients operated on for cataract in Vaasa Central Hospital, Finland. Preoperative visual acuity and the occurrence of ocular and general disease were compared in samples of consecutive cataract extractions performed in 1982, 1985, 1990, 1995 and 2000 in two hospitals in the Vaasa region in Finland. The repeatability and standard deviation of random measurement error in visual acuity and refractive error determination in a clinical environment in cataractous, pseudophakic and healthy eyes were estimated by re-examining visual acuity and refractive error of patients referred to cataract surgery or consultation by ophthalmic professionals. Altogether 99 eyes of 99 persons (41 cataractous, 36 pseudophakic and 22 healthy eyes) with a visual acuity range of Snellen 0.3 to 1.3 (0.52 to -0.11 logMAR) were examined. During an average waiting time of 13 months, visual acuity in the study eye decreased from 0.68 logMAR to 0.96 logMAR (from 0.2 to 0.1 in Snellen decimal values). The average decrease in vision was 0.27 logMAR per year. In the fastest quartile, visual acuity change per year was 0.75 logMAR, and in the second fastest 0.29 logMAR, the third and fourth quartiles were virtually unaffected. From 1982 to 2000, the incidence of cataract surgery increased from 1.0 to 7.2 operations per 1000 inhabitants per year in the Vaasa region. The average preoperative visual acuity in the operated eye increased by 0.85 logMAR (in decimal values from 0.03to 0.2) and in the better eye 0.27 logMAR (in decimal values from 0.23 to 0.43) over this period. The proportion of patients profoundly visually handicapped (VA in the better eye <0.1) before the operation fell from 15% to 4%, and that of patients less profoundly visually handicapped (VA in the better eye 0.1 to <0.3) from 47% to 15%. The repeatability visual acuity measurement estimated as a coefficient of repeatability for all 99 eyes was ±0.18 logMAR, and the standard deviation of measurement error was 0.06 logMAR. Eyes with the lowest visual acuity (0.3-0.45) had the largest variability, the coefficient of repeatability values being ±0.24 logMAR and eyes with a visual acuity of 0.7 or better had the smallest, ±0.12 logMAR. The repeatability of refractive error measurement was studied in the same patient material as the repeatability of visual acuity. Differences between measurements 1 and 2 were calculated as three-dimensional vector values and spherical equivalents and expressed by coefficients of repeatability. Coefficients of repeatability for all eyes for vertical, torsional and horisontal vectors were ±0.74D, ±0.34D and ±0.93D, respectively, and for spherical equivalent for all eyes ±0.74D. Eyes with lower visual acuity (0.3-0.45) had larger variability in vector and spherical equivalent values (±1.14), but the difference between visual acuity groups was not statistically significant. The difference in the mean defocus equivalent between measurements 1 and 2 was, however, significantly greater in the lower visual acuity group. If a change of ±0.5D (measured in defocus equivalents) is accepted as a basis for change of spectacles for eyes with good vision, the basis for eyes in the visual acuity range of 0.3 - 0.65 would be ±1D. Differences in repeated visual acuity measurements are partly explained by errors in refractive error measurements.
Resumo:
There is only little information available on the 4-6-year-old child s hospital-related fears, and on the coping with such fears, as expressed by the children themselves. However, previous data collected from parents and hospital personnel indicate that hospitalization is an anxiety-producing experience for young children. The purpose of this study was to describe the experience of hospital-related fears and the experience of coping with hospital-related fears of 4-6-year-old children. The aim of this study was to form a descriptive model of the subjective experience of hospital-related fears and coping strategies of 4-6-year old children. The data were collected by interviewing 4-6-year-old children from a hospital and kindergarten settings in Finland from 2004 to 2006. Ninety children were interviewed in order to describe the hospital-related fear and the experience of fear, and 89 to describe their coping with the fear and the experience of coping. The children were chosen through purposive sampling. The data were gathered by semi-structured interview, supported by pictures. The data about hospital-related fears and on strategies for coping with hospital-related fears were reviewed by qualitative and quantitative methods. The experience of hospital-related fears and coping with these fears were analyzed using Colaizzi s Method of Phenomenological Analysis. The results revealed that more than 90 % of the children said they were afraid of at least one thing in hospital. Most of the fears could be categorized as nursing interventions, fears of being a patient, and fears caused by the developmental stage of the child. Children interviewed in the hospital expressed substantially more fears than children interviewed in kindergarten. Children s meanings of hospital-related fears were placed into four main clusters: 1) insecurity, 2) injury, 3) helplessness, 4) and rejection. The results also showed that children have plenty of coping strategies, to deal with their fears, especially such strategies in which the children themselves play an active role. Most often mentioned coping strategies were 1) the presence of parents and other family members, 2) the help of the personnel, 3) positive images and humour, 4) play, and 5) the child s own safety toy. The children interviewed in the hospital mentioned statistically significantly more often play, positive imagination and humour as their coping strategy than children interviewed in kindergarten. The meaning of coping with hospital fears consisted of six clusters: pleasure, security, care, understanding the meaning of the situation participating, and protecting oneself. Being admitted to a hospital is an event which may increase the fears of a 4-6-year-old child. Children who have personal experience of being admitted to a hospital describe more fears than healthy children in kindergarten. For young children, hospital-related fear can be such a distressing experience that it reflects on their feelings of security and their behaviour. Children can sometimes find it difficult to admit their fear. Children need the help of adults to express their hospital-related fears, the objects of the fears, and to cope with the fears. Personnel should be aware of children s fears and support them in the use of coping strategies. In addition to the experiences of security and care, pre-school-aged children need active coping strategies that they can use themselves, regardless of the presence of the parents or nurses. Most of all, children need the possibility to play and experience pleasure. Children can also be taught coping strategies which give them an active, positive role.
Resumo:
Rheumatoid arthritis (RA) and other chronic inflammatory joint diseases already begin to affect patients health-related quality of life (HRQoL) in the earliest phases of these diseases. In treatment of inflammatory joint diseases, the last two decades have seen new strategies and treatment options introduced. Treatment is started at an earlier phase; combinations of disease-modifying anti-rheumatic drugs (DMARDs) and corticosteroids are used; and in refractory cases new drugs such as tumour necrosis factor (TNF) inhibitors or other biologicals can be started. In patients with new referrals to the Department of Rheumatology of the Helsinki University Central Hospital, we evaluated the 15D and the Stanford Health Assessment Questionnaire (HAQ) results at baseline and approximately 8 months after their first visit. Altogether the analysis included 295 patients with various rheumatic diseases. The mean baseline 15D score (0.822, SD 0.114) was significantly lower than for the age-matched general population (0.903, SD 0.098). Patients with osteoarthritis (OA) and spondyloarthropathies (SPA) reported the poorest HRQoL. In patients with RA and reactive arthritis (ReA) the HRQoL improved in a statistically significant manner during the 8-month follow-up. In addition, a clinically important change appeared in patients with systemic rheumatic diseases. HAQ score improved significantly in patients with RA, arthralgia and fibromyalgia, and ReA. In a study of 97 RA patients treated either with etanercept or adalimumab, we assessed their HRQoL with the RAND 36-Item Health Survey 1.0 (RAND-36) questionnaire. We also analysed changes in clinical parameters and the HAQ. With etanercept and adalimumab, the values of all domains in the RAND-36 questionnaire increased during the first 3 months. The efficacy of each in improving HRQoL was statistically significant, and the drug effects were comparable. Compared to Finnish age- and sex-matched general population values, the HRQoL of the RA patients was significantly lower at baseline and, despite the improvement, remained lower also at follow-up. Our RA patients had long-standing and severe disease that can explain the low HRQoL also at follow-up. In a pharmacoeconomic study of patients treated with infliximab we evaluated medical and work disability costs for patients with chronic inflammatory joint disease during one year before and one year after institution of infliximab treatment. Clinical and economic data for 96 patients with different arthritis diagnoses showed, in all patients, significantly improved clinical and laboratory variables. However, the medical costs increased significantly during the second period by 12 015 (95% confidence interval, 6 496 to 18 076). Only a minimal decrease in work disability costs occurred mean decrease 130 (-1 268 to 1 072). In a study involving a switch from infliximab to etanercept, we investigated the clinical outcome in 49 patients with RA. Reasons for switching were in 42% failure to respond by American College of Rheumatology (ACR) 50% criteria; in 12% adverse event; and in 46% non-medical reasons although the patients had responded to infliximab. The Disease Activity Score with 28 joints examined (DAS28) allowed us to measure patients disease activity and compare outcome between groups based on the reason for switching. In the patients in whom infliximab was switched to etanercept for nonmedical reasons, etanercept continued to suppress disease activity effectively, and 1-year drug survival for etanercept was 77% (95% CI, 62 to 97). In patients in the infliximab failure and adverse event groups, DAS28 values improved significantly during etanercept therapy. However, the 1-year drug survival of etanercept was only 43% (95% CI, 26 to 70) and 50% (95% CI, 33 to 100), respectively. Although the HRQoL of patients with inflammatory joint diseases is significantly lower than that of the general population, use of early and aggressive treatment strategies including TNF-inhibitors can improve patients HRQoL effectively. Further research is needed in finding new treatment strategies for those patients who fail to respond or lose their response to TNF-inhibitors.
Resumo:
Cervical cancer is the second most common cancer among women globally. Most, probably all cases, arise through a precursor, cervical intraepithelial neoplasia (CIN). Effective cytological screening programmes and surgical treatments of precancerous lesions have dramatically reduced its prevalence and related mortality. Although these treatments are effective, they may have adverse effects on future fertility and pregnancy outcomes. The aim of this study was to evaluate the effects of surgical treatment of the uterine cervix on pregnancy and fertility outcomes, with the focus particularly on preterm birth. The general preterm birth rates and risk factors during 1987–2005 were studied. Long-term mortality rates of the treated women were studied. In this study, information from The Medical Birth Register (MBR), The Hospital Discharge Register (HDR), The Cause-of-Death Register (CDR), and hospital records were used. Treatments were performed during 1987–2003 and subsequent deliveries, IVF treatments and deaths were analyzed. The general preterm birth rate in Finland was relatively stable, varying from 5.1% to 5.4% during the study period (1987 to 2005), although the proportion of extremely preterm births had decreased substantially by 12%.The main risk factor as regards preterm birth was multiplicity, followed by elective delivery (induction of delivery or elective cesarean section), primiparity, in vitro fertilization treatment, maternal smoking and advanced maternal age. The risk of preterm birth and low birth weight was increased after any cervical surgical treatment; after conization the risk of preterm birth was almost two-fold (RR 1.99, 95% CI 1.81– 2.20). In the conization group the risk was the highest for very preterm birth (28–31 gestational weeks) and it was also high for extremely preterm birth (less than 28 weeks). In this group the perinatal mortality was also increased. In subgroup analysis, laser ablation was not associated with preterm birth. When comparing deliveries before and after Loop conization, we found that the risk of preterm birth was increased 1.94-fold (95% CI 1.10–3.40). Adjusting for age, parity, or both did not affect our results. Large or repeat cones increased the risk of preterm birth when compared with smaller cones, suggesting that the size of the removed cone plays a role. This was corroborated by the finding that repeat treatment increased the risk as much as five-fold when compared with the background preterm birth rate. We found that the proportion of IVF deliveries (1.6% vs. 1.5%) was not increased after treatment for CIN when adjusted for year of delivery, maternal age, or parity. Those women who received both treatment for CIN and IVF treatment were older and more often primiparous, which explained the increased risk of preterm birth. We also found that mortality rates were 17% higher among women previously treated for CIN. This excess mortality was particularly seen as regards increased general disease mortality and alcohol poisoning (by 13%), suicide (by 67%) and injury death (by 31%). The risk of cervical cancer was high, as expected (SMR 7.69, 95% CI 4.23–11.15). Women treated for CIN and having a subsequent delivery had decreased general mortality rate (by -22%), and decreased disease mortality (by -37%). However, those with preterm birth had increased general mortality (SMR 2.51, 95% CI 1.24–3.78), as a result of cardiovascular diseases, alcohol-related causes, and injuries. In conclusion, the general preterm birth rate has not increased in Finland, as in many other developed countries. The rate of extremely preterm births has even decreased. While other risk factors of preterm birth, such as multiplicity and smoking during pregnancy have decreased, surgical treatments of the uterine cervix have become more important risk factors as regards preterm birth. Cervical conization is a predisposing factor as regards preterm birth, low birth weight and even perinatal mortality. The most frequently used treatment modality, Loop conization, is also associated with the increased risk of preterm birth. Treatments should be tailored individually; low-grade lesions should not be treated at all among young women. The first treatment should be curative, because repeat treatments are especially harmful. The proportion of IVF deliveries was not increased after treatment for CIN, suggesting that current treatment modalities do not strongly impair fertility. The long-term risk of cervical cancer remains high even after many years post-treatment; therefore careful surveillance is necessary. In addition, accidental deaths and deaths from injury were common among treated women, suggesting risk-taking behavior of these women. Preterm birth seems be associated with extremely high mortality rates, due to cardiovascular, alcohol-related and injury deaths. These women could benefit from health counseling, for example encouragement in quitting smoking.
Resumo:
This study is part of an ongoing collaborative bipolar research project, the Jorvi Bipolar Study (JoBS). The JoBS is run by the Department of Mental Health and Alcohol Research of the National Public Health Institute, Helsinki, and the Department of Psychiatry, Jorvi Hospital, Helsinki University Central Hospital (HUCH), Espoo, Finland. It is a prospective, naturalistic cohort study of secondary level care psychiatric in- and outpatients with a new episode of bipolar disorder (BD). The second report also included 269 major depressive disorder (MDD) patients from the Vantaa Depression Study (VDS). The VDS was carried out in collaboration with the Department of Psychiatry of the Peijas Medical Care District. Using the Mood Disorder Questionnaire (MDQ), all in- and outpatients at the Department of Psychiatry at Jorvi Hospital who currently had a possible new phase of DSM-IV BD were sought. Altogether, 1630 psychiatric patients were screened, and 490 were interviewed using a semistructured interview (SCID-I/P). The patients included in the cohort (n=191) had at intake a current phase of BD. The patients were evaluated at intake and at 6- and 18-month interviews. Based on this study, BD is poorly recognized even in psychiatric settings. Of the BD patients with acute worsening of illness, 39% had never been correctly diagnosed. The classic presentations of BD with hospitalizations, manic episodes, and psychotic symptoms lead clinicians to correct diagnosis of BD I in psychiatric care. Time of follow-up elapsed in psychiatric care, but none of the clinical features, seemed to explain correct diagnosis of BD II, suggesting reliance on cross- sectional presentation of illness. Even though BD II was clearly less often correctly diagnosed than BD I, few other differences between the two types of BD were detected. BD I and II patients appeared to differ little in terms of clinical picture or comorbidity, and the prevalence of psychiatric comorbidity was strongly related to the current illness phase in both types. At the same time, the difference in outcome was clear. BD II patients spent about 40% more time depressed than BD I patients. Patterns of psychiatric comorbidity of BD and MDD differed somewhat qualitatively. Overall, MDD patients were likely to have more anxiety disorders and cluster A personality disorders, and bipolar patients to have more cluster B personality disorders. The adverse consequences of missing or delayed diagnosis are potentially serious. Thus, these findings strongly support the value of screening for BD in psychiatric settings, especially among the major depressive patients. Nevertheless, the diagnosis must be based on a clinical interview and follow-up of mood. Comorbidity, present in 59% of bipolar patients in a current phase, needs concomitant evaluation, follow-up, and treatment. To improve outcome in BD, treatment of bipolar depression is a major challenge for clinicians.
Resumo:
The outcome of the successfully resuscitated patient is mainly determined by the extent of hypoxic-ischemic cerebral injury, and hypothermia has multiple mechanisms of action in mitigating such injury. The present study was undertaken from 1997 to 2001 in Helsinki as a part of the European multicenter study Hypothermia after cardiac arrest (HACA) to test the neuroprotective effect of therapeutic hypothermia in patients resuscitated from out-of-hospital ventricular fibrillation (VF) cardiac arrest (CA). The aim of this substudy was to examine the neurological and cardiological outcome of these patients, and especially to study and develop methods for prediction of outcome in the hypothermia-treated patients. A total of 275 patients were randomized to the HACA trial in Europe. In Helsinki, 70 patients were enrolled in the study according to the inclusion criteria. Those randomized to hypothermia were actively cooled externally to a core temperature 33 ± 1ºC for 24 hours with a cooling device. Serum markers of ischemic neuronal injury, NSE and S-100B, were sampled at 24, 36, and 48 hours after CA. Somatosensory and brain stem auditory evoked potentials (SEPs and BAEPs) were recorded 24 to 28 hours after CA; 24-hour ambulatory electrocardiography recordings were performed three times during the first two weeks and arrhythmias and heart rate variability (HRV) were analyzed from the tapes. The clinical outcome was assessed 3 and 6 months after CA. Neuropsychological examinations were performed on the conscious survivors 3 months after the CA. Quantitative electroencephalography (Q-EEG) and auditory P300 event-related potentials were studied at the same time-point. Therapeutic hypothermia of 33ºC for 24 hours led to an increased chance of good neurological outcome and survival after out-of-hospital VF CA. In the HACA study, 55% of hypothermia-treated patients and 39% of normothermia-treated patients reached a good neurological outcome (p=0.009) at 6 months after CA. Use of therapeutic hypothermia was not associated with any increase in clinically significant arrhythmias. The levels of serum NSE, but not the levels of S-100B, were lower in hypothermia- than in normothermia-treated patients. A decrease in NSE values between 24 and 48 hours was associated with good outcome at 6 months after CA. Decreasing levels of serum NSE but not of S-100B over time may indicate selective attenuation of delayed neuronal death by therapeutic hypothermia, and the time-course of serum NSE between 24 and 48 hours after CA may help in clinical decision-making. In SEP recordings bilaterally absent N20 responses predicted permanent coma with a specificity of 100% in both treatment arms. Recording of BAEPs provided no additional benefit in outcome prediction. Preserved 24- to 48-hour HRV may be a predictor of favorable outcome in CA patients treated with hypothermia. At 3 months after CA, no differences appeared in any cognitive functions between the two groups: 67% of patients in the hypothermia and 44% patients in the normothermia group were cognitively intact or had only very mild impairment. No significant differences emerged in any of the Q-EEG parameters between the two groups. The amplitude of P300 potential was significantly higher in the hypothermia-treated group. These results give further support to the use of therapeutic hypothermia in patients with sudden out-of-hospital CA.
Resumo:
The Molecular Adsorbent Recirculating System (MARS) is an extracorporeal albumin dialysis device which is used in the treatment of liver failure patients. This treatment was first utilized in Finland in 2001, and since then, over 200 patients have been treated. The aim of this thesis was to evaluate the impact of the MARS treatment on patient outcome, the clinical and biochemical variables, as well as on the psychological and economic aspects of the treatment in Finland. This thesis encompasses 195 MARS-treated patients (including patients with acute liver failure (ALF), acute-on-chronic liver failure (AOCLF) and graft failure), and a historical control group of 46 ALF patients who did not undergo MARS. All patients received a similar standard medical therapy at the same intensive care unit. The baseline data (demographics, laboratory and clinical variables) and MARS treatment-related and health-related quality-of-life data were recorded before and after treatment. The direct medical costs were determined for a period of 3.5 years.Additionally, the outcome of patients (survival, native liver recovery and need for liver transplantation) and survival predicting factors were investigated. In the outcome analysis, for the MARS-treated ALF patients, their 6-month survival (75% vs. 61%, P=0.07) and their native liver recovery rate (49% vs. 17%, P<0.001) were higher, and their need for transplantations was lower (29% vs. 57%, P= 0.001) than for the historical controls. However, the etiological distribution of the ALF patients referred to our unit has changed considerably over the past decade and the percentage of patients with a more favorable prognosis has increased. The etiology of liver failure was the most important predictor of the outcome. Other survival predicting factors in ALF included hepatic encephalopathy, the coagulation factors and the liver enzyme levels prior to MARS treatment. In terms of prognosis, the MARS treatment of the cirrhotic AOCLF patient seems meaningful only when the patient is eligible for transplantation. The MARS treatment appears to halt the progression of encephalopathy and reduce the blood concentration of neuroactive amino acids, albumin-bound and water-soluble toxins. In general, the effects of the MARS treatment seem to stabilize the patients, thus allowing additional time either for the native liver to recover, or for the patients to endure the prolonged waiting for transplantation. Furthermore, for the ALF patients, the MARS treatment appeared to be less costly and more cost-efficient than the standard medical therapy alone. In conclusion, the MARS treatment appears to have a beneficial effect on the patient outcome in ALF and in those AOCLF patients who can be bridged to transplantation.
Resumo:
Intensive care is to be provided to patients benefiting from it, in an ethical, efficient, effective and cost-effective manner. This implies a long-term qualitative and quantitative analysis of intensive care procedures and related resources. The study population consists of 2709 patients treated in the general intensive care unit (ICU) of Helsinki University Hospital. Study sectors investigate intensive care patients mortality, quality of life (QOL), Quality-Adjusted Life-Years (QALY units) and factors related to severity of illness, length of stay (LOS), patient s age, evaluation period as well as experiences and memories connected with the ICU episode. In addition, the study examines the qualities of two QOL measures, the RAND 36 Item Health Survey 1.0 (RAND-36) and the 5 Item EuroQol-5D (EQ-5D) and assesses the correlation of the test results. Patients treated in 1995 responded to the RAND-36 questionnaire in 1996. All patients, treated from 1995-2000, received a QOL questionnaires in 2001, when 1 7 years had lapsed from the intensive treatment. Response rate was 79.5 %. Main Results 1) Of the patients who died within the first year (n = 1047) 66 % died during the intensive care period or within the following month. The non-survivors were more aged than the surviving patients, had generally a higher than average APACHE II and SOFA score depicting the severity of illness, their ICU LOS was longer and hospital stay shorter than of the surviving patients (p < 0.001). Mortality of patients receiving conservative treatment was higher than of those receiving surgical treatment. Patients replying to the QOL survey in 2001 (n = 1099) had recovered well: 97 % of those lived at home. More than half considered their QOL as good or extremely good, 40 % as satisfactory and 7 % as bad. All QOL indexes of those of working-age were considerably lower (p < 0.001) than comparable figures of the age- and gender-adjusted Finnish population. The 5-year monitoring period made evident that mental recovery was slower than physical recovery. 2) The results of RAND-36 and EQ-5D correlated well (p < 0.01). The RAND-36 profile measure distinguished more clearly between the different categories of QOL and their levels. EQ-5D measured well the patient groups general QOL and the sum index was used to calculate QALY units. 3) QALY units were calculated by multiplying the time the patient survived after ICU stay or expected life-years by the EQ-5D sum index. Aging automatically lowers the number of QALY units. Patients under the age of 65 receiving conservative treatment benefited from treatment to a greater extent measured in QALY units than their peers receiving surgical treatment, but in the age group 65 and over patients with surgical treatment received higher QALY ratings than recipients of conservative treatment. 4) The intensive care experience and QOL ratings were connected. The QOL indices were statistically highest for those recipients with memories of intensive care as a positive experience, albeit their illness requiring intensive care treatment was less serious than average. No statistically significant differences were found in the QOL indices of those with negative memories, no memories or those who did not express the quality of their experiences.
Resumo:
Pitfalls in the treatment of persons with dementia Persons with dementia require high-quality health care, rehabilitation and sufficient social services to support their autonomy and to postpone permanent institutionalization. This study sought to investigate possible pitfalls in the care of patients with dementia: hip fracture rehabilitation, use of inappropriate or antipsychotic medication, social and medicolegal services offered to dementia caregiving families. Three different Finnish samples were used from years 1999-2005, mean age 78 to 86 years. After hip fracture operation, the weight-bearing restriction especially in group of patients with dementia, was associated with a longer rehabilitation period (73.5 days vs. 45.5 days, p=0.03) and the inability to learn to walk after six weeks (p<0.001). Almost half (44%) of the pre-surgery home-dwellers with dementia in our sample required permanent hospitalization after hip fracture. Potentially inappropriate medication was used among 36.2% of nursing home and hospital patients. The most common PIDs in Finland were temazepam over 15 mg/day, oxybutynin, and dipyridamole. However, PID use failed to predict mortality or the use of health services. Nearly half (48.4%) of the nursing home and hospital patients with dementia used antipsychotic medication. The two-year mortality did not differ among the users of conventional or atypical antipsychotics or the non-users (45.3% vs.32.1% vs.49.6%, p=0.195). The mean number of hospital admissions was highest among non-users (p=0.029). A high number of medications (HR 1.12, p<0.001) and the use of physical restraints (HR 1.72, p=0.034) predicted higher mortality at two years, while the use of atypical antipsychotics (HR 0.49, p=0.047) showed a protective effect, if any. The services most often offered to caregiving families of persons with Alzheimer s disease (AD) included financial support from the community (36%), technical devices (33%), physiotherapy (32%), and respite care in nursing homes (31%). Those services most often needed included physiotherapy for the spouse with dementia (56%), financial support (50%), house cleaning (41%), and home respite (40%). Only a third of the caregivers were satisfied with these services, and 69% felt unable to influence the range of services offered. The use of legal guardians was quite rare (only 4.3%), while the use of financial powers of attorney was 37.8%. Almost half (47.9%) of the couples expressed an unmet need for discussion with their doctor about medico-legal issues, while only 9.9% stated that their doctor had informed them of such matters. Although we already have many practical methods to develop the medical and social care of persons with AD, these patients and their families require better planning and tailoring of such services. In this way, society could offer these elderly persons better quality of life while economizing on its financial resources. This study was supported by Social Insurance Institution of Finland and part of it made in cooperation with the The Central Union of the Welfare for the Aged, Finland.
Resumo:
Stroke is the second leading cause of death and the leading cause of disability worldwide. Of all strokes, up to 80% to 85% are ischemic, and of these, less than 10% occur in young individuals. Stroke in young adults—most often defined as stroke occurring under the age of 45 or 50—can be particularly devastating due to long expected life-span ahead and marked socio-economic consequences. Current basic knowledge on ischemic stroke in this age group originates mostly from rather small and imprecise patient series. Regarding emergency treatment, systematic data on use of intravenous thrombolysis are absent. For this Thesis project, we collected detailed clinical and radiological data on all consecutive patients aged 15 to 49 with first-ever ischemic stroke between 1994 and 2007 treated at the Helsinki University Central Hospital. The aims of the study were to define demographic characteristics, risk factors, imaging features, etiology, and long-term mortality and its predictors in this patient population. We additionally sought to investigate, whether intravenous thrombolysis is safe and beneficial for the treatment of acute ischemic stroke in the young. Of our 1008 patients, most were males (ratio 1.7:1), who clearly outnumbered females after the age of 44, but females were preponderant among those aged <30. Occurrence increased exponentially. The most frequent risk factors were dyslipidemia (60%), smoking (44%), and hypertension (39%). Risk factors accumulated in males and along aging. Cardioembolism (20%) and cervicocerebral artery dissection (15%) were the most frequent etiologic subgroups, followed by small-vessel disease (14%), and large-artery atherosclerosis (8%). A total of 33% had undetermined etiology. Left hemisphere strokes were more common in general. Posterior circulation infarcts were more common among those aged <45. Multiple brain infarcts were present in 23% of our patients, 13% had silent infarcts, and 5% had leukoaraiosis. Of those with silent brain infarcts, majority (54%) had only a single lesion, and most of the silent strokes were located in basal ganglia (39%) and subcortical regions (21%). In a logistic regression analysis, type 1 diabetes mellitus in particular predicted the presence of both silent brain infarcts (odds ratio 5.78, 95% confidence interval 2.37-14.10) and leukoaraiosis (9.75; 3.39-28.04). We identified 48 young patients with hemispheric ischemic stroke treated with intravenous tissue plasminogen activator, alteplase. For comparisons, we searched 96 untreated control patients matched by age, gender, and admission stroke severity, as well as 96 alteplase-treated older controls aged 50 to 79 matched by gender and stroke severity. Alteplase-treated young patients recovered more often completely (27% versus 10%, P=0.010) or had only mild residual symptoms (40% versus 22%, P=0.025) compared to age-matched controls. None of the alteplase-treated young patients had symptomatic intracerebral hemorrhage or died within 3-month follow-up. Overall long-term mortality was low in our patient population. Cumulative mortality risks were 2.7% (95% confidence interval 1.5-3.9%) at 1 month, 4.7% (3.1-6.3%) at 1 year, and 10.7% (9.9-11.5%) at 5 years. Among the 30-day survivors who died during the 5-year follow-up, more than half died due to vascular causes. Malignancy, heart failure, heavy drinking, preceding infection, type 1 diabetes, increasing age, and large-artery atherosclerosis causing the index stroke independently predicted 5-year mortality when adjusted for age, gender, relevant risk factors, stroke severity, and etiologic subtype. In sum, young adults with ischemic stroke have distinct demographic patterns and they frequently harbor traditional vascular risk factors. Etiology in the young is extremely diverse, but in as many as one-third the exact cause remains unknown. Silent brain infarcts and leukoaraiosis are not uncommon brain imaging findings in these patients and should not be overlooked due to their potential prognostic relevance. Outcomes in young adults with hemispheric ischemic stroke can safely be improved with intravenous thrombolysis. Furthermore, despite their overall low risk of death after ischemic stroke, several easily recognizable factors—of which most are modifiable—predict higher mortality in the long term in young adults.
Resumo:
The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.
Resumo:
In the general population, the timing of puberty is normally distributed. This variation is determined by genetic and environmental factors, but the exact mechanisms underlying these influences remain elusive. The purpose of this study was to gain insight into genetic regulation of pubertal timing. Contributions of genetic versus environmental factors to the normal variation of pubertal timing were explored in twins. Familial occurrence and inheritance patterns of constitutional delay of growth and puberty, CDGP (a variant of normal pubertal timing), were studied in pedigrees of patients with this condition. To ultimately detect genes involved in the regulation of pubertal timing, genetic loci conferring susceptibility to CDGP were mapped by linkage analysis in the same family cohort. To subdivide the overall phenotypic variance of pubertal timing into genetic and environmental components, genetic modeling based on monozygous twins sharing 100% and dizygous twins sharing 50% of their genes was used in 2309 girls and 1828 boys from the FinnTwin 12-17 study. The timing of puberty was estimated from height growth, i.e. change in the relative height between the age when pubertal growth velocity peaks in the general population and adulthood. This reflects the percentage of adult height achieved at the average peak height velocity age, and thus, pubertal timing. Boys and girls diagnosed with CDGP were gathered through medical records from six pediatric clinics in Finland. First-degree relatives of the probands were invited to participate by letter; altogether, 286 families were recruited. When possible, families were extended to include also second-, third-, or fourth-degree relatives. The timing of puberty in all family members was primarily assessed from longitudinal growth data. Delayed puberty was defined by onset of pubertal growth spurt or peak height velocity taking place 1.5 (relaxed criterion) or 2 SD (strict criterion) beyond the mean. If growth data were unavailable, pubertal timing was based on interviews. In this case, CDGP criteria were set as having undergone pubertal development more than 2 (strict criterion) or 1.5 years (relaxed criterion) later than their peers, or menarche after 15 (strict criterion) or 14 years (relaxed criterion). Familial occurrence of strict CDGP was explored in families of 124 patients (95 males and 29 females) from two clinics in Southern Finland. In linkage analysis, we used relaxed CDGP criteria; 52 families with solely growth data-based CDGP diagnoses were selected from all clinics. Based on twin data, genetic factors explain 86% and 82% of the variance of pubertal timing in girls and boys, respectively. In families, 80% of male and 76% of female probands had affected first-degree relatives, in whom CDGP was 15 times more common than the expected (2.5%). In 74% (17 of 23) of the extended families with only one affected parent, familial patterns were consistent with autosomal dominant inheritance. By using 383 multiallelic markers and subsequently fine-mapping with 25 additional markers, significant linkage for CDGP was detected to the pericentromeric region of chromosome 2, to 2p13-2q13 (multipoint HLOD 4.44, α 0.41). The findings of the large twin study imply that the vast majority of the normal variation of pubertal timing is attributed to genetic effects. Moreover, the high frequency of dominant inheritance patterns and the large number of affected relatives of CDGP patients suggest that genetic factors also markedly contribute to constitutional delay of puberty. Detection of the locus 2p13-2q13 in the pericentromeric region of chromosome 2 associating with CDGP is one step towards unraveling the genes that determine pubertal timing.
Resumo:
Acute renal failure (ARF) is a clinical syndrome characterized by rapidly decreasing glomerular filtration rate, which results in disturbances in electrolyte- and acid-base homeostasis, derangement of extracellular fluid volume, and retention of nitrogenous waste products, and is often associated with decreased urine output. ARF affects about 5-25% of patients admitted to intensive care units (ICUs), and is linked to high mortality and morbidity rates. In this thesis outcome of critically ill patients with ARF and factors related to outcome were evaluated. A total of 1662 patients from two ICUs and one acute dialysis unit in Helsinki University Hospital were included. In study I the prevalence of ARF was calculated and classified according to two ARF-specific scoring methods, the RIFLE classification and the classification created by Bellomo et al. (2001). Study II evaluated monocyte human histocompatibility leukocyte antigen-DR (HLA-DR) expression and plasma levels of one proinflammatory (interleukin (IL) 6) and two anti-inflammatory (IL-8 and IL-10) cytokines in predicting survival of critically ill ARF patients. Study III investigated serum cystatin C as a marker of renal function in ARF and its power in predicting survival of critically ill ARF patients. Study IV evaluated the effect of intermittent hemodiafiltration (HDF) on myoglobin elimination from plasma in severe rhabdomyolysis. Study V assessed long-term survival and health-related quality of life (HRQoL) in ARF patients. Neither of the ARF-specific scoring methods presented good discriminative power regarding hospital mortality. The maximum RIFLE score for the first three days in the ICU was an independent predictor of hospital mortality. As a marker of renal dysfunction, serum cystatin C failed to show benefit compared with plasma creatinine in detecting ARF or predicting patient survival. Neither cystatin C nor plasma concentrations of IL-6, IL-8, and IL-10, nor monocyte HLA-DR expression were clinically useful in predicting mortality in ARF patients. HDF may be used to clear myoglobin from plasma in rhabdomyolysis, especially if the alkalization of diuresis does not succeed. The long-term survival of patients with ARF was found to be poor. The HRQoL of those who survive is lower than that of the age- and gender-matched general population.
Resumo:
Atopy-related allergic diseases, i.e. allergic rhinoconjunctivitis, atopic dermatitis and asthma, have increased in frequency in the industrialized countries. In order to reverse this trend, effective preventive strategies need to be developed. This requires a better understanding of the early-life events leading to the expression of the atopic phenotype. The present study has aimed at defining early-life factors and markers associated with the subsequent development of allergic diseases in a cohort of 200 healthy, unselected Finnish newborns prospectively followed up from birth to age 20 years. Their mothers were encouraged to start and maintain exclusive breastfeeding as long as it was nutritionally sufficient for the infant. Consequently, all the infants received some duration of exclusive breastfeeding, 58% of the infants were on exclusive breastfeeding for the first 6 months of life, and 18% received this feeding at least for the first 9 months. Of the infants, 42% had a family history of allergy. After the first year of follow-up, the children were re-assessed at ages 5, 11 and 20 years with clinical examination, skin prick testing, and parental and personal interviews. Exclusive breastfeeding for over 9 months was associated with atopic dermatitis and symptoms of food hypersensitivity at age 5 years, and with symptoms of food hypersensitivity at age 11 years in the children with a familial allergy. Subjects with allergic symptoms or a positive skin prick test in childhood or adolescence had lower retinol concentrations during their infancy and childhood than others. An elevated cord serum immunoglobulin E concentration predicted subsequent atopic manifestations though with modest sensitivity. Children and adolescents with allergic symptoms, skin prick test positivity and an elevated IgE had lower total cholesterol levels in infancy and childhood than the nonatopic subjects. In conclusion, prolonging strictly exclusive breastfeeding for over 9 months of age was not of help in prevention of allergic symptoms; instead, it was associated with increased atopic dermatitis and food hypersensitivity symptoms in childhood. Due to the modest sensitivity, cord serum IgE is not an effective screening method for atopic predisposition in the general population. Retinol and cholesterol concentrations in infancy were inversely associated with the subsequent development of allergic symptoms. Based on these findings, it is proposed that there may be differences in the inborn regulation of retinol and cholesterol levels in children with and without a genetic susceptibility to atopy, and these may play a role in the development of atopic sensitization and allergic diseases.