94 resultados para newborn morbidity
Resumo:
Severe sepsis is associated with common occurrence, high costs of care and significant mortality. The incidence of severe sepsis has been reported to vary between 0.5/1000 and 3/1000 in different studies. The worldwide Severe Sepsis Campaign, guidelines and treatment protocols aim at decreasing severe sepsis associated high morbidity and mortality. Various mediators of inflammation, such as high mobility group box-1 protein (HMGB1) and vascular endothelial growth factor (VEGF), have been tested for severity of illness and outcome in severe sepsis. Long-term survival with quality of life (QOL) assessment is important outcome after severe sepsis. The objective of this study was to evaluate the incidence, severity of organ dysfunction and outcome of severe sepsis in intensive care treated patients in Finland (study I)). HMGB1 and VEGF were studied in predicting severity of illness, development and type of organ dysfunction and hospital mortality (studies II and III). The long-term outcome and quality of life were assessed and quality-adjusted life years and cost per one QALY were estimated (study IV). A total of 470 patients with severe sepsis were included in the Finnsepsis Study. Patients were treated in 24 Finnish intensive care units in a 4-month period from 1 November 2004 to 28 February 2005. The incidence of severe sepsis was 0.38 /1,000 in the adult population (95% confidence interval 0.34-0.41). Septic shock (77%), severe oxygenation impairment (71.4%) and acute renal failure (23.2%) were the most common organ failures. The ICU, hospital, one-year and two-year mortalities were 15.5%, 28.3%, 40.9% and 44.9% respectively. HMGB1 and VEGF were elevated in patients with severe sepsis. VEGF concentrations were lower in non-survivors than in survivors, but HMGB1 levels did not differ between patients. Neither HMGB1 nor VEGF were predictive of hospital mortality. The QOL was measured median 17 months after severe sepsis and QOL was lower than in reference population. The mean QALY was 15.2 years for a surviving patient and the cost for one QALY was 2,139 . The study showed that the incidence of severe sepsis is lower in Finland than in other countries. The short-term outcome is comparable with that in other countries, but long-term outcome is poor. HMGB1 and VEGF are not useful in predicting mortality in severe sepsis. The mean QALY for a surviving patient is 15.2 and as the cost for one QALY is reasonably low, the intensive care is cost-effective in patients with severe sepsis.
Resumo:
Infection is a major cause of mortality and morbidity after thoracic organ transplantation. The aim of the present study was to evaluate the infectious complications after lung and heart transplantation, with a special emphasis on the usefulness of bronchoscopy and the demonstration of cytomegalovirus (CMV), human herpes virus (HHV)-6, and HHV-7. We reviewed all the consecutive bronchoscopies performed on heart transplant recipients (HTRs) from May 1988 to December 2001 (n = 44) and lung transplant recipients (LTRs) from February 1994 to November 2002 (n = 472). To compare different assays in the detection of CMV, a total of 21 thoracic organ transplant recipients were prospectively monitored by CMV pp65-antigenemia, DNAemia (PCR), and mRNAemia (NASBA) tests. The antigenemia test was the reference assay for therapeutic intervention. In addition to CMV antigenemia, 22 LTRs were monitored for HHV-6 and HHV-7 antigenemia. The diagnostic yield of the clinically indicated bronchoscopies was 41 % in the HTRs and 61 % in the LTRs. The utility of the bronchoscopy was highest from one to six months after transplantation. In contrast, the findings from the surveillance bronchoscopies performed on LTRs led to a change in the previous treatment in only 6 % of the cases. Pneumocystis carinii and CMV were the most commonly detected pathogens. Furthermore, 15 (65 %) of the P. carinii infections in the LTRs were detected during chemoprophylaxis. None of the complications of the bronchoscopies were fatal. Antigenemia, DNAemia, and mRNAemia were present in 98 %, 72 %, and 43 % of the CMV infections, respectively. The optimal DNAemia cut-off levels (sensitivity/specificity) were 400 (75.9/92.7 %), 850 (91.3/91.3 %), and 1250 (100/91.5 %) copies/ml for the antigenemia of 2, 5, and 10 pp65-positive leukocytes/50 000 leukocytes, respectively. The sensitivities of the NASBA were 25.9, 43.5, and 56.3 % in detecting the same cut-off levels. CMV DNAemia was detected in 93 % and mRNAemia in 61 % of the CMV antigenemias requiring antiviral therapy. HHV-6, HHV-7, and CMV antigenemia was detected in 20 (91 %), 11 (50 %), and 12 (55 %) of the 22 LTRs (median 16, 31, and 165 days), respectively. HHV-6 appeared in 15 (79 %), HHV-7 in seven (37 %), and CMV in one (7 %) of these patients during ganciclovir or valganciclovir prophylaxis. One case of pneumonitis and another of encephalitis were associated with HHV-6. In conclusion, bronchoscopy is a safe and useful diagnostic tool in LTRs and HTRs with a suspected respiratory infection, but the role of surveillance bronchoscopy in LTRs remains controversial. The PCR assay acts comparably with the antigenemia test in guiding the pre-emptive therapy against CMV when threshold levels of over 5 pp65-antigen positive leukocytes are used. In contrast, the low sensitivity of NASBA limits its usefulness. HHV-6 and HHV-7 activation is common after lung transplantation despite ganciclovir or valganciclovir prophylaxis, but clinical manifestations are infrequently linked to them.
Resumo:
Fatigue fracture is an overuse injury commonly encountered in military and sports medicine, and known to relate to intensive or recently intensified physical activity. Bone responds to increased stress by enhanced remodeling. If physical stress exceeds bone s capability to remodel, accumulation of microfractures can lead to bone fatigue and stress fracture. Clinical diagnosis of stress fractures is complex and based on patient s anamnesis and radiological imaging. Bone stress fractures are mostly low-risk injuries, healing well after non-operative management, yet, occurring in high-risk areas, stress fractures can progress to displacement, often necessitating surgical treatment and resulting in prolonged morbidity. In the current study, the role of vitamin D as a predisposing factor for fatigue fractures was assessed using serum 25OHD level as the index. The average serum 25OHD concentration was significantly lower in conscripts with fatigue fracture than in controls. Evaluating TRACP-5b bone resorption marker as indicator of fatigue fractures, patients with elevated serum TRACP-5b levels had eight times higher probability of sustaining a stress fracture than controls. Among the 154 patients with exercise induced anterior lower leg pain and no previous findings on plain radiography, MRI revealed a total of 143 bone stress injuries in 86 patients. In 99% of the cases, injuries were in the tibia, 57% in the distal third of the tibial shaft. In patients with injury, forty-nine (57%) patients exhibited bilateral stress injuries. In a 20-year follow-up, the incidence of femoral neck fatigue fractures prior to the Finnish Defence Forces new regimen in 1986 addressing prevention of these fractures was 20.8/100,000, but rose to 53.2/100,000 afterwards, a significant 2.6-fold increase. In nineteen subjects with displaced femoral neck fatigue fractures, ten early local complications (in first postoperative year) were evident, and after the first postoperative year, osteonecrosis of the femoral head in six and osteoarthritis of the hip in thirteen patients were found. It seems likely that low vitamin D levels are related to fatigue fractures, and that an increasing trend exists between TRACP-5b bone resorption marker elevation and fatigue fracture incidence. Though seldom detected by plain radiography, fatigue fractures often underlie unclear lower leg stress-related pain occurring in the distal parts of the tibia. Femoral neck fatigue fractures, when displaced, lead to long-term morbidity in a high percentage of patients, whereas, when non-displaced, they do not predispose patients to subsequent adverse complications. Importantly, an educational intervention can diminish the incidence of fracture displacement by enhancing awareness and providing instructions for earlier diagnosis of fatigue fractures.
Resumo:
Chronic venous disease (CVD), including uncomplicated varicose veins and chronic venous insufficiency, is one of the most common medical conditions in the Western world. The central feature of CVD is venous reflux, which may be primary, congenital, or result from an antecedent event, usually an acute deep venous thrombosis (DVT). When the history of DVT is clear, the clinical manifestations of secondary CVD are commonly referred to as the post-thrombotic syndrome. Regardless of the underlying etiology, the final pathway leading to symptoms is ambulatory venous hypertension. The spectrum of symptoms and signs of CVD ranges from minor cosmetic problems to venous ulceration, which results in considerable morbidity and increased medical costs. Aims of this study were to evaluate the outcome of superficial venous surgery performed with or without preoperative duplex evaluation and venous marking with hand-held doppler, to assess short-term outcome of ultrasound-guided foam sclerotherapy in patients with axial superficial venous incompetence, as well as to compare reflux patterns after catheter-directed and systemic thrombolysis of deep ileofemoral venous thrombosis, and to evaluate the long-term outcome of deep venous reconstructions for severe chronic venous insufficiency. The study consists of five separate retrospective projects and includes 315 patients. Of this, 133 patients had undergone superficial venous surgery 2 to 5 years earlier according to preoperative duplex examination and venous marking, or according to clinical evaluation alone, or to a written plan without venous marking. A total of 112 patients had undergone ultrasound-guided foam sclerotherapy 5.5 to 16.5 months before. In addition, 32 patients had received either catheter-directed or systemic thrombolysis for DVT 2 to 3 years earlier, and 38 patients had undergone deep venous reconstructions 2 to 7 years earlier. In the present studies, some venous reflux was present postoperatively irrespective of the method of evaluation or ablation of the reflux. It seemed, however, that preoperative examination with duplex ultrasound and marking of reflux sites before the operation by the operating surgeon improves the outcome of superficial venous surgery. Ultrasound-guided foam sclerotherapy is effective in elimination of venous reflux in selected cases in short-term follow-up. Catheter-directed thrombolysis for deep iliofemoral venous thrombosis reduces later reflux and most probably the development of post-thrombotic syndrome as well. The outcome of deep venous reconstructions, especially for post-thrombotic deep venous incompetence, is poor. Thus, prevention of valvular damage by active treatment of deep venous thrombosis is important.
Resumo:
Backround and Purpose The often fatal (in 50-35%) subarachnoid hemorrhage (SAH) caused by saccular cerebral artery aneurysm (SCAA) rupture affects mainly the working aged population. The incidence of SAH is 10-11 / 100 000 in Western countries and twice as high in Finland and Japan. The estimated prevalence of SCAAs is around 2%. Many of those never rupture. Currently there are, however, no diagnostic methods to identify rupture-prone SCAAs from quiescent, (dormant) ones. Finding diagnostic markers for rupture-prone SCAAs is of primary importance since a SCAA rupture has such a sinister outcome, and all current treatment modalities are associated with morbidity and mortality. Also the therapies that prevent SCAA rupture need to be developed to as minimally invasive as possible. Although the clinical risk factors for SCAA rupture have been extensively studied and documented in large patient series, the cellular and molecular mechanisms how these risk factors lead to SCAA wall rupture remain incompletely known. Elucidation of the molecular and cellular pathobiology of the SCAA wall is needed in order to develop i) novel diagnostic tools that could identify rupture-prone SCAAs or patients at risk of SAH, and to ii) develop novel biological therapies that prevent SCAA wall rupture. Materials and Methods In this study, histological samples from unruptured and ruptured SCAAs and plasma samples from SCAA carriers were compared in order to identify structural changes, cell populations, growth factor receptors, or other molecular markers that would associate with SCAA wall rupture. In addition, experimental saccular aneurysm models and experimental models of mechanical vascular injury were used to study the cellular mechanisms of scar formation in the arterial wall, and the adaptation of the arterial wall to increased mechanical stress. Results and Interpretation Inflammation and degeneration of the SCAA wall, namely loss of mural cells and degradation of the wall matrix, were found to associate with rupture. Unruptured SCAA walls had structural resemblance with pads of myointimal hyperplasia or so called neointima that characterizes early atherosclerotic lesions, and is the repair and adaptation mechanism of the arterial wall after injury or increased mechanical stress. As in pads of myointimal hyperplasia elsewhere in the vasculature, oxidated LDL was found in the SCAA walls. Immunity against OxLDL was demonstrated in SAH patients with detection of circulating anti-oxidized LDL antibodies, which were significantly associated with the risk of rupture in patients with solitary SCAAs. Growth factor receptors associated with arterial wall remodeling and angiogenesis were more expressed in ruptured SCAA walls. In experimental saccular aneurysm models, capillary growth, arterial wall remodeling and neointima formation were found. The neointimal cells were shown to originate from the experimental aneurysm wall with minor contribution from the adjacent artery, and a negligible contribution of bone marrow-derived neointimal cells. Since loss of mural cells characterizes ruptured human SCAAs and likely impairs the adaptation and repair mechanism of ruptured or rupture-prone SCAAs, we investigated also the hypothesis that bone marrow-derived or circulating neointimal precursor cells could be used to enhance neointima formation and compensate the impaired repair capacity in ruptured SCAA walls. However, significant contribution of bone marrow cells or circulating mononuclear cells to neointima formation was not found.
Resumo:
The purpose of this work was to elucidate the ontogeny of interleukin-10 (IL-10) secretion from newborn mononuclear cells (MCs), and to examine its relation to the secretion of interferon-g (IFN-g) and immunoglobulins (Igs). The initial hypothesis was that the decreased immunoglobulin (Ig) synthesis of newborn babies was the result of immature cytokine synthesis regulation, which would lead to excessive IL-10 production, leading in turn to suppressed IFN-g secretion. Altogether 57 full-term newborns and 34 adult volunteers were enrolled. Additionally, surface marker compositions of 29 premature babies were included. Enzyme-linked immunoassays were used to determine the amount of secreted IL-10, IFN-g, and Igs, and the surface marker composition of MC were analyzed with a FACScan flow cytometer. The three most important findings were: 1. Cord blood MC, including CD5+ B cells, are able to secrete IL-10. However, when compared with adults, the secretion of IL-10 was decreased. This indicates that reasons other than excessive IL-10 secretion are responsible of reduced IFN-g secretion in newborns. 2. As illustrated by the IL-10 and IFN-g secretion pattern, newborn cytokine profile was skewed towards the Th2 type. However, approximately 25% of newborns had an adult like cytokine profile with both good IL10 and IFN-g secretion, demonstrating that fullterm newborns are not an immunologically homogenous group at the time of birth. 3. There were significant differences in the surface marker composition of MCs between individual neonates. While gestational age correlated with the proportion of some MC types, it is evident that there are many other maternal and fetal factors that influence the maturity and nature of lymphocyte subpopulations in individual neonates. In conclusion, the reduced ability of neonates to secrete Ig and IFN-g is not a consequence of high IL-10 secretion. However, individual newborns differ significantly in their ability to secrete cytokines as well as Igs.
Resumo:
Atrial fibrillation is the most common arrhythmia requiring treatment. This Thesis investigated atrial fibrillation (AF) with a specific emphasis on atrial remodeling which was analysed from epidemiological, clinical and magnetocardiographic (MCG) perspectives. In the first study we evaluated in real-life clinical practice a population-based cohort of AF patients referred for their first elective cardioversion (CV). 183 consecutive patients were included of whom in 153 (84%) sinus rhythm (SR) was restored. Only 39 (25%) of those maintained SR for one year. Shorter duration of AF and the use of sotalol were the only characteristics associated with better restoration and maintenance of SR. During the one-year follow-up 40% of the patients ended up in permanent AF. Female gender and older age were associated with the acceptance of permanent AF. The LIFE-trial was a prospective, randomised, double-blinded study that evaluated losartan and atenolol in patients with hypertension and left ventricular hypertrophy (LVH). Of the 8,851 patients with SR at baseline and without a history of AF 371 patients developed new-onset AF during the study. Patients with new-onset AF had an increased risk of cardiac events, stroke, and increased rate of hospitalisation for heart failure. Younger age, female gender, lower systolic blood pressure, lesser LVH in ECG and randomisation to losartan therapy were independently associated with lower frequency of new-onset AF. The impact of AF on morbidity and mortality was evaluated in a post-hoc analysis of the OPTIMAAL trial that compared losartan with captopril in patients with acute myocardial infarction (AMI) and evidence of LV dysfunction. Of the 5,477 randomised patients 655 had AF at baseline, and 345 patients developed new AF during the follow-up period, median 3.0 years. Older patients and patients with signs of more serious heart disease had and developed AF more often. Patients with AF at baseline had an increased risk of mortality (hazard ratio (HR) of 1.32) and stroke (HR 1.77). New-onset AF was associated with increased mortality (HR 1.82) and stroke (HR of 2.29). In the fourth study we assessed the reproducibility of our MCG method. This method was used in the fifth study where 26 patients with persistent AF had immediately after the CV longer P-wave duration and higher energy of the last portion of atrial signal (RMS40) in MCG, increased P-wave dispersion in SAECG and decreased pump function of the atria as well as enlarged atrial diameter in echocardiography compared to age- and disease-matched controls. After one month in SR, P-wave duration in MCG still remained longer and left atrial (LA) diameter greater compared to the controls, while the other measurements had returned to the same level as in the control group. In conclusion is not a rare condition in either general population or patients with hypertension or AMI, and it is associated with increased risk of morbidity and mortality. Therefore, atrial remodeling that increases the likelihood of AF and also seems to be relatively stable has to be identified and prevented. MCG was found to be an encouraging new method to study electrical atrial remodeling and reverse remodeling. RAAS-suppressing medications appear to be the most promising method to prevent atrial remodeling and AF.
Resumo:
The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.
Resumo:
The Vantaa Primary Care Depression Study (PC-VDS) is a naturalistic and prospective cohort study concerning primary care patients with depressive disorders. It forms a collaborative research project between the Department of Mental and Alcohol Research of the National Public Health Institute, and the Primary Health Care Organization of the City of Vantaa. The aim is to obtain a comprehensive view on clinically significant depression in primary care, and to compare depressive patients in primary care and in secondary level psychiatric care in terms of clinical characteristics. Consecutive patients (N=1111) in three primary care health centres were screened for depression with the PRIME-MD, and positive cases interviewed by telephone. Cases with current depressive symptoms were diagnosed face-to-face with the Structured Clinical Interview for DSM-IV Axis I Disorders (SCID-I/P). A cohort of 137 patients with unipolar depressive disorders, comprising all patients with at least two depressive symptoms and clinically significant distress or disability, was recruited. The Structured Clinical Interview for DSM-IV Axis II Disorders (SCID-II), medical records, rating scales, interview and a retrospective life-chart were used to obtain comprehensive cross-sectional and retrospective longitudinal information. For investigation of suicidal behaviour the Scale for Suicidal Ideation (SSI), patient records and the interview were used. The methodology was designed to be comparable to The Vantaa Depression Study (VDS) conducted in secondary level psychiatric care. Comparison of major depressive disorder (MDD) patients aged 20-59 from primary care in PC-VDS (N=79) was conducted with new psychiatric outpatients (N =223) and inpatients (N =46) in VDS. The PC-VDS cohort was prospectively followed up at 3, 6 and 18 months. Altogether 123 patients (90%) completed the follow-up. Duration of the index episode and the timing of relapses or recurrences were examined using a life-chart. The retrospective investigation revealed current MDD in most (66%), and lifetime MDD in nearly all (90%) cases of clinically significant depressive syndromes. Two thirds of the “subsyndromal” cases had a history of major depressive episode (MDE), although they were currently either in partial remission or a potential prodromal phase. Recurrences and chronicity were common. The picture of depression was complicated by Axis I co-morbidity in 59%, Axis II in 52% and chronic Axis III disorders in 47%; only 12% had no co-morbidity. Within their lifetimes, one third (37%) had seriously considered suicide, and one sixth (17%) had attempted it. Suicidal behaviour clustered in patients with moderate to severe MDD, co-morbidity with personality disorders, and a history of treatment in psychiatric care. The majority had received treatment for depression, but suicidal ideation had mostly remained unrecognised. The comparison of patients with MDD in primary care to those in psychiatric care revealed that the majority of suicidal or psychotic patients were receiving psychiatric treatment, and the patients with the most severe symptoms and functional limitations were hospitalized. In other clinical aspects, patients with MDD in primary care were surprisingly similar to psychiatric outpatients. Mental health contacts earlier in the current MDE were common among primary care patients. The 18-month prospective investigation with a life-chart methodology verified the chronic and recurrent nature of depression in primary care. Only one-quarter of patients with MDD achieved and maintained full remission during the follow-up, while another quarter failed to remit at all. The remaining patients suffered either from residual symptoms or recurrences. While severity of depression was the strongest predictor of recovery, presence of co-morbid substance use disorders, chronic medical illness and cluster C personality disorders all contributed to an adverse outcome. In clinical decision making, beside severity of depression and co-morbidity, history of previous MDD should not be ignored by primary care doctors while depression there is usually severe enough to indicate at least follow-up, and concerning those with residual symptoms, evaluation of their current treatment. Moreover, recognition of suicidal behaviour among depressed patients should also be improved. In order to improve outcome of depression in primary care, the often chronic and recurrent nature of depression should be taken into account in organizing the care. According to literature management programs of a chronic disease, with enhancement of the role of case managers and greater integration of primary and specialist care, have been successful. Optimum ways of allocating resources between treatment providers as well as within health centres should be found.
Resumo:
Acute renal failure (ARF) is a clinical syndrome characterized by rapidly decreasing glomerular filtration rate, which results in disturbances in electrolyte- and acid-base homeostasis, derangement of extracellular fluid volume, and retention of nitrogenous waste products, and is often associated with decreased urine output. ARF affects about 5-25% of patients admitted to intensive care units (ICUs), and is linked to high mortality and morbidity rates. In this thesis outcome of critically ill patients with ARF and factors related to outcome were evaluated. A total of 1662 patients from two ICUs and one acute dialysis unit in Helsinki University Hospital were included. In study I the prevalence of ARF was calculated and classified according to two ARF-specific scoring methods, the RIFLE classification and the classification created by Bellomo et al. (2001). Study II evaluated monocyte human histocompatibility leukocyte antigen-DR (HLA-DR) expression and plasma levels of one proinflammatory (interleukin (IL) 6) and two anti-inflammatory (IL-8 and IL-10) cytokines in predicting survival of critically ill ARF patients. Study III investigated serum cystatin C as a marker of renal function in ARF and its power in predicting survival of critically ill ARF patients. Study IV evaluated the effect of intermittent hemodiafiltration (HDF) on myoglobin elimination from plasma in severe rhabdomyolysis. Study V assessed long-term survival and health-related quality of life (HRQoL) in ARF patients. Neither of the ARF-specific scoring methods presented good discriminative power regarding hospital mortality. The maximum RIFLE score for the first three days in the ICU was an independent predictor of hospital mortality. As a marker of renal dysfunction, serum cystatin C failed to show benefit compared with plasma creatinine in detecting ARF or predicting patient survival. Neither cystatin C nor plasma concentrations of IL-6, IL-8, and IL-10, nor monocyte HLA-DR expression were clinically useful in predicting mortality in ARF patients. HDF may be used to clear myoglobin from plasma in rhabdomyolysis, especially if the alkalization of diuresis does not succeed. The long-term survival of patients with ARF was found to be poor. The HRQoL of those who survive is lower than that of the age- and gender-matched general population.
Resumo:
Congenital nephrotic syndrome of the Finnish type (NPHS1, CNF) is an autosomal recessive disease, enriched in the Finnish population. NPHS1 is caused by a mutation in the NPHS1 gene. This gene encodes for nephrin, which is a major structural component of the slit diaphragm connecting podocyte foot processes in the glomerular capillary wall. In NPHS1, the genetic defect in nephrin leads to heavy proteinuria already in the newborn period. Finnish NPHS1 patients are nephrectomized at infancy, and after a short period of dialysis the patients receive a kidney transplant, which is the only curative therapy for the disease. In this thesis, we examined the cellular and molecular mechanisms leading to the progression of glomerulosclerosis and tubulointerstitial fibrosis in NPHS1 kidneys. Progressive mesangial expansion in NPHS1 kidneys is caused by mesangial cell hyperplasia and the accumulation of extracellular matrix proteins. Expansion of the extracellular matrix was caused by the normal mesangial cell component, collagen IV. However, no significant changes in mesangial cell phenotype or extracellular matrix component composition were observed. Endotheliosis was the main ultrastructural lesion observed in the endothelium of NPHS1 glomeruli. The abundant expression of vascular endothelial growth factor and its transcription factor hypoxia inducible factor-1 alpha were in accordance with the preserved structure of the endothelium in NPHS1 kidneys. Hypoperfusion of peritubular capillaries and tubulointerstitial hypoxia were evident in NPHS1 kidneys, indicating that these may play an important role in the rapid progression of fibrosis in the kidneys of NPHS1 patients. Upregulation of Angiotensin II was obvious, emphasizing its role in the pathophysiology of NPHS1. Excessive oxidative stress was evident in NPHS1 kidneys, manifested as an increase expression of p22phox, superoxide production, lipid oxide peroxidation and reduced antioxidant activity. In conclusion, our data indicate that mesangial cell proliferation and the accumulation of extracellular matrix accumulation are associated with the obliteration of glomerular capillaries, causing the reduction of circulation in peritubular capillaries. The injury and rarefaction of peritubular capillaries result in impairment of oxygen and nutrient delivery to the tubuli and interstitial cells, which correlates with the fibrosis, tubular atrophy and oxidative stress observed in NPHS1 kidneys.
Resumo:
Atrial fibrillation (AF) is the most common tachyarrhythmia and is associated with substantial morbidity, increased mortality and cost. The treatment modalities of AF have increased, but results are still far from optimal. More individualized therapy may be beneficial. Aiming for this calls improved diagnostics. Aim of this study was to find non-invasive parameters obtained during sinus rhythm reflecting electrophysiological patterns related to propensity to AF and particularly to AF occurring without any associated heart disease, lone AF. Overall 240 subjects were enrolled, 136 patients with paroxysmal lone AF and 104 controls (mean age 45 years, 75% males). Signal measurements were performed by non-invasive magnetocardiography (MCG) and by invasive electroanatomic mapping (EAM). High-pass filtering techniques and a new method based on a surface gradient technique were adapted to analyze atrial MCG signal. The EAM was used to elucidate atrial activation in patients and as a reference for MCG. The results showed that MCG mapping is an accurate method to detect atrial electrophysiologic properties. In lone paroxysmal AF, duration of the atrial depolarization complex was marginally prolonged. The difference was more obvious in women and was also related to interatrial conduction patterns. In the focal type of AF (75%), the root mean square (RMS) amplitudes of the atrial signal were normal, but in AF without demonstrable triggers the late atrial RMS amplitudes were reduced. In addition, the atrial characteristics tended to remain similar even when examined several years after the first AF episodes. The intra-atrial recordings confirmed the occurrence of three distinct sites of electrical connection from right to left atrium (LA): the Bachmann bundle (BB), the margin of the fossa ovalis (FO), and the coronary sinus ostial area (CS). The propagation of atrial signal could also be evaluated non-invasively. Three MCG atrial wave types were identified, each of which represented a distinct interatrial activation pattern. In conclusion, in paroxysmal lone AF, active focal triggers are common, atrial depolarization is slightly prolonged, but with a normal amplitude, and the arrhythmia does not necessarily lead to electrical or mechanical dysfunction of the atria. In women the prolongation of atrial depolarization is more obvious. This may be related to gender differences in presentation of AF. A significant minority of patients with lone AF lack frequent focal triggers, and in them, the late atrial signal amplitude is reduced, possibly signifying a wider degenerative process in the LA. In lone AF, natural impulse propagation to LA during sinus rhythm goes through one or more of the principal pathways described. The BB is the most common route, but in one-third, the earliest LA activation occurs outside the BB. Susceptibility to paroxysmal lone AF is associated with propagation of the atrial signal via the margin of the FO or via multiple pathways. When conduction occurs via the BB, it is related with prolonged atrial activation. Thus, altered and alternative conduction pathways may contribute to pathogenesis of lone AF. There is growing evidence of variability in genesis of AF also within lone paroxysmal AF. Present study suggests that this variation may be reflected in cardiac signal pattern. Recognizing the distinct signal profiles may assist in understanding the pathogenesis of AF and identifying subgroups for patient-tailored therapy.
Resumo:
The purpose of this dissertation was to study the applicability of minced autologous fascia graft for injection laryngoplasty of unilateral vocal fold paralysis (UVFP). Permanence of augmentation and host versus graft tissue reactions were of special interest. The topic deals with phonosurgery, which is a subdivision of the Ear, Nose and Throat-speciality of medicine. UVFP results from an injury to the recurrent or the vagal nerve. The main symptom is a hoarse and weak voice. Surgery is warranted for patients in whom spontaneous reinnervation and a course of voice therapy fails to improve the voice. Injection laryngoplasty is a widespread surgical technique which aims to restore glottic closure by augmenting the atrophied vocal muscle, and also by turning the paralyzed vocal fold towards midline. Currently, there exists a great diversity of synthetic, xenologous, homologous, and autologous substances available for injection. An autologous graft is perfect in terms of biocompatibility. Free fascia grafts have been successfully used in the head and neck surgery for decades, but fascia had not been previously applied into the vocal fold. The fascia is harvested from the lateral thigh under local anesthesia and minced into paste by scissors. Injection of the vocal fold is performed in laryngomicroscopy under general anesthesia. Three series of clinical trials of injection laryngoplasty with autologous fascia (ILAF) for patients with UVFP were conducted at the Department of Otorhinolaryngology of the Helsinki University Central Hospital. The follow-up ranged from a few months to ten years. The aim was to document the vocal results and possible morbidity related to graft harvesting and vocal fold injection. To address the tissue reactions and the degree of reabsoprtion of the graft, an animal study with a follow-up ranging from 3 days to 12 months was performed at the National Laboratory Animal Center, University of Kuopio. Harvesting of the graft and injection was met with minor morbidity. Histological analysis of the vocal fold tissue showed that fascia was well tolerated. Although some resorption or compaction of the graft during the first months is evident, graft volume is maintained well. When injected deep and laterally into the vocalis muscle, the fascia graft allows normal vibration of the vocal fold mucosa to occur during phonation. Improvement of voice quality was seen in all series by multiple objective parameters of voice evaluation. However, the vocal results were poor in cases where the nerve trauma was severe, such as UVFP after chest surgery. ILAF is most suitable for correction of mild to moderate glottic gaps related to less severe nerve damage. Our results indicate that autologous fascia is a feasible and safe new injection material with good and stable vocal results. It offers a practical solution for surgeons who treat this complex issue.
Resumo:
Premature delivery is a major cause of neonatal morbidity and mortality. The incidence of premature deliveries has increased around the world. In Finland 5.3%, or about 3,000 children per year are born prematurely, before 37 weeks of gestation. The corresponding figure in the United States is about 13%. The morbidity and mortality are highest among infants delivered before 32 weeks of gestation - about 600 children each year in Finland. Approximately 70% of premature deliveries are unexplained. Preterm delivery can be caused by an asympto-matic infection between uterus and the fetal membranes, such can begin already in early pregnancy. It is difficult to predict preterm delivery, and many patients are therefore unnecessarily admitted to hospital for observation and exposed to medical treatments. On the other hand, the high risk women should be identified early for the best treatment of the mother and preterm infant. --- In the prospective study conducted at the Department of Obstetric and Gynecology, Helsinki University Central Hospital two biochemical inflammation related markers were measured in the lower genital tract fluids of asymp-tomatic women in early and mid pregnancy in an order to see whether these markers could identify women with an increased risk of preterm delivery. These biomarkers were phosphorylated insulin-like growth factor binding protein-1 (phIGFBP-1) and matrix metalloproteinase-8 (MMP-8). The study involved 5180 asymptomatic pregnant women, examined during the first and second ultrasound screening visits. The study samples were taken from the vagina and cervicix. In addition, 246 symptomatic women were studied (pregnancy weeks 22 – 34). The study showed that increased phIGFBP-1 concentration in cervical canal fluid in early pregnancy increased the risk for preterm delivery. The risk for very premature birth (before 32 weeks of gestation) was nearly four-fold. Low MMP-8 concentration in mid pregnancy increased the risk of subsequent premature preterm rupture of fetal membranes (PPROM). Significantly high MMP-8 concentrations in the cervical fluid increased the risk for prema-ture delivery initiated by preterm labour with intact membranes. Among women with preterm contractions the shortened cervical length measured by ultrasound and elevated cervical fluid phIGFBP-1 both predicted premature delivery. In summary, because of the relatively low sensitivity of cervical fluid phIGFBP-1 this biomarker is not suitable for routine screening, but provides an additional tool in assessing the risk of preterm delivery. Cervical fluid MMP-8 is not useful in early or mid pregnancy in predicting premature delivery because of its dual role. Further studies on the role of MMP-8 are therefore needed. Our study confirms that phIGFBP-1 testing is useful in predicting pre-term delivery.
Resumo:
The aim of the study was to evaluate long-term results of operative treatment for Hirschsprung's disease(HD) and internal anal sphincter achalasia. Fecal continence and quality of life were evaluated by a questionnaire in 100 adult patients who had undergone surgery for HD, during 1950-75. Fecal continence was evaluated using a numerical scoring described by Holschneider. Fifty-four of the 100 patients underwent clinical examination, rigid sigmoidoscopy and manometric evaluation. In anorectal manometry basal resting pressure(BRP)and maximal squeeze pressure(MSP) were measured and voluntary sphincter force(VSF) was calculated by subtracting the BRP from MSP. The results of operative treatment for adult HD were compared with the results of the patients operated in childhood. In adult HD the symptoms are such mild that the patients attain adolescence or even adulthood. The patients with HD and cartilage-hair-hypoplasia were specifically evaluated. The outcome of the patients with internal anal sphincter achalasia operated on by myectomy was evaluated by a questionnaire and continence was evaluated using a numerical scoring described by Holschneider. Of the 100 patients operated on for HD 38 patients had completely normal bowel habits. A normal or good continence score was found in 91 our of 100 patients. Nine patients had fair continence. One of the patients with fair continence had Down's syndrome and two were mentally retarded for other reasons. Only one patient suffered from constipation. In anorectal manometry the difference in BRP between patients with normal and good continence was statistically significant, whereas the difference between good and fair continence groups was not statistically significant. The differences on MSP and VSF between patient groups with different continence outcome were not statistically significant. The differences between patient groups and normal controls were statistically significant in BRP and MSP. In VSF there was not statistically significant difference between the patients and the normal controls. The VSF reflects the working power of the muscles including external sphincter, levator ani and gluteal muscles. The patients operated at adult age had as good continence as patients operated in childhood. The patients with HD and cartilage-hair-hypoplasia had much more morbidity and mortality than non-cartilage-hair-hypoplasia HD patients. The mortality was as high as 38%. In patients with internal anal sphincter achalasia the constipation was cured or alleviated by myectomy whereas a significant number suffered from soiling-related social problems.