33 resultados para Reference Curves


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Continuous epidural analgesia (CEA) and continuous spinal postoperative analgesia (CSPA) provided by a mixture of local anaesthetic and opioid are widely used for postoperative pain relief. E.g., with the introduction of so-called microcatheters, CSPA found its way particularly in orthopaedic surgery. These techniques, however, may be associated with dose-dependent side-effects as hypotension, weakness in the legs, and nausea and vomiting. At times, they may fail to offer sufficient analgesia, e.g., because of a misplaced catheter. The correct position of an epidural catheter might be confirmed by the supposedly easy and reliable epidural stimulation test (EST). The aims of this thesis were to determine a) whether the efficacy, tolerability, and reliability of CEA might be improved by adding the α2-adrenergic agonists adrenaline and clonidine to CEA, and by the repeated use of EST during CEA; and, b) the feasibility of CSPA given through a microcatheter after vascular surgery. Studies I IV were double-blinded, randomized, and controlled trials; Study V was of a diagnostic, prospective nature. Patients underwent arterial bypass surgery of the legs (I, n=50; IV, n=46), total knee arthroplasty (II, n=70; III, n=72), and abdominal surgery or thoracotomy (V, n=30). Postoperative lumbar CEA consisted of regular mixtures of ropivacaine and fentanyl either without or with adrenaline (2 µg/ml (I) and 4 µg/ml (II)) and clonidine (2 µg/ml (III)). CSPA (IV) was given through a microcatheter (28G) and contained either ropivacaine (max. 2 mg/h) or a mixture of ropivacaine (max. 1 mg/h) and morphine (max. 8 µg/h). Epidural catheter tip position (V) was evaluated both by EST at the moment of catheter placement and several times during CEA, and by epidurography as reference diagnostic test. CEA and CSPA were administered for 24 or 48 h. Study parameters included pain scores assessed with a visual analogue scale, requirements of rescue pain medication, vital signs, and side-effects. Adrenaline (I and II) had no beneficial influence as regards the efficacy or tolerability of CEA. The total amounts of epidurally-infused drugs were even increased in the adrenaline group in Study II (p=0.02, RM ANOVA). Clonidine (III) augmented pain relief with lowered amounts of epidurally infused drugs (p=0.01, RM ANOVA) and reduced need for rescue oxycodone given i.m. (p=0.027, MW-U; median difference 3 mg (95% CI 0 7 mg)). Clonidine did not contribute to sedation and its influence on haemodynamics was minimal. CSPA (IV) provided satisfactory pain relief with only limited blockade of the legs (no inter-group differences). EST (V) was often related to technical problems and difficulties of interpretation, e.g., it failed to identify the four patients whose catheters were outside the spinal canal already at the time of catheter placement. As adjuvants to lumbar CEA, clonidine only slightly improved pain relief, while adrenaline did not provide any benefit. The role of EST applied at the time of epidural catheter placement or repeatedly during CEA remains open. The microcatheter CSPA technique appeared effective and reliable, but needs to be compared to routine CEA after peripheral arterial bypass surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acute renal failure (ARF) is a clinical syndrome characterized by rapidly decreasing glomerular filtration rate, which results in disturbances in electrolyte- and acid-base homeostasis, derangement of extracellular fluid volume, and retention of nitrogenous waste products, and is often associated with decreased urine output. ARF affects about 5-25% of patients admitted to intensive care units (ICUs), and is linked to high mortality and morbidity rates. In this thesis outcome of critically ill patients with ARF and factors related to outcome were evaluated. A total of 1662 patients from two ICUs and one acute dialysis unit in Helsinki University Hospital were included. In study I the prevalence of ARF was calculated and classified according to two ARF-specific scoring methods, the RIFLE classification and the classification created by Bellomo et al. (2001). Study II evaluated monocyte human histocompatibility leukocyte antigen-DR (HLA-DR) expression and plasma levels of one proinflammatory (interleukin (IL) 6) and two anti-inflammatory (IL-8 and IL-10) cytokines in predicting survival of critically ill ARF patients. Study III investigated serum cystatin C as a marker of renal function in ARF and its power in predicting survival of critically ill ARF patients. Study IV evaluated the effect of intermittent hemodiafiltration (HDF) on myoglobin elimination from plasma in severe rhabdomyolysis. Study V assessed long-term survival and health-related quality of life (HRQoL) in ARF patients. Neither of the ARF-specific scoring methods presented good discriminative power regarding hospital mortality. The maximum RIFLE score for the first three days in the ICU was an independent predictor of hospital mortality. As a marker of renal dysfunction, serum cystatin C failed to show benefit compared with plasma creatinine in detecting ARF or predicting patient survival. Neither cystatin C nor plasma concentrations of IL-6, IL-8, and IL-10, nor monocyte HLA-DR expression were clinically useful in predicting mortality in ARF patients. HDF may be used to clear myoglobin from plasma in rhabdomyolysis, especially if the alkalization of diuresis does not succeed. The long-term survival of patients with ARF was found to be poor. The HRQoL of those who survive is lower than that of the age- and gender-matched general population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anterior cruciate ligament (ACL) tear is a common sports injury of the knee. Arthroscopic reconstruction using autogenous graft material is widely used for patients with ACL instability. The grafts most commonly used are the patellar and the hamstring tendons, by various fixation techniques. Although clinical evaluation and conventional radiography are routinely used in follow-up after ACL surgery, magnetic resonance imaging (MRI) plays an important role in the diagnosis of complications after ACL surgery. The aim of this thesis was to study the clinical outcome of patellar and hamstring tendon ACL reconstruction techniques. In addition, the postoperative appearance of the ACL graft was evaluated using several MRI sequences. Of the 175 patients who underwent an arthroscopically assisted ACL reconstruction, 99 patients were randomized into patellar tendon (n=51) or hamstring tendon (n=48) groups. In addition, 62 patients with hamstring graft ACL reconstruction were randomized into either cross-pin (n=31) or interference screw (n=31) fixation groups. Follow-up evaluation determined knee laxity, isokinetic muscle performance and several knee scores. Lateral and anteroposterior view radiographs were obtained. Several MRI sequences were obtained with a 1.5-T imager. The appearance and enhancement pattern of the graft and periligamentous tissue, and the location of bone tunnels were evaluated. After MRI, arthroscopy was performed on 14 symptomatic knees. The results revealed no significant differences in the 2-year outcome between the groups. In the hamstring tendon group, the average femoral and tibial bone tunnel diameter increased during 2 years follow-up by 33% and 23%, respectively. In the asymptomatic knees, the graft showed homogeneous and low signal intensity with periligamentous streaks of intermediate signal intensity on T2-weighted MR images. In the symptomatic knees, arthroscopy revealed 12 abnormal grafts and two meniscal tears, each with an intact graft. Among 3 lax grafts visible on arthroscopy, MRI showed an intact graft and improper bone tunnel placement. For diagnosing graft failure, all MRI findings combined gave a specificity of 90% and a sensitivity of 81%. In conclusion, all techniques appeared to improve patients' performance, and were therefore considered as good choices for ACL reconstruction. In follow-up, MRI permits direct evaluation of the ACL graft, the bone tunnels, and additional disorders of the knee. Bone tunnel enlargement and periligamentous tissue showing contrast enhancement were non-specific MRI findings that did not signify ACL deficiency. With an intact graft and optimal femoral bone tunnel placement, graft deficiency is unlikely, and the MRI examination should be carefully scrutinized for possible other causes for the patients symptoms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IgA nephropathy (IgAN) is the most common primary glomerulonephritis. In one third of the patients the disease progresses, and they eventually need renal replacement therapy. IgAN is in most cases a slowly progressing disease, and the prediction of progression has been difficult, and the results of studies have been conflicting. Henoch-Schönlein nephritis (HSN) is rare in adults, and prediction of the outcome is even more difficult than in IgAN. This study was conducted to evaluate the clinical and histopathological features and predictors of the outcome of IgAN and HSN diagnosed in one centre (313 IgAN patients and 38 HSN patients), and especially in patients with normal renal function at the time of renal biopsy. The study also aimed to evaluate whether there is a difference in the progression rates in four countries (259 patients from Finland, 112 from UK, 121 from Australia and 274 from Canada), and if so, can this be explained by differences in renal biopsy policy. The third aim was to measure urinary excretions of cytokines interleukin 1ß (IL-1ß) and interleukin 1 receptor antagonist (IL-1ra) in patients with IgAN and HSN and the correlations of excretion of these substances with histopathological damage and clinical factors. A large proportion of the patients diagnosed in Helsinki as having IgAN had normal renal function (161/313 patients). Four factors, (hypertension, higher amounts of urinary erythrocytes, severe arteriolosclerosis and a higher glomerular score) which independently predicted progression (logistic regression analysis), were identified in mild disease. There was geographic variability in renal survival in patients with IgAN. When age, levels of renal function, proteinuria and blood pressure were taken into account, it showed that the variability related mostly to lead-time bias and renal biopsy indications. Amount of proteinuria more than 0.4g/24h was the only factor that was significantly related to the progression of HSN. the Hypertension and the level of renal function were found to be factors predicting outcome in patients with normal renal function at the time of diagnosis. In IgAN patients, IL-1ra excretion into urine was found to be decreased as compared with HSN patients and healthy controls. Patients with a high IL-1ra/IL-1ß ratio had milder histopathological changes in renal biopsy than patients with a low/normal IL-1ra/IL-1ß ratio. It was also found that the excretion of IL-1ß and especially IL-1ra were significantly higher in women. In conclusion, it was shown that factors associated with outcome can reliably be identified even in mild cases of IgAN. Predicting outcome in adult HSN, however, remains difficult.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to produce information on and practical recommendations for informed decision-making on and capacity building for sustainable forest management (SFM) and good forest governance. This was done within the overall global framework for sustainable development with special emphasis on the EU and African frameworks and on Southern Sudan and Ethiopia in particular. The case studies on Southern Sudan and Ethiopia focused on local, national and regional issues. Moreover, this study attempted to provide both theoretical and practical new insight. The aim was to build an overall theoretical framework and to study its key contents and main implications for SFM and good forest governance at all administration levels, for providing new tools for capacity building in natural resources management. The theoretical framework and research approach were based on the original research problem and the general and specific aims of the study. The key elements of the framework encompass sustainable development, global and EU governance, sustainable forest management (SFM), good forest governance, as well as international and EU law. The selected research approach comprised matrix-based assessment of international, regional (EU and Africa) and national (Southern Sudan and Ethiopia) policy and legal documents. The specific case study on Southern Sudan also involved interviews and group discussions with local community members and government officials. As a whole, this study attempted to link the global, regional, national and local levels in forest-sector development and especially to analyse how the international policy development in environmental and forestry issues is reflected in field-level progress towards SFM and good forest governance, for the specific cases of Southern Sudan and Ethiopia. The results on Southern Sudan focused on the existing situation and perceived needs in capacity building for SFM and good forest governance at all administration levels. Specifically, the results of the case study on Southern Sudan presented the current situation in selected villages in the northern parts of Renk County in Upper Nile State, and the implications of Multilateral Environmental Agreements (MEAs) and of the new forest policy framework for capacity building actions. The results on Ethiopia focused on training, extension, research, education and new curriculum development within higher education institutions and particularly at the Wondo Genet College of Forestry and Natural Resources (WGCF-NR), which administratively lies under Hawassa University. The results suggest that, for both cases studies, informed decision-making on and capacity building for SFM and good forest governance require comprehensive, long-term, cross-sectoral, coherent and consistent approaches within the dynamic and evolving overall global framework, including its multiple inter-linked levels. The specific priority development and focus areas comprised the establishment of SFM and good forest governance in accordance with the overall sustainable development priorities and with more focus on the international trade in forest products that are derived from sustainable and legal sources with an emphasis on effective forest law enforcement and governance at all levels. In Upper Nile State in Southern Sudan there were positive development signals such as the will of the local people to plant more multipurpose trees on farmlands and range lands as well as the recognition of the importance of forests and trees for sustainable rural development where food security is a key element. In addition, it was evident that the local communities studied in Southern Sudan also wanted to establish good governance systems through partnerships with all actors and through increased local responsibilities. The results also suggest that the implementation of MEAs at the local level in Southern Sudan requires mutually supportive and coherent approaches within the agreements as well as significantly more resources and financial and technical assistance for capacity building, training and extension. Finally, the findings confirm the importance of full utilization of the existing local governance and management systems and their traditional and customary knowledge and practices, and of new development partnerships with full participation of all stakeholders. The planned new forest law for Southern Sudan, based on an already existing new forest policy, is expected to recognize the roles of local-level actors, and it would thus obviously facilitate the achieving of sustainable forest management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human parvovirus B19 (B19V) is known to cause anemia, hydrops fetalis, and fetal death especially during the first half of pregnancy. Women who are in occupational contact with young children are at increased risk of B19V infection. The role of the recently discovered human parvovirus, human bocavirus (HBoV), in reproduction is unknown. The aim of this research project was to establish a scientific basis for assessing the work safety of pregnant women and for issuing special maternity leave regulations during B19V epidemics in Finland. The impact of HBoV infection on the pregnant woman and her fetus was also defined. B19V DNA was found in 0.8% of the miscarriages and in 2.4% of the intrauterine fetal death (IUFD; fetal death after completed 22 gestational weeks). All control fetuses (from induced abortions) were B19V-DNA negative. The findings on hydropic B19V DNA-positive IUFDs with evidence of acute or recent maternal B19V infection are in line with those of previous Swedish studies. However, the high prevalence of B19V-related nonhydropic IUFDs noted in the Swedish studies was mostly without evidence of maternal B19V infection and was not found during the third trimester. HBoV was not associated with miscarriages or IUFDs. Almost all of the studied pregnant women were HboV-IgG positive, and thus most probably immune to HBoV. All preterm births, perinatal deaths, smallness for gestational age (SGA) and congenital anomaly were recorded among the infants of child-care employees in a nationwide register-based cohort study over a period of 14 years. Little or no differences in the results were found between the infants of the child-care employees and those of the comparison group. The annual B19V seroconversion rate was over two-fold among the child-care employees, compared to the women in the comparison group. The seropositivity of the child-care employees increased with age, and years from qualification/joining the trade union. In general, the child-care employees are not at increased risk for adverse pregnancy outcome. However, at the population level, the risk of rare events, such as adverse pregnancy outcomes attributed to infections, could not be determined. According to previous studies, seronegative women had a 5 10% excess risk of losing the fetus during the first half of their pregnancy, but thereafter the risk was very low. Therefore, an over two-fold increased risk of B19V infection among child-care employees is considerable, and should be taken into account in the assessment of the occupational safety of pregnant women, especially during the first half of their pregnancy.