1000 resultados para lääketieteellinen genetiikka
Resumo:
The purpose of this series of studies was to evaluate the biocompatibility of poly (ortho) ester (POE), copolymer of ε-caprolactone and D,L-lactide [P (ε-CL/DL-LA)] and the composite of P(ε-CL/DL-LA) and tricalciumphosphate (TCP) as bone filling material in bone defects. Tissue reactions and resorption times of two solid POE-implants (POE 140 and POE 46) with different methods of sterilization (gamma- and ethylene oxide sterilization), P(ε-CL/DL-LA)(40/60 w/w) in paste form and 50/50 w/w composite of 40/60 w/w P(ε-CL/DL-LA) and TCP and 27/73 w/w composite of 60/40 w/w P(ε-CL/DL-LA) and TCP were examined in experimental animals. The follow-up times were from one week to 52 weeks. The bone samples were evaluated histologically and the soft tissue samples histologically, immunohistochemically and electronmicroscopically. The results showed that the resorption time of gamma sterilized POE 140 was eight weeks and ethylene oxide sterilized POE 140 13 weeks in bone. The resorption time of POE 46 was more than 24 weeks. The gamma sterilized rods started to erode from the surface faster than ethylene oxide sterilized rods for both POEs. Inflammation in bone was from slight to moderate with POE 140 and moderate with POE 46. No highly fluorescent layer of tenascin or fibronectin was found in the soft tissue. Bone healing at the sites of implantation was slower than at control sites with the copolymer in small bone defects. The resorption time for the copolymer was over one year. Inflammation in bone was mostly moderate. Bone healing at the sites of implantation was also slower than at the control sites with the composite in small and large mandibular bone defects. Bone formation had ceased at both sites by the end of follow-up in large mandibular bone defects. The ultrastructure of the connective tissue was normal during the period of observation. It can be concluded that the method of sterilization influenced the resorption time of both POEs. Gamma sterilized POE 140 could have been suitable material for filling small bone defects, whereas the degradation times of solid EO-sterilized POE 140 and POE 46 were too slow to be considered as bone filling material. Solid material is difficult to contour, which can be considered as a disadvantage. The composites were excellent to handle, but the degradation time of the polymer and the composites were too slow. Therefore, the copolymer and the composite can not be recommended as bone filling material.
Resumo:
Spirometry is the most widely used lung function test in the world. It is fundamental in diagnostic and functional evaluation of various pulmonary diseases. In the studies described in this thesis, the spirometric assessment of reversibility of bronchial obstruction, its determinants, and variation features are described in a general population sample from Helsinki, Finland. This study is a part of the FinEsS study, which is a collaborative study of clinical epidemiology of respiratory health between Finland (Fin), Estonia (Es), and Sweden (S). Asthma and chronic obstructive pulmonary disease (COPD) constitute the two major obstructive airways diseases. The prevalence of asthma has increased, with around 6% of the population in Helsinki reporting physician-diagnosed asthma. The main cause of COPD is smoking with changes in smoking habits in the population affecting its prevalence with a delay. Whereas airway obstruction in asthma is by definition reversible, COPD is characterized by fixed obstruction. Cough and sputum production, the first symptoms of COPD, are often misinterpreted for smokers cough and not recognized as first signs of a chronic illness. Therefore COPD is widely underdiagnosed. More extensive use of spirometry in primary care is advocated to focus smoking cessation interventions on populations at risk. The use of forced expiratory volume in six seconds (FEV6) instead of forced vital capacity (FVC) has been suggested to enable office spirometry to be used in earlier detection of airflow limitation. Despite being a widely accepted standard method of assessment of lung function, the methodology and interpretation of spirometry are constantly developing. In 2005, the ATS/ERS Task Force issued a joint statement which endorsed the 12% and 200 ml thresholds for significant change in forced expiratory volume in one second (FEV1) or FVC during bronchodilation testing, but included the notion that in cases where only FVC improves it should be verified that this is not caused by a longer exhalation time in post-bronchodilator spirometry. This elicited new interest in the assessment of forced expiratory time (FET), a spirometric variable not usually reported or used in assessment. In this population sample, we examined FET and found it to be on average 10.7 (SD 4.3) s and to increase with ageing and airflow limitation in spirometry. The intrasession repeatability of FET was the poorest of the spirometric variables assessed. Based on the intrasession repeatability, a limit for significant change of 3 s was suggested for FET during bronchodilation testing. FEV6 was found to perform equally well as FVC in the population and in a subgroup of subjects with airways obstruction. In the bronchodilation test, decreases were frequently observed in FEV1 and particularly in FVC. The limit of significant increase based on the 95th percentile of the population sample was 9% for FEV1 and 6% for FEV6 and FVC; these are slightly lower than the current limits for single bronchodilation tests (ATS/ERS guidelines). FEV6 was proven as a valid alternative to FVC also in the bronchodilation test and would remove the need to control duration of exhalation during the spirometric bronchodilation test.
Resumo:
The continuous production of blood cells, a process termed hematopoiesis, is sustained throughout the lifetime of an individual by a relatively small population of cells known as hematopoietic stem cells (HSCs). HSCs are unique cells characterized by their ability to self-renew and give rise to all types of mature blood cells. Given their high proliferative potential, HSCs need to be tightly regulated on the cellular and molecular levels or could otherwise turn malignant. On the other hand, the tight regulatory control of HSC function also translates into difficulties in culturing and expanding HSCs in vitro. In fact, it is currently not possible to maintain or expand HSCs ex vivo without rapid loss of self-renewal. Increased knowledge of the unique features of important HSC niches and of key transcriptional regulatory programs that govern HSC behavior is thus needed. Additional insight in the mechanisms of stem cell formation could enable us to recapitulate the processes of HSC formation and self-renewal/expansion ex vivo with the ultimate goal of creating an unlimited supply of HSCs from e.g. human embryonic stem cells (hESCs) or induced pluripotent stem cells (iPS) to be used in therapy. We thus asked: How are hematopoietic stem cells formed and in what cellular niches does this happen (Papers I, II)? What are the molecular mechanisms that govern hematopoietic stem cell development and differentiation (Papers III, IV)? Importantly, we could show that placenta is a major fetal hematopoietic niche that harbors a large number of HSCs during midgestation (Paper I)(Gekas et al., 2005). In order to address whether the HSCs found in placenta were formed there we utilized the Runx1-LacZ knock-in and Ncx1 knockout mouse models (Paper II). Importantly, we could show that HSCs emerge de novo in the placental vasculature in the absence of circulation (Rhodes et al., 2008). Furthermore, we could identify defined microenvironmental niches within the placenta with distinct roles in hematopoiesis: the large vessels of the chorioallantoic mesenchyme serve as sites of HSC generation whereas the placental labyrinth is a niche supporting HSC expansion (Rhodes et al., 2008). Overall, these studies illustrate the importance of distinct milieus in the emergence and subsequent maturation of HSCs. To ensure proper function of HSCs several regulatory mechanisms are in place. The microenvironment in which HSCs reside provides soluble factors and cell-cell interactions. In the cell-nucleus, these cell-extrinsic cues are interpreted in the context of cell-intrinsic developmental programs which are governed by transcription factors. An essential transcription factor for initiation of hematopoiesis is Scl/Tal1 (stem cell leukemia gene/T-cell acute leukemia gene 1). Loss of Scl results in early embryonic death and total lack of all blood cells, yet deactivation of Scl in the adult does not affect HSC function (Mikkola et al., 2003b. In order to define the temporal window of Scl requirement during fetal hematopoietic development, we deactivated Scl in all hematopoietic lineages shortly after hematopoietic specification in the embryo . Interestingly, maturation, expansion and function of fetal HSCs was unaffected, and, as in the adult, red blood cell and platelet differentiation was impaired (Paper III)(Schlaeger et al., 2005). These findings highlight that, once specified, the hematopoietic fate is stable even in the absence of Scl and is maintained through mechanisms that are distinct from those required for the initial fate choice. As the critical downstream targets of Scl remain unknown, we sought to identify and characterize target genes of Scl (Paper IV). We could identify transcription factor Mef2C (myocyte enhancer factor 2 C) as a novel direct target gene of Scl specifically in the megakaryocyte lineage which largely explains the megakaryocyte defect observed in Scl deficient mice. In addition, we observed an Scl-independent requirement of Mef2C in the B-cell compartment, as loss of Mef2C leads to accelerated B-cell aging (Gekas et al. Submitted). Taken together, these studies identify key extracellular microenvironments and intracellular transcriptional regulators that dictate different stages of HSC development, from emergence to lineage choice to aging.
Resumo:
Infection is a major cause of mortality and morbidity after thoracic organ transplantation. The aim of the present study was to evaluate the infectious complications after lung and heart transplantation, with a special emphasis on the usefulness of bronchoscopy and the demonstration of cytomegalovirus (CMV), human herpes virus (HHV)-6, and HHV-7. We reviewed all the consecutive bronchoscopies performed on heart transplant recipients (HTRs) from May 1988 to December 2001 (n = 44) and lung transplant recipients (LTRs) from February 1994 to November 2002 (n = 472). To compare different assays in the detection of CMV, a total of 21 thoracic organ transplant recipients were prospectively monitored by CMV pp65-antigenemia, DNAemia (PCR), and mRNAemia (NASBA) tests. The antigenemia test was the reference assay for therapeutic intervention. In addition to CMV antigenemia, 22 LTRs were monitored for HHV-6 and HHV-7 antigenemia. The diagnostic yield of the clinically indicated bronchoscopies was 41 % in the HTRs and 61 % in the LTRs. The utility of the bronchoscopy was highest from one to six months after transplantation. In contrast, the findings from the surveillance bronchoscopies performed on LTRs led to a change in the previous treatment in only 6 % of the cases. Pneumocystis carinii and CMV were the most commonly detected pathogens. Furthermore, 15 (65 %) of the P. carinii infections in the LTRs were detected during chemoprophylaxis. None of the complications of the bronchoscopies were fatal. Antigenemia, DNAemia, and mRNAemia were present in 98 %, 72 %, and 43 % of the CMV infections, respectively. The optimal DNAemia cut-off levels (sensitivity/specificity) were 400 (75.9/92.7 %), 850 (91.3/91.3 %), and 1250 (100/91.5 %) copies/ml for the antigenemia of 2, 5, and 10 pp65-positive leukocytes/50 000 leukocytes, respectively. The sensitivities of the NASBA were 25.9, 43.5, and 56.3 % in detecting the same cut-off levels. CMV DNAemia was detected in 93 % and mRNAemia in 61 % of the CMV antigenemias requiring antiviral therapy. HHV-6, HHV-7, and CMV antigenemia was detected in 20 (91 %), 11 (50 %), and 12 (55 %) of the 22 LTRs (median 16, 31, and 165 days), respectively. HHV-6 appeared in 15 (79 %), HHV-7 in seven (37 %), and CMV in one (7 %) of these patients during ganciclovir or valganciclovir prophylaxis. One case of pneumonitis and another of encephalitis were associated with HHV-6. In conclusion, bronchoscopy is a safe and useful diagnostic tool in LTRs and HTRs with a suspected respiratory infection, but the role of surveillance bronchoscopy in LTRs remains controversial. The PCR assay acts comparably with the antigenemia test in guiding the pre-emptive therapy against CMV when threshold levels of over 5 pp65-antigen positive leukocytes are used. In contrast, the low sensitivity of NASBA limits its usefulness. HHV-6 and HHV-7 activation is common after lung transplantation despite ganciclovir or valganciclovir prophylaxis, but clinical manifestations are infrequently linked to them.
Resumo:
The purpose of this study was to estimate the prevalence and distribution of reduced visual acuity, major chronic eye diseases, and subsequent need for eye care services in the Finnish adult population comprising persons aged 30 years and older. In addition, we analyzed the effect of decreased vision on functioning and need for assistance using the World Health Organization’s (WHO) International Classification of Functioning, Disability, and Health (ICF) as a framework. The study was based on the Health 2000 health examination survey, a nationally representative population-based comprehensive survey of health and functional capacity carried out in 2000 to 2001 in Finland. The study sample representing the Finnish population aged 30 years and older was drawn by a two-stage stratified cluster sampling. The Health 2000 survey included a home interview and a comprehensive health examination conducted at a nearby screening center. If the invited participants did not attend, an abridged examination was conducted at home or in an institution. Based on our finding in participants, the great majority (96%) of Finnish adults had at least moderate visual acuity (VA ≥ 0.5) with current refraction correction, if any. However, in the age group 75–84 years the prevalence decreased to 81%, and after 85 years to 46%. In the population aged 30 years and older, the prevalence of habitual visual impairment (VA ≤ 0.25) was 1.6%, and 0.5% were blind (VA < 0.1). The prevalence of visual impairment increased significantly with age (p < 0.001), and after the age of 65 years the increase was sharp. Visual impairment was equally common for both sexes (OR 1.20, 95% CI 0.82 – 1.74). Based on self-reported and/or register-based data, the estimated total prevalences of cataract, glaucoma, age-related maculopathy (ARM), and diabetic retinopathy (DR) in the study population were 10%, 5%, 4%, and 1%, respectively. The prevalence of all of these chronic eye diseases increased with age (p < 0.001). Cataract and glaucoma were more common in women than in men (OR 1.55, 95% CI 1.26 – 1.91 and OR 1.57, 95% CI 1.24 – 1.98, respectively). The most prevalent eye diseases in people with visual impairment (VA ≤ 0.25) were ARM (37%), unoperated cataract (27%), glaucoma (22%), and DR (7%). One-half (58%) of visually impaired people had had a vision examination during the past five years, and 79% had received some vision rehabilitation services, mainly in the form of spectacles (70%). Only one-third (31%) had received formal low vision rehabilitation (i.e., fitting of low vision aids, receiving patient education, training for orientation and mobility, training for activities of daily living (ADL), or consultation with a social worker). People with low vision (VA 0.1 – 0.25) were less likely to have received formal low vision rehabilitation, magnifying glasses, or other low vision aids than blind people (VA < 0.1). Furthermore, low cognitive capacity and living in an institution were associated with limited use of vision rehabilitation services. Of the visually impaired living in the community, 71% reported a need for assistance and 24% had an unmet need for assistance in everyday activities. Prevalence of ADL, instrumental activities of daily living (IADL), and mobility increased with decreasing VA (p < 0.001). Visually impaired persons (VA ≤ 0.25) were four times more likely to have ADL disabilities than those with good VA (VA ≥ 0.8) after adjustment for sociodemographic and behavioral factors and chronic conditions (OR 4.36, 95% CI 2.44 – 7.78). Limitations in IADL and measured mobility were five times as likely (OR 4.82, 95% CI 2.38 – 9.76 and OR 5.37, 95% CI 2.44 – 7.78, respectively) and self-reported mobility limitations were three times as likely (OR 3.07, 95% CI 1.67 – 9.63) as in persons with good VA. The high prevalence of age-related eye diseases and subsequent visual impairment in the fastest growing segment of the population will result in a substantial increase in the demand for eye care services in the future. Many of the visually impaired, especially older persons with decreased cognitive capacity or living in an institution, have not had a recent vision examination and lack adequate low vision rehabilitation. This highlights the need for regular evaluation of visual function in the elderly and an active dissemination of information about rehabilitation services. Decreased VA is strongly associated with functional limitations, and even a slight decrease in VA was found to be associated with limited functioning. Thus, continuous efforts are needed to identify and treat eye diseases to maintain patients’ quality of life and to alleviate the social and economic burden of serious eye diseases.
Resumo:
Background: Alcohol consumption and smoking are the main causes of upper digestive tract cancers. These risk factors account for over 75% of all cases in developed countries. Epidemiological studies have shown that alcohol and tobacco interact in a multiplicative way to the cancer risk, but the pathogenetic mechanism behind this is poorly understood. Strong experimental and human genetic linkage data suggest that acetaldehyde is one of the major factors behind the carcinogenic effect. In the digestive tract, acetaldehyde is mainly formed by microbial metabolism of ethanol. Acetaldehyde is also a major constituent of tobacco smoke. Thus, acetaldehyde from both of these sources may have an interacting carcinogenic effect in the human upper digestive tract. Aims: The first aim of this thesis was to investigate acetaldehyde production and exposure in the human mouth resulting from alcohol ingestion and tobacco smoking in vivo. Secondly, specific L-cysteine products were prepared to examine their efficacy in the binding of salivary acetaldehyde in order to reduce the exposure of the upper digestive tract to acetaldehyde. Methods: Acetaldehyde levels in saliva were measured from human volunteers during alcohol metabolism, during tobacco smoking and during the combined use of alcohol and tobacco. The ability of L-cysteine to eliminate acetaldehyde during alcohol metabolism and tobacco smoking was also investigated with specifically developed tablets. Also the acetaldehyde production of Escherichia coli - an important member of the human microbiota - was measured in different conditions prevailing in the digestive tract. Results and conclusions: These studies established that smokers have significantly increased acetaldehyde exposure during ethanol consumption even when not actively smoking. Acetaldehyde exposure was dramatically further increased during active tobacco smoking. Thus, the elevated aerodigestive tract cancer risk observed in smokers and drinkers may be the result of the increased acetaldehyde exposure. Acetaldehyde produced in the oral cavity during ethanol challenge was significantly decreased by a buccal L-cysteine -releasing tablet. Also smoking-derived acetaldehyde could be totally removed by using a tablet containing L-cysteine. In conclusion, this thesis confirms the essential role of acetaldehyde in the pathogenesis of alcohol- and smoking-induced cancers. This thesis presents a novel experimental approach to decrease the local acetaldehyde exposure of the upper digestive tract with L-cysteine, with the eventual goal of reducting the prevalence of upper digestive tract cancers.
Resumo:
The neurotransmitter serotonin (5-HT) modulates many functions important for life, e.g., appetite and body temperature, and controls development of the neural system. Disturbed 5-HT function has been implicated in mood, anxiety and eating disorders. The serotonin transporter (SERT) controls the amount of effective 5-HT by removing it from the extracellular space. Radionuclide imaging methods single photon emission tomography (SPET) and positron emission tomography (PET) enable studies on the brain SERTs. This thesis concentrated on both methodological and clinical aspects of the brain SERT imaging using SPET. The first study compared the repeatability of automated and manual methods for definition of volumes of interest (VOIs) in SERT images. The second study investigated within-subject seasonal variation of SERT binding in healthy young adults in two brain regions, the midbrain and thalamus. The third study investigated the association of the midbrain and thalamic SERT binding with Bulimia Nervosa (BN) in female twins. The fourth study investigated the association of the midbrain and hypothalamic/thalamic SERT binding and body mass index (BMI) in monozygotic (MZ) twin pairs. Two radioligands for SERT imaging were used: [123I]ADAM (studies I-III) and [123I]nor-beta-CIT (study IV). Study subjects included young adult MZ and dizygotic (DZ) twins screened from the FinnTwin16 twin cohort (studies I-IV) and healthy young adult men recruited for study II. The first study validated the use of an automated brain template in the analyses of [123I]ADAM images and proved automated VOI definition more reproducible than manual VOI definition. The second study found no systematic within-subject variation in SERT binding between scans done in summer and winter in either of the investigated brain regions. The third study found similar SERT binding between BN women (including purging and non-purging probands), their unaffected female co-twins and other healthy women in both brain regions; in post hoc analyses, a subgroup of purging BN women had significantly higher SERT binding in the midbrain as compared to all healthy women. In the fourth study, MZ twin pairs were divided into twins with higher BMI and co-twins with lower BMI; twins with higher BMI were found to have higher SERT binding in the hypothalamus/thalamus than their leaner co-twins. Our results allow the following conclusions: 1) No systematic seasonal variation exists in the midbrain and thalamus between SERT binding in summer and winter. 2) In a population-based sample, BN does not associate with altered SERT status, but alterations are possible in purging BN women. 3) The higher SERT binding in MZ twins with higher BMIs as compared to their leaner co-twins suggests non-genetic association between acquired obesity and the brain 5-HT system, which may have implications on feeding behavior and satiety.
Resumo:
The incidence of non-melanoma skin cancer is increasing worldwide. Basal cell carcinoma followed by squamous cell carcinoma and malignant melanoma are the most frequent skin tumors. Immunosuppressed patients have an increased risk of neoplasia, of which non-melanoma skin cancer is the most common. Matrix metalloproteinases (MMPs) are proteolytic enzymes that collectively are capable of degrading virtually all components of the extracellular matrix. MMPs can also process substrates distinct from extracellular matrix proteins and influence cell proliferation, differentiation, angiogenesis, and apoptosis. MMP activity is regulated by their natural inhibitors, tissue inhibitors of metallopro-teinases (TIMPs). In this study, the expression patterns of MMPs, TIMPs, and certain cancer-related molecules were investigated in premalignant and malignant lesions of the human skin. As methods were used immunohistochemisty, in situ hybridization, and reverse transcriptase polymerase chain reaction (RT-PCR) from the cell cultures. Our aim was to evaluate the expression pattern of MMPs in extramammary Paget's disease in order to find markers for more advanced tumors, as well as to shed light on the origin of this rare neoplasm. Novel MMPs -21, -26, and -28 were studied in melanoma cell culture, in primary cutaneous melanomas, and their sentinel nodes. The MMP expression profile in keratoacanthomas and well-differentiated squamous cell carcinomas was analyzed to find markers to differentiate benign keratinocyte hyperproliferation from malignantly transformed cells. Squamous cell carcinomas of immunosuppressed organ transplant recipients were compared to squamous cell carcinomas of matched immunocompetent controls to investigate the factors explaining their more aggressive nature. We found that MMP-7 and -19 proteins are abundant in extramammary Paget's disease and that their presence may predict an underlying adenocarcinoma in these patients. In melanomas, MMP-21 was upregulated in early phases of melanoma progression, but disappeared from the more aggressive tumors with lymph node metastases. The presence of MMP-13 in primary melanomas and lymph node metastases may relate to more aggressive disease. In keratoacanthomas, the expression of MMP-7 and -9 is rare and therefore should raise a suspicion of well-differentiated squamous cell carcinomas. Furthermore, MMP-19 and p16 were observed in benign keratinocyte hyperproliferation of keratoacanthomas, whereas they were generally lost from malignant keratinocytes of SCCs. MMP-26 staining was significantly stronger in squamous cell carcinomas and Bowen s disease samples of organ transplant recipients and it may contribute to the more aggressive nature of squamous cell carcinomas in immunosuppressed patients. In addition, the staining for MMP-9 was significantly stronger in macrophages surrounding the tumors of the immunocompetent group and in neutrophils of those patients on cyclosporin medication. In conclusion, based on our studies, MMP-7 and -19 might serve as biomarkers for more aggressive extramammary Paget's disease and MMP-21 for malignant transformation of melanocytes. MMP -7, -9, and -26, however, could play an important role in the pathobiology of keratinocyte derived malignancies.
Resumo:
Continuous epidural analgesia (CEA) and continuous spinal postoperative analgesia (CSPA) provided by a mixture of local anaesthetic and opioid are widely used for postoperative pain relief. E.g., with the introduction of so-called microcatheters, CSPA found its way particularly in orthopaedic surgery. These techniques, however, may be associated with dose-dependent side-effects as hypotension, weakness in the legs, and nausea and vomiting. At times, they may fail to offer sufficient analgesia, e.g., because of a misplaced catheter. The correct position of an epidural catheter might be confirmed by the supposedly easy and reliable epidural stimulation test (EST). The aims of this thesis were to determine a) whether the efficacy, tolerability, and reliability of CEA might be improved by adding the α2-adrenergic agonists adrenaline and clonidine to CEA, and by the repeated use of EST during CEA; and, b) the feasibility of CSPA given through a microcatheter after vascular surgery. Studies I IV were double-blinded, randomized, and controlled trials; Study V was of a diagnostic, prospective nature. Patients underwent arterial bypass surgery of the legs (I, n=50; IV, n=46), total knee arthroplasty (II, n=70; III, n=72), and abdominal surgery or thoracotomy (V, n=30). Postoperative lumbar CEA consisted of regular mixtures of ropivacaine and fentanyl either without or with adrenaline (2 µg/ml (I) and 4 µg/ml (II)) and clonidine (2 µg/ml (III)). CSPA (IV) was given through a microcatheter (28G) and contained either ropivacaine (max. 2 mg/h) or a mixture of ropivacaine (max. 1 mg/h) and morphine (max. 8 µg/h). Epidural catheter tip position (V) was evaluated both by EST at the moment of catheter placement and several times during CEA, and by epidurography as reference diagnostic test. CEA and CSPA were administered for 24 or 48 h. Study parameters included pain scores assessed with a visual analogue scale, requirements of rescue pain medication, vital signs, and side-effects. Adrenaline (I and II) had no beneficial influence as regards the efficacy or tolerability of CEA. The total amounts of epidurally-infused drugs were even increased in the adrenaline group in Study II (p=0.02, RM ANOVA). Clonidine (III) augmented pain relief with lowered amounts of epidurally infused drugs (p=0.01, RM ANOVA) and reduced need for rescue oxycodone given i.m. (p=0.027, MW-U; median difference 3 mg (95% CI 0 7 mg)). Clonidine did not contribute to sedation and its influence on haemodynamics was minimal. CSPA (IV) provided satisfactory pain relief with only limited blockade of the legs (no inter-group differences). EST (V) was often related to technical problems and difficulties of interpretation, e.g., it failed to identify the four patients whose catheters were outside the spinal canal already at the time of catheter placement. As adjuvants to lumbar CEA, clonidine only slightly improved pain relief, while adrenaline did not provide any benefit. The role of EST applied at the time of epidural catheter placement or repeatedly during CEA remains open. The microcatheter CSPA technique appeared effective and reliable, but needs to be compared to routine CEA after peripheral arterial bypass surgery.