985 resultados para Long-Term Synaptic Depression
Resumo:
Previous short-term studies predict that the use of fire to manage lantana (Lantana camara) may promote its abundance. We tested this prediction by examining long-term recruitment patterns of lantana in a dry eucalypt forest in Australia from 1959 to 2007 in three fire frequency treatments: repeated annual burning, repeated triennial burning and long unburnt. The dataset was divided into two periods (1959–1972, 1974–2007) due to logging that occurred at the study site between 1972 and 1974 and the establishment of the triennial burn treatment in 1973. Our results showed that repeated burning decreased lantana regeneration under an annual burn regime in the pre- and post-logging periods and maintained low levels of regeneration in the triennial burn compartment during the post-logging period. In the absence of fire, lantana recruitment exhibited a dome-shaped response over time, with the total population peaking in 1982 before declining to 2007. In addition to fire regime, soil pH and carbon to nitrogen ratio, the density of taller conspecifics and the interaction between rainfall and fire regime were found to influence lantana regeneration change over time. The results suggest that the reported positive association between fire disturbance and abundance of lantana does not hold for all forest types and that fire should be considered as part of an integrated weed management strategy for lantana in more fire-tolerant ecosystems.
Resumo:
Glyphosate resistance is a rapidly developing threat to profitability in Australian cotton farming. Resistance causes an immediate reduction in the effectiveness of in-crop weed control in glyphosate-resistant transgenic cotton and summer fallows. Although strategies for delaying glyphosate resistance and those for managing resistant populations are qualitatively similar, the longer resistance can be delayed, the longer cotton growers will have choice over which tactics to apply and when to apply them. Effective strategies to avoid, delay, and manage resistance are thus of substantial value. We used a model of glyphosate resistance dynamics to perform simulations of resistance evolution in Sonchus oleraceus (common sowthistle) and Echinochloa colona (awnless barnyard grass) under a range of resistance prevention, delaying, and management strategies. From these simulations, we identified several elements that could contribute to effective glyphosate resistance prevention and management strategies. (i) Controlling glyphosate survivors is the most robust approach to delaying or preventing resistance. High-efficacy, high-frequency survivor control almost doubled the useful lifespan of glyphosate from 13 to 25 years even with glyphosate alone used in summer fallows. (ii) Two non-glyphosate tactics in-crop plus two in-summer fallows is the minimum intervention required for long-term delays in resistance evolution. (iii) Pre-emergence herbicides are important, but should be backed up with non-glyphosate knockdowns and strategic tillage; replacing a late-season, pre-emergence herbicide with inter-row tillage was predicted to delay glyphosate resistance by 4 years in awnless barnyard grass. (iv) Weed species' ecological characteristics, particularly seed bank dynamics, have an impact on the effectiveness of resistance strategies; S. oleraceus, because of its propensity to emerge year-round, was less exposed to selection with glyphosate than E. colona, resulting in an extra 5 years of glyphosate usefulness (18 v. 13 years) even in the most rapid cases of resistance evolution. Delaying tactics are thus available that can provide some or many years of continued glyphosate efficacy. If glyphosate-resistant cotton cropping is to remain profitable in Australian farming systems in the long-term, however, growers must adapt to the probability that they will have to deal with summer weeds that are no longer susceptible to glyphosate. Robust resistance management systems will need to include a diversity of weed control options, used appropriately.
Resumo:
In 2001, the red imported fire ant (Solenopsis invicta Buren) was identified in Brisbane, Australia. An eradication program involving broadcast bait treatment with two insect growth regulators and a metabolic inhibitor began in September of that year and is currently ongoing. To gauge the impacts of these treatments on local ant populations, we examined long-term monitoring data and quantified abundance patterns of S. invicta and common local ant genera using a linear mixed-effects model. For S. invicta, presence in pitfalls reduced over time to zero on every site. Significantly higher numbers of S. invicta workers were collected on high-density polygyne sites, which took longer to disinfest compared with monogyne and low-density polygyne sites. For local ants, nine genus groups of the 10 most common genera analyzed either increased in abundance or showed no significant trend. Five of these genus groups were significantly less abundant at the start of monitoring on high-density polygyne sites compared with monogyne and low-density polygyne sites. The genus Pheidole significantly reduced in abundance over time, suggesting that it was affected by treatment efforts. These results demonstrate that the treatment regime used at the time successfully removed S. invicta from these sites in Brisbane, and that most local ant genera were not seriously impacted by the treatment. These results have important implications for current and future prophylactic treatment efforts, and suggest that native ants remain in treated areas to provide some biological resistance to S. invicta.
Resumo:
Immediate and residual effects of two lengths of low plane of nutrition (PON) on the synthesis of milk protein and protein fractions were studied at the Mutdapilly Research Station, in south-east Queensland. Thirty-six multiparous Holstein-Friesian cows, between 46 and 102 days in milk (DIM) initially, were used in a completely randomised design experiment with three treatments. All cows were fed on a basal diet of ryegrass pasture (7.0 kg DM/cow.day), barley-sorghum concentrate mix (2.7 kg DM/cow.day) and a canola meal-mineral mix (1.3 kg DM/cow.day). To increase PON, 5.0 kg DM/cow.day supplemental maize and forage sorghum silage was added to the basal diet. The three treatments were (C) high PON (basal diet + supplemental silage); (L9) low PON (basal diet only) for a period of 9 weeks; and (L3) low PON (basal diet only) for a period of 3 weeks. The experiment comprised three periods (1) covariate – high PON, all groups (5 weeks), (2) period of low PON for either 3 weeks (L3) or 9 weeks (L9), and (3) period of high PON (all groups) to assess ability of cows to recover any production lost as a result of treatments (5 weeks). The low PON treatment periods for L3 and L9 were end-aligned so that all treatment groups began Period 3 together. Although there was a significant effect of L9 on yields of milk, protein, fat and lactose, and concentrations of true protein, whey protein and urea, these were not significantly different from L3. There were no residual effects of L3 or L9 on protein concentration or nitrogen distribution after 5 weeks of realimentation. There was no significant effect of low PON for 3 or 9 weeks on casein concentration or composition.
Resumo:
Background Hyperferritinemia-cataract syndrome (HCS) is a rare Mendelian condition characterized by bilateral cataract and high levels of serum ferritin in the absence of iron overload. Methods HCS was diagnosed in three adult siblings. In two of them it was possible to assess lens changes initially in 1995 and again in 2013. Serum ferritin, iron, transferrin concentrations and transferrin saturation percentage were also measured, and the Iron Responsive Element (IRE) region of the L-ferritin gene (FTL) was studied. Results Serum ferritin concentrations were considerably elevated while serum iron, transferrin and transferrin saturation levels were within the normal range in each sibling. Cataract changes in our patients were consistent with those previously reported in the literature. Progression of the cataract, an aspect of few studies in this syndrome, appeared to be quite limited in extent. The heterozygous +32G to T (-168G>T) substitution in the IRE of the FTL gene was detected in this family. Conclusions Ophthalmic and biochemical studies together with genetic testing confirmed HCS in three family members. Although the disorder has been extensively described in recent years, little is known regarding cataract evolution over time. In our cases, lens evaluations encompassed many years, identified bilateral cataract of typical morphology and supported the hypothesis that this unique clinical feature of the disease tends to be slowly progressive in nature, at least in adults.
Resumo:
Soil biogeochemical cycles are largely mediated by microorganisms, while fire significantly modifies biogeochemical cycles mainly via altering microbial community and substrate availability. Majority of studies on fire effects have focused on the surface soil; therefore, our understanding of the vertical distribution of microbial communities and the impacts of fire on nitrogen (N) dynamics in the soil profile is limited. Here, we examined the changes of soil denitrification capacity (DNC) and denitrifying communities with depth under different burning regimes, and their interaction with environmental gradients along the soil profile. Results showed that soil depth had a more pronounced impact than the burning treatment on the bacterial community size. The abundance of 16S rRNA and denitrification genes (narG, nirK, and nirS) declined exponentially with soil depth. Surprisingly, the nosZ-harboring denitrifiers were enriched in the deeper soil layers, which was likely to indicate that the nosZ-harboring denitrifiers could better adapt to the stress conditions (i.e., oxygen deficiency, nutrient limitation, etc.) than other denitrifiers. Soil nutrients, including dissolved organic carbon (DOC), total soluble N (TSN), ammonium (NH4 +), and nitrate (NO3 −), declined significantly with soil depth, which probably contributed to the vertical distribution of denitrifying communities. Soil DNC decreased significantly with soil depth, which was negligible in the depths below 20 cm. These findings have provided new insights into niche separation of the N-cycling functional guilds along the soil profile, under a varied fire disturbance regime.
Resumo:
Prescribed fire is one of the most widely-used management tools for reducing fuel loads in managed forests. However the long-term effects of repeated prescribed fires on soil carbon (C) and nitrogen (N) pools are poorly understood. This study aimed to investigate how different fire frequency regimes influence C and N pools in the surface soils (0–10 cm). A prescribed fire field experiment in a wet sclerophyll forest established in 1972 in southeast Queensland was used in this study. The fire frequency regimes included long unburnt (NB), burnt every 2 years (2yrB) and burnt every 4 years (4yrB), with four replications. Compared with the NB treatment, the 2yrB treatment lowered soil total C by 44%, total N by 54%, HCl hydrolysable C and N by 48% and 59%, KMnO4 oxidizable C by 81%, microbial biomass C and N by 42% and 33%, cumulative CO2–C by 28%, NaOCl-non-oxidizable C and N by 41% and 51%, and charcoal-C by 17%, respectively. The 4yrB and NB treatments showed no significant differences for these soil C and N pools. All soil labile, biologically active and recalcitrant and total C and N pools were correlated positively with each other and with soil moisture content, but negatively correlated with soil pH. The C:N ratios of different C and N pools were greater in the burned treatments than in the NB treatments. This study has highlighted that the prescribed burning at four year interval is a more sustainable management practice for this subtropical forest ecosystem.
Resumo:
Bovine Viral Diarrhoea Virus (BVDV) is one of the most serious pathogen, which causes tremendous economic loss to the cattle industry worldwide, meriting the development of improved subunit vaccines. Structural glycoprotein E2 is reported to be a major immunogenic determinant of BVDV virion. We have developed a novel hollow silica vesicles (SV) based platform to administer BVDV-1 Escherichia coli-expressed optimised E2 (oE2) antigen as a nanovaccine formulation. The SV-140 vesicles (diameter 50 nm, wall thickness 6 nm, perforated by pores of entrance size 16 nm and total pore volume of 0.934 cm(3)g(-1)) have proven to be ideal candidates to load oE2 antigen and generate immune response. The current study for the first time demonstrates the ability of freeze-dried (FD) as well as non-FD oE2/SV140 nanovaccine formulation to induce long-term balanced antibody and cell mediated memory responses for at least 6 months with a shortened dosing regimen of two doses in small animal model. The in vivo ability of oE2 (100 mu g)/SV-140 (500 mu g) and FD oE2 (100 mu g)/SV-140 (500 mu g) to induce long-term immunity was compared to immunisation with oE2 (100 mu g) together with the conventional adjuvant Quil-A from the Quillaja saponira (10 mu g) in mice. The oE2/SV-140 as well as the FD oE2/SV-140 nanovaccine generated oE2-specific antibody and cell mediated responses for up to six months post the final second immunisation. Significantly, the cell-mediated responses were consistently high in mice immunised with oE2/SV-140 (1,500 SFU/million cells) at the six-month time point. Histopathology studies showed no morphological changes at the site of injection or in the different organs harvested from the mice immunised with 500 mu g SV-140 nanovaccine compared to the unimmunised control. The platform has the potential for developing single dose vaccines without the requirement of cold chain storage for veterinary and human applications.
Resumo:
This study investigated long-term use of custom-made orthopedic shoes (OS) at 1.5 years follow-up. In addition, the association between short-term outcomes and long-term use was studied. Patients from a previously published study who did use their first-ever pair of OS 3 months after delivery received another questionnaire after 1.5 years. Patients with different pathologies were included in the study (n = 269, response = 86%). Mean age was 63 ± 14 years, and 38% were male. After 1.5 years, 87% of the patients still used their OS (78% frequently [4-7 days/week] and 90% occasionally [1-3 days/week]) and 13% of the patients had ceased using their OS. Patients who were using their OS frequently after 1.5 years had significantly higher scores for 8 of 10 short-term usability outcomes (p-values ranged from <0.001 to 0.046). The largest differences between users and nonusers were found for scores on the short-term outcomes of OS fit and communication with the medical specialist and shoe technician (effect size range = 0.16 to 0.46). We conclude that patients with worse short-term usability outcomes for their OS are more likely to use their OS only occasionally or not at all at long-term follow-up.
Resumo:
Objective To determine mortality rates after a first lower limb amputation and explore the rates for different subpopulations. Methods Retrospective cohort study of all people who underwent a first amputation at or proximal to transtibial level, in an area of 1.7 million people. Analysis with Kaplan-Meier curves and Log Rank tests for univariate associations of psycho-social and health variables. Logistic regression for odds of death at 30-days, 1-year and 5-years. Results 299 people were included. Median time to death was 20.3 months (95%CI: 13.1; 27.5). 30-day mortality = 22%; odds of death 2.3 times higher in people with history of cerebrovascular disease (95%CI: 1.2; 4.7, P = 0.016). 1 year mortality = 44%; odds of death 3.5 times higher for people with renal disease (95%CI: 1.8; 7.0, P < 0.001). 5-years mortality = 77%; odds of death 5.4 times higher for people with renal disease (95%CI: 1.8; 16.0,P = 0.003). Variation in mortality rates was most apparent in different age groups; people 75–84 years having better short term outcomes than those younger and older. Conclusions Mortality rates demonstrated the frailty of this population, with almost one quarter of people dying within 30-days, and almost half at 1 year. People with cerebrovascular had higher odds of death at 30 days, and those with renal disease and 1 and 5 years, respectively.
Resumo:
Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
Staphylococcus aureus is one of the most important bacteria that cause disease in humans, and methicillin-resistant S. aureus (MRSA) has become the most commonly identified antibiotic-resistant pathogen in many parts of the world. MRSA rates have been stable for many years in the Nordic countries and the Netherlands with a low MRSA prevalence in Europe, but in the recent decades, MRSA rates have increased in those low-prevalence countries as well. MRSA has been established as a major hospital pathogen, but has also been found increasingly in long-term facilities (LTF) and in communities of persons with no connections to the health-care setting. In Finland, the annual number of MRSA isolates reported to the National Infectious Disease Register (NIDR) has constantly increased, especially outside the Helsinki metropolitan area. Molecular typing has revealed numerous outbreak strains of MRSA, some of which have previously been associated with community acquisition. In this work, data on MRSA cases notified to the NIDR and on MRSA strain types identified with pulsed-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), and staphylococcal cassette chromosome mec (SCCmec) typing at the National Reference Laboratory (NRL) in Finland from 1997 to 2004 were analyzed. An increasing trend in MRSA incidence in Finland from 1997 to 2004 was shown. In addition, non-multi-drug resistant (NMDR) MRSA isolates, especially those resistant only to methicillin/oxacillin, showed an emerging trend. The predominant MRSA strains changed over time and place, but two internationally spread epidemic strains of MRSA, FIN-16 and FIN-21, were related to the increase detected most recently. Those strains were also one cause of the strikingly increasing invasive MRSA findings. The rise of MRSA strains with SCCmec types IV or V, possible community-acquired MRSA was also detected. With questionnaires, the diagnostic methods used for MRSA identification in Finnish microbiology laboratories and the number of MRSA screening specimens studied were reviewed. Surveys, which focused on the MRSA situation in long-term facilities in 2001 and on the background information of MRSA-positive persons in 2001-2003, were also carried out. The rates of MRSA and screening practices varied widely across geographic regions. Part of the NMDR MRSA strains could remain undetected in some laboratories because of insufficient diagnostic techniques used. The increasing proportion of elderly population carrying MRSA suggests that MRSA is an emerging problem in Finnish long-term facilities. Among the patients, 50% of the specimens were taken on a clinical basis, 43% on a screening basis after exposure to MRSA, 3% on a screening basis because of hospital contact abroad, and 4% for other reasons. In response to an outbreak of MRSA possessing a new genotype that occurred in a health care ward and in an associated nursing home of a small municipality in Northern Finland in autumn 2003, a point-prevalence survey was performed six months later. In the same study, the molecular epidemiology of MRSA and methicillin-sensitive S. aureus (MSSA) strains were also assessed, the results to the national strain collection compared, and the difficulties of MRSA screening with low-level oxacillin-resistant isolates encountered. The original MRSA outbreak in LTF, which consisted of isolates possessing a nationally new PFGE profile (FIN-22) and internationally rare MLST type (ST-27), was confined. Another previously unrecognized MRSA strain was found with additional screening, possibly indicating that current routine MRSA screening methods may be insufficiently sensitive for strains possessing low-level oxacillin resistance. Most of the MSSA strains found were genotypically related to the epidemic MRSA strains, but only a few of them had received the SCCmec element, and all those strains possessed the new SCCmec type V. In the second largest nursing home in Finland, the colonization of S. aureus and MRSA, and the role of screening sites along with broth enrichment culture on the sensitivity to detect S. aureus were studied. Combining the use of enrichment broth and perineal swabbing, in addition to nostrils and skin lesions swabbing, may be an alternative for throat swabs in the nursing home setting, especially when residents are uncooperative. Finally, in order to evaluate adequate phenotypic and genotypic methods needed for reliable laboratory diagnostics of MRSA, oxacillin disk diffusion and MIC tests to the cefoxitin disk diffusion method at both +35°C and +30°C, both with or without an addition of sodium chloride (NaCl) to the Müller Hinton test medium, and in-house PCR to two commercial molecular methods (the GenoType® MRSA test and the EVIGENETM MRSA Detection test) with different bacterial species in addition to S. aureus were compared. The cefoxitin disk diffusion method was superior to that of oxacillin disk diffusion and to the MIC tests in predicting mecA-mediated resistance in S. aureus when incubating at +35°C with or without the addition of NaCl to the test medium. Both the Geno Type® MRSA and EVIGENETM MRSA Detection tests are usable, accurate, cost-effective, and sufficiently fast methods for rapid MRSA confirmation from a pure culture.
Resumo:
Life-history theory states that although natural selection would favour a maximisation of both reproductive output and life-span, such a combination can not be achieved in any living organism. According to life-history theory the reason for the fact that not all traits can be maximised simultaneously is that different traits compete with each other for resources. These relationships between traits that constrain the simultaneous evolution of two or more traits are called trade-offs. Therefore, during different life-stages an individual needs to optimise its allocation of resources to life-history components such as growth, reproduction and survival. Resource limitation acts on these traits and therefore investment in one trait, e.g. reproduction, reduces the resources available for investment in another trait, e.g. residual reproduction or survival. In this thesis I study how food resources during different stages of the breeding event affect reproductive decisions in the Ural owl (Strix uralensis) and the consequences of these decisions on parents and offspring. The Ural owl is a suitable study species for such studies in natural populations since they are long-lived, site-tenacious, and feed on voles. The vole populations in Fennoscandia fluctuate in three- to four-year cycles, which create a variable food environment for the Ural owls to cope with. The thesis gives new insight in reproductive costs and their consequences in natural animal populations with emphasis on underlying physiological mechanisms. I found that supplementary fed Ural owl parents invest supplemented food resources during breeding in own self-maintenance instead of allocating those resources to offspring growth. This investment in own maintenance instead of improving current reproduction had carry-over effects to the following year in terms of increased reproductive output. Therefore, I found evidence that reduced reproductive costs improves future reproductive performance. Furthermore, I found evidence for the underlying mechanism behind this carry-over effect of supplementary food on fecundity. The supplementary-fed parents reduced their feeding investment in the offspring compared to controls, which enabled the fed female parents to invest the surplus resources in parasite resistance. Fed female parents had lower blood parasite loads than control females and this effect lasted until the following year when also reproductive output was increased. Hence, increased investment in parasite resistance when resources are plentiful has the potential to mediate positive carry-over effects on future reproduction. I further found that this carry-over effect was only present when potentials for future reproduction were good. The thesis also provides new knowledge on resource limitation on maternal effects. I found that increased resources prior to egg laying improve the condition and health of Ural owl females and enable them to allocate more resources to reproduction than control females. These additional resources are not allocated to increase the number of offspring, but instead to improve the quality of each offspring. Fed Ural owl females increased the size of their eggs and allocated more health improving immunological components into the eggs. Furthermore, the increased egg size had long-lasting effects on offspring growth, as offspring from larger eggs were heavier at fledging. Limiting resources can have different short- and long-term consequences on reproductive decisions that affect both offspring number and quality. In long-lived organisms, such as the Ural owl, it appears to be beneficial in terms of fitness to invest in long breeding life-span instead of additional investment in current reproduction. In Ural owls, females can influence the phenotypic quality of the offspring by transferring additional resources to the eggs that can have long-lasting effects on growth.
Resumo:
With transplant rejection rendered a minor concern and survival rates after liver transplantation (LT) steadily improving, long-term complications are attracting more attention. Current immunosuppressive therapies, together with other factors, are accompanied by considerable long-term toxicity, which clinically manifests as renal dysfunction, high risk for cardiovascular disease, and cancer. This thesis investigates the incidence, causes, and risk factors for such renal dysfunction, cardiovascular risk, and cancer after LT. Long-term effects of LT are further addressed by surveying the quality of life and employment status of LT recipients. The consecutive patients included had undergone LT at Helsinki University Hospital from 1982 onwards. Data regarding renal function – creatinine and estimated glomerular filtration rate (GFR) – were recorded before and repeatedly after LT in 396 patients. The presence of hypertension, dyslipidemia, diabetes, impaired fasting glucose, and overweight/obesity before and 5 years after LT was determined among 77 patients transplanted for acute liver failure. The entire cohort of LT patients (540 patients), including both children and adults, was linked with the Finnish Cancer Registry, and numbers of cancers observed were compared to site-specific expected numbers based on national cancer incidence rates stratified by age, gender, and calendar time. Health-related quality of life (HRQoL), measured by the 15D instrument, and employment status were surveyed among all adult patients alive in 2007 (401 patients). The response rate was 89%. Posttransplant cardiovascular risk factor prevalence and HRQoL were compared with that in the age- and gender-matched Finnish general population. The cumulative risk for chronic kidney disease increased from 10% at 5 years to 16% at 10 years following LT. GFR up to 10 years after LT could be predicted by the GFR at 1 year. In patients transplanted for chronic liver disease, a moderate correlation of pretransplant GFR with later GFR was also evident, whereas in acute liver failure patients after LT, even severe pretransplant renal dysfunction often recovered. By 5 years after LT, 71% of acute liver failure patients were receiving antihypertensive medications, 61% were exhibiting dyslipidemia, 10% were diabetic, 32% were overweight, and 13% obese. Compared with the general population, only hypertension displayed a significantly elevated prevalence among patients – 2.7-fold – whereas patients exhibited 30% less dyslipidemia and 71% less impaired fasting glucose. The cumulative incidence of cancer was 5% at 5 years and 13% at 10. Compared with the general population, patients were subject to a 2.6-fold cancer risk, with non-melanoma skin cancer (standardized incidence ratio, SIR, 38.5) and non-Hodgkin lymphoma (SIR 13.9) being the predominant malignancies. Non-Hodgkin lymphoma was associated with male gender, young age, and the immediate posttransplant period, whereas old age and antibody induction therapy raised skin-cancer risk. HRQoL deviated clinically unimportantly from the values in the general population, but significant deficits among patients were evident in some physical domains. HRQoL did not seem to decrease with longer follow-up. Although 87% of patients reported improved working capacity, data on return to working life showed marked age-dependency: Among patients aged less than 40 at LT, 70 to 80% returned to work, among those aged 40 to 50, 55%, and among those above 50, 15% to 28%. The most common cause for unemployment was early retirement before LT. Those patients employed exhibited better HRQoL than those unemployed. In conclusion, although renal impairment, hypertension, and cancer are evidently common after LT and increase with time, patients’ quality of life remains comparable with that of the general population.