915 resultados para NATURAL MORTALITY-RATES


Relevância:

80.00% 80.00%

Publicador:

Resumo:

People with psychotic disorders have higher mortality rates compared to the general population. Most deaths are due to cardiovascular (CV) disease, reflecting high rates of CV risk factors such as obesity and diabetes. Treatment with antipsychotic drugs is associated with weight gain in clinical trials. However, there is little information about how these drugs affect children and young people, and how early in the course of treatment the elevation in CV risk factors begins. This information is essential in understanding the costs and benefits of these treatments in young people, and establishing preventive and early intervention services to address physical health comorbidities. This symposium reports both prospective and naturalistic data from children and adolescents treated with antipsychotic drugs. These studies demonstrate that adverse effects on cardiometabolic measures, notably BMI and insulin resistance, become apparent very soon after treatment is initiated. Further, children and adolescents appear to be even more sensitive to these effects than adults. Population-wide studies are also informative. Danish data showing that young people exposed to antipsychotics have a higher risk of diabetes, compared with young people who had a psychiatric diagnosis but were not exposed to antipsychotic drugs, will be presented. In addition, an Australian comparison between a large, nationally representative sample of people with psychosis and a general population sample shows that higher rates of obesity and other cardiometabolic abnormalities are already evident in people with psychosis by the age of 25 years. Young people living with psychosis are already disadvantaged by the demands of living with mental illness, stigma, and social factors such as unemployment and low income. The addition of obesity, diabetes and other comorbidities adds a further burden. The data presented highlights the need for careful selection of antipsychotic drugs, regular monitoring of physical health and early intervention when weight gain, glucose dysregulation, or other cardiometabolic abnormalities are detected.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Occasional strong droughts are an important feature of the climatic environment of tropical rain forest in much of Borneo. This paper compares the response of a lowland dipterocarp forest at Danum, Sabah, in a period of low (LDI) and a period of high (HDI) drought intensity (1986-96, 9.98 y;1996-99, 2.62 y). Mean annual drought intensity was two-fold higher in the HDI than LDI period (1997 v. 976 mm), and each period had one moderately strong main drought (viz. 1992, 1998). Mortality of `all' trees greater than or equal to 10 cm gbh (girth at breast height) and stem growth rates of `small' trees 10less than or equal to50 cm gbh were measured in sixteen 0.16-ha subplots (half on ridge, half on lower slope sites) within two 4-ha plots. These 10-50-cm trees were composed largely of true understorey species. A new procedure was developed to correct for the effect of differences in length of census interval when comparing tree mortality rates. Mortality rates of small trees declined slightly but not significantly between the LDI and HDI periods (1.53 to 1.48% y(-1)): mortality of all trees showed a similar pattern. Relative growth rates declined significantly by 23% from LDI to HDI periods (11.1 to 8.6 mm m(-1) y(-1)): for absolute growth rates the decrease was 28% (2.45 to 1.77 mm y(-1)). Neither mortality nor growth rates were significantly influenced by topography. For small trees, across subplots, absolute growth rate was positively correlated in the LDI period, but negatively correlated in the HDI period, with mortality rate. There was no consistent pattern in the responses among the 19 most abundant species (n greater than or equal to 50 trees) which included a proposed drought-tolerant guild. In terms of tree survival, the forest at Danum was resistant to increasing drought intensity, but showed decreased stem growth attributable to increasing water stress.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the southern part of Korup National Park, Cameroon, the mast fruiting tree Microberlinia bisulcata occurs as a codominant in groves of ectomycorrhizal Caesalpiniaceae within a mosaic of otherwise species-rich lowland rain forest. To estimate the amount of carbon and nutrients invested in reproduction during a mast fruiting event, and the consequential seed and seedling survival, three related field studies were made in 1995. These provided a complete seed and seedling budget for the cohort. Seed production was estimated by counting woody pods on the forest floor. Trees produced on average 26,000 (range 0-92,000) seeds/tree, with a dry mass of 16.6 kg/tree. Seeds were contained in woody pods of mass 307 kg/tree. Dry mass production of pods and seeds was 1034 kg ha(-1), equivalent to over half (55%) of annual leaf litterfall for this species, and contained 13% of the nitrogen and 21% of the phosphorus in annual leaf litterfall. Seed and young-seedling mortality was investigated with open quadrats and cages to exclude vertebrate predators, at two distances from the parent tree. The proportion of seeds on the forest floor which disappeared in the first 6 wk after dispersal was 84%, of which 26.5% was due to likely vertebrate removal, 36% to rotting, and 21.5% to other causes. Vertebrate predation was greater close to the stem than 5 m beyond the crown (41 vs 12% of seeds disappearing) where the seed shadow was less dense. Previous studies have demonstrated an association between mast years at Korup and high dry-season radiation before flowering, and have shown lower leaf-litterfall phosphorus concentrations following mast fruiting. The emerging hypothesis is that mast fruiting is primarily imposed by energy limitation for fruit production, but phosphorus supply and vertebrate predation are regulating factors. Recording the survival of naturally-regenerating M. bisulcata seedlings (6-wk stage) showed that 21% of seedlings survived to 31 mo. A simple three-stage recruitment model was constructed. Mortality rates were initially high and peaked again in each of the next two dry seasons, with smaller peaks in the two intervening wet seasons, these latter coinciding with annual troughs in radiation. The very poor recruitment of M. bisulcata trees in Korup, demonstrated in previous investigations, appears not to be due to a limitation in seed or young-seedling supply, but rather by factors operating at the established-seedling stage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Treatment of patients with paediatric acute lymphoblastic leukaemia has evolved such that the risk of late effects in survivors treated in accordance with contemporary protocols could be different from that noted in those treated decades ago. We aimed to estimate the risk of late effects in children with standard-risk acute lymphoblastic leukaemia treated with contemporary protocols. METHODS We used data from similarly treated members of the Childhood Cancer Survivor Study cohort. The Childhood Cancer Survivor Study is a multicentre, North American study of 5-year survivors of childhood cancer diagnosed between 1970 and 1986. We included cohort members if they were aged 1·0-9·9 years at the time of diagnosis of acute lymphoblastic leukaemia and had received treatment consistent with contemporary standard-risk protocols for acute lymphoblastic leukaemia. We calculated mortality rates and standardised mortality ratios, stratified by sex and survival time, after diagnosis of acute lymphoblastic leukaemia. We calculated standardised incidence ratios and absolute excess risk for subsequent neoplasms with age-specific, sex-specific, and calendar-year-specific rates from the Surveillance, Epidemiology and End Results Program. Outcomes were compared with a sibling cohort and the general US population. FINDINGS We included 556 (13%) of 4329 cohort members treated for acute lymphoblastic leukaemia. Median follow-up of the survivors from 5 years after diagnosis was 18·4 years (range 0·0-33·0). 28 (5%) of 556 participants had died (standardised mortality ratio 3·5, 95% CI 2·3-5·0). 16 (57%) deaths were due to causes other than recurrence of acute lymphoblastic leukaemia. Six (1%) survivors developed a subsequent malignant neoplasm (standardised incidence ratio 2·6, 95% CI 1·0-5·7). 107 participants (95% CI 81-193) in each group would need to be followed-up for 1 year to observe one extra chronic health disorder in the survivor group compared with the sibling group. 415 participants (376-939) in each group would need to be followed-up for 1 year to observe one extra severe, life-threatening, or fatal disorder in the group of survivors. Survivors did not differ from siblings in their educational attainment, rate of marriage, or independent living. INTERPRETATION The prevalence of adverse long-term outcomes in children treated for standard risk acute lymphoblastic leukaemia according to contemporary protocols is low, but regular care from a knowledgeable primary-care practitioner is warranted. FUNDING National Cancer Institute, American Lebanese-Syrian Associated Charities, Swiss Cancer Research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Recently, two simple clinical scores were published to predict survival in trauma patients. Both scores may successfully guide major trauma triage, but neither has been independently validated in a hospital setting. METHODS This is a cohort study with 30-day mortality as the primary outcome to validate two new trauma scores-Mechanism, Glasgow Coma Scale (GCS), Age, and Pressure (MGAP) score and GCS, Age and Pressure (GAP) score-using data from the UK Trauma Audit and Research Network. First, an assessment of discrimination, using the area under the receiver operating characteristic (ROC) curve, and calibration, comparing mortality rates with those originally published, were performed. Second, we calculated sensitivity, specificity, predictive values, and likelihood ratios for prognostic score performance. Third, we propose new cutoffs for the risk categories. RESULTS A total of 79,807 adult (≥16 years) major trauma patients (2000-2010) were included; 5,474 (6.9%) died. Mean (SD) age was 51.5 (22.4) years, median GCS score was 15 (interquartile range, 15-15), and median Injury Severity Score (ISS) was 9 (interquartile range, 9-16). More than 50% of the patients had a low-risk GAP or MGAP score (1% mortality). With regard to discrimination, areas under the ROC curve were 87.2% for GAP score (95% confidence interval, 86.7-87.7) and 86.8% for MGAP score (95% confidence interval, 86.2-87.3). With regard to calibration, 2,390 (3.3%), 1,900 (28.5%), and 1,184 (72.2%) patients died in the low, medium, and high GAP risk categories, respectively. In the low- and medium-risk groups, these were almost double the previously published rates. For MGAP, 1,861 (2.8%), 1,455 (15.2%), and 2,158 (58.6%) patients died in the low-, medium-, and high-risk categories, consonant with results originally published. Reclassifying score point cutoffs improved likelihood ratios, sensitivity and specificity, as well as areas under the ROC curve. CONCLUSION We found both scores to be valid triage tools to stratify emergency department patients, according to their risk of death. MGAP calibrated better, but GAP slightly improved discrimination. The newly proposed cutoffs better differentiate risk classification and may therefore facilitate hospital resource allocation. LEVEL OF EVIDENCE Prognostic study, level II.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pneumococcal meningitis is associated with high morbidity and mortality rates. Brain damage caused by this disease is characterized by apoptosis in the hippocampal dentate gyrus, a morphological correlate of learning deficits in experimental paradigms. The mood stabilizer lithium has previously been found to attenuate brain damage in ischemic and inflammatory diseases of the brain. An infant rat model of pneumococcal meningitis was used to investigate the neuroprotective and neuroregenerative potential of lithium. To assess an effect on the acute disease, LiCl was administered starting five days prior to intracisternal infection with live Streptococcus pneumoniae. Clinical parameters were recorded, cerebrospinal fluid (CSF) was sampled, and the animals were sacrificed 42 hours after infection to harvest the brain and serum. Cryosections of the brains were stained for Nissl substance to quantify brain injury. Hippocampal gene expression of Bcl-2, Bax, p53, and BDNF was analyzed. Lithium concentrations were measured in serum and CSF. The effect of chronic lithium treatment on spatial memory function and cell survival in the dentate gyrus was evaluated in a Morris water maze and by quantification of BrdU incorporation after LiCl treatment during 3 weeks following infection. In the hippocampus, LiCl significantly reduced apoptosis and gene expression of Bax and p53 while it increased expression of Bcl-2. IL-10, MCP-1, and TNF were significantly increased in animals treated with LiCl compared to NaCl. Chronic LiCl treatment improved spatial memory in infected animals. The mood stabilizer lithium may thus be a therapeutic alternative to attenuate neurofunctional deficits as a result of pneumococcal meningitis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES To report the mid-term results of aortic root replacement using a self-assembled biological composite graft, consisting of a vascular tube graft and a stented tissue valve. METHODS Between January 2005 and December 2011, 201 consecutive patients [median age 66 (interquartile range, IQR, 55-77) years, 31 female patients (15.4%), median logistic EuroSCORE 10 (IQR 6.8-23.2)] underwent aortic root replacement using a stented tissue valve for the following indications: annulo-aortic ectasia or ascending aortic aneurysm with aortic valve disease in 162 (76.8%) patients, active infective endocarditis in 18 (9.0%) and acute aortic dissection Stanford type A in 21 (10.4%). All patients underwent clinical and echocardiographic follow-up. We analysed survival and valve-related events. RESULTS The overall in-hospital mortality rate was 4.5%. One- and 5-year cardiac-related mortality rates were 3 and 6%, and overall survival was 95 ± 1.5 and 75 ± 3.6%, respectively. The rate of freedom from structural valve failure was 99% and 97 ± 0.4% at the 1- and 5-year follow-up, respectively. The incidence rates of prosthetic valve endocarditis were 3 and 4%, respectively. During a median follow-up of 28 (IQR 14-51) months, only 2 (1%) patients required valve-related redo surgery due to prosthetic valvular endocarditis and none suffered from thromboembolic events. One percent of patients showed structural valve deterioration without any clinical symptoms; none of the patients suffered greater than mild aortic regurgitation. CONCLUSIONS Aortic root replacement using a self-assembled biological composite graft is an interesting option. Haemodynamic results are excellent, with freedom from structured valve failure. Need for reoperation is extremely low, but long-term results are necessary to prove the durability of this concept.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES Loeys-Dietz syndrome (LDS) is characterized by acute aortic dissection (AAD) at aortic diameters below thresholds for intervention in patients with Marfan syndrome (MFS). The aim was to evaluate the outcome of LDS patients primarily treated as having MFS. METHODS We analysed 68 consecutive patients who underwent surgery between 1995 and 2007 under the assumption of having MFS before retrospectively being screened for LDS when genetic testing became available. These patients were followed up until 2013, and underwent a total of 115 aortic surgeries. RESULTS Genetic testing was performed in 76% of the patients. Sixty per cent of these patients were positive for FBN1 mutations associated with MFS, 20% had no FBN1 mutation and 17% harboured TGFBR1/2 mutations associated with LDS. Mean follow-up was 12.7 ± 7 years. All-cause 30-day, 6-month and 1-year mortality rates were 2.9, 4.4 and 7.3%, respectively. Interestingly, initial presentation with AAD did not differ between LDS and MFS (33 vs 37%, P = 0.48) nor did long-term mortality compared with MFS patients (11 vs 16%, P = 1.0) or within MFS subgroups (FBN1 positive 13%, P = 1.0; FBN1 negative 10%, P = 1.0; not tested 25%, P = 0.62). There was no difference in the need for secondary total arch replacement between LDS and MFS patients (11 vs 14%, P = 1.0), nor within MFS subgroups (FBN1 positive 16%, P = 1.0; FBN1 negative 10%, P = 1.0; not tested 13%, P = 1.0). Total aortic replacement became necessary in 22% of LDS compared with 12% of MFS patients (P = 0.6) and did not differ significantly between MFS subgroups. CONCLUSIONS Although early surgical intervention in LDS is warranted to avoid AAD, the current data suggest that once the diseased segment is repaired, there seems to be no additional burden in terms of mortality or reoperation rate compared with that in MFS patients, with or without confirmed FBN1 mutation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prostate cancer (CaP) is the most commonly diagnosed malignancy in males in the Western world with one in six males diagnosed in their lifetime. Current clinical prognostication groupings use pathologic Gleason score, pre-treatment prostatic-specific antigen and Union for International Cancer Control-TNM staging to place patients with localized CaP into low-, intermediate- and high-risk categories. These categories represent an increasing risk of biochemical failure and CaP-specific mortality rates, they also reflect the need for increasing treatment intensity and justification for increased side effects. In this article, we point out that 30-50% of patients will still fail image-guided radiotherapy or surgery despite the judicious use of clinical risk categories owing to interpatient heterogeneity in treatment response. To improve treatment individualization, better predictors of prognosis and radiotherapy treatment response are needed to triage patients to bespoke and intensified CaP treatment protocols. These should include the use of pre-treatment genomic tests based on DNA or RNA indices and/or assays that reflect cancer metabolism, such as hypoxia assays, to define patient-specific CaP progression and aggression. More importantly, it is argued that these novel prognostic assays could be even more useful if combined together to drive forward precision cancer medicine for localized CaP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

During the American colonization in the 18th and 19th century, Africans were captured and shipped to America. Harsh living and working conditions often led to chronic diseases and high mortality rates. Slaves in the Caribbean were forced to work mainly on sugar plantations. They were buried in cemeteries like Anse Sainte-Marguerite on the isle of Grande-Terre (Guadeloupe) which was examined by archaeologists and physical anthropologists. Morphological studies on osseous remains of 148 individuals revealed 15 cases with signs for bone tuberculosis and a high frequency of periosteal reactions which indicates early stages of the disease. 11 bone samples from these cemeteries were analysed for ancient DNA. The samples were extracted with established procedures and examined for the cytoplasmic multicopy β-actin gene and Mycobacterium tuberculosis complex DNA (IS 6110) by PCR. An amplification product for M. tuberculosis with the size of 123 bp was obtained. Sequencing confirmed the result. This study shows evidence of M. tuberculosis complex DNA in a Caribbean slave population.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent changes in sanitary policies within the European Union (EU) concerning disposal of carcasses of domestic animals and the increase of non-natural mortality factors, such as illegal poisoning, are threatening European vultures. However, the effects of anthropogenic activities on demographic parameters are poorly studied. Using a long-term study (1994–2011) of the threatened Pyrenean Bearded Vulture Gypaetus barbatus population, we assess the variation in the proportion of breeding pairs, egg-laying dates, clutch size, breeding success, and survival following a sharp reduction in food availability in 2005 due to the application of restrictive sanitary policies decreasing livestock carcass availability. We found a delay in laying dates and a regressive trend in clutch size, breeding success, and survival following policy change. The maintenance of specific supplementary feeding stations for Bearded Vultures probably reduced the negative effects of illegal poisoning and food shortages, which mainly affected subadult survival. A drop in food availability may have produced changes in demographic parameters and an increase in mortality due to an increased exposure to contaminated food. As a result, supplementary feeding as a precautionary measure can be a useful tool to reduce illegal poisoning and declines in demographic parameters until previous food availability scenarios are achieved. This study shows how anthropogenic activities through human health regulations that affect habitat quality can suddenly modify demographic parameters in long-lived species, including those, such as survival, with high sensitivity to population growth rate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE Reliable tools to predict long-term outcome among patients with well compensated advanced liver disease due to chronic HCV infection are lacking. DESIGN Risk scores for mortality and for cirrhosis-related complications were constructed with Cox regression analysis in a derivation cohort and evaluated in a validation cohort, both including patients with chronic HCV infection and advanced fibrosis. RESULTS In the derivation cohort, 100/405 patients died during a median 8.1 (IQR 5.7-11.1) years of follow-up. Multivariate Cox analyses showed age (HR=1.06, 95% CI 1.04 to 1.09, p<0.001), male sex (HR=1.91, 95% CI 1.10 to 3.29, p=0.021), platelet count (HR=0.91, 95% CI 0.87 to 0.95, p<0.001) and log10 aspartate aminotransferase/alanine aminotransferase ratio (HR=1.30, 95% CI 1.12 to 1.51, p=0.001) were independently associated with mortality (C statistic=0.78, 95% CI 0.72 to 0.83). In the validation cohort, 58/296 patients with cirrhosis died during a median of 6.6 (IQR 4.4-9.0) years. Among patients with estimated 5-year mortality risks <5%, 5-10% and >10%, the observed 5-year mortality rates in the derivation cohort and validation cohort were 0.9% (95% CI 0.0 to 2.7) and 2.6% (95% CI 0.0 to 6.1), 8.1% (95% CI 1.8 to 14.4) and 8.0% (95% CI 1.3 to 14.7), 21.8% (95% CI 13.2 to 30.4) and 20.9% (95% CI 13.6 to 28.1), respectively (C statistic in validation cohort = 0.76, 95% CI 0.69 to 0.83). The risk score for cirrhosis-related complications also incorporated HCV genotype (C statistic = 0.80, 95% CI 0.76 to 0.83 in the derivation cohort; and 0.74, 95% CI 0.68 to 0.79 in the validation cohort). CONCLUSIONS Prognosis of patients with chronic HCV infection and compensated advanced liver disease can be accurately assessed with risk scores including readily available objective clinical parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Community acquired pneumonia (CAP) is the most common infectious reason for admission to the Intensive Care Unit (ICU). The GenOSept study was designed to determine genetic influences on sepsis outcome. Phenotypic data was recorded using a robust clinical database allowing a contemporary analysis of the clinical characteristics, microbiology, outcomes and independent risk factors in patients with severe CAP admitted to ICUs across Europe. METHODS Kaplan-Meier analysis was used to determine mortality rates. A Cox Proportional Hazards (PH) model was used to identify variables independently associated with 28-day and six-month mortality. RESULTS Data from 1166 patients admitted to 102 centres across 17 countries was extracted. Median age was 64 years, 62% were male. Mortality rate at 28 days was 17%, rising to 27% at six months. Streptococcus pneumoniae was the commonest organism isolated (28% of cases) with no organism identified in 36%. Independent risk factors associated with an increased risk of death at six months included APACHE II score (hazard ratio, HR, 1.03; confidence interval, CI, 1.01-1.05), bilateral pulmonary infiltrates (HR1.44; CI 1.11-1.87) and ventilator support (HR 3.04; CI 1.64-5.62). Haematocrit, pH and urine volume on day one were all associated with a worse outcome. CONCLUSIONS The mortality rate in patients with severe CAP admitted to European ICUs was 27% at six months. Streptococcus pneumoniae was the commonest organism isolated. In many cases the infecting organism was not identified. Ventilator support, the presence of diffuse pulmonary infiltrates, lower haematocrit, urine volume and pH on admission were independent predictors of a worse outcome.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION Faecal peritonitis (FP) is a common cause of sepsis and admission to the intensive care unit (ICU). The Genetics of Sepsis and Septic Shock in Europe (GenOSept) project is investigating the influence of genetic variation on the host response and outcomes in a large cohort of patients with sepsis admitted to ICUs across Europe. Here we report an epidemiological survey of the subset of patients with FP. OBJECTIVES To define the clinical characteristics, outcomes and risk factors for mortality in patients with FP admitted to ICUs across Europe. METHODS Data was extracted from electronic case report forms. Phenotypic data was recorded using a detailed, quality-assured clinical database. The primary outcome measure was 6-month mortality. Patients were followed for 6 months. Kaplan-Meier analysis was used to determine mortality rates. Cox proportional hazards regression analysis was employed to identify independent risk factors for mortality. RESULTS Data for 977 FP patients admitted to 102 centres across 16 countries between 29 September 2005 and 5 January 2011 was extracted. The median age was 69.2 years (IQR 58.3-77.1), with a male preponderance (54.3%). The most common causes of FP were perforated diverticular disease (32.1%) and surgical anastomotic breakdown (31.1%). The ICU mortality rate at 28 days was 19.1%, increasing to 31.6% at 6 months. The cause of FP, pre-existing co-morbidities and time from estimated onset of symptoms to surgery did not impact on survival. The strongest independent risk factors associated with an increased rate of death at 6 months included age, higher APACHE II score, acute renal and cardiovascular dysfunction within 1 week of admission to ICU, hypothermia, lower haematocrit and bradycardia on day 1 of ICU stay. CONCLUSIONS In this large cohort of patients admitted to European ICUs with FP the 6 month mortality was 31.6%. The most consistent predictors of mortality across all time points were increased age, development of acute renal dysfunction during the first week of admission, lower haematocrit and hypothermia on day 1 of ICU admission.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES The aim of the Cavalier trial was to evaluate the safety and performance of the Perceval sutureless aortic valve in patients undergoing aortic valve replacement (AVR). We report the 30-day clinical and haemodynamic outcomes from the largest study cohort with a sutureless valve. METHODS From February 2010 to September 2013, 658 consecutive patients (mean age 77.8 years; 64.4% females; mean logistic EuroSCORE 10.2%) underwent AVR in 25 European Centres. Isolated AVRs were performed in 451 (68.5%) patients with a less invasive approach in 219 (33.3%) cases. Of the total, 40.0% were octogenarians. Congenital bicuspid aortic valve was considered an exclusion criterion. RESULTS Implantation was successful in 628 patients (95.4%). In isolated AVR through sternotomy, the mean cross-clamp time and the cardiopulmonary bypass (CPB) time were 32.6 and 53.7 min, and with the less invasive approach 38.8 and 64.5 min, respectively. The 30-day overall and valve-related mortality rates were 3.7 and 0.5%, respectively. Valve explants, stroke and endocarditis occurred in 0.6, 2.1 and in 0.1% of cases, respectively. Preoperative mean and peak pressure gradients decreased from 44.8 and 73.24 mmHg to 10.24 and 19.27 mmHg at discharge, respectively. The mean effective orifice area improved from 0.72 to 1.46 cm(2). CONCLUSIONS The current 30-day results show that the Perceval valve is safe (favourable haemodynamic effect and low complication rate), and can be implanted with a fast and reproducible technique after a short learning period. Short cross-clamp and CPB times were achieved in both isolated and combined procedures. The Perceval valve represents a promising alternative to biological AVR, especially with a less invasive approach and in older patients.