949 resultados para Observational Analysis
Resumo:
BACKGROUND In patients with cardiogenic shock, data on the comparative safety and efficacy of drug-eluting stents (DESs) vs. bare metal stents (BMSs) are lacking. We sought to assess the performance of DESs compared with BMSs among patients with cardiogenic shock undergoing percutaneous coronary intervention (PCI). METHODS Out of 236 patients with acute coronary syndromes complicated by cardiogenic shock, 203 were included in the final analysis. The primary endpoint included death, and the secondary endpoint of major adverse cardiac and cerebrovascular events (MACCEs) included the composite of death, myocardial infarction, any repeat revascularization and stroke. Patients were followed for a minimum of 30 days and up to 4 years. As stent assignment was not random, we performed a propensity score analysis to minimize potential bias. RESULTS Among patients treated with DESs, there was a lower risk of the primary and secondary endpoints compared with BMSs at 30 days (29 vs. 56%, P < 0.001; 34 vs. 58%, P = 0.001, respectively) and during long-term follow-up [hazard ratio 0.43, 95% confidence interval (CI) 0.29-0.65, P < 0.001; hazard ratio 0.49, 95% CI 0.34-0.71, P < 0.001, respectively]. After propensity score adjustment, all-cause mortality was reduced among patients treated with DESs compared with BMSs both at 30 days [adjusted odds ratio (OR) 0.26, 95% CI 0.11-0.62; P = 0.002] and during long-term follow-up (adjusted hazard ratio 0.40, 95% CI 0.22-0.72; P = 0.002). The rate of MACCE was lower among patients treated with DESs compared with those treated with BMSs at 30 days (adjusted OR 0.42, 95% CI 0.19-0.95; P = 0.036). The difference in MACCEs between devices approached significance during long-term follow-up (adjusted hazard ratio 0.60, 95% CI 0.34-1.01; P = 0.052). CONCLUSION DESs appear to be associated with improved clinical outcomes, including a reduction in all-cause mortality compared with BMSs among patients undergoing PCI for cardiogenic shock, possibly because of a pacification of the infarct-related artery by anti-inflammatory drug. The results of this observational study require confirmation in an appropriately powered randomized trial.
Resumo:
Aims: The reported rate of stent thrombosis (ST) after drug-eluting stent (DES) implantation varies among registries. To investigate differences in baseline characteristics and clinical outcome in European and Japanese all-comers registries, we performed a pooled analysis of patient-level data. Methods and results: The j-Cypher registry (JC) is a multicentre observational study conducted in Japan, including 12,824 patients undergoing SES implantation. From the Bern-Rotterdam registry (BR) enrolled at two academic hospitals in Switzerland and the Netherlands, 3,823 patients with SES were included in the current analysis. Patients in BR were younger, more frequently smokers and presented more frequently with ST-elevation myocardial infarction (MI). Conversely, JC patients more frequently had diabetes and hypertension. At five years, the definite ST rate was significantly lower in JC than BR (JC 1.6% vs. BR 3.3%, p<0.001), while the unadjusted mortality tended to be lower in BR than in JC (BR 13.2% vs. JC 14.4%, log-rank p=0.052). After adjustment, the j-Cypher registry was associated with a significantly lower risk of all-cause mortality (HR 0.56, 95% CI: 0.49-0.64) as well as definite stent thrombosis (HR 0.46, 95% CI: 0.35-0.61). Conclusions: The baseline characteristics of the two large registries were different. After statistical adjustment, JC was associated with lower mortality and ST.
Resumo:
BACKGROUND Observational studies of a putative association between hormonal contraception (HC) and HIV acquisition have produced conflicting results. We conducted an individual participant data (IPD) meta-analysis of studies from sub-Saharan Africa to compare the incidence of HIV infection in women using combined oral contraceptives (COCs) or the injectable progestins depot-medroxyprogesterone acetate (DMPA) or norethisterone enanthate (NET-EN) with women not using HC. METHODS AND FINDINGS Eligible studies measured HC exposure and incident HIV infection prospectively using standardized measures, enrolled women aged 15-49 y, recorded ≥15 incident HIV infections, and measured prespecified covariates. Our primary analysis estimated the adjusted hazard ratio (aHR) using two-stage random effects meta-analysis, controlling for region, marital status, age, number of sex partners, and condom use. We included 18 studies, including 37,124 women (43,613 woman-years) and 1,830 incident HIV infections. Relative to no HC use, the aHR for HIV acquisition was 1.50 (95% CI 1.24-1.83) for DMPA use, 1.24 (95% CI 0.84-1.82) for NET-EN use, and 1.03 (95% CI 0.88-1.20) for COC use. Between-study heterogeneity was mild (I2 < 50%). DMPA use was associated with increased HIV acquisition compared with COC use (aHR 1.43, 95% CI 1.23-1.67) and NET-EN use (aHR 1.32, 95% CI 1.08-1.61). Effect estimates were attenuated for studies at lower risk of methodological bias (compared with no HC use, aHR for DMPA use 1.22, 95% CI 0.99-1.50; for NET-EN use 0.67, 95% CI 0.47-0.96; and for COC use 0.91, 95% CI 0.73-1.41) compared to those at higher risk of bias (pinteraction = 0.003). Neither age nor herpes simplex virus type 2 infection status modified the HC-HIV relationship. CONCLUSIONS This IPD meta-analysis found no evidence that COC or NET-EN use increases women's risk of HIV but adds to the evidence that DMPA may increase HIV risk, underscoring the need for additional safe and effective contraceptive options for women at high HIV risk. A randomized controlled trial would provide more definitive evidence about the effects of hormonal contraception, particularly DMPA, on HIV risk.
Resumo:
BackgroundAcute cough is a common problem in general practice and is often caused by a self-limiting, viral infection. Nonetheless, antibiotics are often prescribed in this situation, which may lead to unnecessary side effects and, even worse, the development of antibiotic resistant microorganisms worldwide. This study assessed the role of point-of-care C-reactive protein (CRP) testing and other predictors of antibiotic prescription in patients who present with acute cough in general practice.MethodsPatient characteristics, symptoms, signs, and laboratory and X-ray findings from 348 patients presenting to 39 general practitioners with acute cough, as well as the GPs themselves, were recorded by fourth-year medical students during their three-week clerkships in general practice. Patient and clinician characteristics of those prescribed and not-prescribed antibiotics were compared using a mixed-effects model.ResultsOf 315 patients included in the study, 22% were prescribed antibiotics. The two groups of patients, those prescribed antibiotics and those treated symptomatically, differed significantly in age, demand for antibiotics, days of cough, rhinitis, lung auscultation, haemoglobin level, white blood cell count, CRP level and the GP¿s license to self-dispense antibiotics. After regression analysis, only the CRP level, the white blood cell count and the duration of the symptoms were statistically significant predictors of antibiotic prescription.ConclusionsThe antibiotic prescription rate of 22% in adult patients with acute cough in the Swiss primary care setting is low compared to other countries. GPs appear to use point-of-care CRP testing in addition to the duration of clinical symptoms to help them decide whether or not to prescribe antibiotics.
Resumo:
Background context Studies involving factor analysis (FA) of the items in the North American Spine Society (NASS) outcome assessment instrument have revealed inconsistent factor structures for the individual items. Purpose This study examined whether the factor structure of the NASS varied in relation to the severity of the back/neck problem and differed from that originally recommended by the developers of the questionnaire, by analyzing data before and after surgery in a large series of patients undergoing lumbar or cervical disc arthroplasty. Study design/setting Prospective multicenter observational case series. Patient sample Three hundred ninety-one patients with low back pain and 553 patients with neck pain completed questionnaires preoperatively and again at 3 to 6 and 12 months follow-ups (FUs), in connection with the SWISSspine disc arthroplasty registry. Outcome measures North American Spine Society outcome assessment instrument. Methods First, an exploratory FA without a priori assumptions and subsequently a confirmatory FA were performed on the 17 items of the NASS-lumbar and 19 items of the NASS-cervical collected at each assessment time point. The item-loading invariance was tested in the German version of the questionnaire for baseline and FU. Results Both NASS-lumbar and NASS-cervical factor structures differed between baseline and postoperative data sets. The confirmatory analysis and item-loading invariance showed better fit for a three-factor (3F) structure for NASS-lumbar, containing items on “disability,” “back pain,” and “radiating pain, numbness, and weakness (leg/foot)” and for a 5F structure for NASS-cervical including disability, “neck pain,” “radiating pain and numbness (arm/hand),” “weakness (arm/hand),” and “motor deficit (legs).” Conclusions The best-fitting factor structure at both baseline and FU was selected for both the lumbar- and cervical-NASS questionnaires. It differed from that proposed by the originators of the NASS instruments. Although the NASS questionnaire represents a valid outcome measure for degenerative spine diseases, it is able to distinguish among all major symptom domains (factors) in patients undergoing lumbar and cervical disc arthroplasty; overall, the item structure could be improved. Any potential revision of the NASS should consider its factorial structure; factorial invariance over time should be aimed for, to allow for more precise interpretations of treatment success.
Resumo:
BACKGROUND Anthelmintic drugs have been widely used in sheep as a cost-effective means for gastro-intestinal nematode (GIN) control. However, growing anthelmintic resistance (AHR) has created a compelling need to identify evidence-based management recommendations that reduce the risk of further development and impact of AHR. OBJECTIVE To identify, critically assess, and synthesize available data from primary research on factors associated with AHR in sheep. METHODS Publications reporting original observational or experimental research on selected factors associated with AHR in sheep GINs and published after 1974, were identified through two processes. Three electronic databases (PubMed, Agricola, CAB) and Web of Science (a collection of databases) were searched for potentially relevant publications. Additional publications were identified through consultation with experts, manual search of references of included publications and conference proceedings, and information solicited from small ruminant practitioner list-serves. Two independent investigators screened abstracts for relevance. Relevant publications were assessed for risk of systematic bias. Where sufficient data were available, random-effects Meta-Analyses (MAs) were performed to estimate the pooled Odds Ratio (OR) and 95% Confidence Intervals (CIs) of AHR for factors reported in ≥2 publications. RESULTS Of the 1712 abstracts screened for eligibility, 131 were deemed relevant for full publication review. Thirty publications describing 25 individual studies (15 observational studies, 7 challenge trials, and 3 controlled trials) were included in the qualitative synthesis and assessed for systematic bias. Unclear (i.e. not reported, or unable to assess) or high risk of selection bias and confounding bias was found in 93% (14/15) and 60% (9/15) of the observational studies, respectively, while unclear risk of selection bias was identified in all of the trials. Ten independent studies were included in the quantitative synthesis, and MAs were performed for five factors. Only high frequency of treatment was a significant risk factor (OR=4.39; 95% CI=1.59, 12.14), while the remaining 4 variables were marginally significant: mixed-species grazing (OR=1.63; 95% CI=0.66, 4.07); flock size (OR=1.02; 95% CI=0.97, 1.07); use of long-acting drug formulations (OR=2.85; 95% CI=0.79, 10.24); and drench-and-shift pasture management (OR=4.08; 95% CI=0.75, 22.16). CONCLUSIONS While there is abundant literature on the topic of AHR in sheep GINs, few studies have explicitly investigated the association between putative risk or protective factors and AHR. Consequently, several of the current recommendations on parasite management are not evidence-based. Moreover, many of the studies included in this review had a high or unclear risk of systematic bias, highlighting the need to improve study design and/or reporting of future research carried out in this field.
Resumo:
BACKGROUND After cardiac surgery with cardiopulmonary bypass (CPB), acquired coagulopathy often leads to post-CPB bleeding. Though multifactorial in origin, this coagulopathy is often aggravated by deficient fibrinogen levels. OBJECTIVE To assess whether laboratory and thrombelastometric testing on CPB can predict plasma fibrinogen immediately after CPB weaning. PATIENTS / METHODS This prospective study in 110 patients undergoing major cardiovascular surgery at risk of post-CPB bleeding compares fibrinogen level (Clauss method) and function (fibrin-specific thrombelastometry) in order to study the predictability of their course early after termination of CPB. Linear regression analysis and receiver operating characteristics were used to determine correlations and predictive accuracy. RESULTS Quantitative estimation of post-CPB Clauss fibrinogen from on-CPB fibrinogen was feasible with small bias (+0.19 g/l), but with poor precision and a percentage of error >30%. A clinically useful alternative approach was developed by using on-CPB A10 to predict a Clauss fibrinogen range of interest instead of a discrete level. An on-CPB A10 ≤10 mm identified patients with a post-CPB Clauss fibrinogen of ≤1.5 g/l with a sensitivity of 0.99 and a positive predictive value of 0.60; it also identified those without a post-CPB Clauss fibrinogen <2.0 g/l with a specificity of 0.83. CONCLUSIONS When measured on CPB prior to weaning, a FIBTEM A10 ≤10 mm is an early alert for post-CPB fibrinogen levels below or within the substitution range (1.5-2.0 g/l) recommended in case of post-CPB coagulopathic bleeding. This helps to minimize the delay to data-based hemostatic management after weaning from CPB.
Resumo:
BACKGROUND Sepsis continues to be a major cause of death, disability, and health-care expenditure worldwide. Despite evidence suggesting that host genetics can influence sepsis outcomes, no specific loci have yet been convincingly replicated. The aim of this study was to identify genetic variants that influence sepsis survival. METHODS We did a genome-wide association study in three independent cohorts of white adult patients admitted to intensive care units with sepsis, severe sepsis, or septic shock (as defined by the International Consensus Criteria) due to pneumonia or intra-abdominal infection (cohorts 1-3, n=2534 patients). The primary outcome was 28 day survival. Results for the cohort of patients with sepsis due to pneumonia were combined in a meta-analysis of 1553 patients from all three cohorts, of whom 359 died within 28 days of admission to the intensive-care unit. The most significantly associated single nucleotide polymorphisms (SNPs) were genotyped in a further 538 white patients with sepsis due to pneumonia (cohort 4), of whom 106 died. FINDINGS In the genome-wide meta-analysis of three independent pneumonia cohorts (cohorts 1-3), common variants in the FER gene were strongly associated with survival (p=9·7 × 10(-8)). Further genotyping of the top associated SNP (rs4957796) in the additional cohort (cohort 4) resulted in a combined p value of 5·6 × 10(-8) (odds ratio 0·56, 95% CI 0·45-0·69). In a time-to-event analysis, each allele reduced the mortality over 28 days by 44% (hazard ratio for death 0·56, 95% CI 0·45-0·69; likelihood ratio test p=3·4 × 10(-9), after adjustment for age and stratification by cohort). Mortality was 9·5% in patients carrying the CC genotype, 15·2% in those carrying the TC genotype, and 25·3% in those carrying the TT genotype. No significant genetic associations were identified when patients with sepsis due to pneumonia and intra-abdominal infection were combined. INTERPRETATION We have identified common variants in the FER gene that associate with a reduced risk of death from sepsis due to pneumonia. The FER gene and associated molecular pathways are potential novel targets for therapy or prevention and candidates for the development of biomarkers for risk stratification. FUNDING European Commission and the Wellcome Trust.
Resumo:
OBJECTIVES Respondent-driven sampling (RDS) is a new data collection methodology used to estimate characteristics of hard-to-reach groups, such as the HIV prevalence in drug users. Many national public health systems and international organizations rely on RDS data. However, RDS reporting quality and available reporting guidelines are inadequate. We carried out a systematic review of RDS studies and present Strengthening the Reporting of Observational Studies in Epidemiology for RDS Studies (STROBE-RDS), a checklist of essential items to present in RDS publications, justified by an explanation and elaboration document. STUDY DESIGN AND SETTING We searched the MEDLINE (1970-2013), EMBASE (1974-2013), and Global Health (1910-2013) databases to assess the number and geographical distribution of published RDS studies. STROBE-RDS was developed based on STROBE guidelines, following Guidance for Developers of Health Research Reporting Guidelines. RESULTS RDS has been used in over 460 studies from 69 countries, including the USA (151 studies), China (70), and India (32). STROBE-RDS includes modifications to 12 of the 22 items on the STROBE checklist. The two key areas that required modification concerned the selection of participants and statistical analysis of the sample. CONCLUSION STROBE-RDS seeks to enhance the transparency and utility of research using RDS. If widely adopted, STROBE-RDS should improve global infectious diseases public health decision making.
Resumo:
OBJECTIVES The aim of this study was to quantify loss to follow-up (LTFU) in HIV care after delivery and to identify risk factors for LTFU, and implications for HIV disease progression and subsequent pregnancies. METHODS We used data on pregnancies within the Swiss HIV Cohort Study from 1996 to 2011. A delayed clinical visit was defined as > 180 days and LTFU as no visit for > 365 days after delivery. Logistic regression analysis was used to identify risk factors for LTFU. RESULTS A total of 695 pregnancies in 580 women were included in the study, of which 115 (17%) were subsequent pregnancies. Median maternal age was 32 years (IQR 28-36 years) and 104 (15%) women reported any history of injecting drug use (IDU). Overall, 233 of 695 (34%) women had a delayed visit in the year after delivery and 84 (12%) women were lost to follow-up. Being lost to follow-up was significantly associated with a history of IDU [adjusted odds ratio (aOR) 2.79; 95% confidence interval (CI) 1.32-5.88; P = 0.007] and not achieving an undetectable HIV viral load (VL) at delivery (aOR 2.42; 95% CI 1.21-4.85; P = 0.017) after adjusting for maternal age, ethnicity and being on antiretroviral therapy (ART) at conception. Forty-three of 84 (55%) women returned to care after LTFU. Half of them (20 of 41) with available CD4 had a CD4 count < 350 cells/μL and 15% (six of 41) a CD4 count < 200 cells/μL at their return. CONCLUSIONS A history of IDU and detectable HIV VL at delivery were associated with LTFU. Effective strategies are warranted to retain women in care beyond pregnancy and to avoid CD4 cell count decline. ART continuation should be advised especially if a subsequent pregnancy is planned.
Resumo:
BACKGROUND Chronic postsurgical pain (CPSP) is an important clinical problem. Prospective studies of the incidence, characteristics and risk factors of CPSP are needed. OBJECTIVES The objective of this study is to evaluate the incidence and risk factors of CPSP. DESIGN A multicentre, prospective, observational trial. SETTING Twenty-one hospitals in 11 European countries. PATIENTS Three thousand one hundred and twenty patients undergoing surgery and enrolled in the European registry PAIN OUT. MAIN OUTCOME MEASURES Pain-related outcome was evaluated on the first postoperative day (D1) using a standardised pain outcome questionnaire. Review at 6 and 12 months via e-mail or telephonic interview used the Brief Pain Inventory (BPI) and the DN4 (Douleur Neuropathique four questions). Primary endpoint was the incidence of moderate to severe CPSP (numeric rating scale, NRS ≥3/10) at 12 months. RESULTS For 1044 and 889 patients, complete data were available at 6 and 12 months. At 12 months, the incidence of moderate to severe CPSP was 11.8% (95% CI 9.7 to 13.9) and of severe pain (NRS ≥6) 2.2% (95% CI 1.2 to 3.3). Signs of neuropathic pain were recorded in 35.4% (95% CI 23.9 to 48.3) and 57.1% (95% CI 30.7 to 83.4) of patients with moderate and severe CPSP, respectively. Functional impairment (BPI) at 6 and 12 months increased with the severity of CPSP (P < 0.01) and presence of neuropathic characteristics (P < 0.001). Multivariate analysis identified orthopaedic surgery, preoperative chronic pain and percentage of time in severe pain on D1 as risk factors. A 10% increase in percentage of time in severe pain was associated with a 30% increase of CPSP incidence at 12 months. CONCLUSION The collection of data on CPSP was feasible within the European registry PAIN OUT. The incidence of moderate to severe CPSP at 12 months was 11.8%. Functional impairment was associated with CPSP severity and neuropathic characteristics. Risk factors for CPSP in the present study were chronic preoperative pain, orthopaedic surgery and percentage of time in severe pain on D1. TRIAL REGISTRATION Clinicaltrials.gov identifier: NCT01467102.
Resumo:
OBJECTIVES To analyse the nationwide prevalence of uveitis in JIA and its complications over a whole decade. METHODS We conducted a prospective, observational and cross-sectional study including all JIA patients from a National Paediatric Rheumatological Database (NPRD) with a uveitis add-on module in Germany (2002-2013). Temporal changes in uveitis prevalence, related secondary complications and anti-inflammatory medication were evaluated. RESULTS A total of 60 centres including 18,555 JIA patients (mean 3,863 patients/year, SD=837) were documented in the NPRD between 2002 and 2013. The mean age of the patients was 11.4±4.6 years, their mean disease duration 4.4±3.7 years. Among them, 66.9% were female and 51.7% ANA positive. Patients' mean age at arthritis onset was 6.9±4.5 years. Treatment rates with synthetic and biological DMARDs increased during the observation period (sDMARD: 39.8% to 47.2%, bDMARD: 3.3% to 21.8%). Uveitis prevalence decreased significantly from 2002 to 2013 (13.0% to 11.6%, OR = 0.98, p=0.015). The prevalence of secondary uveitis complications also decreased significantly between 2002 and 2013 (33.6% to 23.9%, OR=0.94, p<0.001). Among the complications, the most common ones were posterior synechiae, cataract and band keratopathy. A significant increase in achieving uveitis inactivity was observed at 30.6% in 2002 and 65.3% in 2013 (OR=1.15, p<0.001). CONCLUSIONS Uveitis prevalence and complications significantly decreased between 2002 and 2013. This may be associated with a more frequent use of DMARDs.
Resumo:
All-sky Meteor Orbit System (AMOS) is a semi-autonomous video observatory for detection of transient events on the sky, mostly the meteors. Its hardware and software development and permanent placement on several locations in Slovakia allowed the establishment of Slovak Video Meteor Network (SVMN) monitoring meteor activity above the Central Europe. The data reduction, orbital determination and additional results from AMOS cameras–the SVMN database– as well as from observational expeditions on Canary Islands and in Canada provided dynamical and physical data for better understanding of mutual connections between parent bodies of asteroids and comets and their meteoroid streams. We present preliminary results on exceptional and rare meteor streams such as September ε Perseids (SPE) originated from unknown long periodic comet on a retrograde orbit, suspected asteroidal meteor stream of April α Comae Berenicids (ACO) in the orbit of meteorites Příbram and Neuschwanstein and newly observed meteor stream Camelopardalids (CAM) originated from Jupiter family comet 209P/Linear.
Resumo:
Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA’s Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), led by the Astronomical Institute of the University of Bern (AIUB), addresses this problem. The goal of the project is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). The In-Orbit Tumbling Analysis tool (ιOTA) is a prototype software, currently in development by Hyperschall Technologie Göttingen GmbH (HTG) within the framework of the project. ιOTA will be a highly modular software tool to perform short-(days), medium-(months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour, magnetic torquer activity and thruster firing. The purpose of ιOTA is to provide high accuracy short-term simulations to support observers and potential ADR missions, as well as medium-and long-term simulations to study the significance of the particular internal and external influences on the attitude, especially damping factors and momentum transfer. The simulation will also enable the investigation of the altitude dependency of the particular external influences. ιOTA's post-processing modules will generate synthetic measurements for observers and for software validation. The validation of the software will be done by cross-calibration with observations and measurements acquired by the project partners.
Resumo:
BACKGROUND Sutureless aortic valve replacement (SU-AVR) has emerged as an innovative alternative for treatment of aortic stenosis. By avoiding the placement of sutures, this approach aims to reduce cross-clamp and cardiopulmonary bypass (CPB) duration and thereby improve surgical outcomes and facilitate a minimally invasive approach suitable for higher risk patients. The present systematic review and meta-analysis aims to assess the safety and efficacy of SU-AVR approach in the current literature. METHODS Electronic searches were performed using six databases from their inception to January 2014. Relevant studies utilizing sutureless valves for aortic valve implantation were identified. Data were extracted and analyzed according to predefined clinical endpoints. RESULTS Twelve studies were identified for inclusion of qualitative and quantitative analyses, all of which were observational reports. The minimally invasive approach was used in 40.4% of included patients, while 22.8% underwent concomitant coronary bypass surgery. Pooled cross-clamp and CPB duration for isolated AVR was 56.7 and 46.5 minutes, respectively. Pooled 30-day and 1-year mortality rates were 2.1% and 4.9%, respectively, while the incidences of strokes (1.5%), valve degenerations (0.4%) and paravalvular leaks (PVL) (3.0%) were acceptable. CONCLUSIONS The evaluation of current observational evidence suggests that sutureless aortic valve implantation is a safe procedure associated with shorter cross-clamp and CPB duration, and comparable complication rates to the conventional approach in the short-term.