972 resultados para Jeremiah 17:5-10
Resumo:
BACKGROUND AND PURPOSE Autografts are used for bone reconstruction in regenerative medicine including oral and maxillofacial surgery. Bone grafts release paracrine signals that can reach mesenchymal cells at defect sites. The impact of the paracrine signals on osteogenic, adipogenic, and chondrogenic differentiation of mesenchymal cells has remained unclear. MATERIAL AND METHODS Osteogenesis, adipogenesis, and chondrogenesis were studied with murine ST2 osteoblast progenitors, 3T3-L1 preadipocytes, and ATDC5 prechondrogenic cells, respectively. Primary periodontal fibroblasts from the gingiva, from the periodontal ligament, and from bone were also included in the analysis. Cells were exposed to bone-conditioned medium (BCM) that was prepared from porcine cortical bone chips. RESULTS BCM inhibited osteogenic and adipogenic differentiation of ST2 and 3T3-L1 cells, respectively, as shown by histological staining and gene expression. No substantial changes in the expression of chondrogenic genes were observed in ATDC5 cells. Primary periodontal fibroblasts also showed a robust decrease in alkaline phosphatase and peroxisome proliferator-activated receptor gamma (PPARγ) expression when exposed to BCM. BCM also increased collagen type 10 expression. Pharmacologic blocking of transforming growth factor (TGF)-β receptor type I kinase with SB431542 and the smad-3 inhibitor SIS3 at least partially reversed the effect of BCM on PPARγ and collagen type 10 expression. In support of BCM having TGF-β activity, the respective target genes were increasingly expressed in periodontal fibroblasts. CONCLUSIONS The present work is a pioneer study on the paracrine activity of bone grafts. The findings suggest that cortical bone chips release soluble signals that can modulate differentiation of mesenchymal cells in vitro at least partially involving TGF-β signaling.
Resumo:
We analyzed the species distribution of Candida blood isolates (CBIs), prospectively collected between 2004 and 2009 within FUNGINOS, and compared their antifungal susceptibility according to clinical breakpoints defined by the European Committee on Antimicrobial Susceptibility Testing (EUCAST) in 2013, and the Clinical and Laboratory Standards Institute (CLSI) in 2008 (old CLSI breakpoints) and 2012 (new CLSI breakpoints). CBIs were tested for susceptiblity to fluconazole, voriconazole and caspofungin by microtitre broth dilution (Sensititre® YeastOne™ test panel). Of 1090 CBIs, 675 (61.9%) were C. albicans, 191 (17.5%) C. glabrata, 64 (5.9%) C. tropicalis, 59 (5.4%) C. parapsilosis, 33 (3%) C. dubliniensis, 22 (2%) C. krusei and 46 (4.2%) rare Candida species. Independently of the breakpoints applied, C. albicans was almost uniformly (>98%) susceptible to all three antifungal agents. In contrast, the proportions of fluconazole- and voriconazole-susceptible C. tropicalis and F-susceptible C. parapsilosis were lower according to EUCAST/new CLSI breakpoints than to the old CLSI breakpoints. For caspofungin, non-susceptibility occurred mainly in C. krusei (63.3%) and C. glabrata (9.4%). Nine isolates (five C. tropicalis, three C. albicans and one C. parapsilosis) were cross-resistant to azoles according to EUCAST breakpoints, compared with three isolates (two C. albicans and one C. tropicalis) according to new and two (2 C. albicans) according to old CLSI breakpoints. Four species (C. albicans, C. glabrata, C. tropicalis and C. parapsilosis) represented >90% of all CBIs. In vitro resistance to fluconazole, voriconazole and caspofungin was rare among C. albicans, but an increase of non-susceptibile isolates was observed among C. tropicalis/C. parapsilosis for the azoles and C. glabrata/C. krusei for caspofungin according to EUCAST and new CLSI breakpoints compared with old CLSI breakpoints.
Resumo:
PURPOSE To assess the need for clinically-driven secondary revascularization in critical limb ischemia (CLI) patients subsequent to tibial angioplasty during a 2-year follow-up. METHODS Between 2008 and 2010, a total of 128 consecutive CLI patients (80 men; mean age 76.5±9.8 years) underwent tibial angioplasty in 139 limbs. Rutherford categories, ankle-brachial index measurements, and lower limb oscillometries were prospectively assessed. All patients were followed at 3, 6, 12 months, and annually thereafter. Rates of death, primary and secondary sustained clinical improvement, target lesion (TLR) and target extremity revascularization (TER), as well as major amputation, were analyzed retrospectively. Primary clinical improvement was defined as improvement in Rutherford category to a level of intermittent claudication without unplanned amputation or TLR. RESULTS All-cause mortality was 8.6%, 14.8%, 22.9%, and 29.1% at 3, 6, 12, and 24 months. At the same intervals, rates of primary sustained clinical improvement were 74.5%, 53.0%, 42.7%, and 37.1%; for secondary improvement, the rates were 89.1%, 76.0%, 68.4%, and 65.0%. Clinically-driven TLR rates were 14.6%, 29.1%, 41.6%, 46.2%; the rates for TER were 3.0%, 13.6%, 17.2%, and 27.6% in corresponding intervals, while the rates of major amputation were 1.5%, 5.5%, 10.1%, and 10.1%. CONCLUSION Clinically-driven TLR is frequently required to maintain favorable functional clinical outcomes in CLI patients following tibial angioplasty. Dedicated technologies addressing tibial arterial restenosis warrant further academic scrutiny.
Resumo:
A positive relationship between species richness and island size is thought to emerge from an equilibrium between immigration and extinction rates, but the influence of species diversification on the form of this relationship is poorly understood. Here, we show that within-lake adaptive radiation strongly modifies the species-area relationship for African cichlid fishes. The total number of species derived from in situ speciation increases with lake size, resulting in faunas orders of magnitude higher in species richness than faunas assembled by immigration alone. Multivariate models provide evidence for added influence of lake depth on the species-area relationship. Diversity of clades representing within-lake radiations show responses to lake area, depth and energy consistent with limitation by these factors, suggesting that ecological factors influence the species richness of radiating clades within these ecosystems. Together, these processes produce lake fish faunas with highly variable composition, but with diversities that are well predicted by environmental variables.
Resumo:
BACKGROUND We describe the long-term outcome after clinical introduction and dose escalation of somatostatin receptor targeted therapy with [90Y-DOTA]-TOC in patients with metastasized neuroendocrine tumors. METHODS In a clinical phase I dose escalation study we treated patients with increasing [90Y-DOTA]-TOC activities. Multivariable Cox regression and competing risk regression were used to compare efficacy and toxicities of the different dosage protocols. RESULTS Overall, 359 patients were recruited; 60 patients were enrolled for low dose (median: 2.4 GBq/cycle, range 0.9-7.8 GBq/cycle), 77 patients were enrolled for intermediate dose (median: 3.3 GBq/cycle, range: 2.0-7.4 GBq/cycle) and 222 patients were enrolled for high dose (median: 6.7 GBq/cycle, range: 3.7-8.1 GBq/cycle) [90Y-DOTA]-TOC treatment. The incidences of hematotoxicities grade 1-4 were 65.0%, 64.9% and 74.8%; the incidences of grade 4/5 kidney toxicities were 8.4%, 6.5% and 14.0%, and the median survival was 39 (range: 1-158) months, 34 (range: 1-118) months and 29 (range: 1-113) months. The high dose protocol was associated with an increased risk of kidney toxicity (Hazard Ratio: 3.12 (1.13-8.59) vs. intermediate dose, p = 0.03) and a shorter overall survival (Hazard Ratio: 2.50 (1.08-5.79) vs. low dose, p = 0.03). CONCLUSIONS Increasing [90Y-DOTA]-TOC activities may be associated with increasing hematological toxicities. The dose related hematotoxicity profile of [90Y-DOTA]-TOC could facilitate tailoring [90Y-DOTA]-TOC in patients with preexisting hematotoxicities. The results of the long-term outcome suggest that fractionated [90Y-DOTA]-TOC treatment might allow to reduce renal toxicity and to improve overall survival. (ClinicalTrials.gov number NCT00978211).
Resumo:
PURPOSE: The purpose of this study was to assess the impact of different policies on access to hormonal contraception and pregnancy rates at two high school-based clinics. METHODS: Two clinics in high schools (Schools A and B), located in a large urban district in the southwest US, provide primary medical care to enrolled students with parental consent; the majority of whom have no health insurance coverage. The hormonal contraceptive dispensing policy of at School clinic A involves providing barrier, hormonal and emergency contraceptive services on site. School clinic B uses a referral policy that directs students to obtain contraception at an off-campus affiliated family planning clinic. Baseline data (age, race and history of prior pregnancy) on female students seeking hormonal contraception at the two clinics between 9/2008-12/2009 were extracted from an electronic administrative database (AHLERS Integrated System). Data on birth control use and pregnancy tests for each student was then tracked electronically through 3/31/2010. The outcomes measures were accessing hormonal contraception and positive pregnancy tests at any point during or after birth control use were started through 12/2009. The appointment keeping rate for contraceptive services and the overall pregnancy rates were compared between the two schools. In addition the pregnancy rates were compared between the two schools for students with and without a prior history of pregnancy. RESULTS: School clinic A: 79 students sought hormonal contraception; mean age 17.5 years; 68% were > 18 years; 77% were Hispanic; and 20% reported prior pregnancy. The mean duration of the observation period was 13 months (4-19 months). All 79 students received hormonal contraception (65% pill and 35% long acting progestin injection) onsite. During the observation period, the overall pregnancy rate was 6% (5/79); 4.7% (3/63) among students with no prior pregnancy. School clinic B: 40 students sought hormonal contraception; mean age 17.5 years; 52% > 18 years; 88 % were Hispanic; and 7.5% reported prior pregnancy. All 40 students were referred to the affiliated clinic. The mean duration of the observation period was 11.9 months (4-19 months). 50% (20) kept their appointment. Pills were dispensed to 85% (17/20) and 15% (3/20) received long acting progestin injection. The overall pregnancy rate was 20% (8/40); 21.6% (8/37) among students with no prior pregnancy. A significantly higher frequency of students seeking hormonal contraception kept their initial appointment for birth control at the school dispensing onsite contraception compared to the school with a referral policy for contraception (p<0.05). The pregnancy rate was significantly higher for the school with a referral policy for contraception compared to the school with onsite contraceptive services (p< 0.05). The pregnancy rate was also significantly higher for students without a prior history of pregnancy in the school with a referral policy for contraception (21.6%) versus the school with onsite contraceptive services (4.7%) (p< 0.05). CONCLUSION: This preliminary study showed that School clinic B with a referral policy had a lower appointment keeping rate for contraceptive services and a higher pregnancy rate than School clinic A with on-site contraceptive services. An on-site dispensing policy for hormonal contraceptives at high school-based health clinics may be a convenient and effective approach to prevent unintended first and repeat pregnancies among adolescents who seek hormonal contraception. This study has strong implications for reproductive health policy, especially as directed toward high-risk teenage populations.
Resumo:
The impact of cancer on the population of Salvador-Bahia, Brazil was studied using mortality data available from the Brazilian National Bureau of Vital Statistics. Average annual site, age, and gender specific and adjusted cancer mortality rates were determined for the years 1977-83 and contrasted with United States cancer mortality rates for the year of 1977. The accuracy of the cancer mortality rates generated by this research was determined by comparing the underlying causes of death as coded on death certificates to pathology reports and to hospital diagnosis of a sample of 966 deaths occurring in Salvador during the year of 1983. To further explore the information available on the death certificate, a population based decedent case control study was used to determine the relationship between type of occupation (proxy for exposure) and mortality by cancer sites known to be occupationally related.^ The rates in Salvador for cancer of the stomach, oral cavity, and biliary passages are, on average, two fold higher than the U.S. rates.^ The death certificate was found to be accurate for 65 percent of the 485 cancer deaths studied. Thirty five histologically confirmed cancer deaths were found in a random sample of 481 deaths from other causes. This means that, approximately 700 more deaths may be lost among the remainder 10,073 death certificates stating a cause other than cancer.^ In addition, despite the known limitations of decedent case-control studies, cancers of the oral cavity OR = 2.44, CI = 1.17-5.09, stomach OR = 2.31, CI = 1.18-4.52, liver OR = 4.06, CI = 1.27-12.99, bladder OR = 6.77, CI = 1.5-30.67, and lymphoma OR = 2.55, CI = 1.04-6.25 had elevated point estimates, for different age strata indicating an association between these cancers and occupations that led to exposure to petroleum and its derivates. ^
Resumo:
Bcl-2 oncogene expression plays a role in the establishment of persistent viral infection by blocking virus-induced apoptosis. This might be achieved by preventing virus-induced activation of caspase-3, an IL-1beta-converting enzyme (ICE)-like cysteine protease that has been implicated in the death effector phase of apoptosis. Contrary to this model, we show that three cell types highly overexpressing functional Bcl-2 displayed caspase-3 activation and underwent apoptosis in response to infection with alphaviruses Semliki Forest and Sindbis as efficiently as vector control counterparts. In all three cell types, overexpressed 26 kDa Bcl-2 was cleaved into a 23 kDa protein. Antibody epitope mapping revealed that cleavage occurred at one or two target sites for caspases within the amino acid region YEWD31 (downward arrow) AGD34 (downward arrow) A, removing the N-terminal BH4 region known to be essential for the death-protective activity of Bcl-2. Preincubation of cells with the caspase inhibitor Z-VAD prevented Bcl-2 cleavage and partially restored the protective activity of Bcl-2 against virus-induced apoptosis. Moreover, a murine Bcl-2 mutant having Asp31, Asp34 and Asp36 substituted by Glu was resistant to proteolytic cleavage and abrogated apoptosis following virus infection. These findings indicate that alphaviruses can trigger a caspase-mediated inactivation of Bcl-2 in order to evade the death protection imposed by this survival factor.
An Early-Warning System for Hypo-/Hyperglycemic Events Based on Fusion of Adaptive Prediction Models
Resumo:
Introduction: Early warning of future hypoglycemic and hyperglycemic events can improve the safety of type 1 diabetes mellitus (T1DM) patients. The aim of this study is to design and evaluate a hypoglycemia / hyperglycemia early warning system (EWS) for T1DM patients under sensor-augmented pump (SAP) therapy. Methods: The EWS is based on the combination of data-driven online adaptive prediction models and a warning algorithm. Three modeling approaches have been investigated: (i) autoregressive (ARX) models, (ii) auto-regressive with an output correction module (cARX) models, and (iii) recurrent neural network (RNN) models. The warning algorithm performs postprocessing of the models′ outputs and issues alerts if upcoming hypoglycemic/hyperglycemic events are detected. Fusion of the cARX and RNN models, due to their complementary prediction performances, resulted in the hybrid autoregressive with an output correction module/recurrent neural network (cARN)-based EWS. Results: The EWS was evaluated on 23 T1DM patients under SAP therapy. The ARX-based system achieved hypoglycemic (hyperglycemic) event prediction with median values of accuracy of 100.0% (100.0%), detection time of 10.0 (8.0) min, and daily false alarms of 0.7 (0.5). The respective values for the cARX-based system were 100.0% (100.0%), 17.5 (14.8) min, and 1.5 (1.3) and, for the RNN-based system, were 100.0% (92.0%), 8.4 (7.0) min, and 0.1 (0.2). The hybrid cARN-based EWS presented outperforming results with 100.0% (100.0%) prediction accuracy, detection 16.7 (14.7) min in advance, and 0.8 (0.8) daily false alarms. Conclusion: Combined use of cARX and RNN models for the development of an EWS outperformed the single use of each model, achieving accurate and prompt event prediction with few false alarms, thus providing increased safety and comfort.
Resumo:
Definitions of shock and resuscitation endpoints traditionally focus on blood pressures and cardiac output. This carries a high risk of overemphasizing systemic hemodynamics at the cost of tissue perfusion. In line with novel shock definitions and evidence of the lack of a correlation between macro- and microcirculation in shock, we recommend that macrocirculatory resuscitation endpoints, particularly arterial and central venous pressure as well as cardiac output, be reconsidered. In this viewpoint article, we propose a three-step approach of resuscitation endpoints in shock of all origins. This approach targets only a minimum individual and context-sensitive mean arterial blood pressure (for example, 45 to 50 mm Hg) to preserve heart and brain perfusion. Further resuscitation is exclusively guided by endpoints of tissue perfusion irrespectively of the presence of arterial hypotension ('permissive hypotension'). Finally, optimization of individual tissue (for example, renal) perfusion is targeted. Prospective clinical studies are necessary to confirm the postulated benefits of targeting these resuscitation endpoints.
Resumo:
INTRODUCTION According to reports from observational databases, classic AIDS-defining opportunistic infections (ADOIs) occur in patients with CD4 counts above 500/µL on and off cART. Adjudication of these events is usually not performed. However, ADOIs are often used as endpoints, for example, in analyses on when to start cART. MATERIALS AND METHODS In the database, Swiss HIV Cohort Study (SHCS) database, we identified 91 cases of ADOIs that occurred from 1996 onwards in patients with the nearest CD4 count >500/µL. Cases of tuberculosis and recurrent bacterial pneumonia were excluded as they also occur in non-immunocompromised patients. Chart review was performed in 82 cases, and in 50 cases we identified CD4 counts within six months before until one month after ADOI and had chart review material to allow an in-depth review. In these 50 cases, we assessed whether (1) the ADOI fulfilled the SHCS diagnostic criteria (www.shcs.ch), and (2) HIV infection with CD4 >500/µL was the main immune-compromising condition to cause the ADOI. Adjudication of cases was done by two experienced clinicians who had to agree on the interpretation. RESULTS More than 13,000 participants were followed in SHCS in the period of interest. Twenty-four (48%) of the chart-reviewed 50 patients with ADOI and CD4 >500/µL had an HIV RNA <400 copies/mL at the time of ADOI. In the 50 cases, candida oesophagitis was the most frequent ADOI in 30 patients (60%) followed by pneumocystis pneumonia and chronic ulcerative HSV disease (Table 1). Overall chronic HIV infection with a CD4 count >500/µL was the likely explanation for the ADOI in only seven cases (14%). Other reasons (Table 1) were ADOIs occurring during primary HIV infection in 5 (10%) cases, unmasking IRIS in 1 (2%) case, chronic HIV infection with CD4 counts <500/µL near the ADOI in 13 (26%) cases, diagnosis not according to SHCS diagnostic criteria in 7 (14%) cases and most importantly other additional immune-compromising conditions such as immunosuppressive drugs in 14 (34%). CONCLUSIONS In patients with CD4 counts >500/ µL, chronic HIV infection is the cause of ADOIs in only a minority of cases. Other immuno-compromising conditions are more likely explanations in one-third of the patients, especially in cases of candida oesophagitis. ADOIs in HIV patients with high CD4 counts should be used as endpoints only with much caution in studies based on observational databases.
Resumo:
BACKGROUND Pathogenic bacteria are often asymptomatically carried in the nasopharynx. Bacterial carriage can be reduced by vaccination and has been used as an alternative endpoint to clinical disease in randomised controlled trials (RCTs). Vaccine efficacy (VE) is usually calculated as 1 minus a measure of effect. Estimates of vaccine efficacy from cross-sectional carriage data collected in RCTs are usually based on prevalence odds ratios (PORs) and prevalence ratios (PRs), but it is unclear when these should be measured. METHODS We developed dynamic compartmental transmission models simulating RCTs of a vaccine against a carried pathogen to investigate how VE can best be estimated from cross-sectional carriage data, at which time carriage should optimally be assessed, and to which factors this timing is most sensitive. In the models, vaccine could change carriage acquisition and clearance rates (leaky vaccine); values for these effects were explicitly defined (facq, 1/fdur). POR and PR were calculated from model outputs. Models differed in infection source: other participants or external sources unaffected by the trial. Simulations using multiple vaccine doses were compared to empirical data. RESULTS The combined VE against acquisition and duration calculated using POR (VEˆacq.dur, (1-POR)×100) best estimates the true VE (VEacq.dur, (1-facq×fdur)×100) for leaky vaccines in most scenarios. The mean duration of carriage was the most important factor determining the time until VEˆacq.dur first approximates VEacq.dur: if the mean duration of carriage is 1-1.5 months, up to 4 months are needed; if the mean duration is 2-3 months, up to 8 months are needed. Minor differences were seen between models with different infection sources. In RCTs with shorter intervals between vaccine doses it takes longer after the last dose until VEˆacq.dur approximates VEacq.dur. CONCLUSION The timing of sample collection should be considered when interpreting vaccine efficacy against bacterial carriage measured in RCTs.
Resumo:
Staphylococcus aureus is one of the most important pathogens causing mastitis in dairy cows and in Mediterranean buffaloes. Genotype B (GTB) is contagious in dairy cows and may occur in up to 87% of cows of a dairy herd. It was the aim of this study to evaluate genotypes present, clinical outcomes, and prevalence of Staph. aureus in milk samples of primiparous Mediterranean dairy buffaloes. Two hundred composite milk samples originating from 40 primiparous buffaloes were collected from May to June 2012, at d 10, 30, 60, 90, and 150 d in milk (DIM) to perform somatic cell counts and bacteriological cultures. Daily milk yields were recorded. Before parturition until 40 to 50 DIM, all primiparous animals were housed separated from the pluriparous animals. Milking was performed in the same milking parlor, but the primiparous animals were milked first. After 50 DIM, the primiparous were mixed with the pluriparous animals, including the milking procedure. Individual quarter samples were collected from each animal, and aliquots of 1 mL were mixed and used for molecular identification and genotyping of Staph. aureus. The identification of Staph. aureus was performed verifying the presence of nuc gene by nuc gene PCR. All the nuc-positive isolates were subjected to genotype analysis by means of PCR amplification of the 16S-23S rRNA intergenic spacer region and analyzed by a miniaturized electrophoresis system. Of all 200 composite samples, 41 (20.5%) were positive for Staph. aureus, and no genotype other than GTB was identified. The prevalence of samples positive for Staph. aureus was 0% at 10 DIM and increased to a maximum of 22/40 (55%) at 90 DIM. During the period of interest, 14 buffaloes tested positive for Staph. aureus once, 6 were positive twice, and 5 were positive 3 times, whereas 15 animals were negative at every sampling. At 90 and 150 DIM, 7 (17.5%) and 3 buffaloes (7.5%), respectively, showed clinical mastitis (CM), and only 1 (2.5%) showed CM at both samplings. At 60, 90, and 150 DIM, 1 buffalo was found with subclinical mastitis at each sampling. At 30, 60, 90, and 150 DIM, 2.5 (1/40), 22.5 (9/40), 35 (14/40), and 10% (4/40) were considered affected by intramammary infection, respectively. Buffaloes with CM caused by Staph. aureus had statistically significantly higher mean somatic cell count values (6.06 ± 0.29, Log10 cells/mL ± standard deviation) and statistically significantly lower mean daily milk yields (7.15 ± 1.49, liters/animal per day) than healthy animals (4.69 ± 0.23 and 13.87 ± 2.64, respectively), buffaloes with IMI (4.82 ± 0.23 and 11.16 ± 1.80, respectively), or with subclinical mastitis (5.47 ± 0.10 and 10.33 ± 0.68, respectively). Based on our knowledge, this is the first time that Staph. aureus GTB has been identified in milk samples of dairy Mediterranean buffaloes.
Resumo:
BACKGROUND Limited information exists describing the results of transcatheter aortic valve (TAV) replacement in patients with bicuspid aortic valve (BAV) disease (TAV-in-BAV). OBJECTIVES This study sought to evaluate clinical outcomes of a large cohort of patients undergoing TAV-in-BAV. METHODS We retrospectively collected baseline characteristics, procedural data, and clinical follow-up findings from 12 centers in Europe and Canada that had performed TAV-in-BAV. RESULTS A total of 139 patients underwent TAV-in-BAV with the balloon-expandable transcatheter heart valve (THV) (n = 48) or self-expandable THV (n = 91) systems. Patient mean age and Society of Thoracic Surgeons predicted risk of mortality scores were 78.0 ± 8.9 years and 4.9 ± 3.4%, respectively. BAV stenosis occurred in 65.5%, regurgitation in 0.7%, and mixed disease in 33.8% of patients. Incidence of type 0 BAV was 26.7%; type 1 BAV was 68.3%; and type 2 BAV was 5.0%. Multislice computed tomography (MSCT)-based TAV sizing was used in 63.5% of patients (77.1% balloon-expandable THV vs. 56.0% self-expandable THV, p = 0.02). Procedural mortality was 3.6%, with TAV embolization in 2.2% and conversion to surgery in 2.2%. The mean aortic gradient decreased from 48.7 ± 16.5 mm Hg to 11.4 ± 9.9 mm Hg (p < 0.0001). Post-implantation aortic regurgitation (AR) grade ≥2 occurred in 28.4% (19.6% balloon-expandable THV vs. 32.2% self-expandable THV, p = 0.11) but was prevalent in only 17.4% when MSCT-based TAV sizing was performed (16.7% balloon-expandable THV vs. 17.6% self-expandable THV, p = 0.99). MSCT sizing was associated with reduced AR on multivariate analysis (odds ratio [OR]: 0.19, 95% confidence intervals [CI]: 0.08 to 0.45; p < 0.0001). Thirty-day device safety, success, and efficacy were noted in 79.1%, 89.9%, and 84.9% of patients, respectively. One-year mortality was 17.5%. Major vascular complications were associated with increased 1-year mortality (OR: 5.66, 95% CI: 1.21 to 26.43; p = 0.03). CONCLUSIONS TAV-in-BAV is feasible with encouraging short- and intermediate-term clinical outcomes. Importantly, a high incidence of post-implantation AR is observed, which appears to be mitigated by MSCT-based TAV sizing. Given the suboptimal echocardiographic results, further study is required to evaluate long-term efficacy.
Resumo:
Introduction: Current demographic changes are characterized by population aging, such that the surgical treatment of degenerative spine conditions in the elderly is gaining increasing relevance. However, there is a general reluctance to consider spinal fusion procedures in this patient age group due to the increased likelihood of complications. The aim of this study was to assess the patient-rated outcome and complication rates associated with lumbar fusion procedures in three different age groups. Methods: This was a retrospective analysis of prospectively collected data from consecutive patients who underwent first-time, one to three level posterior instrumented fusion between 2004 and 2011, due to degenerative disease of the lumbar spine. Data were obtained from our Spine Surgery Outcomes Database (linked to the International Spine Tango Register). Before surgery, patients completed the multidimensional Core Outcome Measures Index (COMI), and at 3 and 12 months after surgery they completed the COMI and rated the Global Treatment Outcome (GTO) and their satisfaction with care. Patients were divided into three groups according to their age: younger (≥50y <65y; n = 317), older (≥65y <80y; n = 350), and geriatric (≥ 80y; n = 40). Results: 707 consecutive patients were included. The preoperative comorbidity status differed significantly (p < 0.0001) between the age groups, with the highest scores in the geriatric group. General medical complications during surgery were lower in the younger age group (7%) than in the older (13.4%; p = 0.006) and geriatric groups (17.5%; p = 0.007). Duration of hospital stay was longer (p = 0.006) in the older group (10.8 ± 3.7 days) than the younger (10.0 ± 3.6 days) group. There were no significant group differences (p>0.05) for any of the COMI domains covering pain, function, symptom specific well-being, general quality of life, and social and work disability at either 3 months’ or 12 months’ follow-up. Similarly, there were no differences (p>0.05) between the age groups for GTO and patient-rated satisfaction at either follow-up. Conclusions: Preoperative comorbidity and general medical complications during lumbar fusion for degenerative disorders of the lumbar spine are both greater in geriatric patients than in younger patients. However, patient-rated outcome is as good in the elderly as it is in younger age groups. These data suggest that geriatric age per se is not a contraindication to instrumented fusion for lumbar degenerative disease.