68 resultados para Lumbar stabilization
Resumo:
Introduction Le canal lombaire étroit symptomatique est de plus en plus fréquent. Le traitement dépend des signes cliniques et des résultats radiologiques. Mais actuellement il n'y a pas de consensus concernant la classification radiologique. Le but de notre article est d'étudier la relation entre deux paramètres morphologiques radiologiques récemment décrits sur des examens par IRM. Le premier est le « signe de sédimentation » (Sedimentation Sign) et le second est le grade morphologique de la sténose lombaire (Morphological Grade), tous deux décrit en 2010. Matériel et méthode Nous avons étudié des examens IRM de 137 patients suivit dans notre établissement. De ces 137, 110 étaient issus d'une base de donnée de patients avec une sténose lombaire dont la Symptomatologie était typique. Dans ce groupe, 73 patients avaient été traité chirurgicalement et 37 conservativement, dépendant de la sévérité des symptômes. Un troisième groupe, le groupe contrôle, était formé de 27 patients ne présentant que des douleurs lombaires basses sans sciatalgie. La sévérité de la sténose a été évaluée sur les examens IRM au niveau du disque en utilisant les 4 grades de la classification morphologique, de A à D. La présence d'un signe de sédimentation a été, quand à lui, notée au niveau du pédicule, au-dessus et au-dessous du niveau présentant la sténose maximale, comme décrit dans l'article original. Résultat La présence d'un signe de sédimentation positif a été observée chez 58% des patients présentant un grade morphologique B, 69% chez les patients avec un grade C et 76% des patients avec un grade D. Dans le groupe de patient traité chirurgicalement pour une sténose canalaire, 67% des patients présentaient un signe de sédimentation positif, 35% dans le groupe du traitement conservateur, et 8 % dans le groupe contrôle. En ce qui concerne la classification du grade morphologique, nous avons regroupé les grade C et D. Il y avait 97% de patients avec un grade C et D dans le groupe du traitement chirurgical, 35 % dans le groupe du traitement conservateur et 18% dans le groupe contrôle. Nous avons donc calculé que la présence d'un signe de sédimentation positif chez les patients avec une sténose lombaire symptomatique augmente le risque d'avoir besoin d'une intervention de l'ordre de 3.5 fois (OR=3.5). En utilisant la classification du grade morphologique, nous avons calculé un risque encore plus élevé. Un patient avec une sténose canalaire de grade C ou D a 65 fois plus de risque d'avoir besoin d'une intervention (OR=65). Conclusion : Les résultats montrent une corrélation entre ces deux paramètres morphologiques. Mais la prédiction du besoin d'une intervention n'est pas équivalente. Un tiers des patients dans le groupe du traitement chirurgical n'avaient pas de signe de sédimentation positif. Ce signe apparaît donc comme un moins bon prédicteur pour le choix du traitement comparé à la sévérité de la sténose jugée avec le grade morphologique (OR 3.5 vs 65).
Resumo:
PURPOSE: We aimed to study the relationship between two morphological parameters recently described on MRI images in relation to lumbar spinal stenosis (LSS): the first is the sedimentation sign (SedS) and the second is the morphological grading of lumbar stenosis. MATERIALS AND METHODS: MRIs from a total of 137 patients were studied. From those, 110 were issued from a prospective database of symptomatic LSS patients, of whom 73 were treated surgically and 37 conservatively based on symptom severity. A third group consisting of 27 subjects complaining of low back pain (LBP) served as control. Severity of stenosis was judged at disc level using the four A to D grade morphological classification. The presence of a SedS was judged at pedicle level, above or below the site of maximal stenosis. RESULTS: A positive SedS was observed in 58, 69 and 76 % of patients demonstrating B, C and D morphology, respectively, but in none with grade A morphology. The SedS was positive in 67 and 35 % of the surgically and conservatively treated patients, respectively, and in 8 % of the LBP group. C and D morphological grades were present in 97 and 35 % of patients in the surgically and conservatively treated group, respectively, and in 18 % of the LBP group. Presence of a positive SedS carried an increased risk of being submitted to surgery in the symptomatic LSS group (OR 3.5). This risk was even higher in the LSS patients demonstrating grade C or D morphology (OR 65). DISCUSSION AND CONCLUSION: One-third of surgically treated LSS patients do not present a SedS. This sign appears to be a lesser predictor of treatment modality in our setting of symptomatic LSS patients compared to the severity of stenosis judged by the morphological grade.
Resumo:
Cells couple growth with division and regulate size in response to nutrient availability. In rod-shaped fission yeast, cell-size control occurs at mitotic commitment. An important regulator is the DYRK-family kinase Pom1, which forms gradients from cell poles and inhibits the mitotic activator Cdr2, itself localized at the medial cortex. Where and when Pom1 modulates Cdr2 activity is unclear as Pom1 medial cortical levels remain constant during cell elongation. Here we show that Pom1 re-localizes to cell sides upon environmental glucose limitation, where it strongly delays mitosis. This re-localization is caused by severe microtubule destabilization upon glucose starvation, with microtubules undergoing catastrophe and depositing the Pom1 gradient nucleator Tea4 at cell sides. Microtubule destabilization requires PKA/Pka1 activity, which negatively regulates the microtubule rescue factor CLASP/Cls1/Peg1, reducing CLASP's ability to stabilize microtubules. Thus, PKA signalling tunes CLASP's activity to promote Pom1 cell side localization and buffer cell size upon glucose starvation.
Resumo:
NlmCategory="UNASSIGNED">Rapid deployment aortic valve replacement (RDAVR) with the use of rapid deployment valve systems represents a smart alternative to the use of standard aortic bioprosthesis for aortic valve replacement. Nevertheless, its use is still debatable in patients with pure aortic valve regurgitation or true bicuspid aortic valve because of the risk of postoperative paravalvular leak. To address this issue, an optimal annulus-valve size match seems to be the ideal surgical strategy. This article describes a new technique developed to stabilize the aortic annulus and prevent paravalvular leak after RDAVR. To confirm the feasibility, this technique was performed in six patients with severe symptomatic aortic stenosis who were scheduled to undergo aortic valve replacement at our center. All patients survived surgery and were discharged from the hospital. There were no new intracardiac conduction system disturbances observed, and a permanent pacemaker implantation was not required in any of the patients. The intraoperative and postoperative echocardiogram confirmed successful positioning of the valve, and no paravalvular leak was observed. In this preliminary experience, RDAVR through a full sternotomy or an upper hemisternotomy approach with the use of aortic annulus stabilization technique was safe, and no leak was observed. Future studies on large series of patients are necessary to confirm the safety and effectiveness of this technique in preventing paravalvular leak in patients with true bicuspid aortic valves or pure aortic regurgitation.
Resumo:
In many high income developed countries, obesity is inversely associated with educational level. In some countries, a widening gap of obesity between educational groups has been reported. The aim of this study was to assess trends in body mass index (BMI) and in prevalence of overweight and obesity and their association with educational level in the adult Swiss population. Four cross-sectional National health interview surveys conducted in 1992/93 (n = 14,521), 1997 (n = 12,474), 2002 (n = 18,908) and 2007 (n = 17,879) using representative samples of the Swiss population (age range 18-102 years). BMI was derived from self-reported data. Overweight was defined as BMI > or = 25 and <30 kg/m(2), and obesity as BMI > or = 30 kg/m(2). Mean (+/- standard deviation) BMI increased from 24.7 +/- 3.6 in 1992/3 to 25.4 +/- 3.6 kg/m2 in 2007 in men and 22.8 +/- 3.8 to 23.7 +/- 4.3 kg/m(2) in women. Between 1992/3 and 2007, the prevalence of overweight + obesity increased from 40.4% to 49.5% in men and from 22.3% to 31.3% in women, while the prevalence of obesity increased from 6.3% to 9.4% in men and from 4.9% to 8.5% in women. The rate of increase in the prevalence of obesity was greater between 1992/3 and 2002 (men: +0.26%/year; women: +0.31%/year) than between 2002 and 2007 (men: +0.10%/year; women: +0.10%/year). A sizable fraction (approximately 25%) of the increasing mean BMI was due to increasing age of the participants over time. The increase was larger in low than high education strata of the population. BMI was strongly associated with low educational level among women and this gradient remained fairly constant over time. A weaker similar gradient by educational level was apparent in men, but it tended to increase over time. In Switzerland, overweight and obesity increased between 1992 and 2007 and was associated with low education status in both men and women. A trend towards a stabilization of mean BMI levels was noted in most age categories since 2002. The increase in the prevalence of obesity was larger in low education strata of the population. These findings suggest that obesity preventive measures should be targeted according to educational level in Switzerland.
Resumo:
Background/Purpose: The trabecular bone score (TBS), a novel graylevel texture index determined from lumbar spine DXA scans, correlates with 3D parameters of trabecular bone microarchitecture known to predict fracture. TBS may enhance the identification of patients at increased risk for vertebral fracture independently of bone mineral density (BMD) (Boutroy JBMR 2010; Hans JBMR 2011). Denosumab treatment for 36 months decreased bone turnover, increased BMD, and reduced new vertebral fractures in postmenopausal women with osteoporosis (Cummings NEJM 2009). We explored the effect of denosumab on TBS over 36 months and evaluated the association between TBS and lumbar spine BMD in women who had DXA scans obtained from eligible scanners for TBS evaluation in FREEDOM. Methods: FREEDOM was a 3-year, randomized, double-blind trial that enrolled postmenopausal women with a lumbar spine or total hip DXA T-score __2.5, but not __4.0 at both sites. Women received placebo or 60 mg denosumab every 6 months. A subset of women in FREEDOM participated in a DXA substudy where lumbar spine DXA scans were obtained at baseline and months 1, 6, 12, 24, and 36. We retrospectively applied, in a blinded-to-treatment manner, a novel software program (TBS iNsightR v1.9, Med-Imaps, Pessac, France) to the standard lumbar spine DXA scans obtained in these women to determine their TBS indices at baseline and months 12, 24, and 36. From previous studies, a TBS _1.35 is considered as normal microarchitecture, a TBS between 1.35 and _1.20 as partially deteriorated, and 1.20 reflects degraded microarchitecture. Results: There were 285 women (128 placebo, 157 denosumab) with a TBS value at baseline and _1 post-baseline visit. Their mean age was 73, their mean lumbar spine BMD T-score was _2.79, and their mean lumbar spine TBS was 1.20. In addition to the robust gains in DXA lumbar spine BMD observed with denosumab (9.8% at month 36), there were consistent, progressive, and significant increases in TBS compared with placebo and baseline (Table & Figure). BMD explained a very small fraction of the variance in TBS at baseline (r2_0.07). In addition, the variance in the TBS change was largely unrelated to BMD change, whether expressed in absolute or percentage changes, regardless of treatment, throughout the study (all r2_0.06); indicating that TBS provides distinct information, independently of BMD. Conclusion: In postmenopausal women with osteoporosis, denosumab significantly improved TBS, an index of lumbar spine trabecular microarchitecture, independently of BMD.
Resumo:
Introduction: Several scores are commonly used to evaluate patients' postoperative satisfaction after lateral ankle ligament repair, including: AOFAS, FAAM, CAIT and CAIS. Comparing published studies in the literature is difficult, as the same patient can have markedly different results depending on which scoring system is used. The current study aims to address this gap in the literature by developing a system to compare these tests, to allow better analysis and comparison of published studies. Patients and methods: This is a retrospective cohort study of 47 patients following lateral ankle ligament repair using a modified Broström-Gould technique. All patients were operated between 2005 and 2010 by a single surgeon and followed the same post operative rehabilitation protocol. Six patients were excluded from the study because of concomitant surgery. Patients were assessed by an independent observer. We used the Pearson correlation coefficient to analyse the concordance of the scores, as well as scatter plots to assess the linear relationship between them. Results: A linear distribution between the scores was found when the results were analysed using scatter plots. We were thus able to use the Pearson correlation coefficient to evaluate the relationship between each of the different postoperative scores. The correlation was found to be above 0.5 in all cases except for the comparison between the CAIT and the FAAM for the activities of daily living (0.39). We were, therefore, able to compare the results obtained and assess the relative concordance of the scoring systems. The results showed that the more specific the scale is, the worst the score is and inversely. So the CAIT and the CAIS appeared to be more severe than the AOFAS and the FAAM measuring the activities of daily living. The sports subscale of the FAAM demonstrated intermediate results. Conclusion: This study outlines a system to compare different postoperative scores commonly used to evaluate outcome after ankle stabilization surgery. The impact of this study is that it makes comparison of published studies easier, even though they use a variety of different clinical scores, thus facilitating better outcome analysis of operative techniques.
Resumo:
STUDY DESIGN: Prospective, controlled, observational outcome study using clinical, radiographic, and patient/physician-based questionnaire data, with patient outcomes at 12 months follow-up. OBJECTIVE: To validate appropriateness criteria for low back surgery. SUMMARY OF BACKGROUND DATA: Most surgical treatment failures are attributed to poor patient selection, but no widely accepted consensus exists on detailed indications for appropriate surgery. METHODS: Appropriateness criteria for low back surgery have been developed by a multispecialty panel using the RAND appropriateness method. Based on panel criteria, a prospective study compared outcomes of patients appropriately and inappropriately treated at a single institution with 12 months follow-up assessment. Included were patients with low back pain and/or sciatica referred to the neurosurgical department. Information about symptoms, neurologic signs, the health-related quality of life (SF-36), disability status (Roland-Morris), and pain intensity (VAS) was assessed at baseline, at 6 months, and at 12 months follow-up. The appropriateness criteria were administered prospectively to each clinical situation and outside of the clinical setting, with the surgeon and patients blinded to the results of the panel decision. The patients were further stratified into 2 groups: appropriate treatment group (ATG) and inappropriate treatment group (ITG). RESULTS: Overall, 398 patients completed all forms at 12 months. Treatment was considered appropriate for 365 participants and inappropriate for 33 participants. The mean improvement in the SF-36 physical component score at 12 months was significantly higher in the ATG (mean: 12.3 points) than in the ITG (mean: 6.8 points) (P = 0.01), as well as the mean improvement in the SF-36 mental component score (ATG mean: 5.0 points; ITG mean: -0.5 points) (P = 0.02). Improvement was also significantly higher in the ATG for the mean VAS back pain (ATG mean: 2.3 points; ITG mean: 0.8 points; P = 0.02) and Roland-Morris disability score (ATG mean: 7.7 points; ITG mean: 4.2 points; P = 0.004). The ATG also had a higher improvement in mean VAS for sciatica (4.0 points) than the ITG (2.8 points), but the difference was not significant (P = 0.08). The SF-36 General Health score declined in both groups after 12 months, however, the decline was worse in the ITG (mean decline: 8.2 points) than in the ATG (mean decline: 1.2 points) (P = 0.04). Overall, in comparison to ITG patients, ATG patients had significantly higher improvement at 12 months, both statistically and clinically. CONCLUSION: In comparison to previously reported literature, our study is the first to assess the utility of appropriateness criteria for low back surgery at 1-year follow-up with multiple outcome dimensions. Our results confirm the hypothesis that application of appropriateness criteria can significantly improve patient outcomes.
Resumo:
Abstract Bradykinin (BK) was shown to stimulate the production of physiologically active metabolites, blood-brain barrier disruption, and brain edema. The aim of this prospective study was to measure BK concentrations in blood and cerebrospinal fluid (CSF) of patients with traumatic brain injury (TBI), subarachnoid hemorrhage (SAH), intracerebral hemorrhage (ICH), and ischemic stroke and to correlate BK levels with the extent of cerebral edema and intracranial pressure (ICP). Blood and CSF samples of 29 patients suffering from acute cerebral lesions (TBI, 7; SAH,: 10; ICH, 8; ischemic stroke, 4) were collected for up to 8 days after insult. Seven patients with lumbar drainage were used as controls. Edema (5-point scale), ICP, and the GCS (Glasgow Coma Score) at the time of sample withdrawal were correlated with BK concentrations. Though all plasma-BK samples were not significantly elevated, CSF-BK levels of all patients were significantly elevated in overall (n=73) and early (≤72 h) measurements (n=55; 4.3±6.9 and 5.6±8.9 fmol/mL), compared to 1.2±0.7 fmol/mL of controls (p=0.05 and 0.006). Within 72 h after ictus, patients suffering from TBI (p=0.01), ICH (p=0.001), and ischemic stroke (p=0.02) showed significant increases. CSF-BK concentrations correlated with extent of edema formation (r=0.53; p<0.001) and with ICP (r=0.49; p<0.001). Our results demonstrate that acute cerebral lesions are associated with increased CSF-BK levels. Especially after TBI, subarachnoid and intracerebral hemorrhage CSF-BK levels correlate with extent of edema evolution and ICP. BK-blocking agents may turn out to be effective remedies in brain injuries.
Resumo:
In the present study, we evaluated stimulation of the angiotensin type 2 receptor (AT2R) by the selective non-peptide agonist Compound 21 (C21) as a novel therapeutic concept for the treatment of multiple sclerosis using the model of experimental autoimmune encephalomyelitis (EAE) in mice. C57BL-6 mice were immunized with myelin-oligodendrocyte peptide and treated for 4 weeks with C21 (0.3 mg/kg/day i.p.). Potential effects on myelination, microglia and T-cell composition were estimated by immunostaining and FACS analyses of lumbar spinal cords. The in vivo study was complemented by experiments in aggregating brain cell cultures and microglia in vitro. In the EAE model, treatment with C21 ameliorated microglia activation and decreased the number of total T-cells and CD4+ T-cells in the spinal cord. Fluorescent myelin staining of spinal cords further revealed a significant reduction in EAE-induced demyelinated areas in lumbar spinal cord tissue after AT2R stimulation. C21-treated mice had a significantly better neurological score than vehicle-treated controls. In aggregating brain cell cultures challenged with lipopolysaccharide (LPS) plus interferon-γ (IFNγ), AT2R stimulation prevented demyelination, accelerated re-myelination and reduced the number of microglia. Cytokine synthesis and nitric oxide production by microglia in vitro were significantly reduced after C21 treatment. These results suggest that AT2R stimulation protects the myelin sheaths in autoimmune central nervous system inflammation by inhibiting the T-cell response and microglia activation. Our findings identify the AT2R as a potential new pharmacological target for demyelinating diseases such as multiple sclerosis.