953 resultados para CLASS-III MALOCCLUSIONS
Resumo:
Although intervertebral disc herniation is a well-known disease in dogs, pain management for this condition has remained a challenge. The goal of the present study is to address the lack of information regarding the innervation of anatomical structures within the canine vertebral canal. Immunolabeling was performed with antibodies against protein gene product 9.5, Tuj-1 (neuron-specific class III β-tubulin), calcitonin gene-related peptide, and neuropeptide Y in combination with the lectin from Lycopersicon esculentum as a marker for blood vessels. Staining was indicative of both sensory and sympathetic fibers. Innervation density was the highest in lateral areas, intermediate in dorsal areas, and the lowest in ventral areas. In the dorsal longitudinal ligament (DLL), the highest innervation density was observed in the lateral regions. Innervation was lower at mid-vertebral levels than at intervertebral levels. The presence of sensory and sympathetic fibers in the canine dura and DLL suggests that pain may originate from both these structures. Due to these regional differences in sensory innervation patterns, trauma to intervertebral DLL and lateral dura is expected to be particularly painful. The results ought to provide a better basis for the assessment of medicinal and surgical procedures.
Resumo:
Suppression of cyclic activity in cattle is often desired in alpine farming and for feedlot cattle not intended for breeding. A cattle-specific anti-GnRH vaccination (Bopriva, Zoetis Australia Ltd., West Ryde, Australia) is approved for use in heifers and bulls in New Zealand, Australia, Mexico, Brazil, Argentina, Turkey, and Peru. Eleven healthy, cyclic Swiss Fleckvieh cows were included in the study and vaccinated twice with Bopriva 4wk apart. Injection site, rectal body temperature, and heart and respiratory rates were recorded before and 3d following each vaccination. Blood samples were taken weekly for progesterone and estrogen analysis and to determine GnRH antibody titer. Ovaries were examined weekly, using ultrasound to count the number of follicles and identify the presence of a corpus luteum. Thirty weeks after the first vaccination, the cows were subjected to a controlled internal drug-releasing device-based Select-Synch treatment. The GnRH antibody titers increased after the second vaccination and peaked 2wk later. Estrogen levels were not influenced by vaccination, and progesterone level decreased in 7 of 11 cows up to 3wk after the second vaccination and remained low for 10 to 15wk following the second vaccination. The number of class I follicles (diameter ≤5mm) was not influenced by vaccination, whereas the number of class II follicles (diameter 6-9mm) decreased between 7 and 16wk after the first vaccination. Class III follicles (diameter >9mm) were totally absent during this period in most cows. The median period until recurrence of class III follicles was 78d from the day of the second vaccination (95% confidence interval: 60-92d). After vaccination, all cows showed swelling and pain at the injection site, and these reactions subsided within 2wk. Body temperature and heart and respiratory rates increased after the first and second vaccinations and returned to normal values within 2d of each vaccination. The cows in our study were not observed to display estrus behavior until 30wk after the first vaccination. Therefore, a Select-Synch protocol was initiated at that time. Ten cows became pregnant after the first insemination (the remaining cow was reinseminated once until confirmed pregnancy). Bopriva induced a reliable and reversible suppression of reproductive cyclicity for more than 2mo. The best practical predictor for the length of the anestrus period was the absence of class III follicles.
Resumo:
BACKGROUND The use of transcatheter mitral valve repair (TMVR) has gained widespread acceptance in Europe, but data on immediate success, safety, and long-term echocardiographic follow-up in real-world patients are still limited. OBJECTIVES The aim of this multinational registry is to present a real-world overview of TMVR use in Europe. METHODS The Transcatheter Valve Treatment Sentinel Pilot Registry is a prospective, independent, consecutive collection of individual patient data. RESULTS A total of 628 patients (mean age 74.2 ± 9.7 years, 63.1% men) underwent TMVR between January 2011 and December 2012 in 25 centers in 8 European countries. The prevalent pathogenesis was functional mitral regurgitation (FMR) (n = 452 [72.0%]). The majority of patients (85.5%) were highly symptomatic (New York Heart Association functional class III or higher), with a high logistic EuroSCORE (European System for Cardiac Operative Risk Evaluation) (20.4 ± 16.7%). Acute procedural success was high (95.4%) and similar in FMR and degenerative mitral regurgitation (p = 0.662). One clip was implanted in 61.4% of patients. In-hospital mortality was low (2.9%), without significant differences between groups. The estimated 1-year mortality was 15.3%, which was similar for FMR and degenerative mitral regurgitation. The estimated 1-year rate of rehospitalization because of heart failure was 22.8%, significantly higher in the FMR group (25.8% vs. 12.0%, p[log-rank] = 0.009). Paired echocardiographic data from the 1-year follow-up, available for 368 consecutive patients in 15 centers, showed a persistent reduction in the degree of mitral regurgitation at 1 year (6.0% of patients with severe mitral regurgitation). CONCLUSIONS This independent, contemporary registry shows that TMVR is associated with high immediate success, low complication rates, and sustained 1-year reduction of the severity of mitral regurgitation and improvement of clinical symptoms.
Resumo:
That gene transfer to plant cells is a temperature-sensitive process has been known for more than 50 years. Previous work indicated that this sensitivity results from the inability to assemble a functional T pilus required for T-DNA and protein transfer to recipient cells. The studies reported here extend these observations and more clearly define the molecular basis of this assembly and transfer defect. T-pilus assembly and virulence protein accumulation were monitored in Agrobacterium tumefaciens strain C58 at different temperatures ranging from 20 degrees C to growth-inhibitory 37 degrees C. Incubation at 28 degrees C but not at 26 degrees C strongly inhibited extracellular assembly of the major T-pilus component VirB2 as well as of pilus-associated protein VirB5, and the highest amounts of T pili were detected at 20 degrees C. Analysis of temperature effects on the cell-bound virulence machinery revealed three classes of virulence proteins. Whereas class I proteins (VirB2, VirB7, VirB9, and VirB10) were readily detected at 28 degrees C, class II proteins (VirB1, VirB4, VirB5, VirB6, VirB8, VirB11, VirD2, and VirE2) were only detected after cell growth below 26 degrees C. Significant levels of class III proteins (VirB3 and VirD4) were only detected at 20 degrees C and not at higher temperatures. Shift of virulence-induced agrobacteria from 20 to 28 or 37 degrees C had no immediate effect on cell-bound T pili or on stability of most virulence proteins. However, the temperature shift caused a rapid decrease in the amount of cell-bound VirB3 and VirD4, and VirB4 and VirB11 levels decreased next. To assess whether destabilization of virulence proteins constitutes a general phenomenon, levels of virulence proteins and of extracellular T pili were monitored in different A. tumefaciens and Agrobacterium vitis strains grown at 20 and 28 degrees C. Levels of many virulence proteins were strongly reduced at 28 degrees C compared to 20 degrees C, and T-pilus assembly did not occur in all strains except "temperature-resistant" Ach5 and Chry5. Virulence protein levels correlated well with bacterial virulence at elevated temperature, suggesting that degradation of a limited set of virulence proteins accounts for the temperature sensitivity of gene transfer to plants.
Resumo:
BACKGROUND Treatment of furcation defects is a core component of periodontal therapy. The goal of this consensus report is to critically appraise the evidence and to subsequently present interpretive conclusions regarding the effectiveness of regenerative therapy for the treatment of furcation defects and recommendations for future research in this area. METHODS A systematic review was conducted before the consensus meeting. This review aims to evaluate and present the available evidence regarding the effectiveness of different regenerative approaches for the treatment of furcation defects in specific clinical scenarios compared with conventional surgical therapy. During the meeting, the outcomes of the systematic review, as well as other pertinent sources of evidence, were discussed by a committee of nine members. The consensus group members submitted additional material for consideration by the group in advance and at the time of the meeting. The group agreed on a comprehensive summary of the evidence and also formulated recommendations for the treatment of furcation defects via regenerative therapies and the conduction of future studies. RESULTS Histologic proof of periodontal regeneration after the application of a combined regenerative therapy for the treatment of maxillary facial, mesial, distal, and mandibular facial or lingual Class II furcation defects has been demonstrated in several studies. Evidence of histologic periodontal regeneration in mandibular Class III defects is limited to one case report. Favorable outcomes after regenerative therapy for maxillary Class III furcation defects are limited to clinical case reports. In Class I furcation defects, regenerative therapy may be beneficial in certain clinical scenarios, although generally Class I furcation defects may be treated predictably with non-regenerative therapies. There is a paucity of data regarding quantifiable patient-reported outcomes after surgical treatment of furcation defects. CONCLUSIONS Based on the available evidence, it was concluded that regenerative therapy is a viable option to achieve predictable outcomes for the treatment of furcation defects in certain clinical scenarios. Future research should test the efficacy of novel regenerative approaches that have the potential to enhance the effectiveness of therapy in clinical scenarios associated historically with less predictable outcomes. Additionally, future studies should place emphasis on histologic demonstration of periodontal regeneration in humans and also include validated patient-reported outcomes. CLINICAL RECOMMENDATIONS Based on the prevailing evidence, the following clinical recommendations could be offered. 1) Periodontal regeneration has been established as a viable therapeutic option for the treatment of various furcation defects, among which Class II defects represent a highly predictable scenario. Hence, regenerative periodontal therapy should be considered before resective therapy or extraction; 2) The application of a combined therapeutic approach (i.e., barrier, bone replacement graft with or without biologics) appears to offer an advantage over monotherapeutic algorithms; 3) To achieve predictable regenerative outcomes in the treatment of furcation defects, adverse systemic and local factors should be evaluated and controlled when possible; 4) Stringent postoperative care and subsequent supportive periodontal therapy are essential to achieve sustainable long-term regenerative outcomes.
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
AIM To systematically search the literature and assess the available evidence for the influence of chin-cup therapy on the temporomandibular joint regarding morphological adaptations and appearance of temporomandibular disorders (TMD). MATERIALS AND METHODS Electronic database searches of published and unpublished literature were performed. The following electronic databases with no language and publication date restrictions were searched: MEDLINE (via Ovid and PubMed), EMBASE (via Ovid), the Cochrane Oral Health Group's Trials Register, and CENTRAL. Unpublished literature was searched on ClinicalTrials.gov, the National Research Register, and Pro-Quest Dissertation Abstracts and Thesis database. The reference lists of all eligible studies were checked for additional studies. Two review authors performed data extraction independently and in duplicate using data collection forms. Disagreements were resolved by discussion or the involvement of an arbiter. RESULTS From the 209 articles identified, 55 papers were considered eligible for inclusion in the review. Following the full text reading stage, 12 studies qualified for the final review analysis. No randomized clinical trial was identified. Eight of the included studies were of prospective and four of retrospective design. All studies were assessed for their quality and graded eventually from low to medium level of evidence. Based on the reported evidence, chin-cup therapy affects the condylar growth pattern, even though two studies reported no significance changes in disc position and arthrosis configuration. Concerning the incidence of TMD, it can be concluded from the available evidence that chin-cup therapy constitutes no risk factor for TMD. CONCLUSION Based on the available evidence, chin-cup therapy for Class III orthodontic anomaly seems to induce craniofacial adaptations. Nevertheless, there are insufficient or low-quality data in the orthodontic literature to allow the formulation of clear statements regarding the influence of chin-cup treatment on the temporomandibular joint.
Resumo:
OBJECTIVE Parametrial involvement (PMI) is one of the most important factors influencing prognosis in locally advanced stage cervical cancer (LACC) patients. We aimed to evaluate PMI rate among LACC patients undergoing neoadjuvant chemotherapy (NACT), thus evaluating the utility of parametrectomy in tailor adjuvant treatments. METHODS Retrospective evaluation of consecutive 275 patients affected by LACC (IB2-IIB), undergoing NACT followed by type C/class III radical hysterectomy. Basic descriptive statistics, univariate and multivariate analyses were applied in order to identify factors predicting PMI. Survival outcomes were assessed using Kaplan-Meier and Cox models. RESULTS PMI was detected in 37 (13%) patients: it was associated with vaginal involvement, lymph node positivity and both in 10 (4%), 5 (2%) and 12 (4%) patients, respectively; while PMI alone was observed in only 10 (4%) patients. Among this latter group, adjuvant treatment was delivered in 3 (1%) patients on the basis of pure PMI; while the remaining patients had other characteristics driving adjuvant treatment. Considering factors predicting PMI we observed that only suboptimal pathological responses (OR: 1.11; 95% CI: 1.01, 1.22) and vaginal involvement (OR: 1.29 (95%) CI: 1.17, 1.44) were independently associated with PMI. PMI did not correlate with survival (HR: 2.0; 95% CI: 0.82, 4.89); while clinical response to NACT (HR: 3.35; 95% CI: 1.59, 7.04), vaginal involvement (HR: 2.38; 95% CI: 1.12, 5.02) and lymph nodes positivity (HR: 3.47; 95% CI: 1.62, 7.41), independently correlated with worse survival outcomes. CONCLUSIONS Our data suggest that PMI had a limited role on the choice to administer adjuvant treatment, thus supporting the potential embrace of less radical surgery in LACC patients undergoing NACT. Further prospective studies are warranted.
Resumo:
Obesity is a complex multifactorial disease and is a public health priority. Perilipin coats the surface of lipid droplets in adipocytes and is believed to stabilize these lipid bodies by protecting triglyceride from early lipolysis. This research project evaluated the association between genetic variation within the human perilipin (PLIN) gene and obesity-related quantitative traits and disease-related phenotypes in Non-Hispanic White (NHW) and African American (AA) participants from the Atherosclerosis Risk in Communities (ARIC) Study. ^ Multivariate linear regression, multivariate logistic regression, and Cox proportional hazards models evaluated the association between single gene variants (rs2304794, rs894160, rs8179071, and rs2304795) and multilocus variation (rs894160 and rs2304795) within the PLIN gene and both obesity-related quantitative traits (body weight, body mass index [BMI], waist girth, waist-to-hip ratio [WHR], estimated percent body fat, and plasma total triglycerides) and disease-related phenotypes (prevalent obesity, metabolic syndrome [MetS], prevalent coronary heart disease [CHD], and incident CHD). Single variant analyses were stratified by race and gender within race while multilocus analyses were stratified by race. ^ Single variant analyses revealed that rs2304794 and rs894160 were significantly related to plasma triglyceride levels in all NHWs and NHW women. Among AA women, variant rs8179071 was associated with triglyceride levels and rs2304794 was associated with risk-raising waist circumference (>0.8 in women). The multilocus effects of variants rs894160 and rs2304795 were significantly associated with body weight, waist girth, WHR, estimated percent body fat, class II obesity (BMI ≥ 35 kg/m2), class III obesity (BMI ≥ 35 kg/m2), and risk-raising WHR (>0.9 in men and >0.8 in women) in AAs. Variant rs2304795 was significantly related to prevalent MetS among AA males and prevalent CHD in NHW women; multilocus effects of the PLIN gene were associated with prevalent CHD among NHWs. Rs2304794 was associated with incident CHD in the absence of the MetS among AAs. These findings support the hypothesis that variation within the PLIN gene influences obesity-related traits and disease-related phenotypes. ^ Understanding these effects of the PLIN genotype on the development of obesity can potentially lead to tailored health promotion interventions that are more effective. ^
Resumo:
Pulmonary fibrosis (PF) is the result of a variety of environmental and cancer treatment related insults and is characterized by excessive deposition of collagen. Gas exchange in the alveoli is impaired as the normal lung becomes dense and collapsed leading to a loss of lung volume. It is now accepted that lung injury and fibrosis are in part genetically regulated. ^ Bleomycin is a chemotherapeutic agent used for testicular cancer and lymphomas that induces significant pulmonary toxicity. We delivered bleomycin to mice subcutaneously via a miniosmotic pump in order to elicit lung injury (LI) and quantified the %LI morphometrically using video imaging software. We previously identified a quantitative trait loci, Blmpf-1(LOD=17.4), in the Major Histocompatibility Complex (MHC), but the exact genetic components involved have remained unknown. ^ In the current studies, Blmpf-1 was narrowed to an interval spanning 31.9-32.9Mb on Chromosome 17 using MHC Congenic mice. This region includes the MHC Class II and III genes, and is flanked by the TNF-alpha super locus and MHC Class I genes. Knockout mice of MHC Class I genes (B2mko), MHC Class II genes (Cl2ko), and TNF-alpha (TNF-/-) and its receptors (p55-/-, p75-/-, and p55/p75-/-) were treated with bleomycin in order to ascertain the role of these genes in the pathogenesis of lung injury. ^ Cl2ko mice had significantly better survival and %LI when compared to treated background BL/6 (B6, P<.05). In contrast, B2mko showed no differences in survival or %LI compared to B6. This suggests that the MHC Class II locus contains susceptibility genes for bleomycin-induced lung injury. ^ TNF-alpha, a Class III gene, was examined and it was found that TNF-/- and p55-/- mice had higher %LI and lower survival when compared to B6 (P<.05). In contrast, p75-/- mice had significantly reduced %LI when compared to TNF-/-, p55-/-, and B6 mice as well as higher survival (P<.01). These data contradict the current paradigm that TNF-alpha is a profibrotic mediator of lung injury and suggest a novel and distinct role for the p55 and p75 receptors in mediating lung injury. ^
Resumo:
Mammalian cells express 7 β-tubulin isotypes in a tissue specific manner. This has long fueled the speculation that different isotypes carry out different functions. To provide direct evidence for their functional significance, class III, IVa, and VI β-tubulin cDNAs were cloned into a tetracycline regulated expression vector and stably transfected Chinese hamster ovary cell lines expressing different levels of ectopic β-tubulin were compared for effects on microtubule organization, microtubule assembly and sensitivity to antimitotic drugs. It was found that all three isotypes coassembled with endogenous β-tubulin. βVI expression caused distinct microtubule rearrangements including microtubule dissociation from the centrosome and accumulation at the cell periphery; whereas expression of βIII and βVIa caused no observable changes in the interphase microtubule network. Overexpression of all 3 isotypes caused spindle malformation and mitotic defects. Both βIII and βIVa disrupted microtubule assembly in proportion to their abundance and thereby conferred supersensitivity to microtubule depolymerizing drugs. In contrast, βVI stabilized microtubules at low stoichiometry and thus conferred resistance to many microtubule destabilizing drugs but not vinblastine. The 3 isotypes caused differing responses to microtubule stabilizing drugs. Expression of βIII conferred paclitaxel resistance while βVI did not. Low expression of βIVa caused supersensitivity to paclitaxel, whereas higher expression resulted in the loss of supersensitivity. The results suggest that βIVa may possess an enhanced ability to bind paclitaxel that increases sensitivity to the drug and acts substoichiometrically. At high levels of βVIa expression, however, microtubule disruptive effects counteract the assembly promoting pressure exerted by increased paclitaxel binding, and drug supersensitivity is lost. From this study, I concluded that β-tubulin isotypes behave differently from each other in terms of microtubule organization, microtubule assembly and dynamics, and antimitotic drug sensitivity. The isotype composition of cell can impart subtle to dramatic effects on the properties of microtubules leading to potential functional consequences and opening the opportunity to exploit differences in microtubule isotype composition for therapeutic gain. ^
Resumo:
Baseline elevation of troponin I (TnI) has been associated with worse outcomes in heart failure (HF). However, the prevalence of persistent TnI elevation and its association with clinical outcomes has not been well described. HF is a major public health issue due to its wide prevalence and prognosticators of this condition will have a significant impact on public health. Methods: A retrospective study was performed in 510 patients with an initial HF admission between 2002 to 2004, and all subsequent hospital admissions up to May 2009 were recorded in a de-identified database. Persistent TnI elevation was defined as a level ≥0.05 ng/ml on ≥3 HF admissions. Baseline characteristics, hospital readmissions and all cause mortality were compared between patients with persistent TnI elevation (Persistent), patients with no persistence of TnI (Nonpersistent) and patients who had less than three hospital admissions (admission <3) groups. Also the same data was analyzed using the mean method in which the mean value of all recorded troponin values of each patient was used to define persistence i.e. patients who had a mean troponin level ≥0.05 ng/ml were classified as persistent. Results: Mean age of our cohort was 68.4 years out of which 99.6% subjects were male, 62.4% had ischemic HF. 78.2% had NYHA class III to IV HF, mean LVEF was 25.9%. Persistent elevation of TnI was seen in 26% of the cohort and in 66% of patients with more than 3 hospital admissions. Mean TnI level was 0.67 ± 0.15 ng/ml in the 'Persistent' group. Mean TnI using the mean method was 1.11 ± 7.25 ng/ml. LVEF was significantly lower in persistent group. Hypertension, diabetes, chronic renal insufficiency and mean age did not differ between the two groups. 'Persistent' patients had higher mortality (HR = 1.26, 95% CI = 0.89–1.78, p = 0.199 when unadjusted and HR = 1.29, 95% CI = 0.89–1.86, p = 0.176 when adjusted for race, LVEF and ischemic etiology) HR for mortality in persistent patients was 1.99 (95% CI = 1.06–3.73, p = 0.03) using the mean method. The following results were found in those with ischemic cardiomyopathy (HR = 1.44034, 95% CI = 0.92–2.26, p = 0.113) and (HR = 1.89, 95% CI = 1.01–3.55, p = 0.046) by using the mean method. 2 out of three patients with HF who were readmitted three or more times had persistent elevation of troponin I levels. Patients with chronic persistence of troponin I elevation showed a trend towards lesser survival as compared to patients who did not have chronic persistence, however this did not reach statistical significance. This trend was seen more among ischemic patients than non ischemic patients, but did not reach statistical significance. With the mean method, patients with chronic persistence of troponin I elevation had significantly lesser survival than those without it. Also ischemic patients had significantly lesser survival than non ischemic patients. ^
Resumo:
The Federal Food and Drug Administration (FDA) and the Centers for Medicare and Medicaid (CMS) play key roles in making Class III, medical devices available to the public, and they are required by law to meet statutory deadlines for applications under review. Historically, both agencies have failed to meet their respective statutory requirements. Since these failures affect patient access and may adversely impact public health, Congress has enacted several “modernization” laws. However, the effectiveness of these modernization laws has not been adequately studied or established for Class III medical devices. ^ The aim of this research study was, therefore, to analyze how these modernization laws may have affected public access to medical devices. Two questions were addressed: (1) How have the FDA modernization laws affected the time to approval for medical device premarket approval applications (PMAs)? (2) How has the CMS modernization law affected the time to approval for national coverage decisions (NCDs)? The data for this research study were collected from publicly available databases for the period January 1, 1995, through December 31, 2008. These dates were selected to ensure that a sufficient period of time was captured to measure pre- and post-modernization effects on time to approval. All records containing original PMAs were obtained from the FDA database, and all records containing NCDs were obtained from the CMS database. Source documents, including FDA premarket approval letters and CMS national coverage decision memoranda, were reviewed to obtain additional data not found in the search results. Analyses were conducted to determine the effects of the pre- and post-modernization laws on time to approval. Secondary analyses of FDA subcategories were conducted to uncover any causal factors that might explain differences in time to approval and to compare with the primary trends. The primary analysis showed that the FDA modernization laws of 1997 and 2002 initially reduced PMA time to approval; after the 2002 modernization law, the time to approval began increasing and continued to increase through December 2008. The non-combined, subcategory approval trends were similar to the primary analysis trends. The combined, subcategory analysis showed no clear trends with the exception of non-implantable devices, for which time to approval trended down after 1997. The CMS modernization law of 2003 reduced NCD time to approval, a trend that continued through December 2008. This study also showed that approximately 86% of PMA devices do not receive NCDs. ^ As a result of this research study, recommendations are offered to help resolve statutory non-compliance and access issues, as follows: (1) Authorities should examine underlying causal factors for the observed trends; (2) Process improvements should be made to better coordinate FDA and CMS activities to include sharing data, reducing duplication, and establishing clear criteria for “safe and effective” and “reasonable and necessary”; (3) A common identifier should be established to allow tracking and trending of applications between FDA and CMS databases; (4) Statutory requirements may need to be revised; and (5) An investigation should be undertaken to determine why NCDs are not issued for the majority of PMAs. Any process improvements should be made without creating additional safety risks and adversely impacting public health. Finally, additional studies are needed to fully characterize and better understand the trends identified in this research study.^
Resumo:
Chronic β-blocker treatment improves survival and left ventricular ejection fraction (LVEF) in patients with systolic heart failure (HF). Data on whether the improvement in LVEF after β-blocker therapy is sustained for a long term or whether there is a loss in LVEF after an initial gain is not known. Our study sought to determine the prevalence and prognostic role of secondary decline in LVEF in chronic systolic HF patients on β-blocker therapy and characterize these patients. Retrospective chart review of HF hospitalizations fulfilling Framingham Criteria was performed at the MEDVAMC between April 2000 and June 2006. Follow up vital status and recurrent hospitalizations were ascertained until May 2010. Three groups of patients were identified based on LVEF response to beta blockers; group A with secondary decline in LVEF following an initial increase, group B with progressive increase in LVEF and group C with progressive decline in LVEF. Covariate adjusted Cox proportional hazard models were used to examine differences in heart failure re-hospitalizations and all cause mortality between the groups. Twenty five percent (n=27) of patients had a secondary decline in LVEF following an initial gain. The baseline, peak and final LVEF in this group were 27.6±12%, 40.1±14% and 27.4±13% respectively. The mean nadir LVEF after decline was 27.4±13% and this decline occurred at a mean interval of 2.8±1.9 years from the day of beta blocker initiation. These patients were older, more likely to be whites, had advanced heart failure (NYHA class III/IV) more due to a non ischemic etiology compared to groups B & C. They were also more likely to be treated with metoprolol (p=0.03) compared to the other two groups. No significant differences were observed in combined risk of all cause mortality and HF re-hospitalization [hazard ratio 0.80, 95% CI 0.47 to 1.38, p=0.42]. No significant difference was observed in survival estimates between the groups. In conclusion, a late decline in LVEF does occur in a significant proportion of heart failure patients treated with beta blockers, more so in patients treated with metoprolol.^
Resumo:
Background. Cancer cachexia is a common syndrome complex in cancer, occurring in nearly 80% of patients with advanced cancer and responsible for at least 20% of all cancer deaths. Cachexia is due to increased resting energy expenditure, increased production of inflammatory mediators, and changes in lipid and protein metabolism. Non-steroidal anti-inflammatory drugs (NSAIDs), by virtue of their anti-inflammatory properties, are possibly protective against cancer-related cachexia. Since cachexia is also associated with increased hospitalizations, this outcome may also show improvement with NSAID exposure. ^ Design. In this retrospective study, computerized records from 700 non-small cell lung cancer patients (NSCLC) were reviewed, and 487 (69.57%) were included in the final analyses. Exclusion criteria were severe chronic obstructive pulmonary disease, significant peripheral edema, class III or IV congestive heart failure, liver failure, other reasons for weight loss, or use of research or anabolic medications. Information on medication history, body weight and hospitalizations was collected from one year pre-diagnosis until three years post-diagnosis. Exposure to NSAIDs was defined if a patient had a history of being treated with NSAIDs for at least 50% of any given year in the observation period. We used t-test and chi-square tests for statistical analyses. ^ Results. Neither the proportion of patients with cachexia (p=0.27) nor the number of hospitalizations (p=0.74) differed among those with a history of NSAID use (n=92) and those without (n=395). ^ Conclusions. In this study, NSAID exposure was not significantly associated with weight loss or hospital admissions in patients with NSCLC. Further studies may be needed to confirm these observations.^