512 resultados para Diagnostic techniques and procedures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is little literature about the clinical presentation and time-course of postoperative venous thromboembolism (VTE) in different surgical procedures. RIETE is an ongoing, prospective registry of consecutive patients with objectively confirmed, symptomatic acute VTE. In this analysis, we analysed the baseline characteristics, thromboprophylaxis and therapeutic patterns, time-course, and three-month outcome of all patients with postoperative VTE. As of January 2006, there were 1,602 patients with postoperative VTE in RIETE: 393 (25%) after major orthopaedic surgery (145 elective hip arthroplasty, 126 knee arthroplasty, 122 hip fracture); 207 (13%) after cancer surgery; 1,002 (63%) after other procedures. The percentage of patients presenting with clinically overt pulmonary embolism (PE) (48%, 48%, and 50% respectively), the average time elapsed from surgery to VTE (22 +/- 16, 24 +/- 16, and 21 +/- 15 days, respectively), and the three-month incidence of fatal PE (1.3%, 1.4%, and 0.8%, respectively), fatal bleeding (0.8%, 1.0%, and 0.2%, respectively), or major bleeding (2.3%, 2.9%, and 2.8%, respectively) were similar in the three groups. However, the percentage of patients who had received thromboprophylaxis (96%, 76% and 52%, respectively), the duration of prophylaxis (17 +/- 9.6, 13 +/- 8.9, and 12 +/- 11 days, respectively) and the mean daily doses of low-molecular-weight heparin (4,252 +/- 1,016, 3,260 +/- 1,141, and 3,769 +/- 1,650 IU, respectively), were significantly lower in those undergoing cancer surgery or other procedures. In conclusion, the clinical presentation, time-course, and three-month outcome of VTE was similar among the different subgroups of patients, but the use of prophylaxis in patients undergoing cancer surgery or other procedures was suboptimal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Illiteracy, a universal problem, limits the utilization of the most widely used short cognitive tests. Our objective was to assess and compare the effectiveness and cost for cognitive impairment (CI) and dementia (DEM) screening of three short cognitive tests applicable to illiterates. METHODS Phase III diagnostic test evaluation study was performed during one year in four Primary Care centers, prospectively including individuals with suspicion of CI or DEM. All underwent the Eurotest, Memory Alteration Test (M@T), and Phototest, applied in a balanced manner. Clinical, functional, and cognitive studies were independently performed in a blinded fashion in a Cognitive Behavioral Neurology Unit, and the gold standard diagnosis was established by consensus of expert neurologists on the basis of these results. Effectiveness of tests was assessed as the proportion of correct diagnoses (diagnostic accuracy [DA]) and the kappa index of concordance (k) with respect to gold standard diagnoses. Costs were based on public prices at the time and hospital accounts. RESULTS The study included 139 individuals: 47 with DEM, 36 with CI, and 56 without CI. No significant differences in effectiveness were found among the tests. For DEM screening: Eurotest (k = 0.71 [0.59-0.83], DA = 0.87 [0.80-0.92]), M@T (k = 0.72 [0.60-0.84], DA = 0.87 [0.80-0.92]), Phototest (k = 0.70 [0.57-0.82], DA = 0.86 [0.79-0.91]). For CI screening: Eurotest (k = 0.67 [0.55-0.79]; DA = 0.83 [0.76-0.89]), M@T (k = 0.52 [0.37-0.67]; DA = 0.80 [0.72-0.86]), Phototest (k = 0.59 [0.46-0.72]; DA = 0.79 [0.71-0.86]). There were no differences in the cost of DEM screening, but the cost of CI screening was significantly higher with M@T (330.7 ± 177.1 €, mean ± sd) than with Eurotest (294.1 ± 195.0 €) or Phototest (296.0 ± 196. 5 €). Application time was shorter with Phototest (2.8 ± 0.8 min) than with Eurotest (7.1 ± 1.8 min) or M@T (6.8 ± 2.2 min). CONCLUSIONS Eurotest, M@T, and Phototest are equally effective. Eurotest and Phototest are both less expensive options but Phototest is the most efficient, requiring the shortest application time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assigning causality in drug-induced liver injury is challenging particularly when more than one drug could be responsible. We report a woman on long-term therapy with raloxifen who developed acute cholestasis shortly after starting fenofibrate. The picture evolved into chronic cholestasis. We hypothesized that an interaction at the metabolic level could have triggered the presentation of hepatotoxicity after a very short time of exposure to fenofibrate in this patient. The findings of an overexpression of vascular endothelial growth factor in the liver biopsy suggest that angiogenesis might play a role in the persistence of toxic cholestasis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of our study is to assess the diagnostic profi tability of procalcitonin (PCT) in septic shock and another biomarker as C-reactive protein (CRP). Results: Fifty-four septic patients were assessed, 66% were males; mean age, 63 years. Eighty-eight percent was diagnosed as septic shock and 11% severe sepsis. Seventy-six percent were medical patients. Positive blood cultures in 42.5%. Sepsis origin: respiratory 46%, neurological 5%, digestive 37% and urinary 3%. Average SOFA score was 10.4. Conclusions: PCT and CRP have the same efficiency in early sepsis diagnosis. The PCT and CRP effi ciency diagnostic together is signifi cant but small. We suggest using both with the doubt of sepsis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Temporo-mandibular joint disc disorders are highly prevalent in adult populations. Autologous chondrocyte implantation is a well-established method for the treatment of several chondral defects. However, very few studies have been carried out using human fibrous chondrocytes from the temporo-mandibular joint (TMJ). One of the main drawbacks associated to chondrocyte cell culture is the possibility that chondrocyte cells kept in culture tend to de-differentiate and to lose cell viability under in in-vitro conditions. In this work, we have isolated human temporo-mandibular joint fibrochondrocytes (TMJF) from human disc and we have used a highly-sensitive technique to determine cell viability, cell proliferation and gene expression of nine consecutive cell passages to determine the most appropriate cell passage for use in tissue engineering and future clinical use. Our results revealed that the most potentially viable and functional cell passages were P5-P6, in which an adequate equilibrium between cell viability and the capability to synthesize all major extracellular matrix components exists. The combined action of pro-apoptotic (TRAF5, PHLDA1) and anti-apoptotic genes (SON, HTT, FAIM2) may explain the differential cell viability levels that we found in this study. These results suggest that TMJF should be used at P5-P6 for cell therapy protocols.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigated the effects of uninephrectomy (UNX) in 6-week-old male and female rats on blood pressure (BP), renal sodium handling, salt sensitivity, oxidative stress, and renal injury over 18 months postsurgery, studying control sham-operated and UNX-operated rats at 6, 12, and 18 months postsurgery, evaluating their renal sodium handling, BP, urinary isoprostanes, N-acetyl-β-D-glucosaminidase, and proteinuria before and after a 2-week high-salt intake period. At 18 months, plasma variables were measured and kidney samples were taken for the analysis of renal morphology and tissue variables. BP was increased at 6 months in male UNX rats versus controls and at 12 and 18 months in both male and female UNX rats and was increased in male versus female UNX groups at 18 months. UNX did not affect water and sodium excretion under basal conditions and after the different test in male and female rats at different ages. However, the renal function curve was shifted to the right in both male and female UNX rats. High-salt intake increased BP in both UNX groups at 6, 12, and 18 months and in the female control group at 18 months, and it increased proteinuria, N-acetyl-β-D-glucosaminidase, and isoprostanes in both UNX groups throughout the study. Renal lesions at 18 months were more severe in male versus female UNX rats. In summary, long-term UNX increased the BP, creatinine, proteinuria, pathological signs of renal injury, and salt sensitivity. Earlier BP elevation was observed and morphological lesions were more severe in male than in female UNX rats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in clinical virology for detecting respiratory viruses have been focused on nucleic acids amplification techniques, which have converted in the reference method for the diagnosis of acute respiratory infections of viral aetiology. Improvements of current commercial molecular assays to reduce hands-on-time rely on two strategies, a stepwise automation (semi-automation) and the complete automation of the whole procedure. Contributions to the former strategy have been the use of automated nucleic acids extractors, multiplex PCR, real-time PCR and/or DNA arrays for detection of amplicons. Commercial fully-automated molecular systems are now available for the detection of respiratory viruses. Some of them could convert in point-of-care methods substituting antigen tests for detection of respiratory syncytial virus and influenza A and B viruses. This article describes laboratory methods for detection of respiratory viruses. A cost-effective and rational diagnostic algorithm is proposed, considering technical aspects of the available assays, infrastructure possibilities of each laboratory and clinic-epidemiologic factors of the infection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Perioperative anaemia, with iron deficiency being its leading cause, is a frequent condition among surgical patients, and has been linked to increased postoperative morbidity and mortality, and decreased quality of life. Postoperative anaemia is even more frequent and is mainly caused by perioperative blood loss, aggravated by inflammation-induced blunting of erythropoiesis. Allogenic transfusion is commonly used for treating acute perioperative anaemia, but it also increases the rate of morbidity and mortality in surgical and critically ill patients. Thus, overall concerns about adverse effects of both preoperative anaemia and allogeneic transfusion have prompted the review of transfusion practice and the search for safer and more biologically rational treatment options. In this paper, the role of intravenous iron therapy (mostly with iron sucrose and ferric carboxymaltose), as a safe and efficacious tool for treating anaemia and reducing transfusion requirements in surgical patients, as well as in other medical areas, has been reviewed. From the analysis of published data and despite the lack of high quality evidence in some areas, it seems fair to conclude that perioperative intravenous iron administration, with or without erythropoiesis stimulating agents, is safe, results in lower transfusion requirements and hastens recovery from postoperative anaemia. In addition, some studies have reported decreased rates of postoperative infection and mortality, and shorter length of hospital stay in surgical patients receiving intravenous iron.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Although Hodgkin's lymphoma is a highly curable disease with modern chemotherapy protocols, some patients are primary refractory or relapse after first-line chemotherapy or even after high-dose therapy and autologous stem cell transplantation. We investigated the potential role of allogeneic stem cell transplantation in this setting. DESIGN AND METHODS In this phase II study 92 patients with relapsed Hodgkin's lymphoma and an HLA-identical sibling, a matched unrelated donor or a one antigen mismatched, unrelated donor were treated with salvage chemotherapy followed by reduced intensity allogeneic transplantation. Fourteen patients showed refractory disease and died from progressive lymphoma with a median overall survival after trial entry of 10 months (range, 6-17). Seventy-eight patients proceeded to allograft (unrelated donors, n=23). Fifty were allografted in complete or partial remission and 28 in stable disease. Fludarabine (150 mg/m(2) iv) and melphalan (140 mg/m(2) iv) were used as the conditioning regimen. Anti-thymocyte globulin was additionally used as graft-versus-host-disease prophylaxis for recipients of grafts from unrelated donors. RESULTS The non-relapse mortality rate was 8% at 100 days and 15% at 1 year. Relapse was the major cause of failure. The progression-free survival rate was 47% at 1 year and 18% at 4 years from trial entry. For the allografted population, the progression-free survival rate was 48% at 1 year and 24% at 4 years. Chronic graft-versus-host disease was associated with a lower incidence of relapse. Patients allografted in complete remission had a significantly better outcome. The overall survival rate was 71% at 1 year and 43% at 4 years. CONCLUSIONS Allogeneic stem cell transplantation can result in long-term progression-free survival in heavily pre-treated patients with Hodgkin's lymphoma. The reduced intensity conditioning approach significantly reduced non-relapse mortality; the high relapse rate represents the major remaining challenge in this setting. The HDR-Allo trial was registered in the European Clinical Trials Database (EUDRACT, https://eudract.ema.europa.eu/) with number 02-0036.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Europe, the combination of plerixafor + granulocyte colony-stimulating factor is approved for the mobilization of hematopoietic stem cells for autologous transplantation in patients with lymphoma and myeloma whose cells mobilize poorly. The purpose of this study was to further assess the safety and efficacy of plerixafor + granulocyte colony-stimulating factor for front-line mobilization in European patients with lymphoma or myeloma. In this multicenter, open label, single-arm study, patients received granulocyte colony-stimulating factor (10 μg/kg/day) subcutaneously for 4 days; on the evening of day 4 they were given plerixafor (0.24 mg/kg) subcutaneously. Patients underwent apheresis on day 5 after a morning dose of granulocyte colony-stimulating factor. The primary study objective was to confirm the safety of mobilization with plerixafor. Secondary objectives included assessment of efficacy (apheresis yield, time to engraftment). The combination of plerixafor + granulocyte colony-stimulating factor was used to mobilize hematopoietic stem cells in 118 patients (90 with myeloma, 25 with non-Hodgkin's lymphoma, 3 with Hodgkin's disease). Treatment-emergent plerixafor-related adverse events were reported in 24 patients. Most adverse events occurred within 1 hour after injection, were grade 1 or 2 in severity and included gastrointestinal disorders or injection-site reactions. The minimum cell yield (≥ 2 × 10(6) CD34(+) cells/kg) was harvested in 98% of patients with myeloma and in 80% of those with non-Hodgkin's lymphoma in a median of one apheresis. The optimum cell dose (≥ 5 × 10(6) CD34(+) cells/kg for non-Hodgkin's lymphoma or ≥ 6 × 10(6) CD34(+) cells/kg for myeloma) was harvested in 89% of myeloma patients and 48% of non-Hodgkin's lymphoma patients. In this prospective, multicenter European study, mobilization with plerixafor + granulocyte colony-stimulating factor allowed the majority of patients with myeloma or non-Hodgkin's lymphoma to undergo transplantation with minimal toxicity, providing further data supporting the safety and efficacy of plerixafor + granulocyte colony-stimulating factor for front-line mobilization of hematopoietic stem cells in patients with non-Hodgkin's lymphoma or myeloma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION Recurrence risk in breast cancer varies throughout the follow-up time. We examined if these changes are related to the level of expression of the proliferation pathway and intrinsic subtypes. METHODS Expression of estrogen and progesterone receptor, Ki-67, human epidermal growth factor receptor 2 (HER2), epidermal growth factor receptor (EGFR) and cytokeratin 5/6 (CK 5/6) was performed on tissue-microarrays constructed from a large and uniformly managed series of early breast cancer patients (N = 1,249). Subtype definitions by four biomarkers were as follows: luminal A (ER + and/or PR+, HER2-, Ki-67 <14), luminal B (ER + and/or PR+, HER2-, Ki-67 ≥14), HER2-enriched (any ER, any PR, HER2+, any Ki-67), triple-negative (ER-, PR-, HER2-, any Ki-67). Subtype definitions by six biomarkers were as follows: luminal A (ER + and/or PR+, HER2-, Ki-67 <14, any CK 5/6, any EGFR), luminal B (ER + and/or PR+, HER2-, Ki-67 ≥14, any CK 5/6, any EGFR), HER2-enriched (ER-, PR-, HER2+, any Ki-67, any CK 5/6, any EGFR), Luminal-HER2 (ER + and/or PR+, HER2+, any Ki-67, any CK 5/6, any EGFR), Basal-like (ER-, PR-, HER2-, any Ki-67, CK5/6+ and/or EGFR+), triple-negative nonbasal (ER-, PR-, HER2-, any Ki-67, CK 5/6-, EGFR-). Each four- or six-marker defined intrinsic subtype was divided in two groups, with Ki-67 <14% or with Ki-67 ≥14%. Recurrence hazard rate function was determined for each intrinsic subtype as a whole and according to Ki-67 value. RESULTS Luminal A displayed a slow risk increase, reaching its maximum after three years and then remained steady. Luminal B presented most of its relapses during the first five years. HER2-enriched tumors show a peak of recurrence nearly twenty months post-surgery, with a greater risk in Ki-67 ≥14%. However a second peak occurred at 72 months but the risk magnitude was greater in Ki-67 <14%. Triple negative tumors with low proliferation rate display a smooth risk curve, but with Ki-67 ≥14% show sharp peak at nearly 18 months. CONCLUSIONS Each intrinsic subtype has a particular pattern of relapses over time which change depending on the level of activation of the proliferation pathway assessed by Ki-67. These findings could have clinical implications both on adjuvant treatment trial design and on the recommendations concerning the surveillance of patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Dermatologic surgeons routinely harvest pedicled flaps at distance with an axial or random pattern to repair facial defects. These types of skin flaps are time-consuming and have high economic, social and personal costs. These drawbacks could be avoided with the introduction of a single-step transfer of free flaps to the recipient site, with microvascular anastomosis. OBJECTIVE To demonstrate that better results are obtained with myocutaneous or fasciocutaneous free flaps and which one is more suitable in surgical dermatology. MATERIAL AND METHODS We selected two patients of opposite sexes and similar ages who had undergone Mohs surgery to remove recurrent malignant tumors that were located in the upper cheek bordering the zygomatic zone. The woman was treated with a fasciocutaneous radial free flap and the man with a rectus abdominis free flap. RESULTS Both patients had excellent immediate postoperative outcomes. Complications observed in the male patient were related to a previous pulmonary alteration. The fasciocutaneous radial free flap reconstruction was easier to perform than the rectus abdominis free flap; nevertheless, the radial free flap is very thin and, although the palmaris longus tendon is used, it does not yield enough volume, requiring later use of implants. In contrast, the rectus abdominis free flap transfers a wide flap with enough fat tissue to expand in the future. As for the cosmetic results regarding the donor site, the rectus abdominis free flap produces better-looking scars, since secondary defects of the palmar surface cannot be directly closed and usually require grafting - a situation that some patients do not accept. CONCLUSIONS In surgical dermatology, each case, once the tumor has been extirpated, requires its own reconstructive technique. The radial free flap is suitable for thin patients who are willing to cover their arm with a shirt. The rectus abdominis free flap is best suited for obese patients with deep and voluminous defects, although it is necessary to dislocate the navel from its original position.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Very few data exist on the clinical impact of permanent pacemaker implantation (PPI) after transcatheter aortic valve implantation. The objective of this study was to assess the impact of PPI after transcatheter aortic valve implantation on late outcomes in a large cohort of patients. METHODS AND RESULTS A total of 1556 consecutive patients without prior PPI undergoing transcatheter aortic valve implantation were included. Of them, 239 patients (15.4%) required a PPI within the first 30 days after transcatheter aortic valve implantation. At a mean follow-up of 22±17 months, no association was observed between the need for 30-day PPI and all-cause mortality (hazard ratio, 0.98; 95% confidence interval, 0.74-1.30; P=0.871), cardiovascular mortality (hazard ratio, 0.81; 95% confidence interval, 0.56-1.17; P=0.270), and all-cause mortality or rehospitalization for heart failure (hazard ratio, 1.00; 95% confidence interval, 0.77-1.30; P=0.980). A lower rate of unexpected (sudden or unknown) death was observed in patients with PPI (hazard ratio, 0.31; 95% confidence interval, 0.11-0.85; P=0.023). Patients with new PPI showed a poorer evolution of left ventricular ejection fraction over time (P=0.017), and new PPI was an independent predictor of left ventricular ejection fraction decrease at the 6- to 12-month follow-up (estimated coefficient, -2.26; 95% confidence interval, -4.07 to -0.44; P=0.013; R(2)=0.121). CONCLUSIONS The need for PPI was a frequent complication of transcatheter aortic valve implantation, but it was not associated with any increase in overall or cardiovascular death or rehospitalization for heart failure after a mean follow-up of ≈2 years. Indeed, 30-day PPI was a protective factor for the occurrence of unexpected (sudden or unknown) death. However, new PPI did have a negative effect on left ventricular function over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To describe and compare the consumption of the main groups and sub-groups of vegetables and fruits (V&F) in men and women from the centres participating in the European Prospective Investigation into Cancer and Nutrition (EPIC). DESIGN Cross-sectional analysis. Dietary intake was assessed by means of a 24-hour dietary recall using computerised interview software and standardised procedures. Crude and adjusted means were computed for the main groups and sub-groups of V&F by centre, separately for men and women. Adjusted means by season, day of the week and age were estimated using weights and covariance analysis. SETTING Twenty-seven centres in 10 European countries participating in the EPIC project. SUBJECTS In total, 35 955 subjects (13 031 men and 22 924 women), aged 35-74 years, randomly selected from each EPIC cohort. RESULTS The centres from southern countries had the highest consumption of V&F, while the lowest intake was seen in The Netherlands and Scandinavia for both genders. These differences were more evident for fruits, particularly citrus. However, slightly different patterns arose for some sub-groups of vegetables, such as root vegetables and cabbage. Adjustment for body mass index, physical activity, smoking habits and education did not substantially modify the mean intakes of vegetables and fruits. CONCLUSIONS Total vegetable and fruit intake follows a south-north gradient in both genders, whereas for several sub-groups of vegetables a different geographic distribution exists. Differences in mean intake of V&F by centre were not explained by lifestyle factors associated with V&F intake.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The high prevalence of disease-related hospital malnutrition justifies the need for screening tools and early detection in patients at risk for malnutrition, followed by an assessment targeted towards diagnosis and treatment. At the same time there is clear undercoding of malnutrition diagnoses and the procedures to correct it Objectives: To describe the INFORNUT program/ process and its development as an information system. To quantify performance in its different phases. To cite other tools used as a coding source. To calculate the coding rates for malnutrition diagnoses and related procedures. To show the relationship to Mean Stay, Mortality Rate and Urgent Readmission; as well as to quantify its impact on the hospital Complexity Index and its effect on the justification of Hospitalization Costs. Material and methods: The INFORNUT® process is based on an automated screening program of systematic detection and early identification of malnourished patients on hospital admission, as well as their assessment, diagnoses, documentation and reporting. Of total readmissions with stays longer than three days incurred in 2008 and 2010, we recorded patients who underwent analytical screening with an alert for a medium or high risk of malnutrition, as well as the subgroup of patients in whom we were able to administer the complete INFORNUT® process, generating a report for each.