997 resultados para 94-610
Resumo:
The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.
Resumo:
PURPOSE The aim was to assess changes of tumour hypoxia during primary radiochemotherapy (RCT) for head and neck cancer (HNC) and to evaluate their relationship with treatment outcome. MATERIAL AND METHODS Hypoxia was assessed by FMISO-PET in weeks 0, 2 and 5 of RCT. The tumour volume (TV) was determined using FDG-PET/MRI/CT co-registered images. The level of hypoxia was quantified on FMISO-PET as TBRmax (SUVmaxTV/SUVmean background). The hypoxic subvolume (HSV) was defined as TV that showed FMISO uptake ⩾1.4 times blood pool activity. RESULTS Sixteen consecutive patients (T3-4, N+, M0) were included (mean follow-up 31, median 44months). Mean TBRmax decreased significantly (p<0.05) from 1.94 to 1.57 (week 2) and 1.27 (week 5). Mean HSV in week 2 and week 5 (HSV2=5.8ml, HSV3=0.3ml) were significantly (p<0.05) smaller than at baseline (HSV1=15.8ml). Kaplan-Meier plots of local recurrence free survival stratified at the median TBRmax showed superior local control for less hypoxic tumours, the difference being significant at baseline and after 2weeks (p=0.031, p=0.016). CONCLUSIONS FMISO-PET documented that in most HNC reoxygenation starts early during RCT and is correlated with better outcome.
Resumo:
OBJECTIVE To assess the maxillary second molar (M2) and third molar (M3) inclination following orthodontic treatment of Class II subdivision malocclusion with unilateral maxillary first molar (M1) extraction. MATERIALS AND METHODS Panoramic radiographs of 21 Class II subdivision adolescents (eight boys, 13 girls; mean age, 12.8 years; standard deviation, 1.7 years) before treatment, after treatment with extraction of one maxillary first molar and Begg appliances and after at least 1.8 years in retention were retrospectively collected from a private practice. M2 and M3 inclination angles (M2/ITP, M2/IOP, M3/ITP, M3/IOP), constructed by intertuberosity (ITP) and interorbital planes (IOP), were calculated for the extracted and nonextracted segments. Random effects regression analysis was performed to evaluate the effect on the molar angulation of extraction, time, and gender after adjusting for baseline measurements. RESULTS Time and extraction status were significant predictors for M2 angulation. M2/ITP and M2/IOP decreased by 4.04 (95% confidence interval [CI]: -6.93, 1.16; P = .001) and 3.67 (95% CI: -6.76, -0.58; P = .020) in the extraction group compared to the nonextraction group after adjusting for time and gender. The adjusted analysis showed that extraction was the only predictor for M3 angulation that reached statistical significance. M3 mesial inclination increased by 7.38° (95% CI: -11.2, -3.54; P < .001) and 7.33° (95% CI: -11.48, -3.19; P = .001). CONCLUSIONS M2 and M3 uprighting significantly improved in the extraction side after orthodontic treatment with unilateral maxillary M1 extraction. There was a significant increase in mesial tipping of maxillary second molar crowns over time.
Resumo:
INTRODUCTION Conventional 2-dimensional radiography uses defined criteria for outcome assessment of apical surgery. However, these radiographic healing criteria are not applicable for 3-dimensional radiography. The present study evaluated the repeatability and reproducibility of new cone-beam computed tomographic (CBCT)-based healing criteria for the judgment of periapical healing 1 year after apical surgery. METHODS CBCT scans taken 1 year after apical surgery (61 roots of 54 teeth in 54 patients, mean age = 54.4 years) were evaluated by 3 blinded and calibrated observers using 4 different indices. Reformatted buccolingual CBCT sections through the longitudinal axis of the treated roots were analyzed. Radiographic healing was assessed at the resection plane (R index), within the apical area (A index), of the cortical plate (C index), and regarding a combined apical-cortical area (B index). All readings were performed twice to calculate the intraobserver agreement (repeatability). Second-time readings were used for analyzing the interobserver agreement (reproducibility). Various statistical tests (Cohen, kappa, Fisher, and Spearman) were performed to measure the intra- and interobserver concurrence, the variability of score ratios, and the correlation of indices. RESULTS For all indices, the rates of identical first- and second-time scores were always higher than 80% (intraobserver Cohen κ values ranging from 0.793 to 0.963). The B index (94.0%) showed the highest intraobserver agreement. Regarding interobserver agreement, the highest rate was found for the B index (72.1%). The Fleiss' κ values for R and B indices exhibited substantial agreement (0.626 and 0.717, respectively), whereas the values for A and C indices showed moderate agreement (0.561 and 0.573, respectively). The Spearman correlation coefficients for R, A, C, and B indices all exhibited a moderate to very strong correlation with the highest correlation found between C and B indices (rs = 0.8069). CONCLUSIONS All indices showed an excellent intraobserver agreement (repeatability). With regard to interobserver agreement (reproducibility), the B index (healing of apical and cortical defects combined) and the R index (healing on the resection plane) showed substantial congruence and thus are to be recommended in future studies when using buccolingual CBCT sections for radiographic outcome assessment of apical surgery.
Resumo:
Although there has been a significant decrease in caries prevalence in developed countries, the slower progression of dental caries requires methods capable of detecting and quantifying lesions at an early stage. The aim of this study was to evaluate the effectiveness of fluorescence-based methods (DIAGNOdent 2095 laser fluorescence device [LF], DIAGNOdent 2190 pen [LFpen], and VistaProof fluorescence camera [FC]) in monitoring the progression of noncavitated caries-like lesions on smooth surfaces. Caries-like lesions were developed in 60 blocks of bovine enamel using a bacterial model of Streptococcus mutans and Lactobacillus acidophilus . Enamel blocks were evaluated by two independent examiners at baseline (phase I), after the first cariogenic challenge (eight days) (phase II), and after the second cariogenic challenge (a further eight days) (phase III) by two independent examiners using the LF, LFpen, and FC. Blocks were submitted to surface microhardness (SMH) and cross-sectional microhardness analyses. The intraclass correlation coefficient for intra- and interexaminer reproducibility ranged from 0.49 (FC) to 0.94 (LF/LFpen). SMH values decreased and fluorescence values increased significantly among the three phases. Higher values for sensitivity, specificity, and area under the receiver operating characteristic curve were observed for FC (phase II) and LFpen (phase III). A significant correlation was found between fluorescence values and SMH in all phases and integrated loss of surface hardness (ΔKHN) in phase III. In conclusion, fluorescence-based methods were effective in monitoring noncavitated caries-like lesions on smooth surfaces, with moderate correlation with SMH, allowing differentiation between sound and demineralized enamel.
Resumo:
The aim of this study was to test a newly developed LED-based fluorescence device for approximal caries detection in vitro. We assembled 120 extracted molars without frank cavitations or fillings pairwise in order to create contact areas. The teeth were independently assessed by two examiners using visual caries detection (International Caries Detection and Assessment System, ICDAS), bitewing radiography (BW), laser fluorescence (LFpen), and LED fluorescence (Midwest Caries I.D., MW). The measurements were repeated at least 1 week later. The diagnostic performance was calculated with Bayesian analyses. Post-test probabilities were calculated in order to judge the diagnostic performance of combined methods. Reliability analyses were performed using kappa statistics for nominal data and intraclass correlation (ICC) for absolute data. Histology served as the gold standard. Sensitivities/specificities at the enamel threshold were 0.33/0.84 for ICDAS, 0.23/0.86 for BW, 0.47/0.78 for LFpen, and 0.32/0.87 for MW. Sensitivities/specificities at the dentine threshold were 0.04/0.89 for ICDAS, 0.27/0.94 for BW, 0.39/0.84 for LFpen, and 0.07/0.96 for MW. Reliability data were fair to moderate for MW and good for BW and LFpen. The combination of ICDAS and radiography yielded the best diagnostic performance (post-test probability of 0.73 at the dentine threshold). The newly developed LED device is not able to be recommended for approximal caries detection. There might be too much signal loss during signal transduction from the occlusal aspect to the proximal lesion site and the reverse.
Resumo:
BACKGROUND The early diagnosis of acute myocardial infarction (AMI) very soon after symptom onset remains a major clinical challenge, even when using high-sensitivity cardiac troponin (hs-cTnT). METHODS AND RESULTS We investigated the incremental value of heart-type fatty acid-binding protein (hFABP) in a pre-specified subgroup analysis of patients presenting with suspected AMI within 1 h of symptom onset to the emergency department (ED) in a multicentre study. HFABP was measured in a blinded fashion. Two independent cardiologists using all available clinical information, including hs-cTnT, adjudicated the final diagnosis. Overall, 1411 patients were enrolled, of whom 105 patients presented within 1 h of symptom onset. Of these, 34 patients (32.4%) had AMI. The diagnostic accuracy as quantified by the area under the receiver-operating characteristics curve (AUC) of hFABP was high (0.84 (95% CI 0.74-0.94)). However, the additional use of hFABP only marginally increased the diagnostic accuracy of hs-cTnT (AUC 0.88 (95% CI 0.81-0.94) for hs-cTnT alone to 0.90 (95% CI 0.83-0.98) for the combination; p=ns). After the exclusion of 18 AMI patients with ST-segment elevation, similar results were obtained. Among the 16 AMI patients without ST-segment elevation, six had normal hs-cTnT at presentation. Of these, hFABP was elevated in two (33.3%) patients. CONCLUSIONS hFABP does not seem to significantly improve the early diagnostic accuracy of hs-cTnT in the important subgroup of patients with suspected AMI presenting to the ED very early after symptom onset.
Resumo:
OBJECTIVE AND BACKGROUND Anemia and thyroid dysfunction are common and often co-occur. Current guidelines recommend the assessment of thyroid function in the work-up of anemia, although evidence on this association is scarce. PATIENTS AND METHODS In the "European Prospective Investigation of Cancer" (EPIC)-Norfolk population-based cohort, we aimed to examine the prevalence and type of anemia (defined as hemoglobin <13 g/dl for men and <12 g/dl for women) according to different thyroid function groups. RESULTS The mean age of the 8791 participants was 59.4 (SD 9.1) years and 55.2% were women. Thyroid dysfunction was present in 437 (5.0%) and anemia in 517 (5.9%) participants. After excluding 121 participants with three most common causes of anemia (chronic kidney disease, inflammation, iron deficiency), anemia was found in 4.7% of euthyroid participants. Compared with the euthyroid group, the prevalence of anemia was significantly higher in overt hyperthyroidism (14.6%, P < .01), higher with borderline significance in overt hypothyroidism (7.7%, P = .05) and not increased in subclinical thyroid dysfunction (5.0% in subclinical hypothyroidism, 3.3% in subclinical hyperthyroidism). Anemia associated with thyroid dysfunction was mainly normocytic (94.0%), and rarely macrocytic (6.0%). CONCLUSION The prevalence of anemia was higher in overt hyperthyroidism, but not increased in subclinical thyroid dysfunction. Systematic measurement of thyroid-stimulating hormone in anemic patients is likely to be useful only after excluding common causes of anemia.
Resumo:
Hiatal hernia was diagnosed in three exotic felines-lynx (Lynx lynx), cougar (Puma concolore), and lion (Panthera leo). All cats had a history of anorexia. Thoracic and abdominal radiographs showed evidence of a soft tissue mass within the caudal mediastinum suggestive of a hiatal hernia in all animals. A barium esophagram was performed in one case. All animals underwent thoracic or abdominal surgery for hernia reduction. Surgical procedures included: intercostal thoracotomy with herniorrhaphy and esophagopexy (lynx and cougar), and incisional gastropexy (lion). Concurrent surgical procedures performed were gastrotomy for gastric foreign body removal and jejunostomy tube placement. Clinical signs related to the hiatal hernia disappeared after surgery and recurrence of signs was not reported for the time of follow-up.
Resumo:
Understanding the regulation of T-cell responses during inflammation and auto-immunity is fundamental for designing efficient therapeutic strategies against immune diseases. In this regard, prostaglandin E2 (PGE2) is mostly considered a myeloid-derived immunosuppressive molecule. We describe for the first time that T cells secrete PGE2 during T-cell receptor stimulation. In addition, we show that autocrine PGE2 signaling through EP receptors is essential for optimal CD4(+) T-cell activation in vitro and in vivo, and for T helper 1 (Th1) and regulatory T cell differentiation. PGE2 was found to provide additive co-stimulatory signaling through AKT activation. Intravital multiphoton microscopy showed that triggering EP receptors in T cells is also essential for the stability of T cell-dendritic cell (DC) interactions and Th-cell accumulation in draining lymph nodes (LNs) during inflammation. We further demonstrated that blocking EP receptors in T cells during the initial phase of collagen-induced arthritis in mice resulted in a reduction of clinical arthritis. This could be attributable to defective T-cell activation, accompanied by a decline in activated and interferon-γ-producing CD4(+) Th1 cells in draining LNs. In conclusion, we prove that T lymphocytes secret picomolar concentrations of PGE2, which in turn provide additive co-stimulatory signaling, enabling T cells to attain a favorable activation threshold. PGE2 signaling in T cells is also required for maintaining long and stable interactions with DCs within LNs. Blockade of EP receptors in vivo impairs T-cell activation and development of T cell-mediated inflammatory responses. This may have implications in various pathophysiological settings.
Resumo:
BACKGROUND The diagnostic performance of biochemical scores and artificial neural network models for portal hypertension and cirrhosis is not well established. AIMS To assess diagnostic accuracy of six serum scores, artificial neural networks and liver stiffness measured by transient elastography, for diagnosing cirrhosis, clinically significant portal hypertension and oesophageal varices. METHODS 202 consecutive compensated patients requiring liver biopsy and hepatic venous pressure gradient measurement were included. Several serum tests (alone and combined into scores) and liver stiffness were measured. Artificial neural networks containing or not liver stiffness as input variable were also created. RESULTS The best non-invasive method for diagnosing cirrhosis, portal hypertension and oesophageal varices was liver stiffness (C-statistics=0.93, 0.94, and 0.90, respectively). Among serum tests/scores the best for diagnosing cirrhosis and portal hypertension and oesophageal varices were, respectively, Fibrosis-4, and Lok score. Artificial neural networks including liver stiffness had high diagnostic performance for cirrhosis, portal hypertension and oesophageal varices (accuracy>80%), but were not statistically superior to liver stiffness alone. CONCLUSIONS Liver stiffness was the best non-invasive method to assess the presence of cirrhosis, portal hypertension and oesophageal varices. The use of artificial neural networks integrating different non-invasive tests did not increase the diagnostic accuracy of liver stiffness alone.
Resumo:
BACKGROUND Little information is yet available on zirconia-based prostheses supported by implants. PURPOSE To evaluate technical problems and failures of implant-supported zirconia-based prostheses with exclusive screw-retention. MATERIAL AND METHODS Consecutive patients received screw-retained zirconia-based prostheses supported by implants and were followed over a time period of 5 years. The implant placement and prosthetic rehabilitation were performed in one clinical setting, and all patients participated in the maintenance program. The treatment comprised single crowns (SCs) and fixed dental prostheses (FDPs) of three to 12 units. Screw-retention of the CAD/CAM-fabricated SCs and FDPs was performed with direct connection at the implant level. The primary outcome was the complete failure of zirconia-based prostheses; outcome measures were fracture of the framework or extensive chipping resulting in the need for refabrication. A life table analysis was performed, the cumulative survival rate (CSR) calculated, and a Kaplan-Meier curve drawn. RESULTS Two hundred and ninety-four implants supported 156 zirconia-based prostheses in 95 patients (52 men, 43 women, average age 59.1 ± 11.7 years). Sixty-five SCs and 91 FDPs were identified, comprising a total of 441 units. Fractures of the zirconia framework and extensive chipping resulted in refabrication of nine prostheses. Nearly all the prostheses (94.2%) remained in situ during the observation period. The 5-year CSR was 90.5%, and 41 prostheses (14 SCs, 27 FDPs) comprising 113 units survived for an observation time of more than 5 years. Six SCs exhibited screw loosening, and polishing of minor chipping was required for five prostheses. CONCLUSIONS This study shows that zirconia-based implant-supported fixed prostheses exhibit satisfactory treatment outcomes and that screw-retention directly at the implant level is feasible.
Resumo:
CONTEXT The polyuria-polydipsia syndrome comprises primary polydipsia (PP) and central and nephrogenic diabetes insipidus (DI). Correctly discriminating these entities is mandatory, given that inadequate treatment causes serious complications. The diagnostic "gold standard" is the water deprivation test with assessment of arginine vasopressin (AVP) activity. However, test interpretation and AVP measurement are challenging. OBJECTIVE The objective was to evaluate the accuracy of copeptin, a stable peptide stoichiometrically cosecreted with AVP, in the differential diagnosis of polyuria-polydipsia syndrome. DESIGN, SETTING, AND PATIENTS This was a prospective multicenter observational cohort study from four Swiss or German tertiary referral centers of adults >18 years old with the history of polyuria and polydipsia. MEASUREMENTS A standardized combined water deprivation/3% saline infusion test was performed and terminated when serum sodium exceeded 147 mmol/L. Circulating copeptin and AVP levels were measured regularly throughout the test. Final diagnosis was based on the water deprivation/saline infusion test results, clinical information, and the treatment response. RESULTS Fifty-five patients were enrolled (11 with complete central DI, 16 with partial central DI, 18 with PP, and 10 with nephrogenic DI). Without prior thirsting, a single baseline copeptin level >21.4 pmol/L differentiated nephrogenic DI from other etiologies with a 100% sensitivity and specificity, rendering a water deprivation testing unnecessary in such cases. A stimulated copeptin >4.9 pmol/L (at sodium levels >147 mmol/L) differentiated between patients with PP and patients with partial central DI with a 94.0% specificity and a 94.4% sensitivity. A stimulated AVP >1.8 pg/mL differentiated between the same categories with a 93.0% specificity and a 83.0% sensitivity. LIMITATION This study was limited by incorporation bias from including AVP levels as a diagnostic criterion. CONCLUSION Copeptin is a promising new tool in the differential diagnosis of the polyuria-polydipsia syndrome, and a valid surrogate marker for AVP. Primary Funding Sources: Swiss National Science Foundation, University of Basel.
Resumo:
INTRODUCTION Late-onset hypogonadism (LOH) represents a common clinical entity in aging males, characterized by the presence of symptoms (most usually of a sexual nature, such as decreased libido, decreased spontaneous erections and erectile dysfunction) and signs, in combination with low serum testosterone concentrations. Whether testosterone replacement therapy (TRT) should be offered to those individuals is still under extensive debate. AIMS The aim of this position statement is to provide and critically appraise evidence on TRT in the aging male, focusing on pathophysiology and characteristics of LOH, indications for TRT, available therapeutic agents, monitoring and treatment-associated risks. MATERIALS AND METHODS Literature review and consensus of expert opinion. RESULTS AND CONCLUSIONS Diagnosis and treatment of LOH is justified, if a combination of symptoms of testosterone deficiency and low testosterone is present. Patients receiving TRT could profit with regard to obesity, metabolic syndrome, type 2 diabetes mellitus, sexual function and osteoporosis and should undergo scheduled testing for adverse events regularly. Potential adverse effects of TRT on cardiovascular disease, prostate cancer and sleep apnea are as yet unclear and remain to be investigated in large-scale prospective studies. Management of aging men with LOH should include individual evaluation of co-morbidities and careful risk versus benefit assessment.
Resumo:
Abstract We explored the feasibility of unrelated donor haematopoietic stem cell transplant (HSCT) upfront without prior immunosuppressive therapy (IST) in paediatric idiopathic severe aplastic anaemia (SAA). This cohort was then compared to matched historical controls who had undergone first-line therapy with a matched sibling/family donor (MSD) HSCT (n = 87) or IST with horse antithymocyte globulin and ciclosporin (n = 58) or second-line therapy with unrelated donor HSCT post-failed IST (n = 24). The 2-year overall survival in the upfront cohort was 96 ± 4% compared to 91 ± 3% in the MSD controls (P = 0·30) and 94 ± 3% in the IST controls (P = 0·68) and 74 ± 9% in the unrelated donor HSCT post-IST failure controls (P = 0·02).The 2-year event-free survival in the upfront cohort was 92 ± 5% compared to 87 ± 4% in MSD controls (P = 0·37), 40 ± 7% in IST controls (P = 0·0001) and 74 ± 9% in the unrelated donor HSCT post-IST failure controls (n = 24) (P = 0·02). Outcomes for upfront-unrelated donor HSCT in paediatric idiopathic SAA were similar to MSD HSCT and superior to IST and unrelated donor HSCT post-IST failure. Front-line therapy with matched unrelated donor HSCT is a novel treatment approach and could be considered as first-line therapy in selected paediatric patients who lack a MSD. © 2015 John Wiley & Sons Ltd.