906 resultados para Thrombophilia Risk Evaluation
Resumo:
Different risk factors for venous thromboembolism (VTE) have been identified, including hereditary abnormalities in the mechanisms of coagulation and fibrinolysis. We investigated five genetic polymorphisms (FVL G1691A, FII G20210A, MTHFR C677T, TAFI A152G and TAFI T1053C) associated with VTE in individuals from the city of Belém in the Brazilian Amazon who had no history of VTE. No significant difference was found between the observed and expected genotype frequencies for the loci analyzed. We found high frequencies of MTHFR C677T (33.9%) and TAFI T1053C (74%) and low frequencies of FVL (1.6%), FII G20210A (0.8%) and TAFI A152G (0.8%). The FVL G1691A, FII G20210A and MTHFR C677T frequencies were similar to those for European populations and populations of European descent living in the city of Ribeirão Preto in the Brazilian state of São Paulo. The frequency of the two TAFI mutations in the Belém individuals was not significantly different from that described for individuals from Ribeirão Preto. We suggest that the risks for VTE in the population of Belém are of the same magnitude as that observed in European populations and in populations with an expressive European contribution.
Resumo:
OBJECTIVE:The purpose of this study was to evaluate the long term clinical and ultrasonographic outcomes of thrombophilic patients with deep venous thrombosis (DVT).METHOD:Cohort study, retrospective case-control with cross-sectional analysis. Thirty-nine thrombophilic patients and 25 non-thrombophilic patients were assessed 76.3 ± 45.8 months after diagnosis. Demographic and family data were collected, as well as data from clinical and therapeutic progress, and physical and ultrasound examinations of the limbs were performed. Groups were matched for age and gender and the variables studied were compared across groups.RESULTS:Deep venous thrombosis was more frequent in women. The most common thrombophilias were antiphospholipid syndrome and factor V Leiden mutation. There was no difference between groups in terms of the number of pregnancies or miscarriages and the majority of women did not become pregnant after DVT. Non-spontaneous DVT prevailed. Proximal DVT and DVT of the left lower limb were more frequent, and the main risk factor was use of oral contraceptives. All patients were treated with anticoagulation. There was a higher frequency of pulmonary embolism in non-thrombophilic patients. Most patients considered themselves to have a normal life after DVT and reported wearing elastic stockings over at least 2 years. Seventy-one percent of patients had CEAP > 3, with no difference between groups. Deep venous reflux was more frequent in thrombophilic patients.CONCLUSION:There were no significant differences between groups with respect to most of the variables studied, except for a higher frequency of pulmonary embolism in non-thrombophilic patients and greater frequency of deep venous reflux in thrombophilic patients.
Resumo:
OBJECTIVE: To analyze the nutritional status of pediatric patients after orthotopic liver transplantation and the relationship with short-term clinical outcome. METHOD: Anthropometric evaluations of 60 children and adolescents after orthotopic liver transplantation, during the first 24 hours in a tertiary pediatric intensive care unit. Nutritional status was determined from the Z score for the following indices: weight/age, height/age or length/age, weight/height or weight/length, body mass index/age, arm circumference/age and triceps skinfold/age. The severity of liver disease was evaluated using one of the two models which was adequated to the patients' age: 1. Pediatric End-stage Liver Disease, 2. Model for End-Stage Liver Disease. RESULTS: We found 50.0% undernutrition by height/age; 27.3% by weight/age; 11.1% by weight/height or weight/length; 10.0% by body mass index/age; 61.6% by arm circumference/age and 51.0% by triceps skinfold/age. There was no correlation between nutritional status and Pediatric End-stage Liver Disease or mortality. We found a negative correlation between arm circumference/age and length of hospitalization. CONCLUSION: Children with chronic liver diseases experience a significant degree of undernutrition, which makes nutritional support an important aspect of therapy. Despite the difficulties in assessment, anthropometric evaluation of the upper limbs is useful to evaluate nutritional status of children before or after liver transplantation.
Resumo:
OBJECTIVE: Many changes in mucosal morphology are observed following ileal pouch construction, including colonic metaplasia and dysplasia. Additionally, one rare but potential complication is the development of adenocarcinoma of the reservoir. The aim of this study was to evaluate the most frequently observed histopathological changes in ileal pouches and to correlate these changes with potential risk factors for complications. METHODS: A total of 41 patients were enrolled in the study and divided into the following three groups: a non-pouchitis group (group 1) (n = 20; 8 males; mean age: 47.5 years) demonstrating optimal outcome; a pouchitis without antibiotics group (group 2) (n = 14; 4 males; mean age: 47 years), containing individuals with pouchitis who did not receive treatment with antibiotics; and a pouchitis plus antibiotics group (group 3) (n = 7; 3 males; mean age: 41 years), containing those patients with pouchitis who were administered antibiotics. Ileal pouch endoscopy was performed, and tissue biopsy samples were collected for histopathological analysis. RESULTS: Colonic metaplasia was found in 15 (36.6%) of the 41 patients evaluated; of these, five (25%) were from group 1, eight (57.1%) were from group 2, and two (28.6%) were from group 3. However, no correlation was established between the presence of metaplasia and pouchitis (p = 0.17). and no differences in mucosal atrophy or the degree of chronic or acute inflammation were observed between groups 1, 2, and 3 (p > 0.45). Moreover, no dysplasia or neoplastic changes were detected. However, the degree of mucosal atrophy correlated well with the time of postoperative follow-up (p = 0.05). CONCLUSIONS: The degree of mucosal atrophy, the presence of colonic metaplasia, and the degree of acute or chronic inflammation do not appear to constitute risk factors for the development of pouchitis. Moreover, we observed that longer postoperative follow-up times were associated with greater degrees of mucosal atrophy.
Resumo:
The coronary artery calcium (CAC) score is a readily and widely available tool for the noninvasive diagnosis of atherosclerotic coronary artery disease (CAD). The aim of this study was to investigate the added value of the CAC score as an adjunct to gated SPECT for the assessment of CAD in an intermediate-risk population. METHODS: Seventy-seven prospectively recruited patients with intermediate risk (as determined by the Framingham Heart Study 10-y CAD risk score) and referred for coronary angiography because of suspected CAD underwent stress (99m)Tc-tetrofosmin SPECT myocardial perfusion imaging (MPI) and CT CAC scoring within 2 wk before coronary angiography. The sensitivity and specificity of SPECT alone and of the combination of the 2 methods (SPECT plus CAC score) in demonstrating significant CAD (>/=50% stenosis on coronary angiography) were compared. RESULTS: Forty-two (55%) of the 77 patients had CAD on coronary angiography, and 35 (45%) had abnormal SPECT results. The CAC score was significantly higher in subjects with perfusion abnormalities than in those who had normal SPECT results (889 +/- 836 [mean +/- SD] vs. 286 +/- 335; P < 0.0001). Similarly, with rising CAC scores, a larger percentage of patients had CAD. Receiver-operating-characteristic analysis showed that a CAC score of greater than or equal to 709 was the optimal cutoff for detecting CAD missed by SPECT. SPECT alone had a sensitivity and a specificity for the detection of significant CAD of 76% and 91%, respectively. Combining SPECT with the CAC score (at a cutoff of 709) improved the sensitivity of SPECT (from 76% to 86%) for the detection of CAD, in association with a nonsignificant decrease in specificity (from 91% to 86%). CONCLUSION: The CAC score may offer incremental diagnostic information over SPECT data for identifying patients with significant CAD and negative MPI results.
Resumo:
Standard procedures for forecasting flood risk (Bulletin 17B) assume annual maximum flood (AMF) series are stationary, meaning the distribution of flood flows is not significantly affected by climatic trends/cycles, or anthropogenic activities within the watershed. Historical flood events are therefore considered representative of future flood occurrences, and the risk associated with a given flood magnitude is modeled as constant over time. However, in light of increasing evidence to the contrary, this assumption should be reconsidered, especially as the existence of nonstationarity in AMF series can have significant impacts on planning and management of water resources and relevant infrastructure. Research presented in this thesis quantifies the degree of nonstationarity evident in AMF series for unimpaired watersheds throughout the contiguous U.S., identifies meteorological, climatic, and anthropogenic causes of this nonstationarity, and proposes an extension of the Bulletin 17B methodology which yields forecasts of flood risk that reflect climatic influences on flood magnitude. To appropriately forecast flood risk, it is necessary to consider the driving causes of nonstationarity in AMF series. Herein, large-scale climate patterns—including El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO), and Atlantic Multidecadal Oscillation (AMO)—are identified as influencing factors on flood magnitude at numerous stations across the U.S. Strong relationships between flood magnitude and associated precipitation series were also observed for the majority of sites analyzed in the Upper Midwest and Northeastern regions of the U.S. Although relationships between flood magnitude and associated temperature series are not apparent, results do indicate that temperature is highly correlated with the timing of flood peaks. Despite consideration of watersheds classified as unimpaired, analyses also suggest that identified change-points in AMF series are due to dam construction, and other types of regulation and diversion. Although not explored herein, trends in AMF series are also likely to be partially explained by changes in land use and land cover over time. Results obtained herein suggest that improved forecasts of flood risk may be obtained using a simple modification of the Bulletin 17B framework, wherein the mean and standard deviation of the log-transformed flows are modeled as functions of climate indices associated with oceanic-atmospheric patterns (e.g. AMO, ENSO, NAO, and PDO) with lead times between 3 and 9 months. Herein, one-year ahead forecasts of the mean and standard deviation, and subsequently flood risk, are obtained by applying site specific multivariate regression models, which reflect the phase and intensity of a given climate pattern, as well as possible impacts of coupling of the climate cycles. These forecasts of flood risk are compared with forecasts derived using the existing Bulletin 17B model; large differences in the one-year ahead forecasts are observed in some locations. The increased knowledge of the inherent structure of AMF series and an improved understanding of physical and/or climatic causes of nonstationarity gained from this research should serve as insight for the formulation of a physical-casual based statistical model, incorporating both climatic variations and human impacts, for flood risk over longer planning horizons (e.g., 10-, 50, 100-years) necessary for water resources design, planning, and management.
Resumo:
The development of coronary vasculopathy is the main determinant of long-term survival in cardiac transplantation. The identification of risk factors, therefore, seems necessary in order to identify possible treatment strategies. Ninety-five out of 397 patients, undergoing orthotopic cardiac transplantation from 10/1985 to 10/1992 were evaluated retrospectively on the basis of perioperative and postoperative variables including age, sex, diagnosis, previous operations, renal function, cholesterol levels, dosage of immunosuppressive drugs (cyclosporin A, azathioprine, steroids), incidence of rejection, treatment with calcium channel blockers at 3, 6, 12, and 18 months postoperatively. Coronary vasculopathy was assessed by annual angiography at 1 and 2 years postoperatively. After univariate analysis, data were evaluated by stepwise multiple logistic regression analysis. Coronary vasculopathy was assessed in 15 patients at 1 (16%), and in 23 patients (24%) at 2, years. On multivariate analysis, previous operations and the incidence of rejections were identified as significant risk factors (P < 0.05), whereas the underlying diagnosis had borderline significance (P = 0.058) for the development of graft coronary vasculopathy. In contrast, all other variables were not significant in our subset of patients investigated. We therefore conclude that the development of coronary vasculopathy in cardiac transplant patients mainly depends on the rejection process itself, aside from patient-dependent factors. Therapeutic measures, such as the administration of calcium channel blockers and regulation of lipid disorders, may therefore only reduce the progress of native atherosclerotic disease in the posttransplant setting.
Resumo:
Because of the important morbidity and mortality associated with osteoporosis, it is essential to detect subjects at risk by screening methods, such as bone quantitative ultrasounds (QUSs). Several studies showed that QUS could predict fractures. None, however, compared prospectively different QUS devices, and few data of quality controls (QCs) have been published. The Swiss Evaluation of the Methods of Measurement of Osteoporotic Fracture Risk is a prospective multicenter study that compared three QUSs for the assessment of hip fracture risk in a population of 7609 women age >/=70 yr. Because the inclusion phase lasted 20 mo, and because 10 centers participated in this study, QC became a major issue. We therefore developed a QC procedure to assess the stability and precision of the devices, and for their cross-calibration. Our study focuses on the two heel QUSs. The water bath system (Achilles+) had a higher precision than the dry system (Sahara). The QC results were highly dependent on temperature. QUS stability was acceptable, but Sahara must be calibrated regularly. A sufficient homogeneity among all the Sahara devices could be demonstrated, whereas significant differences were found among the Achilles+ devices. For speed of sound, 52% of the differences among the Achilles+ was explained by the water s temperature. However, for broadband ultrasound attenuation, a maximal difference of 23% persisted after adjustment for temperature. Because such differences could influence measurements in vivo, it is crucial to develop standardized phantoms to be used in prospective multicenter studies.
Resumo:
BACKGROUND Pets, often used as companionship and for psychological support in the therapy of nursing home residents, have been implicated as reservoirs for antibiotic-resistant bacteria. We investigated the importance of pets as reservoirs of multidrug-resistant (MDR) staphylococci in nursing homes. METHODS We assessed the carriage of MDR staphylococci in pets and in 2 groups of residents, those living in nursing homes with pets and those living without pet contacts. We collected demographic, health status, and human-pet contact data by means of questionnaires. We assessed potential bacteria transmission pathways by investigating physical resident-to-pet contact. RESULTS The observed prevalence of MDR staphylococci carriage was 84/229 (37%) in residents living with pets and 99/216 (46%) in those not living with pets (adjusted odds ratio [aOR], 0.6; 95% confidence interval [CI], 0.4-0.9). Active pet contact was associated with lower carriage of MDR staphylococci (aOR, 0.5; 95% CI, 0.4-0.8). Antibiotic treatment during the previous 3 months was associated with significantly increased risk for MDR carriage in residents (aOR, 3.1; 95% CI, 1.8-5.7). CONCLUSIONS We found no evidence that the previously reported benefits of pet contact are compromised by the increased risk of carriage of MDR staphylococci in residents associated with interaction with these animals in nursing homes. Thus, contact with pets, always under good hygiene standards, should be encouraged in these settings.
Resumo:
PRINCIPLES To evaluate the validity and feasibility of a novel photography-based home assessment (PhoHA) protocol, as a possible substitute for on-site home assessment (OsHA). METHODS A total of 20 patients aged ≥65 years who were hospitalised in a rehabilitation centre for musculoskeletal disorders affecting mobility participated in this prospective validation study. For PhoHA, occupational therapists rated photographs and measurements of patients' homes provided by patients' confidants. For OsHA, occupational therapists conducted a conventional home visit. RESULTS Information obtained by PhoHA was 79.1% complete (1,120 environmental factors identified by PhoHA vs 1416 by OsHA). Of the 1,120 factors, 749 had dichotomous (potential hazards) and 371 continuous scores (measurements with tape measure). Validity of PhoHA to potential hazards was good (sensitivity 78.9%, specificity 84.9%), except for two subdomains (pathways, slippery surfaces). Pearson's correlation coefficient for the validity of measurements was 0.87 (95% confidence interval [CI 0.80-0.92, p <0.001). Agreement between methods was 0.52 (95%CI 0.34-0.67, p <0.001, Cohen's kappa coefficient) for dichotomous and 0.86 (95%CI 0.79-0.91, p <0.001, intraclass correlation coefficient) for continuous scores. Costs of PhoHA were 53.0% lower than those of OsHA (p <0.001). CONCLUSIONS PhoHA has good concurrent validity for environmental assessment if instructions for confidants are improved. PhoHA is potentially a cost-effective method for environmental assessment.
Resumo:
BACKGROUND Conventional factors do not fully explain the distribution of cardiovascular outcomes. Biomarkers are known to participate in well-established pathways associated with cardiovascular disease, and may therefore provide further information over and above conventional risk factors. This study sought to determine whether individual and/or combined assessment of 9 biomarkers improved discrimination, calibration and reclassification of cardiovascular mortality. METHODS 3267 patients (2283 men), aged 18-95 years, at intermediate-to-high-risk of cardiovascular disease were followed in this prospective cohort study. Conventional risk factors and biomarkers were included based on forward and backward Cox proportional stepwise selection models. RESULTS During 10-years of follow-up, 546 fatal cardiovascular events occurred. Four biomarkers (interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D) were retained during stepwise selection procedures for subsequent analyses. Simultaneous inclusion of these biomarkers significantly improved discrimination as measured by the C-index (0.78, P = 0.0001), and integrated discrimination improvement (0.0219, P<0.0001). Collectively, these biomarkers improved net reclassification for cardiovascular death by 10.6% (P<0.0001) when added to the conventional risk model. CONCLUSIONS In terms of adverse cardiovascular prognosis, a biomarker panel consisting of interleukin-6, neutrophils, von Willebrand factor, and 25-hydroxyvitamin D offered significant incremental value beyond that conveyed by simple conventional risk factors.
Resumo:
BACKGROUND It is often assumed that horses with mild respiratory clinical signs, such as mucous nasal discharge and occasional coughing, have an increased risk of developing recurrent airway obstruction (RAO). HYPOTHESIS Compared to horses without any clinical signs of respiratory disease, those with occasional coughing, mucous nasal discharge, or both have an increased risk of developing signs of RAO (frequent coughing, increased breathing effort, exercise intolerance, or a combination of these) as characterized by the Horse Owner Assessed Respiratory Signs Index (HOARSI 1-4). ANIMALS Two half-sibling families descending from 2 RAO-affected stallions (n = 65 and n = 47) and an independent replication population of unrelated horses (n = 88). METHODS In a retrospective cohort study, standardized information on occurrence and frequency of coughing, mucous nasal discharge, poor performance, and abnormal breathing effort-and these factors combined in the HOARSI-as well as management factors were collected at intervals of 1.3-5 years. RESULTS Compared to horses without clinical signs of respiratory disease (half-siblings 7%; unrelated horses 3%), those with mild respiratory signs developed clinical signs of RAO more frequently: half-siblings with mucous nasal discharge 35% (P < .001, OR: 7.0, sensitivity: 62%, specificity: 81%), with mucous nasal discharge and occasional coughing 43% (P < .001, OR: 9.9, sensitivity: 55%, specificity: 89%); unrelated horses with occasional coughing: 25% (P = .006, OR = 9.7, sensitivity: 75%, specificity: 76%). CONCLUSIONS AND CLINICAL IMPORTANCE Occasional coughing and mucous nasal discharge might represent an increased risk of developing RAO.
Resumo:
The comparison of radiotherapy techniques regarding secondary cancer risk has yielded contradictory results possibly stemming from the many different approaches used to estimate risk. The purpose of this study was to make a comprehensive evaluation of different available risk models applied to detailed whole-body dose distributions computed by Monte Carlo for various breast radiotherapy techniques including conventional open tangents, 3D conformal wedged tangents and hybrid intensity modulated radiation therapy (IMRT). First, organ-specific linear risk models developed by the International Commission on Radiological Protection (ICRP) and the Biological Effects of Ionizing Radiation (BEIR) VII committee were applied to mean doses for remote organs only and all solid organs. Then, different general non-linear risk models were applied to the whole body dose distribution. Finally, organ-specific non-linear risk models for the lung and breast were used to assess the secondary cancer risk for these two specific organs. A total of 32 different calculated absolute risks resulted in a broad range of values (between 0.1% and 48.5%) underlying the large uncertainties in absolute risk calculation. The ratio of risk between two techniques has often been proposed as a more robust assessment of risk than the absolute risk. We found that the ratio of risk between two techniques could also vary substantially considering the different approaches to risk estimation. Sometimes the ratio of risk between two techniques would range between values smaller and larger than one, which then translates into inconsistent results on the potential higher risk of one technique compared to another. We found however that the hybrid IMRT technique resulted in a systematic reduction of risk compared to the other techniques investigated even though the magnitude of this reduction varied substantially with the different approaches investigated. Based on the epidemiological data available, a reasonable approach to risk estimation would be to use organ-specific non-linear risk models applied to the dose distributions of organs within or near the treatment fields (lungs and contralateral breast in the case of breast radiotherapy) as the majority of radiation-induced secondary cancers are found in the beam-bordering regions.
Resumo:
Genome-wide association studies (GWAS) have revealed genetic determinants of iron metabolism, but correlation of these with clinical phenotypes is pending. Homozygosity for HFE C282Y is the predominant genetic risk factor for hereditary hemochromatosis (HH) and may cause liver cirrhosis. However, this genotype has a low penetrance. Thus, detection of yet unknown genetic markers that identify patients at risk of developing severe liver disease is necessary for better prevention. Genetic loci associated with iron metabolism (TF, TMPRSS6, PCSK7, TFR2 and Chr2p14) in recent GWAS and liver fibrosis (PNPLA3) in recent meta-analysis were analyzed for association with either liver cirrhosis or advanced fibrosis in 148 German HFE C282Y homozygotes. Replication of associations was sought in additional 499 Austrian/Swiss and 112 HFE C282Y homozygotes from Sweden. Only variant rs236918 in the PCSK7 gene (proprotein convertase subtilisin/kexin type 7) was associated with cirrhosis or advanced fibrosis (P = 1.02 × 10(-5)) in the German cohort with genotypic odds ratios of 3.56 (95% CI 1.29-9.77) for CG heterozygotes and 5.38 (95% CI 2.39-12.10) for C allele carriers. Association between rs236918 and cirrhosis was confirmed in Austrian/Swiss HFE C282Y homozygotes (P = 0.014; ORallelic = 1.82 (95% CI 1.12-2.95) but not in Swedish patients. Post hoc combined analyses of German/Swiss/Austrian patients with available liver histology (N = 244, P = 0.00014, ORallelic = 2.84) and of males only (N = 431, P = 2.17 × 10(-5), ORallelic = 2.54) were consistent with the premier finding. Association between rs236918 and cirrhosis was not confirmed in alcoholic cirrhotics, suggesting specificity of this genetic risk factor for HH. PCSK7 variant rs236918 is a risk factor for cirrhosis in HH patients homozygous for the HFE C282Y mutation.