950 resultados para value-at-risk (VaR)
Resumo:
This paper examines the interrelationship between law and lifestyle sports, viewed through the lens of parkour. We argue that the literature relating to legal approaches to lifestyle sport is currently underdeveloped and so seek to partially fill this lacuna. Hitherto, we argue, the law has been viewed as a largely negative presence, seen particularly in terms of the ways in which counter-cultural activities are policed and regulated, and where such activities are viewed as transgressive or undesirable. We argue that this is a somewhat unsophisticated take on how the law can operate, with law constructed as an outcome of constraints to behaviour (where the law authorises or prohibits), distinct from the legal contexts, environments and spaces in which these relationships occur. We argue that the distinctive settings in which lifestyle sports are practiced needs a more fine-grained analysis as they are settings which bear, and bring to life, laws and regulations that shape how space is to be experienced. We examine specifically the interrelationship between risk and benefit and how the law recognises issues of social utility or value, particularly within the context of lifestyle sport. We seek to move from user-centred constructions of law as an imposition, to a more nuanced position that looks at parkour at the intersections of law, space and lifestyle sport, in order to reveal how law can be used to support and extend claims to space.
Resumo:
To compare time and risk to biochemical recurrence (BR) after radical prostatectomy of two chronologically different groups of patients using the standard and the modified Gleason system (MGS). Cohort 1 comprised biopsies of 197 patients graded according to the standard Gleason system (SGS) in the period 1997/2004, and cohort 2, 176 biopsies graded according to the modified system in the period 2005/2011. Time to BR was analyzed with the Kaplan-Meier product-limit analysis and prediction of shorter time to recurrence using univariate and multivariate Cox proportional hazards model. Patients in cohort 2 reflected time-related changes: striking increase in clinical stage T1c, systematic use of extended biopsies, and lower percentage of total length of cancer in millimeter in all cores. The MGS used in cohort 2 showed fewer biopsies with Gleason score ≤ 6 and more biopsies of the intermediate Gleason score 7. Time to BR using the Kaplan-Meier curves showed statistical significance using the MGS in cohort 2, but not the SGS in cohort 1. Only the MGS predicted shorter time to BR on univariate analysis and on multivariate analysis was an independent predictor. The results favor that the 2005 International Society of Urological Pathology modified system is a refinement of the Gleason grading and valuable for contemporary clinical practice.
Resumo:
This study aimed at evaluating whether human papillomavirus (HPV) groups and E6/E7 mRNA of HPV 16, 18, 31, 33, and 45 are prognostic of cervical intraepithelial neoplasia (CIN) 2 outcome in women with a cervical smear showing a low-grade squamous intraepithelial lesion (LSIL). This cohort study included women with biopsy-confirmed CIN 2 who were followed up for 12 months, with cervical smear and colposcopy performed every three months. Women with a negative or low-risk HPV status showed 100% CIN 2 regression. The CIN 2 regression rates at the 12-month follow-up were 69.4% for women with alpha-9 HPV versus 91.7% for other HPV species or HPV-negative status (P < 0.05). For women with HPV 16, the CIN 2 regression rate at the 12-month follow-up was 61.4% versus 89.5% for other HPV types or HPV-negative status (P < 0.05). The CIN 2 regression rate was 68.3% for women who tested positive for HPV E6/E7 mRNA versus 82.0% for the negative results, but this difference was not statistically significant. The expectant management for women with biopsy-confirmed CIN 2 and previous cytological tests showing LSIL exhibited a very high rate of spontaneous regression. HPV 16 is associated with a higher CIN 2 progression rate than other HPV infections. HPV E6/E7 mRNA is not a prognostic marker of the CIN 2 clinical outcome, although this analysis cannot be considered conclusive. Given the small sample size, this study could be considered a pilot for future larger studies on the role of predictive markers of CIN 2 evolution.
Resumo:
To analyze associations between mammographic arterial mammary calcifications in menopausal women and risk factors for cardiovascular disease. This was a cross-sectional retrospective study, in which we analyzed the mammograms and medical records of 197 patients treated between 2004 and 2005. Study variables were: breast arterial calcifications, stroke, acute coronary syndrome, age, obesity, diabetes mellitus, smoking, and hypertension. For statistical analysis, we used the Mann-Whitney, χ2 and Cochran-Armitage tests, and also evaluated the prevalence ratios between these variables and mammary artery calcifications. Data were analyzed with the SAS version 9.1 software. In the group of 197 women, there was a prevalence of 36.6% of arterial calcifications on mammograms. Among the risk factors analyzed, the most frequent were hypertension (56.4%), obesity (31.9%), smoking (15.2%), and diabetes (14.7%). Acute coronary syndrome and stroke presented 5.6 and 2.0% of prevalence, respectively. Among the mammograms of women with diabetes, the odds ratio of mammary artery calcifications was 2.1 (95%CI 1.0-4.1), with p-value of 0.02. On the other hand, the mammograms of smokers showed the low occurrence of breast arterial calcification, with an odds ratio of 0.3 (95%CI 0.1-0.8). Hypertension, obesity, diabetes mellitus, stroke and acute coronary syndrome were not significantly associated with breast arterial calcification. The occurrence of breast arterial calcification was associated with diabetes mellitus and was negatively associated with smoking. The presence of calcification was independent of the other risk factors for cardiovascular disease analyzed.
Resumo:
The objectives of the study were to evaluate the performance of sentinel lymph node biopsy (SLNB) in detecting occult metastases in papillary thyroid carcinoma (PTC) and to correlate their presence to tumor and patient characteristics. Twenty-three clinically node-negative PTC patients (21 females, mean age 48.4 years) were prospectively enrolled. Patients were submitted to sentinel lymph node (SLN) lymphoscintigraphy prior to total thyroidectomy. Ultrasound-guided peritumoral injections of (99m)Tc-phytate (7.4 MBq) were performed. Cervical single-photon emission computed tomography and computed tomography (SPECT/CT) images were acquired 15 min after radiotracer injection and 2 h prior to surgery. Intra-operatively, SLNs were located with a gamma probe and removed along with non-SLNs located in the same neck compartment. Papillary thyroid carcinoma, SLNs and non-SLNs were submitted to histopathology analysis. Sentinel lymph nodes were located in levels: II in 34.7 % of patients; III in 26 %; IV in 30.4 %; V in 4.3 %; VI in 82.6 % and VII in 4.3 %. Metastases in the SLN were noted in seven patients (30.4 %), in non-SLN in three patients (13.1 %), and in the lateral compartments in 20 % of patients. There were significant associations between lymph node (LN) metastases and the presence of angio-lymphatic invasion (p = 0.04), extra-thyroid extension (p = 0.03) and tumor size (p = 0.003). No correlations were noted among LN metastases and patient age, gender, stimulated thyroglobulin levels, positive surgical margins, aggressive histology and multifocal lesions. Sentinel lymph node biopsy can detect occult metastases in PTC. The risk of a metastatic SLN was associated with extra-thyroid extension, larger tumors and angio-lymphatic invasion. This may help guide future neck dissection, patient surveillance and radioiodine therapy doses.
Resumo:
A retrospective cohort. To report the incidence rates of shoulder injuries diagnosed with magnetic resonance imaging (MRI) in tetraplegic athletes and sedentary tetraplegic individuals. To evaluate whether sport practice increases the risk of shoulder injuries in tetraplegic individuals. Campinas, Sao Paulo, Brazil. Ten tetraplegic athletes with traumatic spinal cord injury were selected among quad rugby athletes and had both the shoulders evaluated by MRI. They were compared with 10 sedentary tetraplegic individuals who were submitted to the same radiological protocol. All athletes were male with a mean age of 32.1 years (range 25-44 years, s.d.=6.44). Time since injury ranged from 6 to 17 years, with a mean value of 9.7 years and s.d. of 3.1 years. All sedentary individuals were male with a mean age of 35.9 years (range 22-47 years, s.d.=8.36). Statistical analysis showed a protective effect of sport in the development of shoulder injuries, with a weak correlation for infraspinatus and subscapularis tendinopathy (P=0.09 and P=0.08, respectively) and muscle atrophy (P=0.08). There was a strong correlation for acromioclavicular joint (ACJ) and labrum injuries (P=0.04), with sedentary individuals at a higher risk for these injuries. Tetraplegic athletes and sedentary individuals have a high incidence of supraspinatus tendinosis, bursitis and ACJ degeneration. Statistical analysis showed that there is a possible protective effect of sport in the development of shoulder injuries. Weak evidence was encountered for infraspinatus and subscapularis tendinopathy and muscle atrophy (P=0.09, P=0.08 and P=0.08, respectively). Strong evidence with P=0.04 suggests that sedentary tetraplegic individuals are at a greater risk for ACJ and labrum injuries.Spinal Cord advance online publication, 17 March 2015; doi:10.1038/sc.2014.248.
Resumo:
Background: Malaria is an important threat to travelers visiting endemic regions. The risk of acquiring malaria is complex and a number of factors including transmission intensity, duration of exposure, season of the year and use of chemoprophylaxis have to be taken into account estimating risk. Materials and methods: A mathematical model was developed to estimate the risk of non-immune individual acquiring falciparum malaria when traveling to the Amazon region of Brazil. The risk of malaria infection to travelers was calculated as a function of duration of exposure and season of arrival. Results: The results suggest significant variation of risk for non-immune travelers depending on arrival season, duration of the visit and transmission intensity. The calculated risk for visitors staying longer than 4 months during peak transmission was 0.5% per visit. Conclusions: Risk estimates based on mathematical modeling based on accurate data can be a valuable tool in assessing risk/benefits and cost/benefits when deciding on the value of interventions for travelers to malaria endemic regions.
Resumo:
The TP53 tumor suppressor gene codifies a protein responsible for preventing cells with genetic damage from growing and dividing by blocking cell growth or apoptosis pathways. A common single nucleotide polymorphism (SNP) in TP53 codon 72 (Arg72Pro) induces a 15-fold decrease of apoptosis-inducing ability and has been associated with susceptibility to human cancers. Recently, another TP53 SNP at codon 47 (Pro47Ser) was reported to have a low apoptosis-inducing ability; however, there are no association studies between this SNP and cancer. Aiming to study the role of TP53 Pro47Ser and Arg72Pro on glioma susceptibility and oncologic prognosis of patients, we investigated the genotype distribution of these SNPs in 94 gliomas (81 astrocytomas, 8 ependymomas and 5 oligodendrogliomas) and in 100 healthy subjects by the polymerase chain reaction-restriction fragment length polymorphism approach. Chi-square and Fisher exact test comparisons for genotype distributions and allele frequencies did not reveal any significant difference between patients and control groups. Overall and disease-free survivals were calculated by the Kaplan-Meier method, and the log-rank test was used for comparisons, but no significant statistical difference was observed between the two groups. Our data suggest that TP53 Pro47Ser and Arg72Pro SNPs are not involved either in susceptibility to developing gliomas or in patient survival, at least in the Brazilian population.
Resumo:
This paper describes the modeling of a weed infestation risk inference system that implements a collaborative inference scheme based on rules extracted from two Bayesian network classifiers. The first Bayesian classifier infers a categorical variable value for the weed-crop competitiveness using as input categorical variables for the total density of weeds and corresponding proportions of narrow and broad-leaved weeds. The inferred categorical variable values for the weed-crop competitiveness along with three other categorical variables extracted from estimated maps for the weed seed production and weed coverage are then used as input for a second Bayesian network classifier to infer categorical variables values for the risk of infestation. Weed biomass and yield loss data samples are used to learn the probability relationship among the nodes of the first and second Bayesian classifiers in a supervised fashion, respectively. For comparison purposes, two types of Bayesian network structures are considered, namely an expert-based Bayesian classifier and a naive Bayes classifier. The inference system focused on the knowledge interpretation by translating a Bayesian classifier into a set of classification rules. The results obtained for the risk inference in a corn-crop field are presented and discussed. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
A novel methodology to assess the risk of power transformer failures caused by external faults, such as short-circuit, taking the paper insulation condition into account, is presented. The risk index is obtained by contrasting the insulation paper condition with the probability that the transformer withstands the short-circuit current flowing along the winding during an external fault. In order to assess the risk, this probability and the value of the degree of polymerization of the insulating paper are regarded as inputs of a type-2 fuzzy logic system (T2-FLS), which computes the fuzzy risk level. A Monte Carlo simulation has been used to find the survival function of the currents flowing through the transformer winding during a single-phase or a three-phase short-circuit. The Roy Billinton Test System and a real power system have been used to test the results. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Although a new protocol of dobutamine stress echocardiography with the early injection of atropine (EA-DSE) has been demonstrated to be useful in reducing adverse effects and increasing the number of effective tests and to have similar accuracy for detecting coronary artery disease (CAD) compared with conventional protocols, no data exist regarding its ability to predict long-term events. The aim of this study was to determine the prognostic value of EA-DSE and the effects of the long-term use of beta blockers on it. A retrospective evaluation of 844 patients who underwent EA-DSE for known or suspected CAD was performed; 309 (37%) were receiving beta blockers. During a median follow-up period of 24 months, 102 events (12%) occurred. On univariate analysis, predictors of events were the ejection fraction (p <0.001), male gender (p <0.001), previous myocardial infarction (p <0.001), angiotensin-converting enzyme inhibitor therapy (p = 0.021), calcium channel blocker therapy (p = 0.034), and abnormal results on EA-DSE (p <0.001). On multivariate analysis, the independent predictors of events were male gender (relative risk [RR] 1.78, 95% confidence interval [CI] 1.13 to 2.81, p = 0.013) and abnormal results on EA-DSE (RR 4.45, 95% CI 2.84 to 7.01, p <0.0001). Normal results on EA-DSE with P blockers were associated with a nonsignificant higher incidence of events than normal results on EA-DSE without beta blockers (RR 1.29, 95% CI 0.58 to 2.87, p = 0.54). Abnormal results on EA-DSE with beta blockers had an RR of 4.97 (95% CI 2.79 to 8.87, p <0.001) compared with normal results, while abnormal results on EA-DSE without beta blockers had an RR of 5.96 (95% CI 3.41 to 10.44, p <0.001) for events, with no difference between groups (p = 0.36). In conclusion, the detection of fixed or inducible wall motion abnormalities during EA-DSE was an independent predictor of long-term events in patients with known or suspected CAD. The prognostic value of EA-DSE was not affected by the long-term use of beta blockers. (C) 2008 Elsevier Inc. All rights reserved. (Am J Cardiol 2008;102:1291-1295)
Resumo:
Background. Many resource-limited countries rely on clinical and immunological monitoring without routine virological monitoring for human immunodeficiency virus (HIV)-infected children receiving highly active antiretroviral therapy (HAART). We assessed whether HIV load had independent predictive value in the presence of immunological and clinical data for the occurrence of new World Health Organization (WHO) stage 3 or 4 events (hereafter, WHO events) among HIV-infected children receiving HAART in Latin America. Methods. The NISDI (Eunice Kennedy Shriver National Institute of Child Health and Human Development International Site Development Initiative) Pediatric Protocol is an observational cohort study designed to describe HIV-related outcomes among infected children. Eligibility criteria for this analysis included perinatal infection, age ! 15 years, and continuous HAART for >= 6 months. Cox proportional hazards modeling was used to assess time to new WHO events as a function of immunological status, viral load, hemoglobin level, and potential confounding variables; laboratory tests repeated during the study were treated as time-varying predictors. Results. The mean duration of follow-up was 2.5 years; new WHO events occurred in 92 (15.8%) of 584 children. In proportional hazards modeling, most recent viral load 15000 copies/mL was associated with a nearly doubled risk of developing a WHO event (adjusted hazard ratio, 1.81; 95% confidence interval, 1.05-3.11; P = 033), even after adjustment for immunological status defined on the basis of CD4 T lymphocyte value, hemoglobin level, age, and body mass index. Conclusions. Routine virological monitoring using the WHO virological failure threshold of 5000 copies/mL adds independent predictive value to immunological and clinical assessments for identification of children receiving HAART who are at risk for significant HIV-related illness. To provide optimal care, periodic virological monitoring should be considered for all settings that provide HAART to children.
Resumo:
Background We validated a strategy for diagnosis of coronary artery disease ( CAD) and prediction of cardiac events in high-risk renal transplant candidates ( at least one of the following: age >= 50 years, diabetes, cardiovascular disease). Methods A diagnosis and risk assessment strategy was used in 228 renal transplant candidates to validate an algorithm. Patients underwent dipyridamole myocardial stress testing and coronary angiography and were followed up until death, renal transplantation, or cardiac events. Results The prevalence of CAD was 47%. Stress testing did not detect significant CAD in 1/3 of patients. The sensitivity, specificity, and positive and negative predictive values of the stress test for detecting CAD were 70, 74, 69, and 71%, respectively. CAD, defined by angiography, was associated with increased probability of cardiac events [log-rank: 0.001; hazard ratio: 1.90, 95% confidence interval (CI): 1.29-2.92]. Diabetes (P=0.03; hazard ratio: 1.58, 95% CI: 1.06-2.45) and angiographically defined CAD (P=0.03; hazard ratio: 1.69, 95% CI: 1.08-2.78) were the independent predictors of events. Conclusion The results validate our observations in a smaller number of high-risk transplant candidates and indicate that stress testing is not appropriate for the diagnosis of CAD or prediction of cardiac events in this group of patients. Coronary angiography was correlated with events but, because less than 50% of patients had significant disease, it seems premature to recommend the test to all high-risk renal transplant candidates. The results suggest that angiography is necessary in many high-risk renal transplant candidates and that better noninvasive methods are still lacking to identify with precision patients who will benefit from invasive procedures. Coron Artery Dis 21: 164-167 (C) 2010 Wolters Kluwer Health vertical bar Lippincott Williams & Wilkins.
Resumo:
Objective: To evaluate physicians` attitudes and adherence to the use of risk scores in the primary prevention of cardiovascular disease (CVD). Design and methods: A cross-sectional survey of 2056 physicians involved in the primary prevention of CVD. Participants included cardiologists (47%), general practitioners (42%), and endocrinologists (11%) from several geographical regions: Brazil (n=968), USA (n=381), Greece (n=275), Chile (n=157), Venezuela (n=128), Portugal (n=42), The Netherlands (n=41), and Central America (Costa Rica, Panama, El Salvador and Guatemala; n=64). Results: The main outcome measure was the percentage of responses on a multiple-choice questionnaire describing a hypothetical asymptomatic patient at intermediate risk for CVD according to the Framingham Risk Score. Only 48% of respondents reported regular use of CVD risk scores to tailor preventive treatment in the case scenario. Of non-users, nearly three-quarters indicated that `It takes up too much of my time` (52%) or `I don`t believe they add value to the clinical evaluation` (21%). Only 56% of respondents indicated that they would prescribe lipid-lowering therapy for the hypothetical intermediate-risk patient. A significantly greater proportion of regular users than non-users of CVD risk scores identified the need for lipid-lowering therapy in the hypothetical patient (59 vs. 41%; p<0.0001).