26 resultados para Risk evaluation
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
BACKGROUND: Cardiovascular disease (CVD) occurs more frequently in individuals with a family history of premature CVD. Within families the demographics of CVD are poorly described. DESIGN: We examined the risk estimation based on the Systematic Coronary Risk Evaluation (SCORE) system and the Joint British Guidelines (JBG) for older unaffected siblings of patients with premature CVD (onset ≤55 years for men and ≤60 years for women). METHODS: Between August 1999 and November 2003 laboratory and demographic details were collected on probands with early-onset CVD and their older unaffected siblings. Siblings were screened for clinically overt CVD by a standard questionnaire and 12-lead electrocardiogram (ECG). RESULTS: A total of 790 siblings was identified and full demographic details were available for 645. The following siblings were excluded: 41 with known diabetes mellitus; seven with random plasma glucose of 11.1 mmol/l or greater; and eight with ischaemic ECG. Data were analysed for 589 siblings from 405 families. The mean age was 55.0 years, 43.1% were men and 28.7% were smokers. The mean total serum cholesterol was 5.8 mmol/l and hypertension was present in 49.4%. Using the SCORE system, when projected to age 60 years, 181 men (71.3%) and 67 women (20.0%) would be eligible for risk factor modification. Using JBG with a 10-year risk of 20% or greater, 42 men (16.5%) and four women (1.2%) would be targeted. CONCLUSIONS: Large numbers of these asymptomatic individuals meet both European and British guidelines for the primary prevention of CVD and should be targeted for risk factor modification. The prevalence of individuals defined as eligible for treatment is much higher when using the SCORE system. © 2007 European Society of Cardiology.
Resumo:
Ancient stone monuments (ASMs), such as standing stones and rock art panels, are powerful and iconic expressions of Britain’s rich prehistoric past that have major economic and tourism value. However, ASMs are under pressure due to increasing anthropogenic exposure and changing climatic conditions, which accelerate their rates of disrepair. Although scientific data exists on the integrity of stone monuments, most applies to “built” systems; therefore, additional work specific to ASMs in the countryside is needed to develop better-informed safeguarding strategies. Here, we use Neolithic and Bronze Age rock art panels across Northern England as a case study for delineating ASM management actions required to enhance monument preservation. The state of the rock art is described first, including factors that led to current conditions. Rock art management approaches then are described within the context of future environments, which models suggest to be more dynamic and locally variable. Finally, a Condition Assessment and Risk Evaluation (CARE) scheme is proposed to help prioritise interventions; an example of which is provided based on stone deterioration at Petra in Jordon. We conclude that more focused scientific and behavioural data, specific to deterioration mechanisms, are required for an ASM CARE scheme to be successful.
Resumo:
The environmental quality of land is often assessed by the calculation of threshold values which aim to differentiate between concentrations of elements based on whether the soils are in residential or industrial sites. In Europe, for example, soil guideline values exist for agricultural and grazing land. A threshold is often set to differentiate between concentrations of the element that naturally occur in the soil and concentrations that result from diffuse anthropogenic sources. Regional geochemistry and, in particular, single component geochemical maps are increasingly being used to determine these baseline environmental assessments. The key question raised in this paper is whether the geochemical map can provide an accurate interpretation on its own. Implicit is the thought that single component geochemical maps represent absolute abundances. However,because of the compositional (closed) nature of the data univariate geochemical maps cannot be compared directly with one another.. As a result, any interpretation based on them is vulnerable to spurious correlation problems. What does this mean for soil geochemistry mapping, baseline quality documentation, soil resource assessment or risk evaluation? Despite the limitation of relative abundances, individual raw geochemical maps are deemed fundamental to several applications of geochemical maps including environmental assessments. However, element toxicity is related to its bioavailable concentration, which is lowered if its source is mixed with another source. Elements interact, for example under reducing conditions with iron oxides, its solid state is lost and arsenic becomes soluble and mobile. Both of these matters may be more adequately dealt with if a single component map is not interpreted in isolation to determine baseline and threshold assessments. A range of alternative compositionally compliant representations based on log-ratio and log-contrast approaches are explored to supplement the classical single component maps for environmental assessment. Case study examples are shown based on the Tellus soil geochemical dataset, covering Northern Ireland and the results of in vitro oral bioaccessibility testing carried out on a sub-set of archived Tellus Survey shallow soils following the Unified BARGE (Bioaccessibility Research Group of Europe).
Resumo:
Abstract OBJECTIVE: Accelerated atherosclerosis and premature coronary heart disease (CHD) are recognized complications of systemic lupus erythematosus (SLE), but the exact etiology remains unclear and is likely to be multifactorial. We hypothesized that SLE patients with CHD have increased exposure to traditional risk factors as well as differing disease phenotype and therapy-related factors compared to SLE patients free of CHD. Our aim was to examine risk factors for development of clinical CHD in SLE in the clinical setting. METHODS: In a UK-wide multicenter retrospective case-control study we recruited 53 SLE patients with verified clinical CHD (myocardial infarction or angina pectoris) and 96 SLE patients without clinical CHD. Controls were recruited from the same center as the case and matched by disease duration. Charts were reviewed up to time of event for cases, or the same "dummy-date" in controls. RESULTS: SLE patients with clinical CHD were older at the time of event [mean (SD) 53 (10) vs 42 (10) yrs; p
Resumo:
BACKGROUND: In this study we aimed to evaluate the role of a SNP in intron I of the ERCC4 gene (rs744154), previously reported to be associated with a reduced risk of breast cancer in the general population, as a breast cancer risk modifier in BRCA1 and BRCA2 mutation carriers.
Resumo:
Background: Serious case reviews and research studies have indicated weaknesses in risk assessments conducted by child protection social workers. Social workers are adept at gathering information but struggle with analysis and assessment of risk. The Department for Education wants to know if the use of a structured decision-making tool can improve child protection assessments of risk.
Methods/design: This multi-site, cluster-randomised trial will assess the effectiveness of the Safeguarding Children Assessment and Analysis Framework (SAAF). This structured decision-making tool aims to improve social workers' assessments of harm, of future risk and parents' capacity to change. The comparison is management as usual.
Inclusion criteria: Children's Services Departments (CSDs) in England willing to make relevant teams available to be randomised, and willing to meet the trial's training and data collection requirements.
Exclusion criteria: CSDs where there were concerns about performance; where a major organisational restructuring was planned or under way; or where other risk assessment tools were in use.
Six CSDs are participating in this study. Social workers in the experimental arm will receive 2 days training in SAAF together with a range of support materials, and access to limited telephone consultation post-training. The primary outcome is child maltreatment. This will be assessed using data collected nationally on two key performance indicators: the first is the number of children in a year who have been subject to a second Child Protection Plan (CPP); the second is the number of re-referrals of children because of related concerns about maltreatment. Secondary outcomes are: i) the quality of assessments judged against a schedule of quality criteria and ii) the relationship between the three assessments required by the structured decision-making tool (level of harm, risk of (re) abuse and prospects for successful intervention).
Discussion: This is the first study to examine the effectiveness of SAAF. It will contribute to a very limited literature on the contribution that structured decision-making tools can make to improving risk assessment and case planning in child protection and on what is involved in their effective implementation.
Resumo:
AIMS/HYPOTHESIS: Recent studies suggest that oxidative stress should be monitored alongside HbA(1c) to identify subgroups of diabetic patients at high risk of initiation or progression of retinopathy. The acrolein-derived advanced lipoxidation end-product (ALE), [Formula: see text]-(3-formyl-3,4-dehydropiperidino)lysine (FDP-lysine), is a useful biomarker that reflects the cumulative burden of oxidative stress over long periods of time. The purpose of the present study was to investigate whether serum and haemoglobin levels of FDP-lysine are associated with the severity of diabetic retinopathy in type 1 and type 2 diabetic patients.
METHODS: Serum and haemoglobin levels of FDP-lysine were measured by competitive ELISA in 59 type 1 and 76 type 2 diabetic patients with no retinopathy, non-proliferative retinopathy or proliferative retinopathy (mean age [+/-SEM] 54.3 +/- 1.3 years), and in 47 non-diabetic control individuals (mean age 51.9 +/- 2.1 years).
RESULTS: Serum and haemoglobin levels of FDP-lysine were significantly increased in diabetic patients compared with control individuals (p = 0.04 and p = 0.002, respectively). However, no significant association was found between levels of serum FDP-lysine and the severity of diabetic retinopathy (p = 0.97). In contrast, increased haemoglobin FDP-lysine levels were observed in patients with proliferative retinopathy compared with patients without retinopathy and with non-proliferative retinopathy (p = 0.04). The relationship of FDP-lysine with proliferative retinopathy was unaltered after adjustment for HbA(1c), or other clinical parameters.
CONCLUSIONS/INTERPRETATION: Our data suggest that haemoglobin FDP-lysine may provide a useful risk marker for the development of proliferative diabetic retinopathy independently of HbA(1c), and that elevated intracellular ALE formation may be involved in the pathogenesis of this sight-threatening complication of diabetes.
Resumo:
BACKGROUND: HIV microbicide trials have emphasized the need to evaluate the safety of topical microbicides and delivery platforms in an animal model prior to conducting clinical efficacy trials. An ideal delivery device should provide sustainable and sufficient concentrations of effective products to prevent HIV transmission while not increasing transmission risk by either local mucosal inflammation and/or disruption of the normal vaginal microflora.
METHODS: Safety analyses of macaque-sized elastomeric silicone and polyurethane intravaginal rings (IVRs) loaded with candidate antiretroviral (ARV) drugs were tested in four studies ranging in duration from 49 to 73 days with retention of the IVR being 28 days in each study. Macaques were assigned to 3 groups; blank IVR, ARV-loaded IVR, and naïve. In sequential studies, the same macaques were used but rotated into different groups. Mucosal and systemic levels of cytokines were measured from vaginal fluids and plasma, respectively, using multiplex technology. Changes in vaginal microflora were also monitored. Statistical analysis (Mann-Whitney test) was used to compare data between two groups of unpaired samples (with and without IVR, and IVR with and without ARV) for the groups collectively, and also for individual macaques.
RESULTS: There were few statistically significant differences in mucosal and systemic cytokine levels measured longitudinally when the ring was present or absent, with or without ARVs. Of the 8 proinflammatory cytokines assayed a significant increase (p = 0.015) was only observed for IL8 in plasma with the blank and ARV loaded IVR (median of 9.2 vs. 5.7 pg/ml in the absence of IVR). There were no significant differences in the prevalence of H2O2-producing lactobacilli or viridans streptococci, or other microorganisms indicative of healthy vaginal microflora. However, there was an increase in the number of anaerobic gram negative rods in the presence of the IVR (p= < 0.0001).
CONCLUSIONS: IVRs with or without ARVs neither significantly induce the majority of potentially harmful proinflammatory cytokines locally or systemically, nor alter the lactobacillus or G. vaginalis levels. The increase in anaerobic gram negative rods alone suggests minimal disruption of normal vaginal microflora. The use of IVRs as a long-term sustained delivery device for ARVs is promising and preclinical studies to demonstrate the prevention of transmission in the HIV/SHIV nonhuman primate model should continue.
Evaluation of Five Interleukin Genes for Association with End-Stage Renal Disease in White Europeans
Resumo:
Background: Genetic variation within interleukin genes has been reported to be associated with end-stage renal disease (ESRD). These findings have not been consistently replicated. No study has yet reported the comprehensive investigation of IL1A, IL1B, IL1RN, IL6 and IL10 genes. Methods: 664 kidney transplant recipients (cases) and 577 kidney donors (controls) were genotyped to establish if common variants in interleukin genes are associated with ESRD. Single nucleotide polymorphism (SNP) genotype data for each gene were downloaded for a northern and western European population from the International HapMap Project. Haploview was used to visualize linkage disequilibrium and select tag SNPs. Thirty SNPs were genotyped using MassARRAY (R) iPLEX Gold technology and data were analyzed using the chi(2) test for trend. Independent replication was conducted in 1,269 individuals with similar phenotypic characteristics. Results: Investigating all common variants in IL1A, IL1B, IL1RN, IL6 and IL10 genes revealed a statistically significant association (rs452204 p(empirical) = 0.02) with one IL1RN variant and ESRD. This IL1RN SNP tags three other variants, none of which have previously been reported to be associated with renal disease. Independent replication in a separate transplant population of comparable size did not confirm the original observation. Conclusions: Common variants in these five candidate interleukin genes are not major risk factors for ESRD in white Europeans. Copyright (C) 2010 S. Karger AG, Basel
Resumo:
We propose two simple evaluation methods for time varying density forecasts of continuous higher dimensional random variables. Both methods are based on the probability integral transformation for unidimensional forecasts. The first method tests multinormal densities and relies on the rotation of the coordinate system. The advantage of the second method is not only its applicability to any continuous distribution but also the evaluation of the forecast accuracy in specific regions of its domain as defined by the user’s interest. We show that the latter property is particularly useful for evaluating a multidimensional generalization of the Value at Risk. In simulations and in an empirical study, we examine the performance of both tests.
Resumo:
Side-channel attacks (SCA) threaten electronic cryptographic devices and can be carried out by monitoring the physical characteristics of security circuits. Differential Power Analysis (DPA) is one the most widely studied side-channel attacks. Numerous countermeasure techniques, such as Random Delay Insertion (RDI), have been proposed to reduce the risk of DPA attacks against cryptographic devices. The RDI technique was first proposed for microprocessors but it was shown to be unsuccessful when implemented on smartcards as it was vulnerable to a variant of the DPA attack known as the Sliding-Window DPA attack.Previous research by the authors investigated the use of the RDI countermeasure for Field Programmable Gate Array (FPGA) based cryptographic devices. A split-RDI technique wasproposed to improve the security of the RDI countermeasure. A set of critical parameters wasalso proposed that could be utilized in the design stage to optimize a security algorithm designwith RDI in terms of area, speed and power. The authors also showed that RDI is an efficientcountermeasure technique on FPGA in comparison to other countermeasures.In this article, a new RDI logic design is proposed that can be used to cost-efficiently implementRDI on FPGA devices. Sliding-Window DPA and realignment attacks, which were shown to beeffective against RDI implemented on smartcard devices, are performed on the improved RDIFPGA implementation. We demonstrate that these attacks are unsuccessful and we also proposea realignment technique that can be used to demonstrate the weakness of RDI implementations.
Resumo:
OBJECTIVES: To determine effective and efficient monitoring criteria for ocular hypertension [raised intraocular pressure (IOP)] through (i) identification and validation of glaucoma risk prediction models; and (ii) development of models to determine optimal surveillance pathways.
DESIGN: A discrete event simulation economic modelling evaluation. Data from systematic reviews of risk prediction models and agreement between tonometers, secondary analyses of existing datasets (to validate identified risk models and determine optimal monitoring criteria) and public preferences were used to structure and populate the economic model.
SETTING: Primary and secondary care.
PARTICIPANTS: Adults with ocular hypertension (IOP > 21 mmHg) and the public (surveillance preferences).
INTERVENTIONS: We compared five pathways: two based on National Institute for Health and Clinical Excellence (NICE) guidelines with monitoring interval and treatment depending on initial risk stratification, 'NICE intensive' (4-monthly to annual monitoring) and 'NICE conservative' (6-monthly to biennial monitoring); two pathways, differing in location (hospital and community), with monitoring biennially and treatment initiated for a ≥ 6% 5-year glaucoma risk; and a 'treat all' pathway involving treatment with a prostaglandin analogue if IOP > 21 mmHg and IOP measured annually in the community.
MAIN OUTCOME MEASURES: Glaucoma cases detected; tonometer agreement; public preferences; costs; willingness to pay and quality-adjusted life-years (QALYs).
RESULTS: The best available glaucoma risk prediction model estimated the 5-year risk based on age and ocular predictors (IOP, central corneal thickness, optic nerve damage and index of visual field status). Taking the average of two IOP readings, by tonometry, true change was detected at two years. Sizeable measurement variability was noted between tonometers. There was a general public preference for monitoring; good communication and understanding of the process predicted service value. 'Treat all' was the least costly and 'NICE intensive' the most costly pathway. Biennial monitoring reduced the number of cases of glaucoma conversion compared with a 'treat all' pathway and provided more QALYs, but the incremental cost-effectiveness ratio (ICER) was considerably more than £30,000. The 'NICE intensive' pathway also avoided glaucoma conversion, but NICE-based pathways were either dominated (more costly and less effective) by biennial hospital monitoring or had a ICERs > £30,000. Results were not sensitive to the risk threshold for initiating surveillance but were sensitive to the risk threshold for initiating treatment, NHS costs and treatment adherence.
LIMITATIONS: Optimal monitoring intervals were based on IOP data. There were insufficient data to determine the optimal frequency of measurement of the visual field or optic nerve head for identification of glaucoma. The economic modelling took a 20-year time horizon which may be insufficient to capture long-term benefits. Sensitivity analyses may not fully capture the uncertainty surrounding parameter estimates.
CONCLUSIONS: For confirmed ocular hypertension, findings suggest that there is no clear benefit from intensive monitoring. Consideration of the patient experience is important. A cohort study is recommended to provide data to refine the glaucoma risk prediction model, determine the optimum type and frequency of serial glaucoma tests and estimate costs and patient preferences for monitoring and treatment.
FUNDING: The National Institute for Health Research Health Technology Assessment Programme.
Resumo:
Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3-4% in 40 year olds with a screening interval of 10 years to approach cost-effectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty, in particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not cost-effective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing. © Queen's Printer and Controller of HMSO 2007. All rights reserved.