768 resultados para Cloud Computing, Risk Assessment, Security, Framework
Resumo:
Over the past two decades, soil ecotoxicologists have made strides in utilizing the basic concepts and advancements in soil zoology and ecology. They have applied the existing tools, and developed new ones to investigate how chemical contamination can affect soil ecosystems, including the degradation or destruction of soil quality and habitats or the diminishment of belowground biodiversity. Soil ecotoxicologists are applying a suite of standard protocols, originally developed as laboratory tests with single chemicals (e.g., pesticides), and further enhancing both the approaches and protocols for the assessment of contaminated lands. However, ecological relevance of some approaches remains unresolved. The authors discuss the main challenges for a coherent ecotoxicological assessment of soil ecosystems amid contaminated lands, and provide recommendations on how to integrate the effects of physical and chemical soil properties, the variations in the diversity of soil invertebrates, and the interactions among organisms of various trophic levels. The review examines new international approaches and test methods using examples from three continents (in particular research conducted in Brazil), and provides recommendations for improving ecological relevance of ecotoxicological investigations of contaminated lands.
Resumo:
One of the characteristics of the finite risk reinsurance is the existence of an found of experience, which is constituted by the premiums charged by the reinsurer, together with his financial incomes, and his objective is to finance the claims to be satisfied to the insurer in the specified period. The objective of this work is to design a model that allows us to determinate the reserve that the found of experience should have in every annual period in order to guarantee its dynamic solvency, taking into the experience of the claims of the reinsurer"s portfolio and of each insurance company.
Resumo:
OBJECTIVE: To compare the predictive accuracy of the original and recalibrated Framingham risk function on current morbidity from coronary heart disease (CHD) and mortality data from the Swiss population. METHODS: Data from the CoLaus study, a cross-sectional, population-based study conducted between 2003 and 2006 on 5,773 participants aged 35-74 without CHD were used to recalibrate the Framingham risk function. The predicted number of events from each risk function were compared with those issued from local MONICA incidence rates and official mortality data from Switzerland. RESULTS: With the original risk function, 57.3%, 21.2%, 16.4% and 5.1% of men and 94.9%, 3.8%, 1.2% and 0.1% of women were at very low (<6%), low (6-10%), intermediate (10-20%) and high (>20%) risk, respectively. With the recalibrated risk function, the corresponding values were 84.7%, 10.3%, 4.3% and 0.6% in men and 99.5%, 0.4%, 0.0% and 0.1% in women, respectively. The number of CHD events over 10 years predicted by the original Framingham risk function was 2-3 fold higher than predicted by mortality+case fatality or by MONICA incidence rates (men: 191 vs. 92 and 51 events, respectively). The recalibrated risk function provided more reasonable estimates, albeit slightly overestimated (92 events, 5-95th percentile: 26-223 events); sensitivity analyses showed that the magnitude of the overestimation was between 0.4 and 2.2 in men, and 0.7 and 3.3 in women. CONCLUSION: The recalibrated Framingham risk function provides a reasonable alternative to assess CHD risk in men, but not in women.
Resumo:
OBJECTIVES: The purpose of this study was to assess whether metabolic syndrome (MetSyn) predicts a higher risk for cardiovascular events in older adults. BACKGROUND: The importance of MetSyn as a risk factor has not previously focused on older adults and deserves further study. METHODS: We studied the impact of MetSyn (38% prevalence) on outcomes in 3,035 participants in the Health, Aging, and Body Composition (Health ABC) study (51% women, 42% black, ages 70 to 79 years). RESULTS: During a 6-year follow-up, there were 434 deaths overall, 472 coronary events (CE), 213 myocardial infarctions (MI), and 231 heart failure (HF) hospital stays; 59% of the subjects had at least one hospital stay. Coronary events, MI, HF, and overall hospital stays occurred significantly more in subjects with MetSyn (19.9% vs. 12.9% for CE, 9.1% vs. 5.7% for MI, 10.0% vs. 6.1% for HF, and 63.1% vs. 56.1% for overall hospital stay; all p < 0.001). No significant differences in overall mortality was seen; however, there was a trend toward higher cardiovascular mortality (5.1% vs. 3.8%, p = 0.067) and coronary mortality (4.5% vs. 3.2%, p = 0.051) in patients with MetSyn. After adjusting for baseline characteristics, patients with MetSyn were at a significantly higher risk for CE (hazard ratio [HR] 1.56, 95% confidence interval [CI] 1.28 to 1.91), MI (HR 1.51, 95% CI 1.12 to 2.05), and HF hospital stay (HR 1.49, 95% CI 1.10 to 2.00). Women and whites with MetSyn had a higher coronary mortality rate. The CE rate was higher among subjects with diabetes and with MetSyn; those with both had the highest risk. CONCLUSIONS: Overall, subjects over 70 years are at high risk for cardiovascular events; MetSyn in this group is associated with a significantly greater risk.
Resumo:
Markets, in the real world, are not efficient zero-sum games where hypotheses of the CAPM are fulfilled. Then, it is easy to conclude the market portfolio is not located on Markowitz"s efficient frontier, and passive investments (and indexing) are not optimal but biased. In this paper, we define and analyze biases suffered by passive investors: the sample, construction, efficiency and active biases and tracking error are presented. We propose Minimum Risk Indices (MRI) as an alternative to deal with to market index biases, and to provide investors with portfolios closer to the efficient frontier, that is, more optimal investment possibilities. MRI (using a Parametric Value-at-Risk Minimization approach) are calculated for three stock markets achieving interesting results. Our indices are less risky and more profitable than current Market Indices in the Argentinean and Spanish markets, facing that way the Efficient Market Hypothesis. Two innovations must be outlined: an error dimension has been included in the backtesting and the Sharpe"s Ratio has been used to select the"best" MRI
Resumo:
OBJECTIVE: This study aims to assess the predictive value of residual venous obstruction (RVO) for recurrent venous thrombo-embolism (VTE) in a study using D-dimer to predict outcome. DESIGN: This is a multicentre randomised open-label study. METHODS: Patients with a first episode of idiopathic VTE were enrolled on the day of anticoagulation discontinuation when RVO was determined by compression ultrasonography in those with proximal deep vein thrombosis (DVT) of the lower limbs. D-dimer was measured after 1 month. Patients with normal D-dimer did not resume anticoagulation while patients with abnormal D-dimer were randomised to resume anticoagulation or not. The primary outcome measure was recurrent VTE over an 18-month follow-up. RESULTS: A total of 490 DVT patients were analysed (after excluding 19 for different reasons and 118 for isolated pulmonary embolism (PE)). Recurrent DVT occurred in 19% (19/99) of patients with abnormal D-dimer who did not resume anticoagulation and 10% (31/310) in subjects with normal D-dimer (adjusted hazard ratio: 2.1; p = 0.02). Recurrences were similar in subjects either with (11%, 17/151) or without RVO (13%, 32/246). Recurrent DVT rates were also similar for normal D-dimer, with or without RVO, and for abnormal D-dimer, with or without RVO. CONCLUSIONS: Elevated D-dimer at 1 month after anticoagulation withdrawal is a risk factor for recurrence, while RVO at the time of anticoagulation withdrawal is not.
Resumo:
This paper addresses primary care physicians, cardiologists, internists, angiologists and doctors desirous of improving vascular risk prediction in primary care. Many cardiovascular risk factors act aggressively on the arterial wall and result in atherosclerosis and atherothrombosis. Cardiovascular prognosis derived from ultrasound imaging is, however, excellent in subjects without formation of intimal thickening or atheromas. Since ultrasound visualises the arterial wall directly, the information derived from the arterial wall may add independent incremental information to the knowledge of risk derived from global risk assessment. This paper provides an overview on plaque imaging for vascular risk prediction in two parts: Part 1: Carotid IMT is frequently used as a surrogate marker for outcome in intervention studies addressing rather large cohorts of subjects. Carotid IMT as a risk prediction tool for the prevention of acute myocardial infarction and stroke has been extensively studied in many patients since 1987, and has yielded incremental hazard ratios for these cardiovascular events independently of established cardiovascular risk factors. However, carotid IMT measurements are not used uniformly and therefore still lack widely accepted standardisation. Hence, at an individual, practicebased level, carotid IMT is not recommended as a risk assessment tool. The total plaque area of the carotid arteries (TPA) is a measure of the global plaque burden within both carotid arteries. It was recently shown in a large Norwegian cohort involving over 6000 subjects that TPA is a very good predictor for future myocardial infarction in women with an area under the curve (AUC) using a receiver operating curves (ROC) value of 0.73 (in men: 0.63). Further, the AUC for risk prediction is high both for vascular death in a vascular prevention clinic group (AUC 0.77) and fatal or nonfatal myocardial infarction in a true primary care group (AUC 0.79). Since TPA has acceptable reproducibility, allows calculation of posttest risk and is easily obtained at low cost, this risk assessment tool may come in for more widespread use in the future and also serve as a tool for atherosclerosis tracking and guidance for intensity of preventive therapy. However, more studies with TPA are needed. Part 2: Carotid and femoral plaque formation as detected by ultrasound offers a global view of the extent of atherosclerosis. Several prospective cohort studies have shown that cardiovascular risk prediction is greater for plaques than for carotid IMT. The number of arterial beds affected by significant atheromas may simply be added numerically to derive additional information on the risk of vascular events. A new atherosclerosis burden score (ABS) simply calculates the sum of carotid and femoral plaques encountered during ultrasound scanning. ABS correlates well and independently with the presence of coronary atherosclerosis and stenosis as measured by invasive coronary angiogram. However, the prognostic power of ABS as an independent marker of risk still needs to be elucidated in prospective studies. In summary, the large number of ways to measure atherosclerosis and related changes in human arteries by ultrasound indicates that this technology is not yet sufficiently perfected and needs more standardisation and workup on clearly defined outcome studies before it can be recommended as a practice-based additional risk modifier.
Resumo:
The authors pooled data from 15 case-control studies of head and neck cancer (9,107 cases, 14,219 controls) to investigate the independent associations with consumption of beer, wine, and liquor. In particular, they calculated associations with different measures of beverage consumption separately for subjects who drank beer only (858 cases, 986 controls), for liquor-only drinkers (499 cases, 527 controls), and for wine-only drinkers (1,021 cases, 2,460 controls), with alcohol never drinkers (1,124 cases, 3,487 controls) used as a common reference group. The authors observed similar associations with ethanol-standardized consumption frequency for beer-only drinkers (odds ratios (ORs) = 1.6, 1.9, 2.2, and 5.4 for < or =5, 6-15, 16-30, and >30 drinks per week, respectively; P(trend) < 0.0001) and liquor-only drinkers (ORs = 1.6, 1.5, 2.3, and 3.6; P < 0.0001). Among wine-only drinkers, the odds ratios for moderate levels of consumption frequency approached the null, whereas those for higher consumption levels were comparable to those of drinkers of other beverage types (ORs = 1.1, 1.2, 1.9, and 6.3; P < 0.0001). Study findings suggest that the relative risks of head and neck cancer for beer and liquor are comparable. The authors observed weaker associations with moderate wine consumption, although they cannot rule out confounding from diet and other lifestyle factors as an explanation for this finding. Given the presence of heterogeneity in study-specific results, their findings should be interpreted with caution.
Resumo:
Using a large prospective cohort of over 12,000 women, we determined 2 thresholds (high risk and low risk of hip fracture) to use in a 10-yr hip fracture probability model that we had previously described, a model combining the heel stiffness index measured by quantitative ultrasound (QUS) and a set of easily determined clinical risk factors (CRFs). The model identified a higher percentage of women with fractures as high risk than a previously reported risk score that combined QUS and CRF. In addition, it categorized women in a way that was quite consistent with the categorization that occurred using dual X-ray absorptiometry (DXA) and the World Health Organization (WHO) classification system; the 2 methods identified similar percentages of women with and without fractures in each of their 3 categories, but the 2 identified only in part the same women. Nevertheless, combining our composite probability model with DXA in a case findings strategy will likely further improve the detection of women at high risk of fragility hip fracture. We conclude that the currently proposed model may be of some use as an alternative to the WHO classification criteria for osteoporosis, at least when access to DXA is limited.
Resumo:
BACKGROUND: Only few countries have cohorts enabling specific and up-to-date cardiovascular disease (CVD) risk estimation. Individual risk assessment based on study samples that differ too much from the target population could jeopardize the benefit of risk charts in general practice. Our aim was to provide up-to-date and valid CVD risk estimation for a Swiss population using a novel record linkage approach. METHODS: Anonymous record linkage was used to follow-up (for mortality, until 2008) 9,853 men and women aged 25-74 years who participated in the Swiss MONICA (MONItoring of trends and determinants in CVD) study of 1983-92. The linkage success was 97.8%, loss to follow-up 1990-2000 was 4.7%. Based on the ESC SCORE methodology (Weibull regression), we used age, sex, blood pressure, smoking, and cholesterol to generate three models. We compared the 1) original SCORE model with a 2) recalibrated and a 3) new model using the Brier score (BS) and cross-validation. RESULTS: Based on the cross-validated BS, the new model (BS = 14107×10(-6)) was somewhat more appropriate for risk estimation than the original (BS = 14190×10(-6)) and the recalibrated (BS = 14172×10(-6)) model. Particularly at younger age, derived absolute risks were consistently lower than those from the original and the recalibrated model which was mainly due to a smaller impact of total cholesterol. CONCLUSION: Using record linkage of observational and routine data is an efficient procedure to obtain valid and up-to-date CVD risk estimates for a specific population.
Resumo:
Detall d'una implementació d'un sistema de computació al núvol a baixa escala i a nivell corporatiu.
Resumo:
Tutkimuksen päätavoitteena olianalysoida Imatran kaupungin konserniohjausta ja -valvontaa tytäryhtiöihin liittyvien riskien hallitsemiseksi. Tavoitteen saavuttamiseksi tutkittiin tytäryhtiöiden konserniohjausta ja sisäistä valvontaa sekä tytäryhtiöihin liittyviä riskejä Imatran kaupunkikonsernissa. Jotta tutkimustavoite saavutettiin alatavoitteina selvitettiin konsernin, konserniohjauksen, sisäisen valvonnan, riskin ja riskienhallinnan käsitteitä sekä yleisesti että kuntaympäristössä. Lisäksi selvitettiin tutkimuksen teoreettisena viitekehyksenä käytettävää COSO-mallia. Tutkimusaineisto jakautui teoreettiseen ja empiiriseen aineistoon. Tutkimuksen teoreettista osuutta varten käytettiinaiheeseen liittyvää kirjallisuutta ja tutkimuksia. Empiirinen aineisto kerättiin haastatteluin ja organisaation tuottamia dokumentteja tutkien. Haastattelut toteutettiin teemahaastatteluina. Tutkimusstrategialtaan tutkimus oli tapaustutkimus (case study), joka tutkii olemassa olevan Imatra-konsernintytäryhtiöiden konserniohjausta ja - valvontaa. Tutkimuksessa analysoitiin yksityiskohtaista ja intensiivistä tietoa konsernin emon näkökulmasta. Todellista elämää kuvaavana tutkimus voidaan luokitella myös kvalitatiiviseksi eli laadulliseksi tutkimukseksi. Tutkimuksen perusteella voin todeta, että kaupunginhallituksen asettamat päämäärät ja tavoitteet ovat Tiian yleisiä ja jäävät konsernitasolle, joten niiden ohjausvaikutus tytäryhtiöihin ei ole kovin suuri. Tytäryhtiöiden heikosta tavoitteiden muotoilusta johtuen myös tytäryhtiöidentarkoituksenmukaisuuden ja tehokkuuden valvonta on vähäistä. Sen sijaan tytäryhteisöjen taloudellisten raporttien luotettavuus, lakien ja sääntöjen noudattaminen tulee valvottua tilintarkastusten yhteydessä. Lisäksi sisäisen tarkastuksen resurssien vähäisyyden vuoksi yksikön suorittama tytäryhtiöiden sisäisen valvonnan ja riskienhallinnan arviointi on olematonta. Myös riskienhallinnan kannalta tavoitteiden asettaminen on ollut liian suurpiirteistä ja irrallaan tytäryhtiöiden toiminnasta. Lisäksi haastattelujen mukaan sekä riskikäsite että riskienhallinnan käsite nähdään aivan liian suppeina. Ohjauksen, valvonnan sekä riskienhallinnan kannalta olisi tärkeää, että konsernin tavoitteiden asettamisen jälkeen myös tytäryhtiöille asetettaisiin selkeät mitattavat tavoitteet, joilla on yhteys kaupungin strategioihin. Tulisi myös varmistaa, että konsernissa ja sen tytäryhtiöissä on jatkuva riskien tunnistamistoiminto. Suositeltavaa on, että Imatra-konsernissa käynnistetään kokonaisvaltainen riskienhallintaprosessi.
Resumo:
The aim was to examine the capacity of commonly used type 2 diabetes mellitus (T2DM) risk scores to predict overall mortality. The US-based NHANES III (n = 3138; 982 deaths) and the Swiss-based CoLaus study (n = 3946; 191 deaths) were used. The predictive value of eight T2DM risk scores regarding overall mortality was tested. The Griffin score, based on few self-reported parameters, presented the best (NHANES III) and second best (CoLaus) predictive capacity. Generally, the predictive capacity of scores based on clinical (anthropometrics, lifestyle, history) and biological (blood parameters) data was not better than of scores based solely on clinical self-reported data. T2DM scores can be validly used to predict mortality risk in general populations without diabetes. Comparison with other scores could further show whether such scores also suit as a screening tool for quick overall health risk assessment.
Resumo:
Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART.
Resumo:
Therapeutic nanoparticles (NPs) are used in nanomedicine as drug carriers or imaging agents, providing increased selectivity/specificity for diseased tissues. The first NPs in nanomedicine were developed for increasing the efficacy of known drugs displaying dose-limiting toxicity and poor bioavailability and for enhancing disease detection. Nanotechnologies have gained much interest owing to their huge potential for applications in industry and medicine. It is necessary to ensure and control the biocompatibility of the components of therapeutic NPs to guarantee that intrinsic toxicity does not overtake the benefits. In addition to monitoring their toxicity in vitro, in vivo and in silico, it is also necessary to understand their distribution in the human body, their biodegradation and excretion routes and dispersion in the environment. Therefore, a deep understanding of their interactions with living tissues and of their possible effects in the human (and animal) body is required for the safe use of nanoparticulate formulations. Obtaining this information was the main aim of the NanoTEST project, and the goals of the reports collected together in this special issue are to summarise the observations and results obtained by the participating research teams and to provide methodological tools for evaluating the biological impact of NPs.