961 resultados para Evaluate Risk


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regular vine copulas are multivariate dependence models constructed from pair-copulas (bivariate copulas). In this paper, we allow the dependence parameters of the pair-copulas in a D-vine decomposition to be potentially time-varying, following a nonlinear restricted ARMA(1,m) process, in order to obtain a very flexible dependence model for applications to multivariate financial return data. We investigate the dependence among the broad stock market indexes from Germany (DAX), France (CAC 40), Britain (FTSE 100), the United States (S&P 500) and Brazil (IBOVESPA) both in a crisis and in a non-crisis period. We find evidence of stronger dependence among the indexes in bear markets. Surprisingly, though, the dynamic D-vine copula indicates the occurrence of a sharp decrease in dependence between the indexes FTSE and CAC in the beginning of 2011, and also between CAC and DAX during mid-2011 and in the beginning of 2008, suggesting the absence of contagion in these cases. We also evaluate the dynamic D-vine copula with respect to Value-at-Risk (VaR) forecasting accuracy in crisis periods. The dynamic D-vine outperforms the static D-vine in terms of predictive accuracy for our real data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To determine the occurrence of delirium in oncology inpatients and to identify and evaluate admission characteristics associated with the development of delirium during inpatient admission, a prospective observational study was conducted of H 3 patients with a total of 145 admissions with histological diagnosis of cancer admitted to the oncology unit over a period of ten weeks. At the point of inpatient admission, all patients were assessed for the presence of potential risk factors for development of delirium. During the index admission patients were assessed daily for the presence of delirium using the Confusion Assessment Method. Delirium was confirmed by clinician assessment. Delirium developed in 26 of 145 admissions (18%) and 32 episodes of delirium were recorded with 6 patients having 2 episodes of delirium during the index admission. Delirium occurred on average 3.3 days into the admission. The average duration of an episode of delirium was 2.1 day. Four patients with delirium (15%) died. All other cases of delirium were reversed. Factors significantly associated with development of delirium on multivariate analysis were: advanced age, cognitive impairment, low albumin level, bone metastases, and the presence of hematological malignancy. Hospital inpatient admission was significantly longer in delirium group (mean: 8.8 days vs 4.5 days in nondelirium group, P

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the newly developed international diversification instruments–iShares traded on the American Stock Exchange. Given the fact that iShares can be created and redeemed at will, the daily price of an iShare is expected to be equal to the daily portfolio value of the underlying assets in the home-country market. Therefore, theoretically, iShare pricing should be influenced by the risk from the iShare's home-country market and not the risk from the US market, per se. We evaluate the risk exposure of iShare prices to the US market (non-fundamental effect) as well as the home-country market (the fundamental effect). We find that most iShare returns are significantly influenced by and sensitive to the US market risk. Moreover, the US market appears to be the key permanent driving factor and the home-country market is a pronounced transitory driving force for iShare prices. These findings indicate the presence of limits of international arbitrage for iShares. As a result, the international diversification benefits of iShares become questionable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable, comparable information about the main causes of disease and injury in populations, and how these are changing, is a critical input for debates about priorities in the health sector. Traditional sources of information about the descriptive epidemiology of diseases, injuries and risk factors are generally incomplete, fragmented and of uncertain reliability and comparability. Lack of a standardized measurement framework to permit comparisons across diseases and injuries, as well as risk factors, and failure to systematically evaluate data quality have impeded comparative analyses of the true public health importance of various conditions and risk factors. As a consequence the impact of major conditions and hazards on population health has been poorly appreciated, often leading to a lack of public health investment. Global disease and risk factor quantification improved dramatically in the early 1990s with the completion of the first Global Burden of Disease Study. For the first time, the comparative importance of over 100 diseases and injuries, and ten major risk factors, for global and regional health status could be assessed using a common metric (Disability-Adjusted Life Years) which simultaneously accounted for both premature mortality and the prevalence, duration and severity of the non-fatal consequences of disease and injury. As a consequence, mental health conditions and injuries, for which non-fatal outcomes are of particular significance, were identified as being among the leading causes of disease/injury burden worldwide, with clear implications for policy, particularly prevention. A major achievement of the Study was the complete global descriptive epidemiology, including incidence, prevalence and mortality, by age, sex and Region, of over 100 diseases and injuries. National applications, further methodological research and an increase in data availability have led to improved national, regional and global estimates for 2000, but substantial uncertainty around the disease burden caused by major conditions, including, HIV, remains. The rapid implementation of cost-effective data collection systems in developing countries is a key priority if global public policy to promote health is to be more effectively informed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the effect of using different risk calculation tools on how general practitioners and practice nurses evaluate the risk of coronary heart disease with clinical data routinely available in patients' records. DESIGN: Subjective estimates of the risk of coronary heart disease and results of four different methods of calculation of risk were compared with each other and a reference standard that had been calculated with the Framingham equation; calculations were based on a sample of patients' records, randomly selected from groups at risk of coronary heart disease. SETTING: General practices in central England. PARTICIPANTS: 18 general practitioners and 18 practice nurses. MAIN OUTCOME MEASURES: Agreement of results of risk estimation and risk calculation with reference calculation; agreement of general practitioners with practice nurses; sensitivity and specificity of the different methods of risk calculation to detect patients at high or low risk of coronary heart disease. RESULTS: Only a minority of patients' records contained all of the risk factors required for the formal calculation of the risk of coronary heart disease (concentrations of high density lipoprotein (HDL) cholesterol were present in only 21%). Agreement of risk calculations with the reference standard was moderate (kappa=0.33-0.65 for practice nurses and 0.33 to 0.65 for general practitioners, depending on calculation tool), showing a trend for underestimation of risk. Moderate agreement was seen between the risks calculated by general practitioners and practice nurses for the same patients (kappa=0.47 to 0.58). The British charts gave the most sensitive results for risk of coronary heart disease (practice nurses 79%, general practitioners 80%), and it also gave the most specific results for practice nurses (100%), whereas the Sheffield table was the most specific method for general practitioners (89%). CONCLUSIONS: Routine calculation of the risk of coronary heart disease in primary care is hampered by poor availability of data on risk factors. General practitioners and practice nurses are able to evaluate the risk of coronary heart disease with only moderate accuracy. Data about risk factors need to be collected systematically, to allow the use of the most appropriate calculation tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research was conducted at the Space Research and Technology Centre o the European Space Agency at Noordvijk in the Netherlands. ESA is an international organisation that brings together a range of scientists, engineers and managers from 14 European member states. The motivation for the work was to enable decision-makers, in a culturally and technologically diverse organisation, to share information for the purpose of making decisions that are well informed about the risk-related aspects of the situations they seek to address. The research examined the use of decision support system DSS) technology to facilitate decision-making of this type. This involved identifying the technology available and its application to risk management. Decision-making is a complex activity that does not lend itself to exact measurement or precise understanding at a detailed level. In view of this, a prototype DSS was developed through which to understand the practical issues to be accommodated and to evaluate alternative approaches to supporting decision-making of this type. The problem of measuring the effect upon the quality of decisions has been approached through expert evaluation of the software developed. The practical orientation of this work was informed by a review of the relevant literature in decision-making, risk management, decision support and information technology. Communication and information technology unite the major the,es of this work. This allows correlation of the interests of the research with European public policy. The principles of communication were also considered in the topic of information visualisation - this emerging technology exploits flexible modes of human computer interaction (HCI) to improve the cognition of complex data. Risk management is itself an area characterised by complexity and risk visualisation is advocated for application in this field of endeavour. The thesis provides recommendations for future work in the fields of decision=making, DSS technology and risk management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyzed clinical and instrumental data of 403 consecutive newborns with gestational age from 24 to 32 weeks, admitted to the University-Hospital of Parma between January 2000 and December 2007, to evaluate the possible relationship between neonatal mortality and occurrence of neonatal seizures in very preterm newborns. Seventy-four subjects died during hospital stay. Seizures were present in 35 neonates, in whom the mortality rate was 37.1%. Multivariate analysis revealed that birth-weight

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Childhood obesity is a major health issue with associated ill-health consequences during childhood and into later adolescence and adulthood. Given that eating behaviors are formed during early childhood, it is important to evaluate the relationships between early life feeding practices and later child adiposity. This review describes and evaluates recent literature exploring associations between child weight and the mode of milk feeding, the age of introducing solid foods and caregivers’ solid feeding practices. There are many inconsistencies in the literature linking early life feeding to later obesity risk and discrepancies may be related to inconsistent definitions, or a lack of control for confounding variables. This review summarizes the literature in this area and identifies the need for large scale longitudinal studies to effectively explore how early life feeding experiences may interact with each other and with nutritional provision during later childhood to predict obesity risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Needleless connectors are being increasingly used for direct access to intravascular catheters. However, the potential for microbial contamination of these devices and subsequent infection risk is still widely debated. In this study the microbial contamination rate associated with three-way stopcock luers with standard caps attached was compared to those with Y-type extension set luers with Clearlink® needleless connectors attached. Fifty patients undergoing cardiothoracic surgery who required a central venous catheter (CVC) as part of their peri- and postoperative management were studied for microbial contamination of CVC luers following 72 hrs in situ. Each patient's CVC was randomly designated to have either the three-way stopcocks with caps (control patients) or Clearlink® Y-type extension sets (test patients). Prior to, and following each manipulation of the three-way stopcock luers or Clearlink® devices, a 70% (v/v) isopropyl alcohol swab was used for disinfection of the connections. The microbial contamination of 393 luers, 200 with standard caps and 193 with Clearlink® attached, was determined. The internal surfaces of 20 of 200 (10%) three-way stopcock luers with standard caps were contaminated with micro-organisms whereas only one of 193 (0.5%) luers with Clearlink® attached was contaminated (P < 0.0001). These results demonstrate that the use of the Clearlink® device with a dedicated disinfection regimen reduces the internal microbial contamination rate of CVC luers compared with standard caps. The use of such needle-free devices may therefore reduce the intraluminal risk of catheter-related bloodstream infection and thereby supplement current preventive guidelines. © 2006 The Hospital Infection Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective - During pregnancy, the human cervix undergoes angiogenic transformations. VEGF is expressed in cervical stroma and is proposed to play key roles in the process of cervical ripening and dilation. This study was conducted to evaluate whether cervical secretion of VEGF can be of clinical value in predicting impending PTB. Study Design - In an observational prospective cohort study, we analyzed cervical fluid samples from 103 pregnant women (GA: median [IQR]: 28 [25-31] wks) who presented for either a routine prenatal visit (n=61) or for evaluation of threatened preterm labor (n=42). Cervical secretions were collected under a standard protocol which was followed in all cases. Cervical length (CL) was assessed by transvaginal ultrasound using well-established criteria. Dilation was evaluated by digital exam performed only after collection of the biological samples. VEGF levels were immunoassayed by investigators unaware of the clinical outcome. Main exclusion criteria were ruptured membranes, active labor, vaginal bleeding, vaginal exam or intercourse within 24h. Results were analyzed with and without normalization for total protein. Results - 1) Clinical characteristics of the cohort are presented in Table;2) VEGF was detectable in all specimens, with no correlation between its levels, CL, twins or GA at collection; 3) There was an inverse correlation between VEGF and cervical dilation (R=-0.646, P=0.003); 4) Women with cervical dilation =1 cm had lower VEGF compared to those with a closed cervix (P=0.003); 5) Women who experienced PTB within 14 days (n=11) had lower VEGF (P=0.003); 6) A free VEGF level of =600 pg/mL had a sensitivity, specificity, +LR and -LR of 70%, 95%, 13.5 and 0.3, respectively in predicting PTB within 14 days. Conclusions - Low VEGF levels in the cervicovaginal secretions of pregnant women are associated with an increased risk of PTB within 2 weeks of collection. Active engagement of VEGF in the process of cervical ripening and dilatation and/or increased affinity of extracellular matrix components for VEGF may provide explanation for our findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To to evaluate the benefit of bilinear and linear fitting to characterize the retinal vessel dilation to flicker light stimulation for the purpose of risk stratification in cardiovascular disease. Methods: Forty-five patients (15 with coronary artery disease (CAD), 15 with Diabetes Mellitus (DM) and 15 with CAD and DM) all underwent contact tonometry, digital blood pressure measurement, fundus photography, retinal vessel oximetry, static retinal vessel analysis and continous retinal diameter assessment using the retinal vessel analyser (and flicker light provocation). In addition we measured blood glucose (HbA1c) and keratinin levels in DM patients. Results: With increased severity of cardiovascular disease a more linear reaction profile of retinal arteriolar diameter to flicker light provocation can be observed. Conclusion: Absolute values of vessel dilation provide only limited information on the state of retinal arteriolar dilatory response to flicker light. The approach of bilinear fitting takes into account the immediate response to flicker light provocation as well as the maintained dilatory capacity during prolonged stimulation. Individuals with cardiovascular disease however show a largely linear reaction profile indicating an impairment of the initial rapid dilatory response as usually observed in healty individuals

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies survival analysis techniques dealing with censoring to produce predictive tools that predict the risk of endovascular aortic aneurysm repair (EVAR) re-intervention. Censoring indicates that some patients do not continue follow up, so their outcome class is unknown. Methods dealing with censoring have drawbacks and cannot handle the high censoring of the two EVAR datasets collected. Therefore, this thesis presents a new solution to high censoring by modifying an approach that was incapable of differentiating between risks groups of aortic complications. Feature selection (FS) becomes complicated with censoring. Most survival FS methods depends on Cox's model, however machine learning classifiers (MLC) are preferred. Few methods adopted MLC to perform survival FS, but they cannot be used with high censoring. This thesis proposes two FS methods which use MLC to evaluate features. The two FS methods use the new solution to deal with censoring. They combine factor analysis with greedy stepwise FS search which allows eliminated features to enter the FS process. The first FS method searches for the best neural networks' configuration and subset of features. The second approach combines support vector machines, neural networks, and K nearest neighbor classifiers using simple and weighted majority voting to construct a multiple classifier system (MCS) for improving the performance of individual classifiers. It presents a new hybrid FS process by using MCS as a wrapper method and merging it with the iterated feature ranking filter method to further reduce the features. The proposed techniques outperformed FS methods based on Cox's model such as; Akaike and Bayesian information criteria, and least absolute shrinkage and selector operator in the log-rank test's p-values, sensitivity, and concordance. This proves that the proposed techniques are more powerful in correctly predicting the risk of re-intervention. Consequently, they enable doctors to set patients’ appropriate future observation plan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A kockázat statisztikai értelemben közvetlenül nem mérhető, azaz látens fogalom éppen úgy, mint a gazdasági fejlettség, a szervezettség vagy az intelligencia. Mi bennünk a közös? A kockázat is komplex fogalom, több mérhető tényezőt foglal magában, és bár sok tényezőjét mérjük, fel sem tételezzük, hogy pontos eredményt kapunk. Ebben a megközelítésben az elemző kezdettől fogva tudja, hogy hiányos az ismerete. Ezt Bélyácz [2011[ nyomán úgy is megfogalmazhatjuk: „A statisztikusok tudják, hogy valamit éppen nem tudnak.” / === / From statistical point of view risk, like economic development is a latent concept. Typically there is no one number which can explicitly estimate or project risk. Variance is used as a proxy in finance to measure risk. Other professions are using other concepts for risk. Underwriting is the most important step in insurance business to analyse exposure. Actuaries evaluate average claim size and the probability of claim to calculate risk. Bayesian credibility can be used to calculate insurance premium combining frequencies and empirical knowledge, as a prior. Different types of risks can be classified into a risk matrix to separate insurable risk. Only this category can be analysed by multivariate statistical methods, which are based on statistical data. Sample size and frequency of events are relevant not only in insurance, but in pension and investment decisions as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Venture capitalists can be regarded as financers of young, high-risk enterprises, seeking investments with a high growth potential and offering professional support above and beyond their capital investment. The aim of this study is to analyse the occurrence of information asymmetry between venture capital investors and entrepreneurs, with special regard to the problem of adverse selection. In the course of my empirical research, I conducted in-depth interviews with 10 venture capital investors. The aim of the research was to elicit their opinions about the situation regarding information asymmetry, how they deal with problems arising from adverse selection, and what measures they take to manage these within the investment process. In the interviews we also touched upon how investors evaluate state intervention, and how much they believe company managers are influenced by state support.