87 resultados para risk level
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.
Resumo:
Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.
Resumo:
Human risk taking is characterized by a large amount of individual heterogeneity. In this study, we applied resting-state electroencephalography, which captures stable individual differences in neural activity, before subjects performed a risk-taking task. Using a source-localization technique, we found that the baseline cortical activity in the right prefrontal cortex predicts individual risk-taking behavior. Individuals with higher baseline cortical activity in this brain area display more risk aversion than do other individuals. This finding demonstrates that neural characteristics that are stable over time can predict a highly complex behavior such as risk-taking behavior and furthermore suggests that hypoactivity in the right prefrontal cortex might serve as a dispositional indicator of lower regulatory abilities, which is expressed in greater risk-taking behavior.
Resumo:
Aims: The reported rate of stent thrombosis (ST) after drug-eluting stent (DES) implantation varies among registries. To investigate differences in baseline characteristics and clinical outcome in European and Japanese all-comers registries, we performed a pooled analysis of patient-level data. Methods and results: The j-Cypher registry (JC) is a multicentre observational study conducted in Japan, including 12,824 patients undergoing SES implantation. From the Bern-Rotterdam registry (BR) enrolled at two academic hospitals in Switzerland and the Netherlands, 3,823 patients with SES were included in the current analysis. Patients in BR were younger, more frequently smokers and presented more frequently with ST-elevation myocardial infarction (MI). Conversely, JC patients more frequently had diabetes and hypertension. At five years, the definite ST rate was significantly lower in JC than BR (JC 1.6% vs. BR 3.3%, p<0.001), while the unadjusted mortality tended to be lower in BR than in JC (BR 13.2% vs. JC 14.4%, log-rank p=0.052). After adjustment, the j-Cypher registry was associated with a significantly lower risk of all-cause mortality (HR 0.56, 95% CI: 0.49-0.64) as well as definite stent thrombosis (HR 0.46, 95% CI: 0.35-0.61). Conclusions: The baseline characteristics of the two large registries were different. After statistical adjustment, JC was associated with lower mortality and ST.
Resumo:
OBJECTIVES To assess the clinical profile and long-term mortality in SYNTAX score II based strata of patients who received percutaneous coronary interventions (PCI) in contemporary randomized trials. BACKGROUND The SYNTAX score II was developed in the randomized, all-comers' SYNTAX trial population and is composed by 2 anatomical and 6 clinical variables. The interaction of these variables with the treatment provides individual long-term mortality predictions if a patient undergoes coronary artery bypass grafting (CABG) or PCI. METHODS Patient-level (n=5433) data from 7 contemporary coronary drug-eluting stent (DES) trials were pooled. The mortality for CABG or PCI was estimated for every patient. The difference in mortality estimates for these two revascularization strategies was used to divide the patients into three groups of theoretical treatment recommendations: PCI, CABG or PCI/CABG (the latter means equipoise between CABG and PCI for long term mortality). RESULTS The three groups had marked differences in their baseline characteristics. According to the predicted risk differences, 5115 patients could be treated either by PCI or CABG, 271 should be treated only by PCI and, rarely, CABG (n=47) was recommended. At 3-year follow-up, according to the SYNTAX score II recommendations, patients recommended for CABG had higher mortality compared to the PCI and PCI/CABG groups (17.4%; 6.1% and 5.3%, respectively; P<0.01). CONCLUSIONS The SYNTAX score II demonstrated capability to help in stratifying PCI procedures.
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
isk Management today has moved from being the topic of top level conferences and media discussions to being a permanent issue in the board and top management agenda. Several new directives and regulations in Switzerland, Germany and EU make it obligatory for the firms to have a risk management strategy and transparently disclose the risk management process to their stakeholders. Shareholders, insurance providers, banks, media, analysts, employees, suppliers and other stakeholders expect the board members to be pro-active in knowing the critical risks facing their organization and provide them with a reasonable assurance vis-à-vis the management of those risks. In this environment however, the lack of standards and training opportunities makes this task difficult for board members. This book with the help of real life examples, analysis of drivers, interpretation of the Swiss legal requirements, and information based on international benchmarks tries to reach out to the forward looking leaders of today's businesses. The authors have collectively brought their years of scientific and practical experience in risk management, Swiss law and board memberships together to provide the board members practical solutions in risk management. The desire is that this book will clear the fear regarding risk management from the minds of the company leadership and help them in making risk savvy decisions in quest to achieve their strategic objectives.
Resumo:
The objective of this survey was to determine herd level risk factors for mortality, unwanted early slaughter, and metaphylactic application of antimicrobial group therapy in Swiss veal calves in 2013. A questionnaire regarding farm structure, farm management, mortality and antimicrobial use was sent to all farmers registered in a Swiss label program setting requirements for improved animal welfare and sustainability. Risk factors were determined by multivariable logistic regression. A total of 619 veal producers returned a useable questionnaire (response rate=28.5%), of which 40.9% only fattened their own calves (group O), 56.9% their own calves and additional purchased calves (group O&P), and 2.3% only purchased calves for fattening (group P). A total number of 19,077 calves entered the fattening units in 2013, of which 21.7%, 66.7%, and 11.6% belonged to groups O, O&P, and P, respectively. Mortality was 0% in 322 herds (52.0%), between 0% and 3% in 47 herds (7.6%), and ≥3% in 250 herds (40.4%). Significant risk factors for mortality were purchasing calves, herd size, higher incidence of BRD, and access to an outside pen. Metaphylaxis was used on 13.4% of the farms (7.9% only upon arrival, 4.4% only later in the fattening period, 1.1% upon arrival and later), in 3.2% of the herds of group O, 17.9% of those in group O&P, and 92.9% of those of group P. Application of metaphylaxis upon arrival was positively associated with purchase (OR=8.9) and herd size (OR=1.2 per 10 calves). Metaphylaxis later in the production cycle was positively associated with group size (OR=2.9) and risk of respiratory disease (OR=1.2 per 10% higher risk) and negatively with the use of individual antimicrobial treatment (OR=0.3). In many countries, purchase and a large herd size are inherently connected to veal production. The Swiss situation with large commercial but also smaller herds with little or no purchase of calves made it possible to investigate the effect of these factors on mortality and antimicrobial drug use. The results of this study show that a system where small farms raise the calves from their own herds has a substantial potential to improve animal health and reduce antimicrobial drug use.
Resumo:
BACKGROUND: This study focused on the descriptive analysis of cattle movements and farm-level parameters derived from cattle movements, which are considered to be generically suitable for risk-based surveillance systems in Switzerland for diseases where animal movements constitute an important risk pathway. METHODS: A framework was developed to select farms for surveillance based on a risk score summarizing 5 parameters. The proposed framework was validated using data from the bovine viral diarrhoea (BVD) surveillance programme in 2013. RESULTS: A cumulative score was calculated per farm, including the following parameters; the maximum monthly ingoing contact chain (in 2012), the average number of animals per incoming movement, use of mixed alpine pastures and the number of weeks in 2012 a farm had movements registered. The final score for the farm depended on the distribution of the parameters. Different cut offs; 50, 90, 95 and 99%, were explored. The final scores ranged between 0 and 5. Validation of the scores against results from the BVD surveillance programme 2013 gave promising results for setting the cut off for each of the five selected farm level criteria at the 50th percentile. Restricting testing to farms with a score ≥ 2 would have resulted in the same number of detected BVD positive farms as testing all farms, i.e., the outcome of the 2013 surveillance programme could have been reached with a smaller survey. CONCLUSIONS: The seasonality and time dependency of the activity of single farms in the networks requires a careful assessment of the actual time period included to determine farm level criteria. However, selecting farms in the sample for risk-based surveillance can be optimized with the proposed scoring system. The system was validated using data from the BVD eradication program. The proposed method is a promising framework for the selection of farms according to the risk of infection based on animal movements.
Resumo:
Symptom development during the prodromal phase of psychosis was explored retrospectively in first-episode psychosis patients with special emphasis on the assumed time-related syndromic sequence of "unspecific symptoms (UN)-predictive basic symptoms (BS)-attenuated psychotic symptoms (APS)-(transient) psychotic symptoms (PS)." Onset of syndromes was defined by first occurrence of any of their respective symptoms. Group means were inspected for time differences between syndromes and influence of sociodemographic and clinical characteristics on the recalled sequence. The sequence of "UN-BS/APS-PS" was clearly supported, and both BS and, though slightly less, APS were highly sensitive. However, onset of BS and APS did not show significant time difference in the whole sample (N = 126; 90% schizophrenia), although when each symptom is considered independently, APS tended to occur later than first predictive BS. On descriptive level, about one-third each recalled an earlier, equal and later onset of BS compared with APS. Level of education showed the greatest impact on the recall of the hypothesized sequence. Thereby, those with a higher school-leaving certificate supported the assumed sequence, whereas those of low educational background retrospectively dated APS before BS. These findings rather point out recognition and recall bias inherent to the retrospective design than true group characteristics. Future long-term prospective studies will have to explore this conclusively. However, as regards the criteria, the results support the notion of BS as at least a complementary approach to the ultrahigh risk criteria, which may also allow for an earlier detection of psychosis.
Resumo:
Tajikistan is judged to be highly vulnerable to risk, including food insecurity risks and climate change risks. By some vulnerability measures it is the most vulnerable among all 28 countries in the World Bank’s Europe and Central Asia Region – ECA (World Bank 2009). The rural population, with its relatively high incidence of poverty, is particularly vulnerable. The Pilot Program for Climate Resilience (PPCR) in Tajikistan (2011) provided an opportunity to conduct a farm-level survey with the objective of assessing various dimensions of rural population’s vulnerability to risk and their perception of constraints to farming operations and livelihoods. The survey should be accordingly referred to as the 2011 PPCR survey. The rural population in Tajikistan is highly agrarian, with about 50% of family income deriving from agriculture (see Figure 4.1; also LSMS 2007 – own calculations). Tajikistan’s agriculture basically consists of two groups of producers: small household plots – the successors of Soviet “private agriculture” – and dehkan (or “peasant”) farms – new family farming structures that began to be created under relevant legislation passed after 1992 (Lerman and Sedik, 2008). The household plots manage 20% of arable land and produce 65% of gross agricultural output (GAO). Dehkan farms manage 65% of arable land and produce close to 30% of GAO. The remaining 15% of arable land is held in agricultural enterprises – the rapidly shrinking sector of corporate farms that succeeded the Soviet kolkhozes and sovkhozes and today produces less than 10% of GAO (TajStat 2011) The survey conducted in May 2011 focused on dehkan farms, as budgetary constraints precluded the inclusion of household plots. A total of 142 dehkan farms were surveyed in face-to-face interviews. They were sampled from 17 districts across all four regions – Sughd, Khatlon, RRP, and GBAO. The districts were selected so as to represent different agro-climatic zones, different vulnerability zones (based on the World Bank (2011) vulnerability assessment), and different food-insecurity zones (based on WFP/IPC assessments). Within each district, 3-4 jamoats were chosen at random and 2-3 farms were selected in each jamoat from lists provided by jamoat administration so as to maximize the variability by farm characteristics. The sample design by region/district is presented in Table A, which also shows the agro-climatic zone and the food security phase for each district. The sample districts are superimposed on a map of food security phases based on IPC April 2011.
Resumo:
Background In Switzerland there are about 150,000 equestrians. Horse related injuries, including head and spinal injuries, are frequently treated at our level I trauma centre. Objectives To analyse injury patterns, protective factors, and risk factors related to horse riding, and to define groups of safer riders and those at greater risk Methods We present a retrospective and a case-control survey at conducted a tertiary trauma centre in Bern, Switzerland. Injured equestrians from July 2000 - June 2006 were retrospectively classified by injury pattern and neurological symptoms. Injured equestrians from July-December 2008 were prospectively collected using a questionnaire with 17 variables. The same questionnaire was applied in non-injured controls. Multiple logistic regression was performed, and combined risk factors were calculated using inference trees. Results Retrospective survey A total of 528 injuries occured in 365 patients. The injury pattern revealed as follows: extremities (32%: upper 17%, lower 15%), head (24%), spine (14%), thorax (9%), face (9%), pelvis (7%) and abdomen (2%). Two injuries were fatal. One case resulted in quadriplegia, one in paraplegia. Case-control survey 61 patients and 102 controls (patients: 72% female, 28% male; controls: 63% female, 37% male) were included. Falls were most frequent (65%), followed by horse kicks (19%) and horse bites (2%). Variables statistically significant for the controls were: Older age (p = 0.015), male gender (p = 0.04) and holding a diploma in horse riding (p = 0.004). Inference trees revealed typical groups less and more likely to suffer injury. Conclusions Experience with riding and having passed a diploma in horse riding seem to be protective factors. Educational levels and injury risk should be graded within an educational level-injury risk index.
Resumo:
INTRODUCTION: Guidelines for the treatment of patients in severe hypothermia and mainly in hypothermic cardiac arrest recommend the rewarming using the extracorporeal circulation (ECC). However,guidelines for the further in-hospital diagnostic and therapeutic approach of these patients, who often suffer from additional injuries—especially in avalanche casualties, are lacking. Lack of such algorithms may relevantly delay treatment and put patients at further risk. Together with a multidisciplinary team, the Emergency Department at the University Hospital in Bern, a level I trauma centre, created an algorithm for the in-hospital treatment of patients with hypothermic cardiac arrest. This algorithm primarily focuses on the decision-making process for the administration of ECC. THE BERNESE HYPOTHERMIA ALGORITHM: The major difference between the traditional approach, where all hypothermic patients are primarily admitted to the emergency centre, and our new algorithm is that hypothermic cardiac arrest patients without obvious signs of severe trauma are taken to the operating theatre without delay. Subsequently, the interdisciplinary team decides whether to rewarm the patient using ECC based on a standard clinical trauma assessment, serum potassium levels, core body temperature, sonographic examinations of the abdomen, pleural space, and pericardium, as well as a pelvic X-ray, if needed. During ECC, sonography is repeated and haemodynamic function as well as haemoglobin levels are regularly monitored. Standard radiological investigations according to the local multiple trauma protocol are performed only after ECC. Transfer to the intensive care unit, where mild therapeutic hypothermia is maintained for another 12 h, should not be delayed by additional X-rays for minor injuries. DISCUSSION: The presented algorithm is intended to facilitate in-hospital decision-making and shorten the door-to-reperfusion time for patients with hypothermic cardiac arrest. It was the result of intensive collaboration between different specialties and highlights the importance of high-quality teamwork for rare cases of severe accidental hypothermia. Information derived from the new International Hypothermia Registry will help to answer open questions and further optimize the algorithm.
Seropositivity and Risk Factors Associated with Toxoplasma gondii Infection in Wild Birds from Spain
Resumo:
Toxoplasma gondii is a zoonotic intracellular protozoan parasite of worldwide distribution that infects many species of warm-blooded animals, including birds. To date, there is scant information about the seropositivity of T. gondii and the risk factors associated with T. gondii infection in wild bird populations. In the present study, T. gondii infection was evaluated on sera obtained from 1079 wild birds belonging to 56 species (including Falconiformes (n = 610), Strigiformes (n = 260), Ciconiiformes (n = 156), Gruiformes (n = 21), and other orders (n = 32), from different areas of Spain. Antibodies to T. gondii (modified agglutination test, MAT titer ≥1:25) were found in 282 (26.1%, IC95%:23.5–28.7) of the 1079 birds. This study constitute the first extensive survey in wild birds species in Spain and reports for the first time T. gondii antibodies in the griffon vulture (Gyps fulvus), short-toed snake-eagle (Circaetus gallicus), Bonelli's eagle (Aquila fasciata), golden eagle (Aquila chrysaetos), bearded vulture (Gypaetus barbatus), osprey (Pandion haliaetus), Montagu's harrier (Circus pygargus), Western marsh-harrier (Circus aeruginosus), peregrine falcon (Falco peregrinus), long-eared owl (Asio otus), common scops owl (Otus scops), Eurasian spoonbill (Platalea leucorodia), white stork (Ciconia ciconia), grey heron (Ardea cinerea), common moorhen (Gallinula chloropus); in the International Union for Conservation of Nature (IUCN) “vulnerable” Spanish imperial eagle (Aquila adalberti), lesser kestrel (Falco naumanni) and great bustard (Otis tarda); and in the IUCN “near threatened” red kite (Milvus milvus). The highest seropositivity by species was observed in the Eurasian eagle owl (Bubo bubo) (68.1%, 98 of 144). The main risk factors associated with T. gondii seropositivity in wild birds were age and diet, with the highest exposure in older animals and in carnivorous wild birds. The results showed that T. gondii infection is widespread and can be at a high level in many wild birds in Spain, most likely related to their feeding behaviour.
Resumo:
Traditionally, the routine artificial digestion test is applied to assess the presence of Trichinella larvae in pigs. However, this diagnostic method has a low sensitivity compared to serological tests. The results from artificial digestion tests in Switzerland were evaluated over a time period of 15 years to determine by when freedom from infection based on these data could be confirmed. Freedom was defined as a 95% probability that the prevalence of infection was below 0.0001%. Freedom was demonstrated after 12 years at the latest. A new risk-based surveillance approach was then developed based on serology. Risk-based surveillance was also assessed over 15 years, starting in 2010. It was shown that by using this design, the sample size could be reduced by at least a factor of 4 when compared with the traditional testing regimen, without lowering the level of confidence in the Trichinella-free status of the pig population.