913 resultados para Constant Relative Risk Aversion
Resumo:
Land development in the vicinity of airports often leads to land-use that can attract birds that are hazardous to aviation operations. For this reason, certain forms of land-use have traditionally been discouraged within prescribed distances of Canadian airports. However, this often leads to an unrealistic prohibition of land-use in the vicinity of airports located in urban settings. Furthermore, it is often unclear that the desired safety goals have been achieved. This paper describes a model that was created to assist in the development of zoning regulations for a future airport site in Canada. The framework links land-use to bird-related safety-risks and aircraft operations by categorizing the predictable relationships between: (i) different land uses found in urbanized and urbanizing settings near airports; (ii) bird species; and (iii) the different safety-risks to aircraft during various phases of flight. The latter is assessed relative to the runway approach and departure paths. Bird species are ranked to reflect the potential severity of an impact with an aircraft (using bird weight, flocking characteristics, and flight behaviours). These criteria are then employed to chart bird-related safety-risks relative to runway reference points. Each form of land-use is categorized to reflect the degree to which it attracts hazardous bird species. From this information, hazard and risk matrices have been developed and applied to the future airport setting, thereby providing risk-based guidance on appropriate land-uses that range from prohibited to acceptable. The framework has subsequently been applied to an existing Canadian airport, and is currently being adapted for national application. The framework provides a risk-based and science-based approach that offers municipalities and property owner’s flexibility in managing the risks to aviation related to their land use.
Resumo:
Background: Arboviral diseases are major global public health threats. Yet, our understanding of infection risk factors is, with a few exceptions, considerably limited. A crucial shortcoming is the widespread use of analytical methods generally not suited for observational data - particularly null hypothesis-testing (NHT) and step-wise regression (SWR). Using Mayaro virus (MAYV) as a case study, here we compare information theory-based multimodel inference (MMI) with conventional analyses for arboviral infection risk factor assessment. Methodology/Principal Findings: A cross-sectional survey of anti-MAYV antibodies revealed 44% prevalence (n = 270 subjects) in a central Amazon rural settlement. NHT suggested that residents of village-like household clusters and those using closed toilet/latrines were at higher risk, while living in non-village-like areas, using bednets, and owning fowl, pigs or dogs were protective. The "minimum adequate" SWR model retained only residence area and bednet use. Using MMI, we identified relevant covariates, quantified their relative importance, and estimated effect-sizes (beta +/- SE) on which to base inference. Residence area (beta(Village) = 2.93 +/- 0.41; beta(Upland) = -0.56 +/- 0.33, beta(Riverbanks) = -2.37 +/- 0.55) and bednet use (beta = -0.95 +/- 0.28) were the most important factors, followed by crop-plot ownership (beta = 0.39 +/- 0.22) and regular use of a closed toilet/latrine (beta = 0.19 +/- 0.13); domestic animals had insignificant protective effects and were relatively unimportant. The SWR model ranked fifth among the 128 models in the final MMI set. Conclusions/Significance: Our analyses illustrate how MMI can enhance inference on infection risk factors when compared with NHT or SWR. MMI indicates that forest crop-plot workers are likely exposed to typical MAYV cycles maintained by diurnal, forest dwelling vectors; however, MAYV might also be circulating in nocturnal, domestic-peridomestic cycles in village-like areas. This suggests either a vector shift (synanthropic mosquitoes vectoring MAYV) or a habitat/habits shift (classical MAYV vectors adapting to densely populated landscapes and nocturnal biting); any such ecological/adaptive novelty could increase the likelihood of MAYV emergence in Amazonia.
Resumo:
In Performance-Based Earthquake Engineering (PBEE), evaluating the seismic performance (or seismic risk) of a structure at a designed site has gained major attention, especially in the past decade. One of the objectives in PBEE is to quantify the seismic reliability of a structure (due to the future random earthquakes) at a site. For that purpose, Probabilistic Seismic Demand Analysis (PSDA) is utilized as a tool to estimate the Mean Annual Frequency (MAF) of exceeding a specified value of a structural Engineering Demand Parameter (EDP). This dissertation focuses mainly on applying an average of a certain number of spectral acceleration ordinates in a certain interval of periods, Sa,avg (T1,…,Tn), as scalar ground motion Intensity Measure (IM) when assessing the seismic performance of inelastic structures. Since the interval of periods where computing Sa,avg is related to the more or less influence of higher vibration modes on the inelastic response, it is appropriate to speak about improved IMs. The results using these improved IMs are compared with a conventional elastic-based scalar IMs (e.g., pseudo spectral acceleration, Sa ( T(¹)), or peak ground acceleration, PGA) and the advanced inelastic-based scalar IM (i.e., inelastic spectral displacement, Sdi). The advantages of applying improved IMs are: (i ) "computability" of the seismic hazard according to traditional Probabilistic Seismic Hazard Analysis (PSHA), because ground motion prediction models are already available for Sa (Ti), and hence it is possibile to employ existing models to assess hazard in terms of Sa,avg, and (ii ) "efficiency" or smaller variability of structural response, which was minimized to assess the optimal range to compute Sa,avg. More work is needed to assess also "sufficiency" and "scaling robustness" desirable properties, which are disregarded in this dissertation. However, for ordinary records (i.e., with no pulse like effects), using the improved IMs is found to be more accurate than using the elastic- and inelastic-based IMs. For structural demands that are dominated by the first mode of vibration, using Sa,avg can be negligible relative to the conventionally-used Sa (T(¹)) and the advanced Sdi. For structural demands with sign.cant higher-mode contribution, an improved scalar IM that incorporates higher modes needs to be utilized. In order to fully understand the influence of the IM on the seismis risk, a simplified closed-form expression for the probability of exceeding a limit state capacity was chosen as a reliability measure under seismic excitations and implemented for Reinforced Concrete (RC) frame structures. This closed-form expression is partuclarly useful for seismic assessment and design of structures, taking into account the uncertainty in the generic variables, structural "demand" and "capacity" as well as the uncertainty in seismic excitations. The assumed framework employs nonlinear Incremental Dynamic Analysis (IDA) procedures in order to estimate variability in the response of the structure (demand) to seismic excitations, conditioned to IM. The estimation of the seismic risk using the simplified closed-form expression is affected by IM, because the final seismic risk is not constant, but with the same order of magnitude. Possible reasons concern the non-linear model assumed, or the insufficiency of the selected IM. Since it is impossibile to state what is the "real" probability of exceeding a limit state looking the total risk, the only way is represented by the optimization of the desirable properties of an IM.
Resumo:
La ricerca è strutturata in due sezioni: nella prima, dopo una premessa storica sul suicidio ed una lettura dei relativi dati statistici italiani integrata dall’analisi delle principali teorie sociologiche e dei principali aspetti psicopatologici e di psicologia clinica, vengono esaminati i risultati forniti da numerosi studi scientifici sul tema complementare delle morti equivoche, con particolare riferimento alle categorie a rischio rappresentate da anziani, carcerati, piloti di aerei, soggetti dediti a pratiche di asfissia autoerotica o roulette russa, istigatori delle forze di polizia e suicida stradali. Successivamente sono esaminati gli aspetti investigativi e medico-legali in tema di suicidi e morti equivoche con particolare riferimento alla tecnica dell’autopsia psicologica analizzandone le origini ed evoluzioni, il suo ambito di utilizzo ed i relativi aspetti metodologici. Nella seconda sezione del lavoro il tema dei suicidi e delle morti equivoche viene approfondito grazie all’apporto di professionisti di discipline diverse esperti in materia di autopsia psicologica ed indagini giudiziarie. A questi è stata presentata, con l’utilizzo della tecnica qualitativa “Dephi, una iniziale ipotesi di protocollo di autopsia psicologica, con le relative modalità applicative, al fine di procedere ad una sua revisione ed adattamento alle esigenze operative italiane grazie alle specifiche esperienze professionali e multidisciplinari maturate dagli esperti. I dati raccolti hanno permesso di giungere alla formulazione di un protocollo di autopsia psicologica, basato sulla elaborazione di domande generali, specifiche e conclusive, a risposta aperta, che possono esser formulate, secondo le modalità previste, alle persone affettivamente significative per la vittima nei confronti della quale si intende procedere con tale strumento investigativo.
Resumo:
This thesis is the result of a project aimed at the study of a crucial topic in finance: default risk, whose measurement and modelling have achieved increasing relevance in recent years. We investigate the main issues related to the default phenomenon, under both a methodological and empirical perspective. The topics of default predictability and correlation are treated with a constant attention to the modelling solutions and reviewing critically the literature. From the methodological point of view, our analysis results in the proposal of a new class of models, called Poisson Autoregression with Exogenous Covariates (PARX). The PARX models, including both autoregressive end exogenous components, are able to capture the dynamics of default count time series, characterized by persistence of shocks and slowly decaying autocorrelation. Application of different PARX models to the monthly default counts of US industrial firms in the period 1982-2011 allows an empirical insight of the defaults dynamics and supports the identification of the main default predictors at an aggregate level.
Analisi e riprogettazione del processo di ict risk management: un caso applicativo in Telecom Italia
Resumo:
Questo lavoro di tesi muove da tematiche relative alla sicurezza IT e risulta dagli otto mesi di lavoro all’interno della funzione Technical Security di Telecom Italia Information Technology. Il compito primario di questa unità di business è ridurre il rischio informatico dei sistemi di Telecom Italia per mezzo dell’attuazione del processo di ICT Risk Management, che coinvolge l’intera organizzazione ed è stato oggetto di una riprogettazione nel corso del 2012. Per estendere tale processo a tutti i sistemi informatici, nello specifico a quelli caratterizzati da non conformità, all’inizio del 2013 è stato avviato il Programma Strutturato di Sicurezza, un aggregato di quattro progetti dalla durata triennale particolarmente articolato e complesso. La pianificazione di tale Programma ha visto coinvolto, tra gli altri, il team di cui ho fatto parte, che ha collaborato con Telecom Italia assolvendo alcune delle funzioni di supporto tipiche dei Project Management Office (PMO).
Resumo:
La stima della frequenza di accadimento di eventi incidentali di tipo random da linee e apparecchiature è in generale effettuata sulla base delle informazioni presenti in banche dati specializzate. I dati presenti in tali banche contengono informazioni relative ad eventi incidentali avvenuti in varie tipologie di installazioni, che spaziano dagli impianti chimici a quelli petrolchimici. Alcune di queste banche dati risultano anche piuttosto datate, poiché fanno riferimento ad incidenti verificatisi ormai molto addietro negli anni. Ne segue che i valori relativi alle frequenze di perdita forniti dalle banche dati risultano molto conservativi. Per ovviare a tale limite e tenere in conto il progresso tecnico, la linea guida API Recommended Pratice 581, pubblicata nel 2000 e successivamente aggiornata nel 2008, ha introdotto un criterio per determinare frequenze di perdita specializzate alla realtà propria impiantistica, mediante l’ausilio di coefficienti correttivi che considerano il meccanismo di guasto del componente, il sistema di gestione della sicurezza e l’efficacia dell’attività ispettiva. Il presente lavoro di tesi ha lo scopo di mettere in evidenza l’evoluzione dell’approccio di valutazione delle frequenze di perdita da tubazione. Esso è articolato come descritto nel seguito. Il Capitolo 1 ha carattere introduttivo. Nel Capitolo 2 è affrontato lo studio delle frequenze di perdita reperibili nelle banche dati generaliste. Nel Capitolo 3 sono illustrati due approcci, uno qualitativo ed uno quantitativo, che permettono di determinare le linee che presentano priorità più alta per essere sottoposte all’attività ispettiva. Il Capitolo 4 è dedicato alla descrizione della guida API Recomended Practice 581. L’applicazione ad un caso di studio dei criteri di selezione delle linee illustrati nel Capitolo 3 e la definizione delle caratteristiche dell’attività ispettiva secondo la linea guida API Recomended Practice 581 sono illustrati nel Capitolo 5. Infine nel Capitolo 6 sono rese esplicite le considerazioni conclusive dello studio effettuato.
Resumo:
Since 2007, more than 250,000 American students have studied abroad annually for a semester or more. While there are obvious benefits associated with study abroad programs, personal risks (including interpersonal victimization such as sexual and physical assault) occurring during the experience have been anecdotally reported but not systematically assessed. This study is the first to investigate the possibility of increased risk for sexual assault in female undergraduates while abroad. Two hundred eighteen female undergraduates completed a modified version of the Sexual Experiences Survey (SES: Koss et al., 2007) about their sexual experiences abroad and on campus. Findings indicate increased risk for sexual assault while abroad relative to on-campus rates, particularly in non-English speaking countries. Study abroad programs should consider educating students about increased risk and develop response protocols when sexual assaults happen while abroad.
Resumo:
Learned irrelevance (LIrr) refers to a form of selective learning that develops as a result of prior noncorrelated exposures of the predicted and predictor stimuli. In learning situations that depend on the associative link between the predicted and predictor stimuli, LIrr is expressed as a retardation of learning. It represents a form of modulation of learning by selective attention. Given the relevance of selective attention impairment to both positive and cognitive schizophrenia symptoms, the question remains whether LIrr impairment represents a state (relating to symptom manifestation) or trait (relating to schizophrenia endophenotypes) marker of human psychosis. We examined this by evaluating the expression of LIrr in an associative learning paradigm in (1) asymptomatic first-degree relatives of schizophrenia patients (SZ-relatives) and in (2) individuals exhibiting prodromal signs of psychosis ("ultrahigh risk" [UHR] patients) in each case relative to demographically matched healthy control subjects. There was no evidence for aberrant LIrr in SZ-relatives, but LIrr as well as associative learning were attenuated in UHR patients. It is concluded that LIrr deficiency in conjunction with a learning impairment might be a useful state marker predictive of psychotic state but a relatively weak link to a potential schizophrenia endophenotype.
Resumo:
The parasite Echinococcus multilocularis was first detected in The Netherlands in 1996 and repeated studies have shown that the parasite subsequently spread in the local population of foxes in the province of Limburg. It was not possible to quantify the human risk of alveolar echinococcosis because no relationship between the amount of parasite eggs in the environment and the probability of infection in humans was known. Here, we used the spread of the parasite in The Netherlands as a predictor, together with recently published historical records of the epidemiology of alveolar echinococcosis in Switzerland, to achieve a relative quantification of the risk. Based on these analyses, the human risk in Limburg was simulated and up to three human cases are predicted by 2018. We conclude that the epidemiology of alveolar echinococcosis in The Netherlands might have changed from a period of negligible risk in the past to a period of increasing risk in the forthcoming years.
Resumo:
Knowledge on the relative importance of alternative sources of human campylobacteriosis is important in order to implement effective disease prevention measures. The objective of this study was to assess the relative importance of three key exposure pathways (travelling abroad, poultry meat, pet contact) for different patient age groups in Switzerland. With a stochastic exposure model data on Campylobacter incidence for the years 2002-2007 were linked with data for the three exposure pathways and the results of a case-control study. Mean values for the population attributable fractions (PAF) over all age groups and years were 27% (95% CI 17-39) for poultry consumption, 27% (95% CI 22-32) for travelling abroad, 8% (95% CI 6-9) for pet contact and 39% (95% CI 25-50) for other risk factors. This model provided robust results when using data available for Switzerland, but the uncertainties remained high. The output of the model could be improved if more accurate input data are available to estimate the infection rate per exposure. In particular, the relatively high proportion of cases attributed to 'other risk factors' requires further attention.
Resumo:
Published opinions regarding the outcomes and complications in older patients have a broad spectrum and there is a disagreement whether surgery in older patients entails a higher risk. Therefore this study examines the risk of surgery for lumbar spinal stenosis relative to age in the pooled data set of the Spine Tango registry.
Resumo:
Objectives: Neurofunctional alterations are correlates of vulnerability to psychosis, as well as of the disorder itself. How these abnormalities relate to different probabilities for later transition to psychosis is unclear. We investigated vulnerability- versus disease-related versus resilience biomarkers of psychosis during working memory (WM) processing in individuals with an at-risk mental state (ARMS). Experimental design: Patients with “first-episode psychosis” (FEP, n = 21), short-term ARMS (ARMS-ST, n = 17), long-term ARMS (ARMS-LT, n = 16), and healthy controls (HC, n = 20) were investigated with an n-back WM task. We examined functional magnetic resonance imaging (fMRI) and structural magnetic resonance imaging (sMRI) data in conjunction using biological parametric mapping (BPM) toolbox. Principal observations: There were no differences in accuracy, but the FEP and the ARMS-ST group had longer reaction times compared with the HC and the ARMS-LT group. With the 2-back > 0-back contrast, we found reduced functional activation in ARMS-ST and FEP compared with the HC group in parietal and middle frontal regions. Relative to ARMS-LT individuals, FEP patients showed decreased activation in the bilateral inferior frontal gyrus and insula, and in the left prefrontal cortex. Compared with the ARMS-LT, the ARMS-ST subjects showed reduced activation in the right inferior frontal gyrus and insula. Reduced insular and prefrontal activation was associated with gray matter volume reduction in the same area in the ARMS-LT group. Conclusions: These findings suggest that vulnerability to psychosis was associated with neurofunctional alterations in fronto-temporo-parietal networks in a WM task. Neurofunctional differences within the ARMS were related to different duration of the prodromal state and resilience factors
Resumo:
Mr. Pechersky set out to examine a specific feature of the employer-employee relationship in Russian business organisations. He wanted to study to what extent the so-called "moral hazard" is being solved (if it is being solved at all), whether there is a relationship between pay and performance, and whether there is a correlation between economic theory and Russian reality. Finally, he set out to construct a model of the Russian economy that better reflects the way it actually functions than do certain other well-known models (for example models of incentive compensation, the Shapiro-Stiglitz model etc.). His report was presented to the RSS in the form of a series of manuscripts in English and Russian, and on disc, with many tables and graphs. He begins by pointing out the different examples of randomness that exist in the relationship between employee and employer. Firstly, results are frequently affected by circumstances outside the employee's control that have nothing to do with how intelligently, honestly, and diligently the employee has worked. When rewards are based on results, uncontrollable randomness in the employee's output induces randomness in their incomes. A second source of randomness involves the outside events that are beyond the control of the employee that may affect his or her ability to perform as contracted. A third source of randomness arises when the performance itself (rather than the result) is measured, and the performance evaluation procedures include random or subjective elements. Mr. Pechersky's study shows that in Russia the third source of randomness plays an important role. Moreover, he points out that employer-employee relationships in Russia are sometimes opposite to those in the West. Drawing on game theory, he characterises the Western system as follows. The two players are the principal and the agent, who are usually representative individuals. The principal hires an agent to perform a task, and the agent acquires an information advantage concerning his actions or the outside world at some point in the game, i.e. it is assumed that the employee is better informed. In Russia, on the other hand, incentive contracts are typically negotiated in situations in which the employer has the information advantage concerning outcome. Mr. Pechersky schematises it thus. Compensation (the wage) is W and consists of a base amount, plus a portion that varies with the outcome, x. So W = a + bx, where b is used to measure the intensity of the incentives provided to the employee. This means that one contract will be said to provide stronger incentives than another if it specifies a higher value for b. This is the incentive contract as it operates in the West. The key feature distinguishing the Russian example is that x is observed by the employer but is not observed by the employee. So the employer promises to pay in accordance with an incentive scheme, but since the outcome is not observable by the employee the contract cannot be enforced, and the question arises: is there any incentive for the employer to fulfil his or her promises? Mr. Pechersky considers two simple models of employer-employee relationships displaying the above type of information symmetry. In a static framework the obtained result is somewhat surprising: at the Nash equilibrium the employer pays nothing, even though his objective function contains a quadratic term reflecting negative consequences for the employer if the actual level of compensation deviates from the expectations of the employee. This can lead, for example, to labour turnover, or the expenses resulting from a bad reputation. In a dynamic framework, the conclusion can be formulated as follows: the higher the discount factor, the higher the incentive for the employer to be honest in his/her relationships with the employee. If the discount factor is taken to be a parameter reflecting the degree of (un)certainty (the higher the degree of uncertainty is, the lower is the discount factor), we can conclude that the answer to the formulated question depends on the stability of the political, social and economic situation in a country. Mr. Pechersky believes that the strength of a market system with private property lies not just in its providing the information needed to compute an efficient allocation of resources in an efficient manner. At least equally important is the manner in which it accepts individually self-interested behaviour, but then channels this behaviour in desired directions. People do not have to be cajoled, artificially induced, or forced to do their parts in a well-functioning market system. Instead, they are simply left to pursue their own objectives as they see fit. Under the right circumstances, people are led by Adam Smith's "invisible hand" of impersonal market forces to take the actions needed to achieve an efficient, co-ordinated pattern of choices. The problem is that, as Mr. Pechersky sees it, there is no reason to believe that the circumstances in Russia are right, and the invisible hand is doing its work properly. Political instability, social tension and other circumstances prevent it from doing so. Mr. Pechersky believes that the discount factor plays a crucial role in employer-employee relationships. Such relationships can be considered satisfactory from a normative point of view, only in those cases where the discount factor is sufficiently large. Unfortunately, in modern Russia the evidence points to the typical discount factor being relatively small. This fact can be explained as a manifestation of aversion to risk of economic agents. Mr. Pechersky hopes that when political stabilisation occurs, the discount factors of economic agents will increase, and the agent's behaviour will be explicable in terms of more traditional models.
Resumo:
Biomarkers are currently best used as mechanistic "signposts" rather than as "traffic lights" in the environmental risk assessment of endocrine-disrupting chemicals (EDCs). In field studies, biomarkers of exposure [e.g., vitellogenin (VTG) induction in male fish] are powerful tools for tracking single substances and mixtures of concern. Biomarkers also provide linkage between field and laboratory data, thereby playing an important role in directing the need for and design of fish chronic tests for EDCs. It is the adverse effect end points (e.g., altered development, growth, and/or reproduction) from such tests that are most valuable for calculating adverseNOEC (no observed effect concentration) or adverseEC10 (effective concentration for a 10% response) and subsequently deriving predicted no effect concentrations (PNECs). With current uncertainties, biomarkerNOEC or biomarkerEC10 data should not be used in isolation to derive PNECs. In the future, however, there may be scope to increasingly use biomarker data in environmental decision making, if plausible linkages can be made across levels of organization such that adverse outcomes might be envisaged relative to biomarker responses. For biomarkers to fulfil their potential, they should be mechanistically relevant and reproducible (as measured by interlaboratory comparisons of the same protocol). VTG is a good example of such a biomarker in that it provides an insight to the mode of action (estrogenicity) that is vital to fish reproductive health. Interlaboratory reproducibility data for VTG are also encouraging; recent comparisons (using the same immunoassay protocol) have provided coefficients of variation (CVs) of 38-55% (comparable to published CVs of 19-58% for fish survival and growth end points used in regulatory test guidelines). While concern over environmental xenoestrogens has led to the evaluation of reproductive biomarkers in fish, it must be remembered that many substances act via diverse mechanisms of action such that the environmental risk assessment for EDCs is a broad and complex issue. Also, biomarkers such as secondary sexual characteristics, gonadosomatic indices, plasma steroids, and gonadal histology have significant potential for guiding interspecies assessments of EDCs and designing fish chronic tests. To strengthen the utility of EDC biomarkers in fish, we need to establish a historical control database (also considering natural variability) to help differentiate between statistically detectable versus biologically significant responses. In conclusion, as research continues to develop a range of useful EDC biomarkers, environmental decision-making needs to move forward, and it is proposed that the "biomarkers as signposts" approach is a pragmatic way forward in the current risk assessment of EDCs.