894 resultados para risk need responsivity model
Resumo:
Climate controls fire regimes through its influence on the amount and types of fuel present and their dryness. CO2 concentration constrains primary production by limiting photosynthetic activity in plants. However, although fuel accumulation depends on biomass production, and hence on CO2 concentration, the quantitative relationship between atmospheric CO2 concentration and biomass burning is not well understood. Here a fire-enabled dynamic global vegetation model (the Land surface Processes and eXchanges model, LPX) is used to attribute glacial–interglacial changes in biomass burning to an increase in CO2, which would be expected to increase primary production and therefore fuel loads even in the absence of climate change, vs. climate change effects. Four general circulation models provided last glacial maximum (LGM) climate anomalies – that is, differences from the pre-industrial (PI) control climate – from the Palaeoclimate Modelling Intercomparison Project Phase~2, allowing the construction of four scenarios for LGM climate. Modelled carbon fluxes from biomass burning were corrected for the model's observed prediction biases in contemporary regional average values for biomes. With LGM climate and low CO2 (185 ppm) effects included, the modelled global flux at the LGM was in the range of 1.0–1.4 Pg C year-1, about a third less than that modelled for PI time. LGM climate with pre-industrial CO2 (280 ppm) yielded unrealistic results, with global biomass burning fluxes similar to or even greater than in the pre-industrial climate. It is inferred that a substantial part of the increase in biomass burning after the LGM must be attributed to the effect of increasing CO2 concentration on primary production and fuel load. Today, by analogy, both rising CO2 and global warming must be considered as risk factors for increasing biomass burning. Both effects need to be included in models to project future fire risks.
Resumo:
This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.
Resumo:
Economic theory makes no predictions about social factors affecting decisions under risk. We examine situations in which a decision maker decides for herself and another person under conditions of payoff equality, and compare them to individual decisions. By estimating a structural model, we find that responsibility leaves utility curvature unaffected, but accentuates the subjective distortion of very small and very large probabilities for both gains and losses. We also find that responsibility reduces loss aversion, but that these results only obtain under some specific definitions of the latter. These results serve to generalize and reconcile some of the still largely contradictory findings in the literature. They also have implications for financial agency, which we discuss.
Resumo:
This paper presents a GIS-based multicriteria flood risk assessment and mapping approach applied to coastal drainage basins where hydrological data are not available. It involves risk to different types of possible processes: coastal inundation (storm surge), river, estuarine and flash flood, either at urban or natural areas, and fords. Based on the causes of these processes, several environmental indicators were taken to build-up the risk assessment. Geoindicators include geological-geomorphologic proprieties of Quaternary sedimentary units, water table, drainage basin morphometry, coastal dynamics, beach morphodynamics and microclimatic characteristics. Bioindicators involve coastal plain and low slope native vegetation categories and two alteration states. Anthropogenic indicators encompass land use categories properties such as: type, occupation density, urban structure type and occupation consolidation degree. The selected indicators were stored within an expert Geoenvironmental Information System developed for the State of Sao Paulo Coastal Zone (SIIGAL), which attributes were mathematically classified through deterministic approaches, in order to estimate natural susceptibilities (Sn), human-induced susceptibilities (Sa), return period of rain events (Ri), potential damages (Dp) and the risk classification (R), according to the equation R=(Sn.Sa.Ri).Dp. Thematic maps were automatically processed within the SIIGAL, in which automata cells (""geoenvironmental management units"") aggregating geological-geomorphologic and land use/native vegetation categories were the units of classification. The method has been applied to the Northern Littoral of the State of Sao Paulo (Brazil) in 32 small drainage basins, demonstrating to be very useful for coastal zone public politics, civil defense programs and flood management.
Resumo:
Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.
Resumo:
The Delaware River provides half of New York City's drinking water, is a habitat for wild trout, American shad and the federally endangered dwarf wedge mussel. It has suffered four 100‐year floods in the last seven years. A drought during the 1960s stands as a warning of the potential vulnerability of the New York City area to severe water shortages if a similar drought were to recur. The water releases from three New York City dams on the Delaware River's headwaters impact not only the reliability of the city’s water supply, but also the potential impact of floods, and the quality of the aquatic habitat in the upper river. The goal of this work is to influence the Delaware River water release policies (FFMP/OST) to further benefit river habitat and fisheries without increasing New York City's drought risk, or the flood risk to down basin residents. The Delaware water release policies are constrained by the dictates of two US Supreme Court Decrees (1931 and 1954) and the need for unanimity among four states: New York, New Jersey, Pennsylvania, and Delaware ‐‐ and New York City. Coordination of their activities and the operation under the existing decrees is provided by the Delaware River Basin Commission (DRBC). Questions such as the probability of the system approaching drought state based on the current FFMP plan and the severity of the 1960s drought are addressed using long record paleo‐reconstructions of flows. For this study, we developed reconstructed total annual flows (water year) for 3 reservoir inflows using regional tree rings going back upto 1754 (a total of 246 years). The reconstructed flows are used with a simple reservoir model to quantify droughts. We observe that the 1960s drought is by far the worst drought based on 246 years of simulations (since 1754).
Resumo:
When the joint assumption of optimal risk sharing and coincidence of beliefs is added to the collective model of Browning and Chiappori (1998) income pooling and symmetry of the pseudo-Hicksian matrix are shown to be restored. Because these are also the features of the unitary model usually rejected in empirical studies one may argue that these assumptions are at odds with evidence. We argue that this needs not be the case. The use of cross-section data to generate price and income variation is based Oil a definition of income pooling or symmetry suitable for testing the unitary model, but not the collective model with risk sharing. AIso, by relaxing assumptions on beliefs, we show that symmetry and income pooling is lost. However, with usual assumptions on existence of assignable goods, we show that beliefs are identifiable. More importantly, if di:fferences in beliefs are not too extreme, the risk sharing hypothesis is still testable.
Resumo:
O objetivo deste estudo é propor a implementação de um modelo estatístico para cálculo da volatilidade, não difundido na literatura brasileira, o modelo de escala local (LSM), apresentando suas vantagens e desvantagens em relação aos modelos habitualmente utilizados para mensuração de risco. Para estimação dos parâmetros serão usadas as cotações diárias do Ibovespa, no período de janeiro de 2009 a dezembro de 2014, e para a aferição da acurácia empírica dos modelos serão realizados testes fora da amostra, comparando os VaR obtidos para o período de janeiro a dezembro de 2014. Foram introduzidas variáveis explicativas na tentativa de aprimorar os modelos e optou-se pelo correspondente americano do Ibovespa, o índice Dow Jones, por ter apresentado propriedades como: alta correlação, causalidade no sentido de Granger, e razão de log-verossimilhança significativa. Uma das inovações do modelo de escala local é não utilizar diretamente a variância, mas sim a sua recíproca, chamada de “precisão” da série, que segue uma espécie de passeio aleatório multiplicativo. O LSM captou todos os fatos estilizados das séries financeiras, e os resultados foram favoráveis a sua utilização, logo, o modelo torna-se uma alternativa de especificação eficiente e parcimoniosa para estimar e prever volatilidade, na medida em que possui apenas um parâmetro a ser estimado, o que representa uma mudança de paradigma em relação aos modelos de heterocedasticidade condicional.
Resumo:
We develop an affine jump diffusion (AJD) model with the jump-risk premium being determined by both idiosyncratic and systematic sources of risk. While we maintain the classical affine setting of the model, we add a finite set of new state variables that affect the paths of the primitive, under both the actual and the risk-neutral measure, by being related to the primitive's jump process. Those new variables are assumed to be commom to all the primitives. We present simulations to ensure that the model generates the volatility smile and compute the "discounted conditional characteristic function'' transform that permits the pricing of a wide range of derivatives.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
O objetivo do artigo foi avaliar o uso da lógica fuzzy para estimar possibilidade de óbito neonatal. Desenvolveu-se um modelo computacional com base na teoria dos conjuntos fuzzy, tendo como variáveis peso ao nascer, idade gestacional, escore de Apgar e relato de natimorto. Empregou-se o método de inferência de Mamdani, e a variável de saída foi o risco de morte neonatal. Criaram-se 24 regras de acordo com as variáveis de entrada, e a validação do modelo utilizou um banco de dados real de uma cidade brasileira. A acurácia foi estimada pela curva ROC; os riscos foram comparados pelo teste t de Student. O programa MATLAB 6.5 foi usado para construir o modelo. Os riscos médios foram menores para os que sobreviveram (p < 0,001). A acurácia do modelo foi 0,90. A maior acurácia foi com possibilidade de risco igual ou menor que 25% (sensibilidade = 0,70, especificidade = 0,98, valor preditivo negativo = 0,99 e valor preditivo positivo = 0,22). O modelo mostrou acurácia e valor preditivo negativo bons, podendo ser utilizado em hospitais gerais.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
Resumo:
Objective: To analyze the association between maternal obesity and postnatal infectious complications in high-risk pregnancies. Methods: Prospective study from August 2009 through August 2010 with the following inclusion criteria: women up to the 5th postpartum day; age L 18 years; high-risk pregnancy; singleton pregnancy with live fetus at labor onset; delivery at the institution; maternal weight measured on day of delivery. The nutritional status in late pregnancy was assessed by the body mass index (BMI), with the application of the Atalah et al. curve. Patients were graded as underweight, adequate weight, overweight, or obese. Postpartum complications investigated during the hospital stay and 30 days post-discharge were: surgical wound infection and/or secretion, urinary infection, postpartum infection, fever, hospitalization, antibiotic use, and composite morbidity (at least one of the complications mentioned). Results: 374 puerperal women were included, graded according to the final BMI as: underweight (n = 54, 14.4%); adequate weight (n = 126, 33.7%); overweight (n = 105, 28.1%); and obese (n = 89, 23.8%). Maternal obesity was shown to have a significant association with the following postpartum complications: surgical wound infection (16.7%, p = 0.042), urinary infection (9.0%, p = 0.004), antibiotic use (12.3%, p < 0.001), and composite morbidity (25.6%, p = 0.016). By applying the logistic regression model, obesity in late pregnancy was found to be an independent variable regardless of the composite morbidity predicted (OR: 2.09; 95% CI: 1.15-3.80, p = 0.015). Conclusion: Maternal obesity during late pregnancy in high-risk patients is independently associated with postpartum infectious complications, which demonstrates the need for a closer follow-up of maternal weight gain in these pregnancies.