913 resultados para risk-based modeling
Resumo:
In this contribution we aim at anchoring Agent-Based Modeling (ABM) simulations in actual models of human psychology. More specifically, we apply unidirectional ABM to social psychological models using low level agents (i.e., intra-individual) to examine whether they generate better predictions, in comparison to standard statistical approaches, concerning the intentions of performing a behavior and the behavior. Moreover, this contribution tests to what extent the predictive validity of models of attitude such as the Theory of Planned Behavior (TPB) or Model of Goal-directed Behavior (MGB) depends on the assumption that peoples’ decisions and actions are purely rational. Simulations were therefore run by considering different deviations from rationality of the agents with a trembling hand method. Two data sets concerning respectively the consumption of soft drinks and physical activity were used. Three key findings emerged from the simulations. First, compared to standard statistical approach the agent-based simulation generally improves the prediction of behavior from intention. Second, the improvement in prediction is inversely proportional to the complexity of the underlying theoretical model. Finally, the introduction of varying degrees of deviation from rationality in agents’ behavior can lead to an improvement in the goodness of fit of the simulations. By demonstrating the potential of ABM as a complementary perspective to evaluating social psychological models, this contribution underlines the necessity of better defining agents in terms of psychological processes before examining higher levels such as the interactions between individuals.
Resumo:
Design summer years representing near-extreme hot summers have been used in the United Kingdom for the evaluation of thermal comfort and overheating risk. The years have been selected from measured weather data basically representative of an assumed stationary climate. Recent developments have made available ‘morphed’ equivalents of these years by shifting and stretching the measured variables using change factors produced by the UKCIP02 climate projections. The release of the latest, probabilistic, climate projections of UKCP09 together with the availability of a weather generator that can produce plausible daily or hourly sequences of weather variables has opened up the opportunity for generating new design summer years which can be used in risk-based decision-making. There are many possible methods for the production of design summer years from UKCP09 output: in this article, the original concept of the design summer year is largely retained, but a number of alternative methodologies for generating the years are explored. An alternative, more robust measure of warmth (weighted cooling degree hours) is also employed. It is demonstrated that the UKCP09 weather generator is capable of producing years for the baseline period, which are comparable with those in current use. Four methodologies for the generation of future years are described, and their output related to the future (deterministic) years that are currently available. It is concluded that, in general, years produced from the UKCP09 projections are warmer than those generated previously. Practical applications: The methodologies described in this article will facilitate designers who have access to the output of the UKCP09 weather generator (WG) to generate Design Summer Year hourly files tailored to their needs. The files produced will differ according to the methodology selected, in addition to location, emissions scenario and timeslice.
Resumo:
Glacier fluctuations exclusively due to internal variations in the climate system are simulated using downscaled integrations of the ECHAM4/OPYC coupled general circulation model (GCM). A process-based modeling approach using a mass balance model of intermediate complexity and a dynamic ice flow model considering simple shearing flow and sliding are applied. Multimillennia records of glacier length fluctuations for Nigardsbreen (Norway) and Rhonegletscher (Switzerland) are simulated using autoregressive processes determined by statistically downscaled GCM experiments. Return periods and probabilities of specific glacier length changes using GCM integrations excluding external forcings such as solar irradiation changes, volcanic, or anthropogenic effects are analyzed and compared to historical glacier length records. Preindustrial fluctuations of the glaciers as far as observed or reconstructed, including their advance during the “Little Ice Age,” can be explained by internal variability in the climate system as represented by a GCM. However, fluctuations comparable to the present-day glacier retreat exceed any variation simulated by the GCM control experiments and must be caused by external forcing, with anthropogenic forcing being a likely candidate.
Resumo:
Four CO2 concentration inversions and the Global Fire Emissions Database (GFED) versions 2.1 and 3 are used to provide benchmarks for climate-driven modeling of the global land-atmosphere CO2 flux and the contribution of wildfire to this flux. The Land surface Processes and exchanges (LPX) model is introduced. LPX is based on the Lund-Potsdam-Jena Spread and Intensity of FIRE (LPJ-SPITFIRE) model with amended fire probability calculations. LPX omits human ignition sources yet simulates many aspects of global fire adequately. It captures the major features of observed geographic pattern in burnt area and its seasonal timing and the unimodal relationship of burnt area to precipitation. It simulates features of geographic variation in the sign of the interannual correlations of burnt area with antecedent dryness and precipitation. It simulates well the interannual variability of the global total land-atmosphere CO2 flux. There are differences among the global burnt area time series from GFED2.1, GFED3 and LPX, but some features are common to all. GFED3 fire CO2 fluxes account for only about 1/3 of the variation in total CO2 flux during 1997–2005. This relationship appears to be dominated by the strong climatic dependence of deforestation fires. The relationship of LPX-modeled fire CO2 fluxes to total CO2 fluxes is weak. Observed and modeled total CO2 fluxes track the El Niño–Southern Oscillation (ENSO) closely; GFED3 burnt area and global fire CO2 flux track the ENSO much less so. The GFED3 fire CO2 flux-ENSO connection is most prominent for the El Niño of 1997–1998, which produced exceptional burning conditions in several regions, especially equatorial Asia. The sign of the observed relationship between ENSO and fire varies regionally, and LPX captures the broad features of this variation. These complexities underscore the need for process-based modeling to assess the consequences of global change for fire and its implications for the carbon cycle.
Resumo:
This paper seeks to discuss EU policies relating to securities markets, created in the wake of the financial crisis and how ICT and specifically e-Government can be utilised within this context. This study utilises the UK as a basis for our discussion. The recent financial crisis has caused a change of perspective in relation to government services and polices. The regulation of the financial sector has been heavily criticised and so is undergoing radical change in the UK and the rest of Europe. New regulatory bodies are being defined with more focus on taking a risk-based system-wide approach to regulating the financial sector. This approach aims to prevent financial institutions becoming too big to fail and thus require massive government bail outs. In addition, a new wave of EU regulation is in the wind to update risk management practices and to further protect investors. This paper discusses the reasons for the financial crisis and the UK’s past and future regulatory landscape. The current and future approach and strategies adopted by the UK’s financial regulators are reviewed as is the lifecycle of EU Directives. The regulatory responses to the crisis are discussed and upcoming regulatory hotspots identified. Discussion of these issues provides the context for our evaluation of the role e-Government and ICT in improving the regulatory system. We identify several processes, which are elementary for regulatory compliance and discuss how ICT is elementary in their implementation. The processes considered include those required for internal control and monitoring, risk management, record keeping and disclosure to regulatory bodies. We find these processes offer an excellent opportunity to adopt an e-Government approach to improve services to both regulated businesses and individual investors through the benefits derived from a more effective and efficient regulatory system.
Resumo:
In this article, we illustrate experimentally an important consequence of the stochastic component in choice behaviour which has not been acknowledged so far. Namely, its potential to produce ‘regression to the mean’ (RTM) effects. We employ a novel approach to individual choice under risk, based on repeated multiple-lottery choices (i.e. choices among many lotteries), to show how the high degree of stochastic variability present in individual decisions can distort crucially certain results through RTM effects. We demonstrate the point in the context of a social comparison experiment.
Resumo:
Understanding the performance of banks is of the utmost importance due to the impact the sector may have on economic growth and financial stability. Residential mortgage loans constitute a large proportion of the portfolio of many banks and are one of the key assets in the determination of their performance. Using a dynamic panel model, we analyse the impact of residential mortgage loans on bank profitability and risk, based on a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that an increase in residential mortgage loans seems to improve bank’s performance in terms of both profitability and credit risk in good market, pre-financial crisis, conditions. These findings may aid in explaining why banks rush to lend to property during booms because of the positive effect it has on performance. The results also show that credit risk and profitability are lower during the upturn in the residential property cycle.
Resumo:
The evidence for anthropogenic climate change continues to strengthen, and concerns about severe weather events are increasing. As a result, scientific interest is rapidly shifting from detection and attribution of global climate change to prediction of its impacts at the regional scale. However, nearly everything we have any confidence in when it comes to climate change is related to global patterns of surface temperature, which are primarily controlled by thermodynamics. In contrast, we have much less confidence in atmospheric circulation aspects of climate change, which are primarily controlled by dynamics and exert a strong control on regional climate. Model projections of circulation-related fields, including precipitation, show a wide range of possible outcomes, even on centennial timescales. Sources of uncertainty include low-frequency chaotic variability and the sensitivity to model error of the circulation response to climate forcing. As the circulation response to external forcing appears to project strongly onto existing patterns of variability, knowledge of errors in the dynamics of variability may provide some constraints on model projections. Nevertheless, higher scientific confidence in circulation-related aspects of climate change will be difficult to obtain. For effective decision-making, it is necessary to move to a more explicitly probabilistic, risk-based approach.
Resumo:
An efficient data based-modeling algorithm for nonlinear system identification is introduced for radial basis function (RBF) neural networks with the aim of maximizing generalization capability based on the concept of leave-one-out (LOO) cross validation. Each of the RBF kernels has its own kernel width parameter and the basic idea is to optimize the multiple pairs of regularization parameters and kernel widths, each of which is associated with a kernel, one at a time within the orthogonal forward regression (OFR) procedure. Thus, each OFR step consists of one model term selection based on the LOO mean square error (LOOMSE), followed by the optimization of the associated kernel width and regularization parameter, also based on the LOOMSE. Since like our previous state-of-the-art local regularization assisted orthogonal least squares (LROLS) algorithm, the same LOOMSE is adopted for model selection, our proposed new OFR algorithm is also capable of producing a very sparse RBF model with excellent generalization performance. Unlike our previous LROLS algorithm which requires an additional iterative loop to optimize the regularization parameters as well as an additional procedure to optimize the kernel width, the proposed new OFR algorithm optimizes both the kernel widths and regularization parameters within the single OFR procedure, and consequently the required computational complexity is dramatically reduced. Nonlinear system identification examples are included to demonstrate the effectiveness of this new approach in comparison to the well-known approaches of support vector machine and least absolute shrinkage and selection operator as well as the LROLS algorithm.
Resumo:
The globalization of trade in fish has created many challenges for the developing world specifically with regard to food safety and quality. International organisations have established a good basis for standards in international trade. Whilst these requirements are frequently embraced by the major importers (such as Japan, the EU and the USA), they often impose additional safety requirements and regularly identify batches which fail to meet their strict standards. Creating an effective national seafood control system which meets both the internal national needs as well the requirements for the export market can be challenging. Many countries adopt a dual system where seafood products for the major export markets are subject to tight control whilst the majority of the products (whether for the local market or for more regional trade) are less tightly controlled. With regional liberalization also occurring, deciding on appropriate controls is complex. In the Sultanate of Oman, fisheries production is one of the countries' chief sources of economic revenue after oil production and is a major source of the national food supply. In this paper the structure of the fish supply chain has been analysed and highlighted the different routes operating for the different markets. Although much of the fish are consumed within Oman, there is a major export trade to the local regional markets. Much smaller quantities meet the more stringent standards imposed by the major importing countries and exports to these are limited. The paper has considered the development of the Omani fish control system including the key legislative documents and the administrative structures that have been developed. Establishing modern controls which satisfy the demands of the major importers is possible but places additional costs on businesses. Enhanced controls such as HACCP and other management standards are required but can be difficult to justify when alternative markets do not specify these. These enhanced controls do however provide additional consumer protection and can bring benefits to local consumers. The Omani government is attempting to upgrade the system of controls and has made tremendous progress toward the implementation of HACCP and introducing enhanced management systems into its industrial sector. The existence of strengthened legislative and government support, including subsidies, has encouraged some businesses to implement HACCP. The current control systems have been reviewed and a SWOT analysis approach used to identify key factors for their future development. The study shows that seafood products in the supply chain are often exposed to lengthy handling and distribution process before reaching the consumers, a typical issue faced by many developing countries. As seafood products are often perishable, they safety is compromised if not adequately controlled. The enforcement of current food safety laws in the Sultanate of Oman is shared across various government agencies. Consequently, there is a need to harmonize all regulatory requirements, enhancing the domestic food protection and to continue to work towards a fully risk-based approach in order to compete successfully in the global market.
Resumo:
The extent to which a given extreme weather or climate event is attributable to anthropogenic climate change is a question of considerable public interest. From a scientific perspective, the question can be framed in various ways, and the answer depends very much on the framing. One such framing is a risk-based approach, which answers the question probabilistically, in terms of a change in likelihood of a class of event similar to the one in question, and natural variability is treated as noise. A rather different framing is a storyline approach, which examines the role of the various factors contributing to the event as it unfolded, including the anomalous aspects of natural variability, and answers the question deterministically. It is argued that these two apparently irreconcilable approaches can be viewed within a common framework, where the most useful level of conditioning will depend on the question being asked and the uncertainties involved.
Resumo:
This study contributes a rigorous diagnostic assessment of state-of-the-art multiobjective evolutionary algorithms (MOEAs) and highlights key advances that the water resources field can exploit to better discover the critical tradeoffs constraining our systems. This study provides the most comprehensive diagnostic assessment of MOEAs for water resources to date, exploiting more than 100,000 MOEA runs and trillions of design evaluations. The diagnostic assessment measures the effectiveness, efficiency, reliability, and controllability of ten benchmark MOEAs for a representative suite of water resources applications addressing rainfall-runoff calibration, long-term groundwater monitoring (LTM), and risk-based water supply portfolio planning. The suite of problems encompasses a range of challenging problem properties including (1) many-objective formulations with 4 or more objectives, (2) multi-modality (or false optima), (3) nonlinearity, (4) discreteness, (5) severe constraints, (6) stochastic objectives, and (7) non-separability (also called epistasis). The applications are representative of the dominant problem classes that have shaped the history of MOEAs in water resources and that will be dominant foci in the future. Recommendations are provided for which modern MOEAs should serve as tools and benchmarks in the future water resources literature.
Resumo:
O trabalho discute a possibilidade de criação de um mecanismo de seguro para os compromissos dos planos de previdência privada de benefício definido no Brasil. Analisa-se a experiência de alguns países que criaram mecanismos públicos de seguro contra este tipo de evento. Observou-se que estes mecanismos não cobram um prêmio atuarialmente justo, sujeitando o segurador ao perigo moral. Apresenta-se uma proposta para a regulamentação de um seguro privado e voluntário dirigido aos planos de benefício definido no Brasil, baseada em pesquisa junto a dirigentes de fundos de pensão e empresas seguradoras.
Resumo:
A implantação do Plano Real em 1994 provocou mudanças na estrutura de receitas dos bancos brasileiros. Em épocas de altas taxas de inflação, o resultado dos bancos era composto substancialmente por rendas oriundas do financiamento da dívida interna do país e em menor parcela de outras receitas como dos empréstimos ao setor privado. Com a estabilização da economia e a globalização dos mercados financeiros mundiais, refletida na entrada de bancos estrangeiros no mercado brasileiro, as taxas de juros tenderam a diminuir, ocasionando uma mudança no foco de atuação dos Bancos que estão se concentrando na intermediação financeira. Neste projeto é apresentada a formação básica do resultado de um banco obtido com a intermediação financeira e explana-se sobre os riscos da atividade bancária. É focado o risco de crédito, abrangendo a descrição das principais metodologias de análise. Estuda-se a Resolução CMN/BACEN nO 2682 que mudou a contabilização das rendas de renegociação de dívidas e estabeleceu parâmetros mínimos para a classificação das operações de crédito alterando os critérios de constituição da provisão para créditos de liquidação duvidosa. É explicado como pode ser utilizado um modelo RARO C - Risk Adjusted Return on Capital - desenvolvido originalmente pelo Bankers Trust - para gerenciamento da Carteira de crédito de um Banco de Varejo típico. Para ilustração e considerando que no mercado brasileiro os dados estatísticos sobre operações de crédito são escassos, além de existirem dificuldades na obtenção de dados de uma Carteira de crédito real relacionadas ao sigilo bancário e estratégias de investimento, o modelo RAROC será aplicado em uma Carteira de crédito fictícia de um Banco de Varejo, especialmente criada para esse fim. O estudo não abrange os recursos necessários para a implementação do modelo, nem customização para outros tipos de Bancos, restringindo-se à análise da utilização da metodologia. Por fim, apresentamos nossas conclusões a respeito da gestão do risco do crédito baseada na utilização de um modelo RAROC.
Resumo:
Esta tese é constituída por três ensaios. O primeiro ensaio analisa a informação pública disponível sobre o risco das carteiras de crédito dos bancos brasileiros, sendo dividido em dois capítulos. O primeiro analisa a limitação da informação pública disponibilizada pelos bancos e pelo Banco Central, quando comparada a informação gerencial disponível internamente pelos bancos. Concluiu-se que existe espaço para o aumento da transparência na divulgação das informações, fato que vem ocorrendo gradativamente no Brasil através de novas normas relacionadas ao Pilar 3 de Basileia II e à divulgação de informações mais detalhas pelo Bacen, como, por exemplo, aquelas do “Top50” . A segunda parte do primeiro ensaio mostra a discrepância entre o índice de inadimplência contábil (NPL) e a probabilidade de inadimplência (PD) e também discute a relação entre provisão e perda esperada. Através da utilização de matrizes de migração e de uma simulação baseada na sobreposição de safras de carteira de crédito de grandes bancos, concluiu-se que o índice de inadimplência subestima a PD e que a provisão constituída pelos bancos é menor que a perda esperada do SFN. O segundo ensaio relaciona a gestão de risco à discriminação de preço. Foi desenvolvido um modelo que consiste em um duopólio de Cournot em um mercado de crédito de varejo, em que os bancos podem realizar discriminação de terceiro grau. Neste modelo, os potenciais tomadores de crédito podem ser de dois tipos, de baixo ou de alto risco, sendo que tomadores de baixo risco possuem demanda mais elástica. Segundo o modelo, se o custo para observar o tipo do cliente for alto, a estratégia dos bancos será não discriminar (pooling equilibrium). Mas, se este custo for suficientemente baixo, será ótimo para os bancos cobrarem taxas diferentes para cada grupo. É argumentado que o Acordo de Basileia II funcionou como um choque exógeno que deslocou o equilíbrio para uma situação com maior discriminação. O terceiro ensaio é divido em dois capítulos. O primeiro discute a aplicação dos conceitos de probabilidade subjetiva e incerteza Knigthiana a modelos de VaR e a importância da avaliação do “risco de modelo”, que compreende os riscos de estimação, especificação e identificação. O ensaio propõe que a metodologia dos “quatro elementos” de risco operacional (dados internos, externos, ambiente de negócios e cenários) seja estendida à mensuração de outros riscos (risco de mercado e risco de crédito). A segunda parte deste último ensaio trata da aplicação do elemento análise de cenários para a mensuração da volatilidade condicional nas datas de divulgação econômica relevante, especificamente nos dias de reuniões do Copom.