46 resultados para risk-based modeling
                                
Resumo:
This paper seeks to discuss EU policies relating to securities markets, created in the wake of the financial crisis and how ICT and specifically e-Government can be utilised within this context. This study utilises the UK as a basis for our discussion. The recent financial crisis has caused a change of perspective in relation to government services and polices. The regulation of the financial sector has been heavily criticised and so is undergoing radical change in the UK and the rest of Europe. New regulatory bodies are being defined with more focus on taking a risk-based system-wide approach to regulating the financial sector. This approach aims to prevent financial institutions becoming too big to fail and thus require massive government bail outs. In addition, a new wave of EU regulation is in the wind to update risk management practices and to further protect investors. This paper discusses the reasons for the financial crisis and the UK’s past and future regulatory landscape. The current and future approach and strategies adopted by the UK’s financial regulators are reviewed as is the lifecycle of EU Directives. The regulatory responses to the crisis are discussed and upcoming regulatory hotspots identified. Discussion of these issues provides the context for our evaluation of the role e-Government and ICT in improving the regulatory system. We identify several processes, which are elementary for regulatory compliance and discuss how ICT is elementary in their implementation. The processes considered include those required for internal control and monitoring, risk management, record keeping and disclosure to regulatory bodies. We find these processes offer an excellent opportunity to adopt an e-Government approach to improve services to both regulated businesses and individual investors through the benefits derived from a more effective and efficient regulatory system.
                                
Resumo:
In this article, we illustrate experimentally an important consequence of the stochastic component in choice behaviour which has not been acknowledged so far. Namely, its potential to produce ‘regression to the mean’ (RTM) effects. We employ a novel approach to individual choice under risk, based on repeated multiple-lottery choices (i.e. choices among many lotteries), to show how the high degree of stochastic variability present in individual decisions can distort crucially certain results through RTM effects. We demonstrate the point in the context of a social comparison experiment.
                                
Resumo:
Understanding the performance of banks is of the utmost importance due to the impact the sector may have on economic growth and financial stability. Residential mortgage loans constitute a large proportion of the portfolio of many banks and are one of the key assets in the determination of their performance. Using a dynamic panel model, we analyse the impact of residential mortgage loans on bank profitability and risk, based on a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that an increase in residential mortgage loans seems to improve bank’s performance in terms of both profitability and credit risk in good market, pre-financial crisis, conditions. These findings may aid in explaining why banks rush to lend to property during booms because of the positive effect it has on performance. The results also show that credit risk and profitability are lower during the upturn in the residential property cycle.
                                
Resumo:
The evidence for anthropogenic climate change continues to strengthen, and concerns about severe weather events are increasing. As a result, scientific interest is rapidly shifting from detection and attribution of global climate change to prediction of its impacts at the regional scale. However, nearly everything we have any confidence in when it comes to climate change is related to global patterns of surface temperature, which are primarily controlled by thermodynamics. In contrast, we have much less confidence in atmospheric circulation aspects of climate change, which are primarily controlled by dynamics and exert a strong control on regional climate. Model projections of circulation-related fields, including precipitation, show a wide range of possible outcomes, even on centennial timescales. Sources of uncertainty include low-frequency chaotic variability and the sensitivity to model error of the circulation response to climate forcing. As the circulation response to external forcing appears to project strongly onto existing patterns of variability, knowledge of errors in the dynamics of variability may provide some constraints on model projections. Nevertheless, higher scientific confidence in circulation-related aspects of climate change will be difficult to obtain. For effective decision-making, it is necessary to move to a more explicitly probabilistic, risk-based approach.
                                
Resumo:
An efficient data based-modeling algorithm for nonlinear system identification is introduced for radial basis function (RBF) neural networks with the aim of maximizing generalization capability based on the concept of leave-one-out (LOO) cross validation. Each of the RBF kernels has its own kernel width parameter and the basic idea is to optimize the multiple pairs of regularization parameters and kernel widths, each of which is associated with a kernel, one at a time within the orthogonal forward regression (OFR) procedure. Thus, each OFR step consists of one model term selection based on the LOO mean square error (LOOMSE), followed by the optimization of the associated kernel width and regularization parameter, also based on the LOOMSE. Since like our previous state-of-the-art local regularization assisted orthogonal least squares (LROLS) algorithm, the same LOOMSE is adopted for model selection, our proposed new OFR algorithm is also capable of producing a very sparse RBF model with excellent generalization performance. Unlike our previous LROLS algorithm which requires an additional iterative loop to optimize the regularization parameters as well as an additional procedure to optimize the kernel width, the proposed new OFR algorithm optimizes both the kernel widths and regularization parameters within the single OFR procedure, and consequently the required computational complexity is dramatically reduced. Nonlinear system identification examples are included to demonstrate the effectiveness of this new approach in comparison to the well-known approaches of support vector machine and least absolute shrinkage and selection operator as well as the LROLS algorithm.
                                
Resumo:
The globalization of trade in fish has created many challenges for the developing world specifically with regard to food safety and quality. International organisations have established a good basis for standards in international trade. Whilst these requirements are frequently embraced by the major importers (such as Japan, the EU and the USA), they often impose additional safety requirements and regularly identify batches which fail to meet their strict standards. Creating an effective national seafood control system which meets both the internal national needs as well the requirements for the export market can be challenging. Many countries adopt a dual system where seafood products for the major export markets are subject to tight control whilst the majority of the products (whether for the local market or for more regional trade) are less tightly controlled. With regional liberalization also occurring, deciding on appropriate controls is complex. In the Sultanate of Oman, fisheries production is one of the countries' chief sources of economic revenue after oil production and is a major source of the national food supply. In this paper the structure of the fish supply chain has been analysed and highlighted the different routes operating for the different markets. Although much of the fish are consumed within Oman, there is a major export trade to the local regional markets. Much smaller quantities meet the more stringent standards imposed by the major importing countries and exports to these are limited. The paper has considered the development of the Omani fish control system including the key legislative documents and the administrative structures that have been developed. Establishing modern controls which satisfy the demands of the major importers is possible but places additional costs on businesses. Enhanced controls such as HACCP and other management standards are required but can be difficult to justify when alternative markets do not specify these. These enhanced controls do however provide additional consumer protection and can bring benefits to local consumers. The Omani government is attempting to upgrade the system of controls and has made tremendous progress toward the implementation of HACCP and introducing enhanced management systems into its industrial sector. The existence of strengthened legislative and government support, including subsidies, has encouraged some businesses to implement HACCP. The current control systems have been reviewed and a SWOT analysis approach used to identify key factors for their future development. The study shows that seafood products in the supply chain are often exposed to lengthy handling and distribution process before reaching the consumers, a typical issue faced by many developing countries. As seafood products are often perishable, they safety is compromised if not adequately controlled. The enforcement of current food safety laws in the Sultanate of Oman is shared across various government agencies. Consequently, there is a need to harmonize all regulatory requirements, enhancing the domestic food protection and to continue to work towards a fully risk-based approach in order to compete successfully in the global market.
                                
Resumo:
The extent to which a given extreme weather or climate event is attributable to anthropogenic climate change is a question of considerable public interest. From a scientific perspective, the question can be framed in various ways, and the answer depends very much on the framing. One such framing is a risk-based approach, which answers the question probabilistically, in terms of a change in likelihood of a class of event similar to the one in question, and natural variability is treated as noise. A rather different framing is a storyline approach, which examines the role of the various factors contributing to the event as it unfolded, including the anomalous aspects of natural variability, and answers the question deterministically. It is argued that these two apparently irreconcilable approaches can be viewed within a common framework, where the most useful level of conditioning will depend on the question being asked and the uncertainties involved.
                                
Resumo:
Abstract: Following a workshop exercise, two models, an individual-based landscape model (IBLM) and a non-spatial life-history model were used to assess the impact of a fictitious insecticide on populations of skylarks in the UK. The chosen population endpoints were abundance, population growth rate, and the chances of population persistence. Both models used the same life-history descriptors and toxicity profiles as the basis for their parameter inputs. The models differed in that exposure was a pre-determined parameter in the life-history model, but an emergent property of the IBLM, and the IBLM required a landscape structure as an input. The model outputs were qualitatively similar between the two models. Under conditions dominated by winter wheat, both models predicted a population decline that was worsened by the use of the insecticide. Under broader habitat conditions, population declines were only predicted for the scenarios where the insecticide was added. Inputs to the models are very different, with the IBLM requiring a large volume of data in order to achieve the flexibility of being able to integrate a range of environmental and behavioural factors. The life-history model has very few explicit data inputs, but some of these relied on extensive prior modelling needing additional data as described in Roelofs et al.(2005, this volume). Both models have strengths and weaknesses; hence the ideal approach is that of combining the use of both simple and comprehensive modeling tools.
                                
                                
Resumo:
It is known that germin, which is a marker of the onset of growth in germinating wheat, is an oxalate oxidase, and also that germins possess sequence similarity with legumin and vicilin seed storage proteins. These two pieces of information have been combined in order to generate a 3D model of germin based on the structure of vicilin and to examine the model with regard to a potential oxalate oxidase active site. A cluster of three histidine residues has been located within the conserved beta-barrel structure. While there is a relatively low level of overall sequence similarity between the model and the vicilin structures, the conservation of amino acids important in maintaining the scaffold of the beta-barrel lends confidence to the juxtaposition of the histidine residues. The cluster is similar structurally to those found in copper amine oxidase and other proteins, leading to the suggestion that it defines a metal-binding location within the oxalate oxidase active site. It is also proposed that the structural elements involved in intermolecular interactions in vicilins may play a role in oligomer formation in germin/oxalate oxidase.
                                
Resumo:
Smooth flow of production in construction is hampered by disparity between individual trade teams' goals and the goals of stable production flow for the project as a whole. This is exacerbated by the difficulty of visualizing the flow of work in a construction project. While the addresses some of the issues in Building information modeling provides a powerful platform for visualizing work flow in control systems that also enable pull flow and deeper collaboration between teams on and off site. The requirements for implementation of a BIM-enabled pull flow construction management software system based on the Last Planner System™, called ‘KanBIM’, have been specified, and a set of functional mock-ups of the proposed system has been implemented and evaluated in a series of three focus group workshops. The requirements cover the areas of maintenance of work flow stability, enabling negotiation and commitment between teams, lean production planning with sophisticated pull flow control, and effective communication and visualization of flow. The evaluation results show that the system holds the potential to improve work flow and reduce waste by providing both process and product visualization at the work face.
                                
                                
Resumo:
We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.
                                
Resumo:
This chapter introduces ABMs, their construction, and the pros and cons of their use. Although relatively new, agent-basedmodels (ABMs) have great potential for use in ecotoxicological research – their primary advantage being the realistic simulations that can be constructed and particularly their explicit handling of space and time in simulations. Examples are provided of their use in ecotoxicology primarily exemplified by different implementations of the ALMaSS system. These examples presented demonstrate how multiple stressors, landscape structure, details regarding toxicology, animal behavior, and socioeconomic effects can and should be taken into account when constructing simulations for risk assessment. Like ecological systems, in ABMs the behavior at the system level is not simply the mean of the component responses, but the sum of the often nonlinear interactions between components in the system; hence this modeling approach opens the door to implementing and testing much more realistic and holistic ecotoxicological models than are currently used.
                                
Resumo:
Polycondensation of 2,6-dihydroxynaphthalene with 4,4'-bis(4"-fluorobenzoyl)biphenyl affords a novel, semicrystalline poly(ether ketone) with a melting point of 406 degreesC and glass transition temperature (onset) of 168 degreesC. Molecular modeling and diffraction-simulation studies of this polymer, coupled with data from the single-crystal structure of an oligomer model, have enabled the crystal and molecular structure of the polymer to be determined from X-ray powder data. This structure-the first for any naphthalene-containing poly(ether ketone)-is fully ordered, in monoclinic space group P2(1)/b, with two chains per unit cell. Rietveld refinement against the experimental powder data gave a final agreement factor (R-wp) of 6.7%.
 
                    