73 resultados para Context-aware computing and systems
Resumo:
In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.
In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.
Entrepreneurial Learning: Researching the Interface between Learning and the Entrepreneurial Context
Resumo:
This paper investigates the two-stage stepwise identification for a class of nonlinear dynamic systems that can be described by linear-in-the-parameters models, and the model has to be built from a very large pool of basis functions or model terms. The main objective is to improve the compactness of the model that is obtained by the forward stepwise methods, while retaining the computational efficiency. The proposed algorithm first generates an initial model using a forward stepwise procedure. The significance of each selected term is then reviewed at the second stage and all insignificant ones are replaced, resulting in an optimised compact model with significantly improved performance. The main contribution of this paper is that these two stages are performed within a well-defined regression context, leading to significantly reduced computational complexity. The efficiency of the algorithm is confirmed by the computational complexity analysis, and its effectiveness is demonstrated by the simulation results.
Resumo:
In the USA today, the precipitous rise of new financial mechanisms for capitalisation of firms as well as the merger and acquisition of others, especially risk equity capital through venture capitalist and investment banking, has sparked growth and helped to bring the economy out of the 1990s recession into a robust continuous growth pattern well positioned for the next century. The scenario is not new. For the venture capitalists of ''Silicon Valley'' in California, the experience is not new. They have seen the new industries arise before, like a phoenix from ashes of ruin, despair and even failure. Venture capital poured into high tech start-up companies has been an enormous source of financial support for the entrepreneurs who head new and growing companies. The mid-1990s marked the most dramatic increase yet recorded. Indicators, such as the NASDAQ document, outlined the solid and continuous growth in high tech industries. The paper discusses investment in US corporations within the context of governance and management of the company. Discussion about the various forms of finance are related to the organisation and management of the US corporation. Critical to any firm today are its ability to find innovative, new products or services. A growing literature on resource-base framework for analysis will be discussed as part of the firm's development of research for commercialisation. The results of a recent survey further shed light on the relationship between corporate financial management and allocated resources for research and development as the ''engine'' for new product development and therefore corporate market share and growth. The conclusion is that more financial mechanisms will be created and changed within US corporate systems to adjust, grow, and expand companies in the global economic arena, as the inevitable economic pattern leads to mergers, consolidations, and increasing cooperation and alliances among firms.
Resumo:
Treasure et al. (2004) recently proposed a new sub space-monitoring technique, based on the N4SID algorithm, within the multivariate statistical process control framework. This dynamic-monitoring method requires considerably fewer variables to be analysed when compared with dynamic principal component analysis (PCA). The contribution charts and variable reconstruction, traditionally employed for static PCA, are analysed in a dynamic context. The contribution charts and variable reconstruction may be affected by the ratio of the number of retained components to the total number of analysed variables. Particular problems arise if this ratio is large and a new reconstruction chart is introduced to overcome these. The utility of such a dynamic contribution chart and variable reconstruction is shown in a simulation and by application to industrial data from a distillation unit.
Resumo:
Modelling and control of nonlinear dynamical systems is a challenging problem since the dynamics of such systems change over their parameter space. Conventional methodologies for designing nonlinear control laws, such as gain scheduling, are effective because the designer partitions the overall complex control into a number of simpler sub-tasks. This paper describes a new genetic algorithm based method for the design of a modular neural network (MNN) control architecture that learns such partitions of an overall complex control task. Here a chromosome represents both the structure and parameters of an individual neural network in the MNN controller and a hierarchical fuzzy approach is used to select the chromosomes required to accomplish a given control task. This new strategy is applied to the end-point tracking of a single-link flexible manipulator modelled from experimental data. Results show that the MNN controller is simple to design and produces superior performance compared to a single neural network (SNN) controller which is theoretically capable of achieving the desired trajectory. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Functional and non-functional concerns require different programming effort, different techniques and different methodologies when attempting to program efficient parallel/distributed applications. In this work we present a "programmer oriented" methodology based on formal tools that permits reasoning about parallel/distributed program development and refinement. The proposed methodology is semi-formal in that it does not require the exploitation of highly formal tools and techniques, while providing a palatable and effective support to programmers developing parallel/distributed applications, in particular when handling non-functional concerns.
Resumo:
Despite the substantial organisational benefits of integrated IT, the implementation of such systems – and particularly Enterprise Resource Planning (ERP) systems – has tended to be problematic, stimulating an extensive body of research into ERP implementation. This research has remained largely separate from the main IT implementation literature. At the same time, studies of IT implementation have generally adopted either a factor or process approach; both have major limitations. To address these imitations, factor and process perspectives are combined here in a unique model of IT implementation. We argue that • the organisational factors which determine successful implementation differ for integrated and traditional, discrete IT • failure to manage these differences is a major source of integrated IT failure. The factor/process model is used as a framework for proposing differences between discrete and integrated IT.
Resumo:
This article distinguishes three different conceptions of the relationship between religion and the public sphere. The reconciliation of these different aspects of freedom of religion can be seen to give rise to considerable difficulties in practice, and the legal and political systems of several Western European countries are struggling to cope. Four recurring issues that arise in this context are identified and considered: what is a 'religion' and what are 'religious' beliefs and practices for the purposes of the protection of 'freedom of religion', together with the closely related issue of who decides these questions; what justification there is for a provision guaranteeing freedom of religion at all; which manifestations of religious association are so unacceptable as to take the association outside the protection of freedom of religion altogether; and what weight should be given to freedom of religion when this freedom stands opposed to other values. It is argued that the scope and meaning of human rights in this context is anything but settled and that this gives an opportunity to those who support a role for religion in public life to intervene.