991 resultados para Default time


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work we study a new risk model for a firm which is sensitive to its credit quality, proposed by Yang(2003): Are obtained recursive equations for finite time ruin probability and distribution of ruin time and Volterra type integral equation systems for ultimate ruin probability, severity of ruin and distribution of surplus before and after ruin

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We argue that it is possible to adapt the approach of imposing restrictions on available plans through finitely effective debt constraints, introduced by Levine and Zame (1996), to encompass models with default and collateral. Along this line, we introduce in the setting of Araujo, Páscoa and Torres-Martínez (2002) and Páscoa and Seghir (2008) the concept of almost finite-time solvency. We show that the conditions imposed in these two papers to rule out Ponzi schemes implicitly restrict actions to be almost finite-time solvent. We define the notion of equilibrium with almost finite-time solvency and look on sufficient conditions for its existence. Assuming a mild assumption on default penalties, namely that agents are myopic with respect to default penalties, we prove that existence is guaranteed (and Ponzi schemes are ruled out) when actions are restricted to be almost finite-time solvent. The proof is very simple and intuitive. In particular, the main existence results in Araujo et al. (2002) and Páscoa and Seghir (2008) are simple corollaries of our existence result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The time for conducting Preventive Maintenance (PM) on an asset is often determined using a predefined alarm limit based on trends of a hazard function. In this paper, the authors propose using both hazard and reliability functions to improve the accuracy of the prediction particularly when the failure characteristic of the asset whole life is modelled using different failure distributions for the different stages of the life of the asset. The proposed method is validated using simulations and case studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Travel time is an important network performance measure and it quantifies congestion in a manner easily understood by all transport users. In urban networks, travel time estimation is challenging due to number of reasons such as, fluctuations in traffic flow due to traffic signals, significant flow to/from mid link sinks/sources, etc. The classical analytical procedure utilizes cumulative plots at upstream and downstream locations for estimating travel time between the two locations. In this paper, we discuss about the issues and challenges with classical analytical procedure such as its vulnerability to non conservation of flow between the two locations. The complexity with respect to exit movement specific travel time is discussed. Recently, we have developed a methodology utilising classical procedure to estimate average travel time and its statistic on urban links (Bhaskar, Chung et al. 2010). Where, detector, signal and probe vehicle data is fused. In this paper we extend the methodology for route travel time estimation and test its performance using simulation. The originality is defining cumulative plots for each exit turning movement utilising historical database which is self updated after each estimation. The performance is also compared with a method solely based on probe (Probe-only). The performance of the proposed methodology has been found insensitive to different route flow, with average accuracy of more than 94% given a probe per estimation interval which is more than 5% increment in accuracy with respect to Probe-only method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

-

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Entrepreneurship research and practice places emphasis on company growth as a measure of entrepreneurial success. In many cases, there has been a tendency to give growth a very central role, with some researchers even seeing growth as the very essence of entrepreneurship (Cole, 1949; Sexton, 1997; Stevenson & Gumpert, 1991). A large number of empirical studies of the performance of young and/or small firms use growth as the dependent variable (see reviews by Ardishvili, Cardozo, Harmon, & Vadakath, 1998; Delmar, 1997; Wiklund, 1998). By contrast, the two most prominent views of strategic management – strategic positioning (Porter, 1980) and the resource-based view (Barney, 1991; Wernerfelt, 1984) – are both concerned with achieving competitive advantage and regard achieving economic rents and profitability relative to other competitors as the central measures of firm performance. Strategic entrepreneurship integrates these two perspectives and is simultaneously concerned with opportunity seeking and advantage seeking (Hitt, Ireland, Camp, & Sexton, 2002; Ireland, Hitt, & Sirmon, 2003). Consequently, both company growth and relative profitability are together relevant measures of firm performance in the domain of strategic entrepreneurship.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper describes the sensitivity of the simulated precipitation to changes in convective relaxation time scale (TAU) of Zhang and McFarlane (ZM) cumulus parameterization, in NCAR-Community Atmosphere Model version 3 (CAM3). In the default configuration of the model, the prescribed value of TAU, a characteristic time scale with which convective available potential energy (CAPE) is removed at an exponential rate by convection, is assumed to be 1 h. However, some recent observational findings suggest that, it is larger by around one order of magnitude. In order to explore the sensitivity of the model simulation to TAU, two model frameworks have been used, namely, aqua-planet and actual-planet configurations. Numerical integrations have been carried out by using different values of TAU, and its effect on simulated precipitation has been analyzed. The aqua-planet simulations reveal that when TAU increases, rate of deep convective precipitation (DCP) decreases and this leads to an accumulation of convective instability in the atmosphere. Consequently, the moisture content in the lower-and mid-troposphere increases. On the other hand, the shallow convective precipitation (SCP) and large-scale precipitation (LSP) intensify, predominantly the SCP, and thus capping the accumulation of convective instability in the atmosphere. The total precipitation (TP) remains approximately constant, but the proportion of the three components changes significantly, which in turn alters the vertical distribution of total precipitation production. The vertical structure of moist heating changes from a vertically extended profile to a bottom heavy profile, with the increase of TAU. Altitude of the maximum vertical velocity shifts from upper troposphere to lower troposphere. Similar response was seen in the actual-planet simulations. With an increase in TAU from 1 h to 8 h, there was a significant improvement in the simulation of the seasonal mean precipitation. The fraction of deep convective precipitation was in much better agreement with satellite observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information and Communication Technology (ICT) is becoming increasingly central to many people’s lives, making it possible to be connected in any place at any time, be unceasingly and instantly informed, and benefit from greater economic and educational opportunities. With all the benefits afforded by these new-found capabilities, however, come potential drawbacks. A plethora of new PCs, laptops, tablets, smartphones, Bluetooth, the internet, Wi-Fi (the list goes on) expect us to know or be able to guess, what, where and when to connect, click, double-click, tap, flick, scroll, in order to realise these benefits, and to have the physical and cognitive capability to do all these things. One of the groups most affected by this increase in high-demand technology is older people. They do not understand and use technology in the same way that younger generations do, because they grew up in the simpler electro-mechanical era and embedded that particular model of the world in their minds. Any consequential difficulty in familiarising themselves with modern ICT and effectively applying it to their needs can also be exacerbated by age-related changes in vision, motor control and cognitive functioning. Such challenges lead to digital exclusion. Much has been written about this topic over the years, usually by academics from the area of inclusive product design. The issue is complex and it is fair to say that no one researcher has the whole picture. It is difficult to understand and adequately address the issue of digital exclusion among the older generation without looking across disciplines and at industry’s and government’s understanding, motivation and efforts toward resolving this important problem. To do otherwise is to risk misunderstanding the true impact that ICT has and could have on people’s lives across all generations. In this European year of Active Ageing and Solidarity between Generations and as the British government is moving forward with its Digital by Default initiative as part of a wider objective to make ICT accessible to as many people as possible by 2015, the Engineering Design Centre (EDC) at the University of Cambridge collaborated with BT to produce a book of thought pieces to address, and where appropriate redress, these important and long-standing issues. “Ageing, Adaption and Accessibility: Time for the Inclusive Revolution!” brings together opinions and insights from twenty one prominent thought leaders from government, industry and academia regarding the problems, opportunities and strategies for combating digital exclusion among senior citizens. The contributing experts were selected as individuals, rather than representatives of organisations, to provide the broadest possible range of perspectives. They are renowned in their respective fields and their opinions are formed not only from their own work, but also from the contributions of others in their area. Their views were elicited through conversations conducted by the editors of this book who then drafted the thought pieces to be edited and approved by the experts. We hope that this unique collection of thought pieces will give you a broader perspective on ageing, people’s adaption to the ever changing world of technology and insights into better ways of designing digital devices and services for the older population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a methodology for deploying flexible dynamic configuration into embedded systems whilst preserving the reliability advantages of static systems. The methodology is based on the concept of decision points (DP) which are strategically placed to achieve fine-grained distribution of self-management logic to meet application-specific requirements. DP logic can be changed easily, and independently of the host component, enabling self-management behavior to be deferred beyond the point of system deployment. A transparent Dynamic Wrapper mechanism (DW) automatically detects and handles problems arising from the evaluation of self-management logic within each DP and ensures that the dynamic aspects of the system collapse down to statically defined default behavior to ensure safety and correctness despite failures. Dynamic context management contributes to flexibility, and removes the need for design-time binding of context providers and consumers, thus facilitating run-time composition and incremental component upgrade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reviews key proposals of a draft Bill set out in Command Paper: The Law Commission: Termination of Tenancies for Tenant Default (Cm.6946), aimed at replacing the existing law on forfeiture of tenancies. Summarises the main elements of the proposed termination action by landlords, the events justifying such an action, the time limits for serving default notices, the revised range of court orders available and the considerations influencing which type of order to make. Examines the position of qualifying interest holders and the circumstances in which summary termination notices are prohibited.