151 resultados para Credit method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the macroeconomic implications of firms' investment composition choices in the presence of credit constraints. Following a negative andpersistent aggregate productivity shock, firms shift into short-term investments because they produce more pledgeable output and because they help alleviate futureborrowing constraints. This produces a short-run dampening of the effects of theshock, at the expense of lower long-term investment and future output, relativeto an economy with no credit market imperfections. The effects are exacerbatedby a steepening of the term structure of interest rates that further encourages ashift towards short-term investments in the short-run. Small temporary shocks tothe severity of financing frictions generate large and long-lasting effects on outputthrough their impact on the composition of investment. A positive financial shockproduces much stronger effects than an identical negative shock, while the responsesto positive and negative shocks to aggregate productivity are roughly symmetric.Finally, the paper introduces a novel explanation for the countercyclicality of financing constraints of firms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The experiential sampling method (ESM) was used to collect data from 74 parttimestudents who described and assessed the risks involved in their current activitieswhen interrupted at random moments by text messages. The major categories ofperceived risk were short-term in nature and involved loss of time or materials relatedto work and physical damage (e.g., from transportation). Using techniques of multilevelanalysis, we demonstrate effects of gender, emotional state, and types of risk onassessments of risk. Specifically, females do not differ from males in assessing thepotential severity of risks but they see these as more likely to occur. Also, participantsassessed risks to be lower when in more positive self-reported emotional states. Wefurther demonstrate the potential of ESM by showing that risk assessments associatedwith current actions exceed those made retrospectively. We conclude by notingadvantages and disadvantages of ESM for collecting data about risk perceptions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the problem of scheduling a multi-station multiclassqueueing network (MQNET) with server changeover times to minimizesteady-state mean job holding costs. We present new lower boundson the best achievable cost that emerge as the values ofmathematical programming problems (linear, semidefinite, andconvex) over relaxed formulations of the system's achievableperformance region. The constraints on achievable performancedefining these formulations are obtained by formulatingsystem's equilibrium relations. Our contributions include: (1) aflow conservation interpretation and closed formulae for theconstraints previously derived by the potential function method;(2) new work decomposition laws for MQNETs; (3) new constraints(linear, convex, and semidefinite) on the performance region offirst and second moments of queue lengths for MQNETs; (4) a fastbound for a MQNET with N customer classes computed in N steps; (5)two heuristic scheduling policies: a priority-index policy, anda policy extracted from the solution of a linear programmingrelaxation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Credit Derivatives are securities that offer protection against credit or default risk ofbonds or loans. The credit derivatives emerging market has grown rapidly and creditderivatives are widely used. This paper describes the emerging credit derivativesmarket structure. The current market activity is analyzed through elementary pricingdynamics and the study of the term structure of default risk. Focusing on theperformance of credit derivatives in stress situation, including legal and market risks,we discuss the potential consequences of a debt restructuring in a large emergingmarket borrower. The contribution of credit derivatives to the risk sharing in emergingmarkets is also examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The financial crisis of 2007-08 has underscored the importance of adverse selection in financialmarkets. This friction has been mostly neglected by macroeconomic models of financialimperfections, however, which have focused almost exclusively on the effects of limited pledgeability.In this paper, we fill this gap by developing a standard growth model with adverseselection. Our main results are that, by fostering unproductive investment, adverse selection:(i) leads to an increase in the economy s equilibrium interest rate, and; (ii) it generates a negativewedge between the marginal return to investment and the equilibrium interest rate. Underfinancial integration, we show how this translates into excessive capital inflows and endogenouscycles. We also extend our model to the more general case in which adverse selection and limitedpledgeability coexist. We conclude that both frictions complement one another and show thatlimited pledgeability exacerbates the effects of adverse selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We continue the development of a method for the selection of a bandwidth or a number of design parameters in density estimation. We provideexplicit non-asymptotic density-free inequalities that relate the $L_1$ error of the selected estimate with that of the best possible estimate,and study in particular the connection between the richness of the classof density estimates and the performance bound. For example, our methodallows one to pick the bandwidth and kernel order in the kernel estimatesimultaneously and still assure that for {\it all densities}, the $L_1$error of the corresponding kernel estimate is not larger than aboutthree times the error of the estimate with the optimal smoothing factor and kernel plus a constant times $\sqrt{\log n/n}$, where $n$ is the sample size, and the constant only depends on the complexity of the family of kernels used in the estimate. Further applications include multivariate kernel estimates, transformed kernel estimates, and variablekernel estimates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes an optimized model to support QoS by mean of Congestion minimization on LSPs (Label Switching Path). In order to perform this model, we start from a CFA (Capacity and Flow Allocation) model. As this model does not consider the buffer size to calculate the capacity cost, our model- named BCA (Buffer Capacity Allocation)- take into account this issue and it improve the CFA performance. To test our proposal, we perform several simulations; results show that BCA model minimizes LSP congestion and uniformly distributes flows on the network

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g.,rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperaturedependent viscosity. Results show that average RMS errors are ∼5%–7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck’09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ∼10°C–15°C effluent reach ∼5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2‐D flows where background objects have a small spatial scale, such as sand or gravel

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contamination of weather radar echoes by anomalous propagation (anaprop) mechanisms remains a serious issue in quality control of radar precipitation estimates. Although significant progress has been made identifying clutter due to anaprop there is no unique method that solves the question of data reliability without removing genuine data. The work described here relates to the development of a software application that uses a numerical weather prediction (NWP) model to obtain the temperature, humidity and pressure fields to calculate the three dimensional structure of the atmospheric refractive index structure, from which a physically based prediction of the incidence of clutter can be made. This technique can be used in conjunction with existing methods for clutter removal by modifying parameters of detectors or filters according to the physical evidence for anomalous propagation conditions. The parabolic equation method (PEM) is a well established technique for solving the equations for beam propagation in a non-uniformly stratified atmosphere, but although intrinsically very efficient, is not sufficiently fast to be practicable for near real-time modelling of clutter over the entire area observed by a typical weather radar. We demonstrate a fast hybrid PEM technique that is capable of providing acceptable results in conjunction with a high-resolution terrain elevation model, using a standard desktop personal computer. We discuss the performance of the method and approaches for the improvement of the model profiles in the lowest levels of the troposphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest duration corresponding to the maturity stage (approximately 80% of the total time).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatial resolution is a key parameter of all remote sensing satellites and platforms. The nominal spatial resolution of satellites is a well-known characteristic because it is directly related to the area in ground that represents a pixel in the detector. Nevertheless, in practice, the actual resolution of a specific image obtained from a satellite is difficult to know precisely because it depends on many other factors such as atmospheric conditions. However, if one has two or more images of the same region, it is possible to compare their relative resolutions. In this paper, a wavelet-decomposition-based method for the determination of the relative resolution between two remotely sensed images of the same area is proposed. The method can be applied to panchromatic, multispectral, and mixed (one panchromatic and one multispectral) images. As an example, the method was applied to compute the relative resolution between SPOT-3, Landsat-5, and Landsat-7 panchromatic and multispectral images taken under similar as well as under very different conditions. On the other hand, if the true absolute resolution of one of the images of the pair is known, the resolution of the other can be computed. Thus, in the last part of this paper, a spatial calibrator that is designed and constructed to help compute the absolute resolution of a single remotely sensed image is described, and an example of its use is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article designs what it calls a Credit-Risk Balance Sheet (the risk being that of default by customers), a tool which, in principle, can contribute to revealing, controlling and managing the bad debt risk arising from a company¿s commercial credit, whose amount can represent a significant proportion of both its current and total assets.To construct it, we start from the duality observed in any credit transaction of this nature, whose basic identity can be summed up as Credit = Risk. ¿Credit¿ is granted by a company to its customer, and can be ranked by quality (we suggest the credit scoring system) and ¿risk¿ can either be assumed (interiorised) by the company itself or transferred to third parties (exteriorised).What provides the approach that leads to us being able to talk with confidence of a real Credit-Risk Balance Sheet with its methodological robustness is that the dual vision of the credit transaction is not, as we demonstrate, merely a classificatory duality (a double risk-credit classification of reality) but rather a true causal relationship, that is, a risk-credit causal duality.Once said Credit-Risk Balance Sheet (which bears a certain structural similarity with the classic net asset balance sheet) has been built, and its methodological coherence demonstrated, its properties ¿static and dynamic¿ are studied.Analysis of the temporal evolution of the Credit-Risk Balance Sheet and of its applications will be the object of subsequent works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article has an immediate predecessor, upon which it is based and with which readers must necessarily be familiar: Towards a Theory of the Credit-Risk Balance Sheet (Vallverdú, Somoza and Moya, 2006). The Balance Sheet is conceptualised on the basis of the duality of a credit-based transaction; it deals with its theoretical foundations, providing evidence of a causal credit-risk duality, that is, a true causal relationship; its characteristics, properties and its static and dynamic characteristics are analyzed. This article, which provides a logical continuation to the previous one, studies the evolution of the structure of the Credit-Risk Balance Sheet as a consequence of a business¿s dynamics in the credit area. Given the Credit-Risk Balance Sheet of a company at any given time, it attempts to estimate, by means of sequential analysis, its structural evolution, showing its usefulness in the management and control of credit and risk. To do this, it bases itself, with the necessary adaptations, on the by-now classic works of Palomba and Cutolo. The establishment of the corresponding transformation matrices allows one to move from an initial balance sheet structure to a final, future one, to understand its credit-risk situation trends, as well as to make possible its monitoring and control, basic elements in providing support for risk management.