907 resultados para Probability of default
Resumo:
Tämän tutkielman tavoitteena on selvittää, mitkä tekijät vaikuttavat yrityksen ja valtion velkakirjojen väliseen tuottoeroon. Strukturaalisten luottoriskin hinnoittelumallien mukaan luottoriskiin vaikuttavia tekijöitä ovat yrityksen velkaantumisaste, volatiliteetti ja riskitön korkokanta. Tavoitteena on erityisesti tutkia, kuinka hyvin nämä teoreettiset tekijät selittävät tuottoeroja ja onko olemassa muita tärkeitä selittäviä tekijöitä. Luottoriskinvaihtosopimusten noteerauksia käytetään tuottoerojen määrittämiseen. Selittävät tekijät koostuvat sekä yrityskohtaisista että markkinalaajuisista muuttujista. Luottoriskinvaihtosopimusten ja yrityskohtaisten muuttujien data on kerätty yhteensä 50 yritykselle Euroalueen maista. Aineisto koostuu kuukausittaisista havainnoista aikaväliltä 01.01.2003-31.12.2006. Empiiriset tulokset osoittavat, että strukturaalisten mallien mukaiset tekijät selittävät vain pienen osan tuottoeron muutoksista yli ajan. Toisaalta nämä teoreettiset tekijät selittävät huomattavasti paremmin tuottoeron vaihtelua yli poikkileikkauksen. Muut kuin teoreettiset tekijät pystyvät selittämään suuren osan tuottoeron vaihtelusta. Erityisen tärkeäksi tuottoeron selittäväksi tekijäksi osoittautui yleinen riskipreemio velkakirjamarkkinoilla. Tulokset osoittavat, että luottoriskin hinnoittelumalleja on kehitettävä edelleenniin, että ne ottaisivat huomioon yrityskohtaisten tekijöiden lisäksi myös markkinalaajuisia tekijöitä.
Resumo:
There has been recent interest in the use of X-chromosomal loci for forensic and relatedness testing casework, with many authors developing new X-linked short tandem repeat (STR) loci suitable for forensic use. Here we present formulae for two key quantities in paternity testing, the average probability of exclusion and the paternity index, which are suitable for Xchromosomal loci in the presence of population substructure.
Resumo:
This dissertation analyses quantitatively the costs of sovereign default for the economy, in a model where banks with long positions in government debt play a central role in the financial intermediation for private sector's investments and face financial frictions that limit their leverage ability. Calibration tries to resemble some features of the Eurozone, where discussions about bailout schemes and default risk have been central issues. Results show that the model captures one important cost of default pointed out by empirical and theoretical literature on debt crises, namely the fall in investment that follows haircut episodes, what can be explained by a worsening in banks' balance sheet conditions that limits credit for the private sector and raises their funding costs. The cost in terms of output decrease is though not significant enough to justify the existence of debt markets and the government incentives for debt repayment. Assuming that the government is able to alleviate its constrained budget by imposing a restructuring on debt repayment profile that allows it to cut taxes, our model generates an important difference for output path comparing lump-sum taxes and distortionary. For our calibration, quantitative results show that in terms of output and utility, it is possible that the effect on the labour supply response generated by tax cuts dominates investment drop caused by credit crunch on financial markets. We however abstract from default costs associated to the breaking of existing contracts, external sanctions and risk spillovers between countries, that might also be relevant in addition to financial disruption effects. Besides, there exist considerable trade-offs for short and long run path of economic variables related to government and banks' behaviour.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Abstract Background Smear-negative pulmonary tuberculosis (SNPTB) accounts for 30% of Pulmonary Tuberculosis (PTB) cases reported annually in developing nations. Polymerase chain reaction (PCR) may provide an alternative for the rapid detection of Mycobacterium tuberculosis (MTB); however little data are available regarding the clinical utility of PCR in SNPTB, in a setting with a high burden of TB/HIV co-infection. Methods To evaluate the performance of the PCR dot-blot in parallel with pretest probability (Clinical Suspicion) in patients suspected of having SNPTB, a prospective study of 213 individuals with clinical and radiological suspicion of SNPTB was carried out from May 2003 to May 2004, in a TB/HIV reference hospital. Respiratory specialists estimated the pretest probability of active disease into high, intermediate, low categories. Expectorated sputum was examined by direct microscopy (Ziehl-Neelsen staining), culture (Lowenstein Jensen) and PCR dot-blot. Gold standard was based on culture positivity combined with the clinical definition of PTB. Results In smear-negative and HIV subjects, active PTB was diagnosed in 28.4% (43/151) and 42.2% (19/45), respectively. In the high, intermediate and low pretest probability categories active PTB was diagnosed in 67.4% (31/46), 24% (6/25), 7.5% (6/80), respectively. PCR had sensitivity of 65% (CI 95%: 50%–78%) and specificity of 83% (CI 95%: 75%–89%). There was no difference in the sensitivity of PCR in relation to HIV status. PCR sensitivity and specificity among non-previously TB treated and those treated in the past were, respectively: 69%, 43%, 85% and 80%. The high pretest probability, when used as a diagnostic test, had sensitivity of 72% (CI 95%:57%–84%) and specificity of 86% (CI 95%:78%–92%). Using the PCR dot-blot in parallel with high pretest probability as a diagnostic test, sensitivity, specificity, positive and negative predictive values were: 90%, 71%, 75%, and 88%, respectively. Among non-previously TB treated and HIV subjects, this approach had sensitivity, specificity, positive and negative predictive values of 91%, 79%, 81%, 90%, and 90%, 65%, 72%, 88%, respectively. Conclusion PCR dot-blot associated with a high clinical suspicion may provide an important contribution to the diagnosis of SNPTB mainly in patients that have not been previously treated attended at a TB/HIV reference hospital.
Resumo:
OBJECTIVE: To estimate the pretest probability of Cushing's syndrome (CS) diagnosis by a Bayesian approach using intuitive clinical judgment. MATERIALS AND METHODS: Physicians were requested, in seven endocrinology meetings, to answer three questions: "Based on your personal expertise, after obtaining clinical history and physical examination, without using laboratorial tests, what is your probability of diagnosing Cushing's Syndrome?"; "For how long have you been practicing Endocrinology?"; and "Where do you work?". A Bayesian beta regression, using the WinBugs software was employed. RESULTS: We obtained 294 questionnaires. The mean pretest probability of CS diagnosis was 51.6% (95%CI: 48.7-54.3). The probability was directly related to experience in endocrinology, but not with the place of work. CONCLUSION: Pretest probability of CS diagnosis was estimated using a Bayesian methodology. Although pretest likelihood can be context-dependent, experience based on years of practice may help the practitioner to diagnosis CS. Arq Bras Endocrinol Metab. 2012;56(9):633-7
Resumo:
A Swiss-specific FRAX model was developed. Patient profiles at increased probability of fracture beyond currently accepted reimbursement thresholds for bone mineral density (BMD) measurement by dual X-ray absorptiometry (DXA), and osteoporosis treatment were identified.
Resumo:
To deliver sample estimates provided with the necessary probability foundation to permit generalization from the sample data subset to the whole target population being sampled, probability sampling strategies are required to satisfy three necessary not sufficient conditions: (i) All inclusion probabilities be greater than zero in the target population to be sampled. If some sampling units have an inclusion probability of zero, then a map accuracy assessment does not represent the entire target region depicted in the map to be assessed. (ii) The inclusion probabilities must be: (a) knowable for nonsampled units and (b) known for those units selected in the sample: since the inclusion probability determines the weight attached to each sampling unit in the accuracy estimation formulas, if the inclusion probabilities are unknown, so are the estimation weights. This original work presents a novel (to the best of these authors' knowledge, the first) probability sampling protocol for quality assessment and comparison of thematic maps generated from spaceborne/airborne Very High Resolution (VHR) images, where: (I) an original Categorical Variable Pair Similarity Index (CVPSI, proposed in two different formulations) is estimated as a fuzzy degree of match between a reference and a test semantic vocabulary, which may not coincide, and (II) both symbolic pixel-based thematic quality indicators (TQIs) and sub-symbolic object-based spatial quality indicators (SQIs) are estimated with a degree of uncertainty in measurement in compliance with the well-known Quality Assurance Framework for Earth Observation (QA4EO) guidelines. Like a decision-tree, any protocol (guidelines for best practice) comprises a set of rules, equivalent to structural knowledge, and an order of presentation of the rule set, known as procedural knowledge. The combination of these two levels of knowledge makes an original protocol worth more than the sum of its parts. The several degrees of novelty of the proposed probability sampling protocol are highlighted in this paper, at the levels of understanding of both structural and procedural knowledge, in comparison with related multi-disciplinary works selected from the existing literature. In the experimental session the proposed protocol is tested for accuracy validation of preliminary classification maps automatically generated by the Satellite Image Automatic MapperT (SIAMT) software product from two WorldView-2 images and one QuickBird-2 image provided by DigitalGlobe for testing purposes. In these experiments, collected TQIs and SQIs are statistically valid, statistically significant, consistent across maps and in agreement with theoretical expectations, visual (qualitative) evidence and quantitative quality indexes of operativeness (OQIs) claimed for SIAMT by related papers. As a subsidiary conclusion, the statistically consistent and statistically significant accuracy validation of the SIAMT pre-classification maps proposed in this contribution, together with OQIs claimed for SIAMT by related works, make the operational (automatic, accurate, near real-time, robust, scalable) SIAMT software product eligible for opening up new inter-disciplinary research and market opportunities in accordance with the visionary goal of the Global Earth Observation System of Systems (GEOSS) initiative and the QA4EO international guidelines.
Resumo:
Transports of radioactive wastes in Spain are becoming issues of renewed interest, due to the increased mobility of these materials which can be expected after the building and operation of the planned central repository for this country in a near future. Such types of residues will be mainly of the medium and high activity classes and have raised concerns on the safety of the operations, the radiological protection of the individuals, the compliance with the legal regulations and their environmental consequences of all kind. In this study, relevant information for the assessment of radiological risk of road transport were taken into account, as the sources and destination of the radioactive transports, the amount of traveling to be done, the preferred routes and populations affected, the characterization of the residues and containers, their corresponding testing, etc. These data were supplied by different organizations fully related with these activities, like the nuclear power stations, the companies in charge of radioactive transports, the enterprises for inspection and control of the activities, etc., as well as the government institutions which are responsible for the selection and location of the storage facility and other decisions on the nuclear policies of the country. Thus, we have developed a program for computing the data in such a form that by entering the radiation levels at one meter of the transport loads and by choosing a particular displacement, the computer application is capable to calculate the corresponding radiological effects, like the global estimated impact, its relevance to the population in general or on those people living and driving near the main road routes, the doses received by the most exposed individuals (e.g. the workers for loading or driving the vehicle), or the probability of detrimental on the human health. The results of this work could be of help for a better understanding and management of these activities and their related impacts; at the same time that the generated reports of the computer application are considered of particular interest as innovative and complementary information to the current legal documentation, which is basically required for transporting radioactive wastes in the country, according with the international safety rules (like IAEA and ADR).Though main studies are still in progress, as the definite location for the Spanish storage facility has not been decided yet, preliminary results with the existing transports of residues of medium activity indicate that the radiological impact is very low in conventional operations. Nevertheless, the management of these transports is complex and laborious, making it convenient to progress further in the analysis and quantification of this kind of events, which constitutes one of the main objectives of the present study for the radioactive road mobility in Spain.
Resumo:
Medial prefrontal cortex (MPFC) is among those brain regions having the highest baseline metabolic activity at rest and one that exhibits decreases from this baseline across a wide variety of goal-directed behaviors in functional imaging studies. This high metabolic rate and this behavior suggest the existence of an organized mode of default brain function, elements of which may be either attenuated or enhanced. Extant data suggest that these MPFC regions may contribute to the neural instantiation of aspects of the multifaceted “self.” We explore this important concept by targeting and manipulating elements of MPFC default state activity. In this functional magnetic resonance imaging (fMRI) study, subjects made two judgments, one self-referential, the other not, in response to affectively normed pictures: pleasant vs. unpleasant (an internally cued condition, ICC) and indoors vs. outdoors (an externally cued condition, ECC). The ICC was preferentially associated with activity increases along the dorsal MPFC. These increases were accompanied by decreases in both active task conditions in ventral MPFC. These results support the view that dorsal and ventral MPFC are differentially influenced by attentiondemanding tasks and explicitly self-referential tasks. The presence of self-referential mental activity appears to be associated with increases from the baseline in dorsal MPFC. Reductions in ventral MPFC occurred consistent with the fact that attention-demanding tasks attenuate emotional processing. We posit that both self-referential mental activity and emotional processing represent elements of the default state as represented by activity in MPFC. We suggest that a useful way to explore the neurobiology of the self is to explore the nature of default state activity.
Resumo:
We have studied enhancer function in transient and stable expression assays in mammalian cells by using systems that distinguish expressing from nonexpressing cells. When expression is studied in this way, enhancers are found to increase the probability of a construct being active but not the level of expression per template. In stably integrated constructs, large differences in expression level are observed but these are not related to the presence of an enhancer. Together with earlier studies, these results suggest that enhancers act to affect a binary (on/off) switch in transcriptional activity. Although this idea challenges the widely accepted model of enhancer activity, it is consistent with much, if not all, experimental evidence on this subject. We hypothesize that enhancers act to increase the probability of forming a stably active template. When randomly integrated into the genome, enhancers may affect a metastable state of repression/activity, permitting expression in regions that would not permit activity of an isolated promoter.
Resumo:
Freeze events significantly influence landscape structure and community composition along subtropical coastlines. This is particularly true in south Florida, where such disturbances have historically contributed to patch diversity within the mangrove forest, and have played a part in limiting its inland transgression. With projected increases in mean global temperatures, such instances are likely to become much less frequent in the region, contributing to a reduction in heterogeneity within the mangrove forest itself. To understand the process more clearly, we explored the dynamics of a Dwarf mangrove forest following two chilling events that produced freeze-like symptoms, i.e., leaf browning, desiccation, and mortality, and interpreted the resulting changes within the context of current winter temperatures and projected future scenarios. Structural effects from a 1996 chilling event were dramatic, with mortality and tissue damage concentrated among individuals comprising the Dwarf forest's low canopy. This disturbance promoted understory plant development and provided an opportunity for Laguncularia racemosa to share dominance with Rhizophora mangle. Mortality due to the less severe 2001 event was greatest in the understory, probably because recovery of the protective canopy following the earlier freeze was still incomplete. Stand dynamics were static over the same period in nearby unimpacted sites. The probability of reaching temperatures as low as those recorded at a nearby meteorological station (≤3 °C) under several warming scenarios was simulated by applying 1° incremental temperature increases to a model developed from a 42-year temperature record. According to the model, the frequency of similar chilling events decreased from once every 1.9 years at present to once every 3.4 and 32.5 years with 1 and 4 °C warming, respectively. The large decrease in the frequency of these events would eliminate an important mechanism that maintains Dwarf forest structure, and promotes compositional diversity.
Resumo:
Extensive data sets on water quality and seagrass distributions in Florida Bay have been assembled under complementary, but independent, monitoring programs. This paper presents the landscape-scale results from these monitoring programs and outlines a method for exploring the relationships between two such data sets. Seagrass species occurrence and abundance data were used to define eight benthic habitat classes from 677 sampling locations in Florida Bay. Water quality data from 28 monitoring stations spread across the Bay were used to construct a discriminant function model that assigned a probability of a given benthic habitat class occurring for a given combination of water quality variables. Mean salinity, salinity variability, the amount of light reaching the benthos, sediment depth, and mean nutrient concentrations were important predictor variables in the discriminant function model. Using a cross-validated classification scheme, this discriminant function identified the most likely benthic habitat type as the actual habitat type in most cases. The model predicted that the distribution of benthic habitat types in Florida Bay would likely change if water quality and water delivery were changed by human engineering of freshwater discharge from the Everglades. Specifically, an increase in the seasonal delivery of freshwater to Florida Bay should cause an expansion of seagrass beds dominated by Ruppia maritima and Halodule wrightii at the expense of the Thalassia testudinum-dominated community that now occurs in northeast Florida Bay. These statistical techniques should prove useful for predicting landscape-scale changes in community composition in diverse systems where communities are in quasi-equilibrium with environmental drivers.
Resumo:
Within the marl prairie grasslands of the Florida Everglades, USA, the combined effects of fire and flooding usually lead to very significant changes in tree island structure and composition. Depending on fire severity and post-fire hydroperiod, these effects vary spatially and temporally throughout the landscape, creating a patchy post-fire mosaic of tree islands with different successional states. Through the use of the Normalized Difference Vegetation Index (NDVI) and three predictor variables (marsh water table elevation at the time of fire, post-fire hydroperiod, and tree island size), along with logistic regression analysis, we examined the probability of tree island burning and recovering following the Mustang Corner Fire (May to June 2008) in Everglades National Park. Our data show that hydrologic conditions during and after fire, which are under varying degrees of management control, can lead to tree island contraction or loss. More specifically, the elevation of the marsh water table at the time of the fire appears to be the most important parameter determining the severity of fire in marl prairie tree islands. Furthermore, in the post-fire recovery phase, both tree island size and hydroperiod during the first year after the fire played important roles in determining the probability of tree island recovery, contraction, or loss.
Resumo:
This paper analyzes the inner relations between classical sub-scheme probability and statistic probability, subjective probability and objective probability, prior probability and posterior probability, transition probability and probability of utility, and further analysis the goal, method, and its practical economic purpose which represent by these various probability from the perspective of mathematics, so as to deeply understand there connotation and its relation with economic decision making, thus will pave the route for scientific predication and decision making.