9 resultados para Probabilidade de default
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
The idea of considering imprecision in probabilities is old, beginning with the Booles George work, who in 1854 wanted to reconcile the classical logic, which allows the modeling of complete ignorance, with probabilities. In 1921, John Maynard Keynes in his book made explicit use of intervals to represent the imprecision in probabilities. But only from the work ofWalley in 1991 that were established principles that should be respected by a probability theory that deals with inaccuracies. With the emergence of the theory of fuzzy sets by Lotfi Zadeh in 1965, there is another way of dealing with uncertainty and imprecision of concepts. Quickly, they began to propose several ways to consider the ideas of Zadeh in probabilities, to deal with inaccuracies, either in the events associated with the probabilities or in the values of probabilities. In particular, James Buckley, from 2003 begins to develop a probability theory in which the fuzzy values of the probabilities are fuzzy numbers. This fuzzy probability, follows analogous principles to Walley imprecise probabilities. On the other hand, the uses of real numbers between 0 and 1 as truth degrees, as originally proposed by Zadeh, has the drawback to use very precise values for dealing with uncertainties (as one can distinguish a fairly element satisfies a property with a 0.423 level of something that meets with grade 0.424?). This motivated the development of several extensions of fuzzy set theory which includes some kind of inaccuracy. This work consider the Krassimir Atanassov extension proposed in 1983, which add an extra degree of uncertainty to model the moment of hesitation to assign the membership degree, and therefore a value indicate the degree to which the object belongs to the set while the other, the degree to which it not belongs to the set. In the Zadeh fuzzy set theory, this non membership degree is, by default, the complement of the membership degree. Thus, in this approach the non-membership degree is somehow independent of the membership degree, and this difference between the non-membership degree and the complement of the membership degree reveals the hesitation at the moment to assign a membership degree. This new extension today is called of Atanassov s intuitionistic fuzzy sets theory. It is worth noting that the term intuitionistic here has no relation to the term intuitionistic as known in the context of intuitionistic logic. In this work, will be developed two proposals for interval probability: the restricted interval probability and the unrestricted interval probability, are also introduced two notions of fuzzy probability: the constrained fuzzy probability and the unconstrained fuzzy probability and will eventually be introduced two notions of intuitionistic fuzzy probability: the restricted intuitionistic fuzzy probability and the unrestricted intuitionistic fuzzy probability
Resumo:
The static and cyclic assays are common to test materials in structures.. For cycling assays to assess the fatigue behavior of the material and thereby obtain the S-N curves and these are used to construct the diagrams of living constant. However, these diagrams, when constructed with small amounts of S-N curves underestimate or overestimate the actual behavior of the composite, there is increasing need for more testing to obtain more accurate results. Therewith, , a way of reducing costs is the statistical analysis of the fatigue behavior. The aim of this research was evaluate the probabilistic fatigue behavior of composite materials. The research was conducted in three parts. The first part consists of associating the equation of probability Weilbull equations commonly used in modeling of composite materials S-N curve, namely the exponential equation and power law and their generalizations. The second part was used the results obtained by the equation which best represents the S-N curves of probability and trained a network to the modular 5% failure. In the third part, we carried out a comparative study of the results obtained using the nonlinear model by parts (PNL) with the results of a modular network architecture (MN) in the analysis of fatigue behavior. For this we used a database of ten materials obtained from the literature to assess the ability of generalization of the modular network as well as its robustness. From the results it was found that the power law of probability generalized probabilistic behavior better represents the fatigue and composites that although the generalization ability of the MN that was not robust training with 5% failure rate, but for values mean the MN showed more accurate results than the PNL model
Resumo:
Ayahuasca is psychotropic beverage that has been used for ages by indigenous populations in South America, notably in the Amazon region, for religious and medicinal purposes. The tea is obtained by the decoction of leaves from the Psychotria viridis with the bark and stalk of a shrub, the Banisteriopsis caapi. The first is rich in N-N-dimethyltryptamine (DMT), which has an important and well-known hallucinogenic effect due to its agonistic action in serotonin receptors, specifically 5-HT2A. On the other hand, β-carbolines present in B. caapi, particularly harmine and harmaline, are potent monoamine oxidase inhibitors (MAOi). In addition, the tetrahydroharmine (THH), also present in B. caapi, acts as mild selective serotonin reuptake inhibitor and a weak MAOi. This unique composition induces a number of affective, sensitive, perceptual and cognitive changes in individuals under the effect of Ayahuasca. On the other hand, there is growing interest in the Default Mode Network (DMN), which has been consistently observed in functional neuroimaging studies. The key components of this network include structures in the brain midline, as the anterior medial frontal cortex, ventral medial frontal cortex, posterior cingulate cortex, precuneus, and some regions within the inferior parietal lobe and middle temporal gyrus. It has been argued that DMN participate in tasks involving self-judgments, autobiographical memory retrieval, mental simulations, thinking in perspective, meditative states, and others. In general, these tasks require an internal focus of attention, hence the conclusion that the DMN is associated with introspective mental activity. Therefore, this study aimed to evaluate by functional magnetic resonance imaging (fMRI) changes in DMN caused via the ingestion of Ayahuasca by 10 healthy subjects while submitted to two fMRI protocols: a verbal fluency task and a resting state acquisition. In general, it was observed that Ayahuasca causes a reduction in the fMRI signal in central nodes of DMN, such as the anterior cingulate cortex, the medial prefrontal cortex, the posterior cingulate cortex, precuneus and inferior parietal lobe. Furthermore, changes in connectivity patterns of the DMN were observed, especially a decrease in the functional connectivity of the precuneus. Together, these findings indicate an association between the altered state of consciousness experienced by individuals under the effect of Ayahuasca, and changes in the stream of spontaneous thoughts leading to an increased introspective mental activity
Resumo:
In this work we study a new risk model for a firm which is sensitive to its credit quality, proposed by Yang(2003): Are obtained recursive equations for finite time ruin probability and distribution of ruin time and Volterra type integral equation systems for ultimate ruin probability, severity of ruin and distribution of surplus before and after ruin
Resumo:
Two-level factorial designs are widely used in industrial experimentation. However, many factors in such a design require a large number of runs to perform the experiment, and too many replications of the treatments may not be feasible, considering limitations of resources and of time, making it expensive. In these cases, unreplicated designs are used. But, with only one replicate, there is no internal estimate of experimental error to make judgments about the significance of the observed efects. One of the possible solutions for this problem is to use normal plots or half-normal plots of the efects. Many experimenters use the normal plot, while others prefer the half-normal plot and, often, for both cases, without justification. The controversy about the use of these two graphical techniques motivates this work, once there is no register of formal procedure or statistical test that indicates \which one is best". The choice between the two plots seems to be a subjective issue. The central objective of this master's thesis is, then, to perform an experimental comparative study of the normal plot and half-normal plot in the context of the analysis of the 2k unreplicated factorial experiments. This study involves the construction of simulated scenarios, in which the graphics performance to detect significant efects and to identify outliers is evaluated in order to verify the following questions: Can be a plot better than other? In which situations? What kind of information does a plot increase to the analysis of the experiment that might complement those provided by the other plot? What are the restrictions on the use of graphics? Herewith, this work intends to confront these two techniques; to examine them simultaneously in order to identify similarities, diferences or relationships that contribute to the construction of a theoretical reference to justify or to aid in the experimenter's decision about which of the two graphical techniques to use and the reason for this use. The simulation results show that the half-normal plot is better to assist in the judgement of the efects, while the normal plot is recommended to detect outliers in the data
Resumo:
In the work reported here we present theoretical and numerical results about a Risk Model with Interest Rate and Proportional Reinsurance based on the article Inequalities for the ruin probability in a controlled discrete-time risk process by Ros ario Romera and Maikol Diasparra (see [5]). Recursive and integral equations as well as upper bounds for the Ruin Probability are given considering three di erent approaches, namely, classical Lundberg inequality, Inductive approach and Martingale approach. Density estimation techniques (non-parametrics) are used to derive upper bounds for the Ruin Probability and the algorithms used in the simulation are presented
Resumo:
This thesis aims to show teachers and students in teaching and learning in a study of Probability High School, a subject that sharpens the perception and understanding of the phenomea of the random nature that surrounds us. The same aims do with people who are involved in this process understand basic ideas of probability and, when necessary, apply them in the real world. We seek to draw a matched between intuition and rigor and hope therebyto contribute to the work of the teacher in the classroom and the learning process of students, consolidating, deepening and expaning what they have learned in previous contents
Resumo:
In this paper we propose a class for introducing the probability teaching using the game discs which is based on the concept of geometric probability and which is supposed to determine the probability of a disc randomly thrown does not intercept the lines of a gridded surface. The problem was posed to a group of 3nd year of the Federal Institute of Education, Science and Technology of Rio Grande do Norte - Jo~ao C^amara. Therefore, the students were supposed to build a grid board in which the success percentage of the players had been previously de ned for them. Once the grid board was built, the students should check whether that theoretically predetermined percentage corresponded to reality obtained through experimentation. The results and attitude of the students in further classes suggested greater involvement of them with discipline, making the environment conducive for learning.
Resumo:
In this paper we propose a class for introducing the probability teaching using the game discs which is based on the concept of geometric probability and which is supposed to determine the probability of a disc randomly thrown does not intercept the lines of a gridded surface. The problem was posed to a group of 3nd year of the Federal Institute of Education, Science and Technology of Rio Grande do Norte - Jo~ao C^amara. Therefore, the students were supposed to build a grid board in which the success percentage of the players had been previously de ned for them. Once the grid board was built, the students should check whether that theoretically predetermined percentage corresponded to reality obtained through experimentation. The results and attitude of the students in further classes suggested greater involvement of them with discipline, making the environment conducive for learning.