939 resultados para Extreme Value Theory
Resumo:
According to Bandura (1997) efficacy beliefs are a primary determinant of motivation. Still, very little is known about the processes through which people integrate situational factors to form efficacy beliefs (Myers & Feltz, 2007). The aim of this study was to gain insight into the cognitive construction of subjective group-efficacy beliefs. Only with a sound understanding of those processes is there a sufficient base to derive psychological interventions aimed at group-efficacy beliefs. According to cognitive theories (e.g., Miller, Galanter, & Pribram, 1973) individual group-efficacy beliefs can be seen as the result of a comparison between the demands of a group task and the resources of the performing group. At the center of this comparison are internally represented structures of the group task and plans to perform it. The empirical plausibility of this notion was tested using functional measurement theory (Anderson, 1981). Twenty-three students (M = 23.30 years; SD = 3.39; 35 % females) of the University of Bern repeatedly judged the efficacy of groups in different group tasks. The groups consisted of the subjects and another one to two fictive group members. The latter were manipulated by their value (low, medium, high) in task-relevant abilities. Data obtained from multiple full factorial designs were structured with individuals as second level units and analyzed using mixed linear models. The task-relevant abilities of group members, specified as fixed factors, all had highly significant effects on subjects’ group-efficacy judgments. The effect sizes of the ability factors showed to be dependent on the respective abilities’ importance in a given task. In additive tasks (Steiner, 1972) group resources were integrated in a linear fashion whereas significant interaction between factors was obtained in interdependent tasks. The results also showed that people take into account other group members’ efficacy beliefs when forming their own group-efficacy beliefs. The results support the notion that personal group-efficacy beliefs are obtained by comparing the demands of a task with the performing groups’ resources. Psychological factors such as other team members’ efficacy beliefs are thereby being considered task relevant resources and affect subjective group-efficacy beliefs. This latter finding underlines the adequacy of multidimensional measures. While the validity of collective efficacy measures is usually estimated by how well they predict performances, the results of this study allow for a somewhat internal validity criterion. It is concluded that Information Integration Theory holds potential to further help understand people’s cognitive functioning in sport relevant situations.
Resumo:
The transverse broadening of an energetic jet passing through a non-Abelian plasma is believed to be described by the thermal expectation value of a light-cone Wilson loop. In this exploratory study, we measure the light-cone Wilson loop with classical lattice gauge theory simulations. We observe, as suggested by previous studies, that there are strong interactions already at short transverse distances, which may lead to more efficient jet quenching than in leading-order perturbation theory. We also verify that the asymptotics of the Wilson loop do not change qualitatively when crossing the light cone, which supports arguments in the literature that infrared contributions to jet quenching can be studied with dimensionally reduced simulations in the space-like domain. Finally we speculate on possibilities for full four-dimensional lattice studies of the same observable, perhaps by employing shifted boundary conditions in order to simulate ensembles boosted by an imaginary velocity.
Resumo:
Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.
Resumo:
Traditional methods do not actually measure peoples’ risk attitude naturally and precisely. Therefore, a fuzzy risk attitude classification method is developed. Since the prospect theory is usually considered as an effective model of decision making, the personalized parameters in prospect theory are firstly fuzzified to distinguish people with different risk attitudes, and then a fuzzy classification database schema is applied to calculate the exact value of risk value attitude and risk be- havior attitude. Finally, by applying a two-hierarchical clas- sification model, the precise value of synthetical risk attitude can be acquired.
Resumo:
Theory: Interpersonal factors play a major role in causing and maintaining depression. It is unclear, however, to what degree significant others of the patient need to be involved for characterizing the patient's interpersonal style. Therefore, our study sought to investigate how impact messages as perceived by the patients' significant others add to the prediction of psychotherapy process and outcome above and beyond routine assessments, and therapist factors. Method: 143 outpatients with major depressive disorder were treated by 24 therapists with CBT or Exposure-Based Cognitive Therapy. Interpersonal style was measured pre and post therapy with the informant‐based Impact Message Inventory (IMI), in addition to the self‐report Inventory of Interpersonal Problems (IIP‐32). Indicators for the patients' dominance and affiliation as well as interpersonal distress were calculated from these measures. Depressive and general symptomatology was assessed at pre, post, and at three months follow‐up, and by process measures after every session. Results: Whereas significant other's reports did not add significantly to the prediction of the early therapeutic alliance, central mechanisms of change, or post‐therapy outcome including therapist factors, the best predictor of outcome 3 months post therapy was an increase in dominance as perceived by significant others. Conclusions: The patients' significant others seem to provide important additional information about the patients' interpersonal style and therefore should be included in the diagnostic process. Moreover, practitioners should specifically target interpersonal change as a potential mechanism of change in psychotherapy for depression.
Resumo:
In this article, we present a new microscopic theoretical approach to the description of spin crossover in molecular crystals. The spin crossover crystals under consideration are composed of molecular fragments formed by the spin-crossover metal ion and its nearest ligand surrounding and exhibiting well defined localized (molecular) vibrations. As distinguished from the previous models of this phenomenon, the developed approach takes into account the interaction of spin-crossover ions not only with the phonons but also a strong coupling of the electronic shells with molecular modes. This leads to an effective coupling of the local modes with phonons which is shown to be responsible for the cooperative spin transition accompanied by the structural reorganization. The transition is characterized by the two order parameters representing the mean values of the products of electronic diagonal matrices and the coordinates of the local modes for the high- and low-spin states of the spin crossover complex. Finally, we demonstrate that the approach provides a reasonable explanation of the observed spin transition in the [Fe(ptz)6](BF4)2 crystal. The theory well reproduces the observed abrupt low-spin → high-spin transition and the temperature dependence of the high-spin fraction in a wide temperature range as well as the pronounced hysteresis loop. At the same time within the limiting approximations adopted in the developed model, the evaluated high-spin fraction vs. T shows that the cooperative spin-lattice transition proves to be incomplete in the sense that the high-spin fraction does not reach its maximum value at high temperature.
Resumo:
We propose a way to incorporate NTBs for the four workhorse models of the modern trade literature in computable general equilibrium models (CGEs). CGE models feature intermediate linkages and thus allow us to study global value chains (GVCs). We show that the Ethier-Krugman monopolistic competition model, the Melitz firm heterogeneity model and the Eaton and Kortum model can be defined as an Armington model with generalized marginal costs, generalized trade costs and a demand externality. As already known in the literature in both the Ethier-Krugman model and the Melitz model generalized marginal costs are a function of the amount of factor input bundles. In the Melitz model generalized marginal costs are also a function of the price of the factor input bundles. Lower factor prices raise the number of firms that can enter the market profitably (extensive margin), reducing generalized marginal costs of a representative firm. For the same reason the Melitz model features a demand externality: in a larger market more firms can enter. We implement the different models in a CGE setting with multiple sectors, intermediate linkages, non-homothetic preferences and detailed data on trade costs. We find the largest welfare effects from trade cost reductions in the Melitz model. We also employ the Melitz model to mimic changes in Non tariff Barriers (NTBs) with a fixed cost-character by analysing the effect of changes in fixed trade costs. While we work here with a model calibrated to the GTAP database, the methods developed can also be applied to CGE models based on the WIOD database.
Resumo:
Redemption laws give mortgagors the right to redeem their property following default for a statutorily set period of time. This paper develops a theory that explains these laws as a means of protecting landowners against the loss of non-transferable values associated with their land. A longer redemption period reduces the risk that this value will be lost but also increases the likelihood of default. The optimal redemption period balances these effects. Empirical analysis of cross-state data from the early twentieth century suggests that these factors, in combination with political considerations, explain the existence and length of redemption laws.
Resumo:
Redemption laws give mortgagors the right to redeem their property following default for a statutorily set period of time. This paper develops a theory that explains these laws as a means of protecting landowners against the loss of nontransferable values associated with their land. A longer redemption period reduces the risk that this value will be lost but also increases the likelihood of default. The optimal redemption period balances these effects. Empirical analysis of cross-state data from the early twentieth century suggests that these factors, in combination with political considerations, explain the existence and length of redemption laws.
Resumo:
This study aims to examine the international value distribution structure among major East Asian economies and the US. The mainstream trade theory explains the gains from trade; however, global value chain (GVC) approach emphasises uneven benefits of globalization among trading partners. The present study is mainly based on this view, examining which economy gains the most and which the least from the East Asian production networks. Two key industries, i.e., electronics and automobile, are our principle focus. Input-output method is employed to trace the creation and flows of value-added within the region. A striking fact is that some ASEAN economies increasingly reduce their shares of value-added, taken by developed countries, particularly by Japan. Policy implications are discussed in the final section.
Resumo:
The fragmentation of production chains across borders is one of the most distinctive feature of the last 30 years of globalization. Nonetheless, our understanding of its implications for trade theory and policy is only in its infancy. We suggest that trade in value added should follow theories of comparative advantage more closely than gross trade, as value-added flows capture where factors of production, e.g. skilled and unskilled labor, are used along the global value chain. We find empirical evidence that Heckscher-Ohlin theory does predict manufacturing trade in value-added, and it does so better than for gross shipment flows. While countries exports across a broad range of sectors, they contribute more value-added in techniques using their abundant factor intensively.
Resumo:
The authors are from UPM and are relatively grouped, and all have intervened in different academic or real cases on the subject, at different times as being of different age. With precedent from E. Torroja and A. Páez in Madrid Spain Safety Probabilistic models for concrete about 1957, now in ICOSSAR conferences, author J.M. Antón involved since autumn 1967 for euro-steel construction in CECM produced a math model for independent load superposition reductions, and using it a load coefficient pattern for codes in Rome Feb. 1969, practically adopted for European constructions, giving in JCSS Lisbon Feb. 1974 suggestion of union for concrete-steel-al.. That model uses model for loads like Gumbel type I, for 50 years for one type of load, reduced to 1 year to be added to other independent loads, the sum set in Gumbel theories to 50 years return period, there are parallel models. A complete reliability system was produced, including non linear effects as from buckling, phenomena considered somehow in actual Construction Eurocodes produced from Model Codes. The system was considered by author in CEB in presence of Hydraulic effects from rivers, floods, sea, in reference with actual practice. When redacting a Road Drainage Norm in MOPU Spain an optimization model was realized by authors giving a way to determine the figure of Return Period, 10 to 50 years, for the cases of hydraulic flows to be considered in road drainage. Satisfactory examples were a stream in SE of Spain with Gumbel Type I model and a paper of Ven Te Chow with Mississippi in Keokuk using Gumbel type II, and the model can be modernized with more varied extreme laws. In fact in the MOPU drainage norm the redacting commission acted also as expert to set a table of return periods for elements of road drainage, in fact as a multi-criteria complex decision system. These precedent ideas were used e.g. in wide Codes, indicated in symposia or meetings, but not published in journals in English, and a condensate of contributions of authors is presented. The authors are somehow involved in optimization for hydraulic and agro planning, and give modest hints of intended applications in presence of agro and environment planning as a selection of the criteria and utility functions involved in bayesian, multi-criteria or mixed decision systems. Modest consideration is made of changing in climate, and on the production and commercial systems, and on others as social and financial.
Resumo:
An asymptotic analysis of the Langmuir-probe problem in a quiescent, fully ionized plasma in a strong magnetic field is performed, for electron cyclotron radius and Debye length much smaller than probe radius, and this not larger than either ion cyclotron radius or mean free path. It is found that the electric potential, which is not confined to a sheath, controls the diffusion far from the probe; inside the magnetic tube bounded by the probe cross section the potential overshoots to a large value before decaying to its value in the body of the plasma. The electron current is independent of the shape of the body along the field and increases with ion temperature; due to the overshoot in the potential, (1) the current at negative voltages does not vary exponentially, (2) its magnitude is strongly reduced by the field, and (3) the usual sharp knee at space potential, disappears. In the regions of the C-V diagram studied the ion current is negligible or unaffected by the field. Some numerical results are presented.The theory, which fails beyond certain positive voltage, fields useful results for weak fields, too.