812 resultados para Boolean-like laws. Fuzzy implications. Fuzzy rule based systens. Fuzzy set theories


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, the fight against money laundering has emerged as a key issue of financial regulation. The Wolfsberg Group is an important multistakeholder agreement establishing corporate responsibility (CR) principles against money laundering in a domain where international coordination remains otherwise difficult. The fact that 10 out of the 25 top private banking institutions joined this initiative opens up an interesting puzzle concerning the conditions for the participation of key industry players in the Wolfsberg Group. The article presents a fuzzy-set analysis of seven hypotheses based on firm-level organizational factors, the macro-institutional context, and the regulatory framework. Results from the analysis of these 25 financial institutions show that public ownership of the bank and the existence of a code of conduct are necessary conditions for participation in the Wolfsberg Group, whereas factors related to the type of financial institution, combined with the existence of a black list, are sufficient for explaining participation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper deals with a bilateral accident situation in which victims haveheterogeneous costs of care. With perfect information,efficient care bythe injurer raises with the victim's cost. When the injurer cannot observeat all the victim's type, and this fact can be verified by Courts, first-bestcannot be implemented with the use of a negligence rule based on thefirst-best levels of care. Second-best leads the injurer to intermediate care,and the two types of victims to choose the best response to it. This second-bestsolution can be easily implemented by a negligence rule with second-best as duecare. We explore imperfect observation of the victim's type, characterizing theoptimal solution and examining the different legal alternatives when Courts cannotverify the injurers' statements. Counterintuitively, we show that there is nodifference at all between the use by Courts of a rule of complete trust and arule of complete distrust towards the injurers' statements. We then relate thefindings of the model to existing rules and doctrines in Common Law and Civil Lawlegal systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We lay out a small open economy version of the Calvo sticky price model, and show how the equilibrium dynamics can be reduced to simple representation in domestic inflation and the output gap. We use the resulting framework to analyze the macroeconomic implications of three alternative rule-based policy regimes for the small open economy: domestic inflation and CPI-based Taylor rules, and an exchange rate peg. We show that a key difference amongthese regimes lies in the relative amount of exchange rate volatility that they entail. We also discuss a special case for which domestic inflation targeting constitutes the optimal policy, and where a simple second order approximation to the utility of the representative consumer can be derived and used to evaluate the welfare losses associated with the suboptimal rules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En este trabajo se describen la teoría de los conjuntos borrosos de L. A. Zadeh(antecedentes, características e implicaciones) y las áreas en las que se ha aplicado laborrosidad en psicología y psicología social (desarrollo evolutivo, procesamiento deestímulos, percepción de la información, prototipos y otras aplicaciones). A partir de esto,se sugiere cómo la borrosidad podría ser útil en el estudio de la interacción social,asumiendo el carácter simultáneamente vago y preciso de la realidad, y la utilización deconceptos como la noción de sí mismo desde una visión compleja, que considere, desde laperspectiva del pluralismo, diversas posturas teóricas y metodológicas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing digital rights management (DRM) systems, initiatives like Creative Commons or research works as some digital rights ontologies provide limited support for content value chains modelling and management. This is becoming a critical issue as content markets start to profit from the possibilities of digital networks and the World Wide Web. The objective is to support the whole copyrighted content value chain across enterprise or business niches boundaries. Our proposal provides a framework that accommodates copyright law and a rich creation model in order to cope with all the creation life cycle stages. The dynamic aspects of value chains are modelled using a hybrid approach that combines ontology-based and rule-based mechanisms. The ontology implementation is based on Web Ontology Language and Description Logic (OWL-DL) reasoners, are directly used for license checking. On the other hand, for more complex aspects of the dynamics of content value chains, rule languages are the choice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper deals with a phenomenologically motivated magneto-viscoelastic coupled finite strain framework for simulating the curing process of polymers under the application of a coupled magneto-mechanical road. Magneto-sensitive polymers are prepared by mixing micron-sized ferromagnetic particles in uncured polymers. Application of a magnetic field during the curing process causes the particles to align and form chain-like structures lending an overall anisotropy to the material. The polymer curing is a viscoelastic complex process where a transformation from fluid. to solid occurs in the course of time. During curing, volume shrinkage also occurs due to the packing of polymer chains by chemical reactions. Such reactions impart a continuous change of magneto-mechanical properties that can be modelled by an appropriate constitutive relation where the temporal evolution of material parameters is considered. To model the shrinkage during curing, a magnetic-induction-dependent approach is proposed which is based on a multiplicative decomposition of the deformation gradient into a mechanical and a magnetic-induction-dependent volume shrinkage part. The proposed model obeys the relevant laws of thermodynamics. Numerical examples, based on a generalised Mooney-Rivlin energy function, are presented to demonstrate the model capacity in the case of a magneto-viscoelastically coupled load.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interactions between stimuli's acoustic features and experience-based internal models of the environment enable listeners to compensate for the disruptions in auditory streams that are regularly encountered in noisy environments. However, whether auditory gaps are filled in predictively or restored a posteriori remains unclear. The current lack of positive statistical evidence that internal models can actually shape brain activity as would real sounds precludes accepting predictive accounts of filling-in phenomenon. We investigated the neurophysiological effects of internal models by testing whether single-trial electrophysiological responses to omitted sounds in a rule-based sequence of tones with varying pitch could be decoded from the responses to real sounds and by analyzing the ERPs to the omissions with data-driven electrical neuroimaging methods. The decoding of the brain responses to different expected, but omitted, tones in both passive and active listening conditions was above chance based on the responses to the real sound in active listening conditions. Topographic ERP analyses and electrical source estimations revealed that, in the absence of any stimulation, experience-based internal models elicit an electrophysiological activity different from noise and that the temporal dynamics of this activity depend on attention. We further found that the expected change in pitch direction of omitted tones modulated the activity of left posterior temporal areas 140-200 msec after the onset of omissions. Collectively, our results indicate that, even in the absence of any stimulation, internal models modulate brain activity as do real sounds, indicating that auditory filling in can be accounted for by predictive activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Behavior-based navigation of autonomous vehicles requires the recognition of the navigable areas and the potential obstacles. In this paper we describe a model-based objects recognition system which is part of an image interpretation system intended to assist the navigation of autonomous vehicles that operate in industrial environments. The recognition system integrates color, shape and texture information together with the location of the vanishing point. The recognition process starts from some prior scene knowledge, that is, a generic model of the expected scene and the potential objects. The recognition system constitutes an approach where different low-level vision techniques extract a multitude of image descriptors which are then analyzed using a rule-based reasoning system to interpret the image content. This system has been implemented using a rule-based cooperative expert system

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a model-based objects recognition system which is part of an image interpretation system intended to assist autonomous vehicles navigation. The system is intended to operate in man-made environments. Behavior-based navigation of autonomous vehicles involves the recognition of navigable areas and the potential obstacles. The recognition system integrates color, shape and texture information together with the location of the vanishing point. The recognition process starts from some prior scene knowledge, that is, a generic model of the expected scene and the potential objects. The recognition system constitutes an approach where different low-level vision techniques extract a multitude of image descriptors which are then analyzed using a rule-based reasoning system to interpret the image content. This system has been implemented using CEES, the C++ embedded expert system shell developed in the Systems Engineering and Automatic Control Laboratory (University of Girona) as a specific rule-based problem solving tool. It has been especially conceived for supporting cooperative expert systems, and uses the object oriented programming paradigm

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Language acquisition is a complex process that requires the synergic involvement of different cognitive functions, which include extracting and storing the words of the language and their embedded rules for progressive acquisition of grammatical information. As has been shown in other fields that study learning processes, synchronization mechanisms between neuronal assemblies might have a key role during language learning. In particular, studying these dynamics may help uncover whether different oscillatory patterns sustain more item-based learning of words and rule-based learning from speech input. Therefore, we tracked the modulation of oscillatory neural activity during the initial exposure to an artificial language, which contained embedded rules. We analyzed both spectral power variations, as a measure of local neuronal ensemble synchronization, as well as phase coherence patterns, as an index of the long-range coordination of these local groups of neurons. Synchronized activity in the gamma band (2040 Hz), previously reported to be related to the engagement of selective attention, showed a clear dissociation of local power and phase coherence between distant regions. In this frequency range, local synchrony characterized the subjects who were focused on word identification and was accompanied by increased coherence in the theta band (48 Hz). Only those subjects who were able to learn the embedded rules showed increased gamma band phase coherence between frontal, temporal, and parietal regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uudelle toimialalle siirtyminen tuo mukanaan yrityksille monenlaisia haasteita. Tämän työn tavoitteena on rakentaa SOL Palvelut Oy:n uusien turvatarkastuspalveluiden johdolle helppokäyttöinen suorituskykymittaristo nykytilanteen kartoittamiseen sekä päätöksenteon tukemiseen. Mittariston rakentamiseen käytetään Lappeenrannan teknillisen yliopiston kehittämää Excel-pohjaista Sake-ohjelmistoa. Lisäksi työn tarkoituksena on tarjota mittaustulosten, teorioiden sekä oman kokemuksen pohjalta muutamia kehitysehdotuksia palvelun tuottamisen parantamiseksi. Oma kokemuspohja syntyy turvatarkastajan peruskurssin suorittamisen sekä itse turvatarkastajan työn tekemisen kautta. Teoriaosuudet käsittelevät palveluiden laatujohtamista, viestintää, motivointia sekä suorituskyvyn mittaamista. Aihealueet ovat määräytyneet mittariston rakentamisen tarpeen, mittaustulosten sekä havaittujen kehityskohteiden kautta. Teorioiden lisäksi työssä tutkitaan turvatarkastuspalveluiden tuottamisen taustalla vaikuttavia tekijöitä kuten lainsäädäntöä, ilmailumääräyksiä sekä kilpailua. Työn empiriaosuus koostuu puolestaan mittariston rakentamisen vaiheista sekä varsinaisten mittaustulosten käsittelemisestä ja hyödyntämisestä. Työlle asetetut tavoitteet täyttyivät hyvin ja yrityksen johdolle kyettiin rakentamaan hyvällä pohjalla oleva suorituskyvyn mittausjärjestelmä. Nykyarvojen perusteella pystyttiin kartoittamaan yrityksen nykytilaa ja sitä kautta voitiin luoda kehitysehdotuksia toimintojen tehostamiseksi. Erityisiksi kehityskohteiksi nousivat talous, koulutus sekä henkilöstö. Näiden tulosten sekä oman työkokemuksen pohjalta työssä on käsitelty kehitysehdotuksia, jotka koskevat mm. viestintää, palkitsemista, palveluohjaajia, koulutuksen kannattavuutta sekä työvuoroja.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Physical activity (PA) is an important field of healthcare research internationally and within Finland. As technology devices and services penetrate deeper levels within society, the need for studying the usefulness for PA turns vital. We started this research work by reviewing literature consisting of two hundred research journals, all of which have found technology to significantly improve an individual’s ability to get motivation and achieve officially recommended levels of physical activity, like the 10000 steps a day, being tracked with the help of pedometers. Physical activity recommendations require sustained encouragement, consistent performance in order to achieve the long term benefits. We surveyed within the city of Turku, how the motivation levels and thirty three other criterions encompassing technology awareness, adoption and usage attitudes are impacted. Our aim was to know the factors responsible for achieving consistent growth in activity levels within the individuals and focus groups, as well as to determine the causes of failures and for collecting user experience feedback. The survey results were quite interesting and contain impeccable information for this field. While the focus groups confirmed the theory established by past studies within our literature review, it also establishes our research propositions that ict tools and services have provided and can further add higher benefits and value to individuals in tracking and maintain their activity levels consistently for longer time durations. This thesis includes two new models which dictate technology and physical activity adoption patterns based on four easy to evaluate criterions, thereby helping the healthcare providers to recommend improvements and address issues with an easy rule based approach. This research work provides vital clues on technology based healthcare objectives and achievement of standard PA recommendations by people within Turku and nearby regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, I examined the relevance of dual-process theory to understanding forgiveness. Specifically, I argued that the internal conflict experienced by laypersons when forgiving (or finding themselves unable to forgive) and the discrepancies between existing definitions of forgiveness can currently be best understood through the lens of dual-process theory. Dual-process theory holds that individuals engage in two broad forms of mental processing corresponding to two systems, here referred to as System 1 and System 2. System 1 processing is automatic, unconscious, and operates through learned associations and heuristics. System 2 processing is effortful, conscious, and operates through rule-based and hypothetical thinking. Different definitions of forgiveness amongst both lay persons and scholars may reflect different processes within each system. Further, lay experiences with internal conflict concerning forgiveness may frequently result from processes within each system leading to different cognitive, affective, and behavioural responses. The study conducted for this thesis tested the hypotheses that processing within System 1 can directly affect one's likelihood to forgive, and that this effect is moderated by System 2 processing. I used subliminal conditioning to manipulate System 1 processing by creating positive or negative conditioned attitudes towards a hypothetical transgressor. I used working memory load (WML) to inhibit System 2 processing amongst half of the participants. The conditioning phase of the study failed and so no conclusions could be drawn regarding the roles of System 1 and System 2 in forgiveness. The implications of dual-process theory for forgiveness research and clinical practice, and directions for future research are discussed.