912 resultados para Compositional Rule of Inference
Resumo:
La tesis doctoral CONTRIBUCIÓN AL ESTUDIO DE DOS CONCEPTOS BÁSICOS DE LA LÓGICA FUZZY constituye un conjunto de nuevas aportaciones al análisis de dos elementos básicos de la lógica fuzzy: los mecanismos de inferencia y la representación de predicados vagos. La memoria se encuentra dividida en dos partes que corresponden a los dos aspectos señalados. En la Parte I se estudia el concepto básico de «estado lógico borroso». Un estado lógico borroso es un punto fijo de la aplicación generada a partir de la regla de inferencia conocida como modus ponens generalizado. Además, un preorden borroso puede ser representado mediante los preórdenes elementales generados por el conjunto de sus estados lógicos borrosos. El Capítulo 1 está dedicado a caracterizar cuándo dos estados lógicos dan lugar al mismo preorden elemental, obteniéndose también un representante de la clase de todos los estados lógicos que generan el mismo preorden elemental. El Capítulo finaliza con la caracterización del conjunto de estados lógicos borrosos de un preorden elemental. En el Capítulo 2 se obtiene un subconjunto borroso trapezoidal como una clase de una relación de indistinguibilidad. Finalmente, el Capítulo 3 se dedica a estudiar dos tipos de estados lógicos clásicos: los irreducibles y los minimales. En el Capítulo 4, que inicia la Parte II de la memoria, se aborda el problema de obtener la función de compatibilidad de un predicado vago. Se propone un método, basado en el conocimiento del uso del predicado mediante un conjunto de reglas y de ciertos elementos distinguidos, que permite obtener una expresión general de la función de pertenencia generalizada de un subconjunto borroso que realice la función de extensión del predicado borroso. Dicho método permite, en ciertos casos, definir un conjunto de conectivas multivaluadas asociadas al predicado. En el último capítulo se estudia la representación de antónimos y sinónimos en lógica fuzzy a través de auto-morfismos. Se caracterizan los automorfismos sobre el intervalo unidad cuando sobre él se consideran dos operaciones: una t-norma y una t-conorma ambas arquimedianas. The PhD Thesis CONTRIBUCIÓN AL ESTUDIO DE DOS CONCEPTOS BÁSICOS DE LA LÓGICA FUZZY is a contribution to two basic concepts of the Fuzzy Logic. It is divided in two parts, the first is devoted to a mechanism of inference in Fuzzy Logic, and the second to the representation of vague predicates. «Fuzzy Logic State» is the basic concept in Part I. A Fuzzy Logic State is a fixed-point for the mapping giving the Generalized Modus Ponens Rule of inference. Moreover, a fuzzy preordering can be represented by the elementary preorderings generated by its Fuzzy Logic States. Chapter 1 contemplates the identity of elementary preorderings and the selection of representatives for the classes modulo this identity. This chapter finishes with the characterization of the set of Fuzzy Logic States of an elementary preordering. In Chapter 2 a Trapezoidal Fuzzy Set as a class of a relation of Indistinguishability is obtained. Finally, Chapter 3 is devoted to study two types of Classical Logic States: irreducible and minimal. Part II begins with Chapter 4 dealing with the problem of obtaining a Compa¬tibility Function for a vague predicate. When the use of a predicate is known by means of a set of rules and some distinguished elements, a method to obtain the general expression of the Membership Function is presented. This method allows, in some cases, to reach a set of multivalued connectives associated to the predicate. Last Chapter is devoted to the representation of antonyms and synonyms in Fuzzy Logic. When the unit interval [0,1] is endowed with both an archimedean t-norm and a an archi-medean t-conorm, it is showed that the automorphisms' group is just reduced to the identity function.
Resumo:
This paper uses a structural approach based on the indirect inference principle to estimate a standard version of the new Keynesian monetary (NKM) model augmented with term structure using both revised and real-time data. The estimation results show that the term spread and policy inertia are both important determinants of the U.S. estimated monetary policy rule whereas the persistence of shocks plays a small but significant role when revised and real-time data of output and inflation are both considered. More importantly, the relative importance of term spread and persistent shocks in the policy rule and the shock transmission mechanism drastically change when it is taken into account that real-time data are not well behaved.
Resumo:
A procedure is given for recognizing sets of inference rules that generate polynomial time decidable inference relations. The procedure can automatically recognize the tractability of the inference rules underlying congruence closure. The recognition of tractability for that particular rule set constitutes mechanical verification of a theorem originally proved independently by Kozen and Shostak. The procedure is algorithmic, rather than heuristic, and the class of automatically recognizable tractable rule sets can be precisely characterized. A series of examples of rule sets whose tractability is non-trivial, yet machine recognizable, is also given. The technical framework developed here is viewed as a first step toward a general theory of tractable inference relations.
Resumo:
Effective network overload alleviation is very much essential in order to maintain security and integrity from the operational viewpoint of deregulated power systems. This paper aims at developing a methodology to reschedule the active power generation from the sources in order to manage the network congestion under normal/contingency conditions. An effective method has been proposed using fuzzy rule based inference system. Using virtual flows concept, which provides partial contributions/counter flows in the network elements is used as a basis in the proposed method to manage network congestions to the possible extent. The proposed method is illustrated on a sample 6 bus test system and on modified IEEE 39 bus system.
Resumo:
Thin films of NiTi were deposited by DC magnetron sputtering from an equiatomic alloy target (Ni/Ti: 50/50 at.%). The films were deposited without intentional heating of the substrates. The thickness of the deposited films was approximately 2 mu m. The structure and morphology of NiTi films annealed at different temperatures were analyzed in order to understand the effect of annealing on physical properties of the films. The compositional investigations of fresh and annealed films were also evaluated by energy dispersive X-ray spectroscopy (EDS) and X-ray photo-electron spectroscopy (XPS) techniques. X-ray diffraction (XRD) studies showed that as-deposited films were amorphous in nature whereas annealed films were found to poly-crystalline with the presence of Austenite phase as the dominant phase. AFM investigations showed higher grain size and surface roughness values in the annealed films. In annealed films, the grain size and film roughness values were increased from 10 to 85 nm and 2-18 nm. Film composition measured by EDS were found to 52.5 atomic percent of Ni and 47.5 atomic percent of Ti. XPS investigations, demonstrated the presence of Ni content on the surface of the films, in fresh films, whereas annealed films did not show any nickel. From HR-XPS investigations, it can be concluded that annealed NiTi films have higher tendency to form metal oxide (titanium dioxide) layer on the surface of the films than fresh NiTi films. (C) 2013 Elsevier B. V. All rights reserved.
Resumo:
Published as an article in: Spanish Economic Review, 2008, vol. 10, issue 4, pages 251-277.
Resumo:
This paper proposes an extended version of the basic New Keynesian monetary (NKM) model which contemplates revision processes of output and inflation data in order to assess the importance of data revisions on the estimated monetary policy rule parameters and the transmission of policy shocks. Our empirical evidence based on a structural econometric approach suggests that although the initial announcements of output and inflation are not rational forecasts of revised output and inflation data, ignoring the presence of non well-behaved revision processes may not be a serious drawback in the analysis of monetary policy in this framework. However, the transmission of inflation-push shocks is largely affected by considering data revisions. The latter being especially true when the nominal stickiness parameter is estimated taking into account data revision processes.
Resumo:
Humans have the arguably unique ability to understand the mental representations of others. For success in both competitive and cooperative interactions, however, this ability must be extended to include representations of others' belief about our intentions, their model about our belief about their intentions, and so on. We developed a "stag hunt" game in which human subjects interacted with a computerized agent using different degrees of sophistication (recursive inferences) and applied an ecologically valid computational model of dynamic belief inference. We show that rostral medial prefrontal (paracingulate) cortex, a brain region consistently identified in psychological tasks requiring mentalizing, has a specific role in encoding the uncertainty of inference about the other's strategy. In contrast, dorsolateral prefrontal cortex encodes the depth of recursion of the strategy being used, an index of executive sophistication. These findings reveal putative computational representations within prefrontal cortex regions, supporting the maintenance of cooperation in complex social decision making.
Resumo:
Much of the contemporary concert (i.e. “classical”) saxophone literature has connections to compositional styles found in other genres like jazz, rock, or pop. Although improvisation exists as a dominant compositional device in jazz, improvisation as a performance technique is not confined to a single genre. This study looks at twelve concert saxophone pieces that are grouped into three primary categories of compositional techniques: 1) those containing unmeasured phrases, 2) those containing limited relation to improvisation but a close relationship to jazz styles, and 3) those containing jazz improvisation. In concert saxophone music, specific crossover pieces use the compositional technique of jazz improvisation. Four examples of such jazz works were composed by Dexter Morrill, Phil Woods, Bill Dobbins, and Ramon Ricker, all of which provide a foundation for this study. In addition, pieces containing varying degrees of unmeasured phrases are highlighted. As this dissertation project is based in performance, the twelve pieces were divided into three recitals that summarize a pedagogical sequence. Any concert saxophonist interested in developing jazz improvisational skills can use the pieces in this study as a method to progress toward the performance of pieces that merge jazz improvisation with the concert format. The three compositional techniques examined here will provide the performer with the necessary material to develop this individualized approach to improvisation. Specific compositional and performance techniques vary depending on the stylistic content: this study examines improvisation in the context of concert saxophone repertoire.
Resumo:
Belief revision characterizes the process of revising an agent’s beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation
is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster
rule of combination, just like revision in the sense of Alchourron, Gardenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey’s rule of updating, Dempster rule of conditioning and a form of AGM revision.
Resumo:
The objective of this study is to provide an alternative model approach, i.e., artificial neural network (ANN) model, to predict the compositional viscosity of binary mixtures of room temperature ionic liquids (in short as ILs) [C n-mim] [NTf 2] with n=4, 6, 8, 10 in methanol and ethanol over the entire range of molar fraction at a broad range of temperatures from T=293.0328.0K. The results show that the proposed ANN model provides alternative way to predict compositional viscosity successfully with highly improved accuracy and also show its potential to be extensively utilized to predict compositional viscosity over a wide range of temperatures and more complex viscosity compositions, i.e., more complex intermolecular interactions between components in which it would be hard or impossible to establish the analytical model. © 2010 IEEE.
Resumo:
As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completely absent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involved parts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method is introduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that the theoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approach has reasonable properties from a compositional point of view. In particular, it is “natural” in the sense that it recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in the same paper a substitution method for missing values on compositional data sets is introduced
Resumo:
Soil aggregation is an index of soil structure measured by mean weight diameter (MWD) or scaling factors often interpreted as fragmentation fractal dimensions (D-f). However, the MWD provides a biased estimate of soil aggregation due to spurious correlations among aggregate-size fractions and scale-dependency. The scale-invariant D-f is based on weak assumptions to allow particle counts and sensitive to the selection of the fractal domain, and may frequently exceed a value of 3, implying that D-f is a biased estimate of aggregation. Aggregation indices based on mass may be computed without bias using compositional analysis techniques. Our objective was to elaborate compositional indices of soil aggregation and to compare them to MWD and D-f using a published dataset describing the effect of 7 cropping systems on aggregation. Six aggregate-size fractions were arranged into a sequence of D-1 balances of building blocks that portray the process of soil aggregation. Isometric log-ratios (ilrs) are scale-invariant and orthogonal log contrasts or balances that possess the Euclidean geometry necessary to compute a distance between any two aggregation states, known as the Aitchison distance (A(x,y)). Close correlations (r>0.98) were observed between MWD, D-f, and the ilr when contrasting large and small aggregate sizes. Several unbiased embedded ilrs can characterize the heterogeneous nature of soil aggregates and be related to soil properties or functions. Soil bulk density and penetrater resistance were closely related to A(x,y) with reference to bare fallow. The A(x,y) is easy to implement as unbiased index of soil aggregation using standard sieving methods and may allow comparisons between studies. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Assessment of changes in surface ocean conditions, in particular, sea-surface temperature (SST), is essential to understand long-term changes in climate especially in regions where continental climate is strongly influenced by oceanographic processes. To evaluate changes in SST in the northeast Pacific, we have analyzed long-chain alkenones of prymnesiophyte origin at 38 depths in a piston and associated trigger core collected beneath the contemporary core of the California Current System at 42°N, ~270 km off the coast of Oregon/California. The samples span 30,000 years of deposition at this location. Unsaturation patterns (UK'37) in the alkenone series display a statistically significant difference (p <<0.001) between interglacial (0.44 ± 0.02, n = 11) and glacial (0.29 ± 0.04, n = 20) intervals of the cores. Detailed examination of other compositional features of the C37, C38, C39 alkenone series and a related C36 alkenoate series measured downcore suggests the published UK'37 - temperature calibration (UK'37 = 0.034 * T + 0.039 ) , defined for cultures of a strain of Emiliania huxleyi isolated from the subarctic Pacific, provides best estimates of winter SST at our study site. This inference is purely statistical and does not imply, however, that the phytoplankton source of these biomarkers is most productive in winter or at the ocean surface. The temperature record for UK'37 implies (1) an ~4°C shift occurred in winter SST from ~7.5 ± 1.1°C at the last glacial maximum to ~11.7 ± 0.7°C in the present interglacial period, and (2) this warming trend was confined to the time frame 14-10 Ka within the glacial to interglacial transition period. These conclusions are corroborated entirely by results from an independent SST transformation of radiolarian species assemblage data obtained from the same core materials.