890 resultados para Theory of legitimate basis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work deals with the theory of Relativity and its diffusion in Italy in the first decades of the XX century. Not many scientists belonging to Italian universities were active in understanding Relativity, but two of them, Max Abraham and Tullio Levi-Civita left a deep mark. Max Abraham engaged a substantial debate against Einstein between 1912 and 1914 about electromagnetic and gravitation aspects of the theories. Levi-Civita played a fundamental role in giving Einstein the correct mathematical instruments for the General Relativity formulation since 1915. This work, which doesn't have the aim of a mere historical chronicle of the events, wants to highlight two particular perspectives: on one hand, the importance of Abraham-Einstein debate in order to clarify the basis of Special Relativity, to observe the rigorous logical structure resulting from a fragmentary reasoning sequence and to understand Einstein's thinking; on the other hand, the originality of Levi-Civita's approach, quite different from the Einstein's one, characterized by the introduction of a method typical of General Relativity even to Special Relativity and the attempt to hide the two Einstein Special Relativity postulates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The classical theory of intermittency developed for return maps assumes uniform density of points reinjected from the chaotic to laminar region. Though it works fine in some model systems, there exist a number of so-called pathological cases characterized by a significant deviation of main characteristics from the values predicted on the basis of the uniform distribution. Recently, we reported on how the reinjection probability density (RPD) can be generalized. Here, we extend this methodology and apply it to different dynamical systems exhibiting anomalous type-II and type-III intermittencies. Estimation of the universal RPD is based on fitting a linear function to experimental data and requires no a priori knowledge on the dynamical model behind. We provide special fitting procedure that enables robust estimation of the RPD from relatively short data sets (dozens of points). Thus, the method is applicable for a wide variety of data sets including numerical simulations and real-life experiments. Estimated RPD enables analytic evaluation of the length of the laminar phase of intermittent behaviors. We show that the method copes well with dynamical systems exhibiting significantly different statistics reported in the literature. We also derive and classify characteristic relations between the mean laminar length and main controlling parameter in perfect agreement with data provided by numerical simulations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diagrammatic strong-coupling perturbation theory (SCPT) for correlated electron systems is developed for intersite Coulomb interaction and for a nonorthogonal basis set. The construction is based on iterations of exact closed equations for many - electron Green functions (GFs) for Hubbard operators in terms of functional derivatives with respect to external sources. The graphs, which do not contain the contributions from the fluctuations of the local population numbers of the ion states, play a special role: a one-to-one correspondence is found between the subset of such graphs for the many - electron GFs and the complete set of Feynman graphs of weak-coupling perturbation theory (WCPT) for single-electron GFs. This fact is used for formulation of the approximation of renormalized Fermions (ARF) in which the many-electron quasi-particles behave analogously to normal Fermions. Then, by analyzing: (a) Sham's equation, which connects the self-energy and the exchange- correlation potential in density functional theory (DFT); and (b) the Galitskii and Migdal expressions for the total energy, written within WCPT and within ARF SCPT, a way we suggest a method to improve the description of the systems with correlated electrons within the local density approximation (LDA) to DFT. The formulation, in terms of renormalized Fermions LIDA (RF LDA), is obtained by introducing the spectral weights of the many electron GFs into the definitions of the charge density, the overlap matrices, effective mixing and hopping matrix elements, into existing electronic structure codes, whereas the weights themselves have to be found from an additional set of equations. Compared with LDA+U and self-interaction correction (SIC) methods, RF LDA has the advantage of taking into account the transfer of spectral weights, and, when formulated in terms of GFs, also allows for consideration of excitations and nonzero temperature. Going beyond the ARF SCPT, as well as RF LIDA, and taking into account the fluctuations of ion population numbers would require writing completely new codes for ab initio calculations. The application of RF LDA for ab initio band structure calculations for rare earth metals is presented in part 11 of this study (this issue). (c) 2005 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several attempts have been made recently to apply Darwinian evolutionary theory to the study of culture change and social history. The essential elements in such a theory are that variations occur in population, and that a process of selective retention operates during their replication and transmission. Location of such variable units in the semantic structure of cognition provides the individual psychological basis for an evolutionary theory of history. Selection operates on both the level of cognition and on its phenotypic expression in action in relation to individual preferred sources of psychological satisfaction. Social power comprises the principal selective forces within the unintended consequences of action and through the struggle of individuals and groups in pursuit of opposing interests. The implication for historiography are methodological in that evolutionary theory of history sharpens the focus of explanatory situational analysis, and interpretive in that it provides a paradigmatic metanarrative for the understanding of historical change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a mean field theory of code-division multiple access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The controlled from distance teaching (DT) in the system of technical education has a row of features: complication of informative content, necessity of development of simulation models and trainers for conducting of practical and laboratory employments, conducting of knowledge diagnostics on the basis of mathematical-based algorithms, organization of execution collective projects of the applied setting. For development of the process of teaching bases of fundamental discipline control system Theory of automatic control (TAC) the combined approach of optimum combination of existent programmatic instruments of support was chosen DT and own developments. The system DT TAC included: controlled from distance course (DC) of TAC, site of virtual laboratory practical works in LAB.TAC and students knowledge remote diagnostic system d-tester.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the specific area of software engineering (SE) for self-adaptive systems (SASs) there is a growing research awareness about the synergy between SE and artificial intelligence (AI). However, just few significant results have been published so far. In this paper, we propose a novel and formal Bayesian definition of surprise as the basis for quantitative analysis to measure degrees of uncertainty and deviations of self-adaptive systems from normal behavior. A surprise measures how observed data affects the models or assumptions of the world during runtime. The key idea is that a "surprising" event can be defined as one that causes a large divergence between the belief distributions prior to and posterior to the event occurring. In such a case the system may decide either to adapt accordingly or to flag that an abnormal situation is happening. In this paper, we discuss possible applications of Bayesian theory of surprise for the case of self-adaptive systems using Bayesian dynamic decision networks. Copyright © 2014 ACM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There have been many functional imaging studies of the brain basis of theory of mind (ToM) skills, but the findings are heterogeneous and implicate anatomical regions as far apart as orbitofrontal cortex and the inferior parietal lobe. The functional imaging studies are reviewed to determine whether the diverse findings are due to methodological factors. The studies are considered according to the paradigm employed (e.g., stories vs. cartoons and explicit vs. implicit ToM instructions), the mental state(s) investigated, and the language demands of the tasks. Methodological variability does not seem to account for the variation in findings, although this conclusion may partly reflect the relatively small number of studies. Alternatively, several distinct brain regions may be activated during ToM reasoning, forming an integrated functional "network." The imaging findings suggest that there are several "core" regions in the network-including parts of the prefrontal cortex and superior temporal sulcus-while several more "peripheral" regions may contribute to ToM reasoning in a manner contingent on relatively minor aspects of the ToM task. © 2008 Wiley-Liss, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mennyiben képes jelenleg a közösségi gazdaságtan az adópolitikák nemzetek fölötti centralizációjára vonatkozó politikai döntések megalapozására? Válaszunk röviden az lesz, hogy a közösségi gazdaságtan főárama - noha számos releváns gazdasági és politikai tényező hatását sikeresen elemzi - jelenleg nem kínál kielégítőnek tekinthető döntési kritériumokat a döntéshozók számára. Ennek oka, hogy központi szerepet játszik benne egy, a modellek szempontjából exogén és a közgazdasági elmélettől idegen tényező: a kormányzatok jóindulatára, pontosabban annak mértékére vonatkozó premissza. Tanulmányunk az adóverseny fiskális föderalista elméletét vizsgálja, és megpróbál általánosabb szinten is a közszektor gazdaságelméletének jelenlegi állapotára, valamint továbbfejlesztésére vonatkozó tanulságokat levonni. A kiutat az elméleti zsákutcából a kormányzati működés és döntéshozatal, valamint a kívánatos gazdaságpolitikai döntések elméletének összekapcsolása jelentheti. Erre megtörténtek az első kísérletek, de a szisztematikus és átfogó elemzés egyelőre várat magára. / === / How far can community economics provide a basis for political decision-making on supranational centralization of taxation policies? The short answer here will be that although the mainstream of community economics succeeds in analysing many relevant economic and political factors, it fails at present to provide satisfactory criteria for decisionmakers. This is because a central role is played in it by a factor exogenous to the models and alien to economic theory: the premise of the measure of goodwill from governments. The study examines the fiscal federalist theory of tax competition. It tries to draw conclusions, on a more general level, about the present state of the economic theory of the public sector and future development of it. The way out of the theoretical blind alley could be to link the theories of government operation and decision-making and of desirable economic-policy decision-making. The first attempts to do so have been made, but a systematic and comprehensive analysis is still awaited.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A könyvvizsgálati kockázat a téves auditjelentés kiadásának kockázata olyan esetekben, amikor a beszámoló lényeges hibás állítást tartalmaz. Ez a kockázat indirekt módon a hitelintézetek és pénzügyi vállalkozások működésében is megjelenik azokban az esetekben, amikor a lényeges hibás állítást a finanszírozott vállalkozás auditált beszámolója tartalmazza, amelynek az alapján finanszírozási döntést hoznak, vagy a finanszírozás folytatásáról a beszámolóban szereplő, hibás információkból számított hitelkovenánsok alapján döntenek. A könyvvizsgálat kockázatában a vizsgált gazdálkodó üzleti kockázatai tükröződnek vissza, ezért a kockázat felmérése és az ellenőrzés ennek alapján való megtervezése, majd végrehajtása kulcsfontosságú. Jelen tanulmány – kapcsolódva a Hitelintézeti Szemle 2011. évi 4. számához – szintén a kockázat és bizonytalanság témakörét tárgyalja, pontosabban ennek egy gyakorlati vetületét: a bizonyosságfüggvények (belief functions) alkalmazását a könyvvizsgálatban; mindezt a teljesség és a tankönyvszerű rendszerfelépítés igénye nélkül. A módszer ugyanis hazánkban szinte ismeretlen, nemzetközi viszonylatban viszont empirikus kutatásban is rámutattak már az alkalmazás lehetséges előnyeire a hagyományos valószínűségelméleten alapuló számszerű kockázatbecslésekkel szemben. Eszerint a bizonyosságfüggvények jobban reprezentálják a könyvvizsgálóknak a kockázatról alkotott képét, mint a valószínűségek, mert – szemben a hagyományos modellel – nem két, hanem három állapotot kezelnek: a pozitív bizonyíték létezését, a negatív bizonyíték létezését és a bizonyíték hiányának esetét. _______ Audit risk is the risk that the auditor expresses an inappropriate audit opinion when the fi nancial statements are materially misstated. This kind of risk indirectly appears in the fi nancial statements of fi nancial institutions, when the material misstatement is in the fi nanced entity’s statements that serve as a basis for lending decisions or when the decision is made based upon credit covenants calculated from misstated information. The risks of the audit process refl ect the business risks of the auditee, so the assessment of risks, and further the planning and performance of the audit based on it is of key importance. The current study – connecting to No 4 2011 of Hitelintézeti Szemle – also discusses the topic of risk and uncertainty, or to be more precise a practical implementation of the aforementioned: the application of belief functions in the fi eld of external audit. All this without the aim of achieving completeness or textbook-like scrutiny in building up the theory. While the formalism is virtually unknown in Hungary, on the international scene empirical studies pointed out the possible advantages of the application of the method in contrast to risk assessments based on the traditional theory of probability. Accordingly, belief functions provide a better representation of auditors’ perception of risk, as in contrast to the traditional model, belief functions deal with three rather than two states: the existence of supportive evidence, that of negative evidence and the lack of evidence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hayek’s theory of socio-cultural evolution is a generalization of his theory on spontaneous market order. Hayek explains both the emergence of market and social institutions serving as a social basis for that order within the framework of a unified evolutionary logic. This logic interprets the emergence and survival of spontaneous order and group-level rules of conduct as an unintended consequence of human action. In order to explain the emergence of social norms exclusively on the basis of methodological individualism, one would have to give up an exclusively evolutionary explanation of these norms. Since Hayek applies the invisible-hand explanation to the investigation of social norms, he combines the position of methodological individualism with functionalist-evolutionary arguments in his analysis. Hayek’s theory of socio-cultural evolution represents a theory in the framework of which methodological individualism and functionalism do not crowd out but complement each other.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hayek's theory of socio-cultural evolution is a generalization of his theory on spontaneous market order. Hayek explains both the emergence of market and social institutions serving as a social basis for that order within the framework of a unified evolutionary logic. This logic interprets the emergence and survival of spontaneous order and group-level rules of conduct as an unintended consequence of human action. In order to explain the emergence of social norms exclusively on the basis of methodological individualism, one would have to give up an exclusively evolutionary explanation of these norms. Since Hayek applies the invisiblehand explanation to the investigation of social norms, he combines the position of methodological individualism with functionalist-evolutionary arguments in his analysis. Hayek's theory of socio-cultural evolution represents a theory in the framework of which methodological individualism and functionalism do not crowd out but complement each other.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the 1950s, the theory of deterministic and nondeterministic finite automata (DFAs and NFAs, respectively) has been a cornerstone of theoretical computer science. In this dissertation, our main object of study is minimal NFAs. In contrast with minimal DFAs, minimal NFAs are computationally challenging: first, there can be more than one minimal NFA recognizing a given language; second, the problem of converting an NFA to a minimal equivalent NFA is NP-hard, even for NFAs over a unary alphabet. Our study is based on the development of two main theories, inductive bases and partials, which in combination form the foundation for an incremental algorithm, ibas, to find minimal NFAs. An inductive basis is a collection of languages with the property that it can generate (through union) each of the left quotients of its elements. We prove a fundamental characterization theorem which says that a language can be recognized by an n-state NFA if and only if it can be generated by an n-element inductive basis. A partial is an incompletely-specified language. We say that an NFA recognizes a partial if its language extends the partial, meaning that the NFA’s behavior is unconstrained on unspecified strings; it follows that a minimal NFA for a partial is also minimal for its language. We therefore direct our attention to minimal NFAs recognizing a given partial. Combining inductive bases and partials, we generalize our characterization theorem, showing that a partial can be recognized by an n-state NFA if and only if it can be generated by an n-element partial inductive basis. We apply our theory to develop and implement ibas, an incremental algorithm that finds minimal partial inductive bases generating a given partial. In the case of unary languages, ibas can often find minimal NFAs of up to 10 states in about an hour of computing time; with brute-force search this would require many trillions of years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the 1950s, the theory of deterministic and nondeterministic finite automata (DFAs and NFAs, respectively) has been a cornerstone of theoretical computer science. In this dissertation, our main object of study is minimal NFAs. In contrast with minimal DFAs, minimal NFAs are computationally challenging: first, there can be more than one minimal NFA recognizing a given language; second, the problem of converting an NFA to a minimal equivalent NFA is NP-hard, even for NFAs over a unary alphabet. Our study is based on the development of two main theories, inductive bases and partials, which in combination form the foundation for an incremental algorithm, ibas, to find minimal NFAs. An inductive basis is a collection of languages with the property that it can generate (through union) each of the left quotients of its elements. We prove a fundamental characterization theorem which says that a language can be recognized by an n-state NFA if and only if it can be generated by an n-element inductive basis. A partial is an incompletely-specified language. We say that an NFA recognizes a partial if its language extends the partial, meaning that the NFA's behavior is unconstrained on unspecified strings; it follows that a minimal NFA for a partial is also minimal for its language. We therefore direct our attention to minimal NFAs recognizing a given partial. Combining inductive bases and partials, we generalize our characterization theorem, showing that a partial can be recognized by an n-state NFA if and only if it can be generated by an n-element partial inductive basis. We apply our theory to develop and implement ibas, an incremental algorithm that finds minimal partial inductive bases generating a given partial. In the case of unary languages, ibas can often find minimal NFAs of up to 10 states in about an hour of computing time; with brute-force search this would require many trillions of years.