936 resultados para Model Participation Rules
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Informática
Resumo:
Ontologies formalized by means of Description Logics (DLs) and rules in the form of Logic Programs (LPs) are two prominent formalisms in the field of Knowledge Representation and Reasoning. While DLs adhere to the OpenWorld Assumption and are suited for taxonomic reasoning, LPs implement reasoning under the Closed World Assumption, so that default knowledge can be expressed. However, for many applications it is useful to have a means that allows reasoning over an open domain and expressing rules with exceptions at the same time. Hybrid MKNF knowledge bases make such a means available by formalizing DLs and LPs in a common logic, the Logic of Minimal Knowledge and Negation as Failure (MKNF). Since rules and ontologies are used in open environments such as the Semantic Web, inconsistencies cannot always be avoided. This poses a problem due to the Principle of Explosion, which holds in classical logics. Paraconsistent Logics offer a solution to this issue by assigning meaningful models even to contradictory sets of formulas. Consequently, paraconsistent semantics for DLs and LPs have been investigated intensively. Our goal is to apply the paraconsistent approach to the combination of DLs and LPs in hybrid MKNF knowledge bases. In this thesis, a new six-valued semantics for hybrid MKNF knowledge bases is introduced, extending the three-valued approach by Knorr et al., which is based on the wellfounded semantics for logic programs. Additionally, a procedural way of computing paraconsistent well-founded models for hybrid MKNF knowledge bases by means of an alternating fixpoint construction is presented and it is proven that the algorithm is sound and complete w.r.t. the model-theoretic characterization of the semantics. Moreover, it is shown that the new semantics is faithful w.r.t. well-studied paraconsistent semantics for DLs and LPs, respectively, and maintains the efficiency of the approach it extends.
Resumo:
The particular characteristics and affordances of technologies play a significant role in human experience by defining the realm of possibilities available to individuals and societies. Some technological configurations, such as the Internet, facilitate peer-to-peer communication and participatory behaviors. Others, like television broadcasting, tend to encourage centralization of creative processes and unidirectional communication. In other instances still, the affordances of technologies can be further constrained by social practices. That is the case, for example, of radio which, although technically allowing peer-to-peer communication, has effectively been converted into a broadcast medium through the legislation of the airwaves. How technologies acquire particular properties, meanings and uses, and who is involved in those decisions are the broader questions explored here. Although a long line of thought maintains that technologies evolve according to the logic of scientific rationality, recent studies demonstrated that technologies are, in fact, primarily shaped by social forces in specific historical contexts. In this view, adopted here, there is no one best way to design a technological artifact or system; the selection between alternative designs—which determine the affordances of each technology—is made by social actors according to their particular values, assumptions and goals. Thus, the arrangement of technical elements in any technological artifact is configured to conform to the views and interests of those involved in its development. Understanding how technologies assume particular shapes, who is involved in these decisions and how, in turn, they propitiate particular behaviors and modes of organization but not others, requires understanding the contexts in which they are developed. It is argued here that, throughout the last century, two distinct approaches to the development and dissemination of technologies have coexisted. In each of these models, based on fundamentally different ethoi, technologies are developed through different processes and by different participants—and therefore tend to assume different shapes and offer different possibilities. In the first of these approaches, the dominant model in Western societies, technologies are typically developed by firms, manufactured in large factories, and subsequently disseminated to the rest of the population for consumption. In this centralized model, the role of users is limited to selecting from the alternatives presented by professional producers. Thus, according to this approach, the technologies that are now so deeply woven into human experience, are primarily shaped by a relatively small number of producers. In recent years, however, a group of three interconnected interest groups—the makers, hackerspaces, and open source hardware communities—have increasingly challenged this dominant model by enacting an alternative approach in which technologies are both individually transformed and collectively shaped. Through a in-depth analysis of these phenomena, their practices and ethos, it is argued here that the distributed approach practiced by these communities offers a practical path towards a democratization of the technosphere by: 1) demystifying technologies, 2) providing the public with the tools and knowledge necessary to understand and shape technologies, and 3) encouraging citizen participation in the development of technologies.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Double Degree in Economics and International Business from the NOVA – School of Business and Economics and Insper Instituto de Ensino e Pesquisa
Resumo:
Tese de Doutoramento em Tecnologias e Sistemas de Informação
Resumo:
Preprint submitted to International Journal of Solids and Structures. ISSN 0020-7683
Resumo:
Many democratic decision making institutions involve quorum rules. Such rules are commonly motivated by concerns about the “legitimacy” or “representativeness” of decisions reached when only a subset of eligible voters participates. A prominent example of this can be found in the context of direct democracy mechanisms, such as referenda and initiatives. We conduct a laboratory experiment to investigate the consequences of the two most common types of quorum rules: a participation quorum and an approval quorum. We find that both types of quora lead to lower participation rates, dramatically increasing the likelihood of full-fledged electoral boycotts on the part of those who endorse the Status Quo. This discouraging effect is significantly larger under a participation quorum than under an approval quorum.
Resumo:
Tese de Doutoramento em Ciências da Educação (Especialidade em Literacias e Ensino do Português)
Resumo:
The public perception of the EU in Spain varies greatly. The most positive aspects of Spanish membership are associated with the consolidation of democracy, economic growth, the introduction of the euro, the growth in employment and structural and cohesion funds, the increase in the female participation rate, and the equal opportunities policies. The analysts are in favour of common objectives in the employment policy and multi-level government. The less positive aspects of the EU are the risks of losing social protection and loss of employment in some sectors due to mergers of multinationals and delocalization of companies towards Eastern Europe. The continuous demands for reform of the welfare state, the toughening of the conditions of access to social benefit and the reform of the labour market are also seen as problematic issues. Risks of competitive cuts and social dumping.
Resumo:
Besley (1988) uses a scaling approach to model merit good arguments in commodity tax policy. In this paper, I question this approach on the grounds that it produces 'wrong' recommendations--taxation (subsidisation) of merit (demerit) goods--whenever the demand for the (de)merit good is inelastic. I propose an alternative approach that does not suffer from this deficiency, and derive the ensuing first and second best tax rules, as well as the marginal cost expressions to perform tax reform analysis.
Resumo:
This paper uses a unique individual level administrative data set to analyse the participation of health professionals in the NHS after training. The data set contains information on over 1,000 dentists who received Dental Vocational Training in Scotland between 1995 and 2006. Using a dynamic nonlinear panel data model, we estimate the determinants of post-training participation. We nd there is signi cant persistence in these data and are able to show that the persistence arises from state dependence and individual heterogeneity. This finding has implications for the structure of policies designed to increase participation rates. We apply this empirical framework to assess the accuracy of predictions for workforce forecasting, and to provide a preliminary estimate of the impact of one of the recruitment and retention policies available to dentists in Scotland.
Resumo:
While consumption habits have been utilised as a means of generating a humpshaped output response to monetary policy shocks in sticky-price New Keynesian economies, there is relatively little analysis of the impact of habits (particularly,external habits) on optimal policy. In this paper we consider the implications of external habits for optimal monetary policy, when those habits either exist at the level of the aggregate basket of consumption goods (‘superficial’ habits) or at the level of individual goods (‘deep’ habits: see Ravn, Schmitt-Grohe, and Uribe (2006)). External habits generate an additional distortion in the economy, which implies that the flex-price equilibrium will no longer be efficient and that policy faces interesting new trade-offs and potential stabilisation biases. Furthermore, the endogenous mark-up behaviour, which emerges when habits are deep, can also significantly affect the optimal policy response to shocks, as well as dramatically affecting the stabilising properties of standard simple rules.
Resumo:
In this paper, we quantitatively assess the welfare implications of alternative public education spending rules. To this end, we employ a dynamic stochastic general equilibrium model in which human capital externalities and public education expenditures, nanced by distorting taxes, enhance the productivity of private education choices. We allow public education spending, as share of output, to respond to various aggregate indicators in an attempt to minimize the market imperfection due to human capital externalities. We also expose the economy to varying degrees of uncertainty via changes in the variance of total factor productivity shocks. Our results indicate that, in the face of increasing aggregate uncertainty, active policy can signi cantly outperform passive policy (i.e. maintaining a constant public education to output ratio) but only when the policy instrument is successful in smoothing the growth rate of human capital.
Resumo:
We introduce duration dependent skill decay among the unemployed into a New-Keynesian model with hiring frictions developed by Blanchard/Gali (2008). If the central bank responds only to (current, lagged or expected future) inflation and quarterly skill decay is above a threshold level, determinacy requires a coefficient on inflation smaller than one. The threshold level is plausible with little steady-state hiring and firing ("Continental European Calibration") but implausibly high in the opposite case ("American calibration"). Neither interest rate smoothing nor responding to the output gap helps to restore determinacy if skill decay exceeds the threshold level. However, a modest response to unemployment guarantees determinacy. Moreover, under indeterminacy, both an adverse sunspot shock and an adverse technology shock increase unemployment extremely persistently.
Exclusive Nightclubs and Lonely Hearts Columns: Nonmonotone Participation in Optional Intermediation
Resumo:
In many decentralised markets, the traders who benefit most from an exchange do not employ intermediaries even though they could easily afford them. At the same time, employing intermediaries is not worthwhile for traders who benefit little from trade. Together, these decisions amount to non-monotone participation choices in intermediation: only traders of middle “type” employ intermediaries, while the rest, the high and the low types, prefer to search for a trading partner directly. We provide a theoretical foundation for this, hitherto unexplained, phenomenon. We build a dynamic matching model, where a trader’s equilibrium bargaining share is a convex increasing function of her type. We also show that this is indeed a necessary condition for the existence of non-monotone equilibria.