818 resultados para Logic of many
Resumo:
It is widely accepted that infants begin learning their native language not by learning words, but by discovering features of the speech signal: consonants, vowels, and combinations of these sounds. Learning to understand words, as opposed to just perceiving their sounds, is said to come later, between 9 and 15 mo of age, when infants develop a capacity for interpreting others' goals and intentions. Here, we demonstrate that this consensus about the developmental sequence of human language learning is flawed: in fact, infants already know the meanings of several common words from the age of 6 mo onward. We presented 6- to 9-mo-old infants with sets of pictures to view while their parent named a picture in each set. Over this entire age range, infants directed their gaze to the named pictures, indicating their understanding of spoken words. Because the words were not trained in the laboratory, the results show that even young infants learn ordinary words through daily experience with language. This surprising accomplishment indicates that, contrary to prevailing beliefs, either infants can already grasp the referential intentions of adults at 6 mo or infants can learn words before this ability emerges. The precocious discovery of word meanings suggests a perspective in which learning vocabulary and learning the sound structure of spoken language go hand in hand as language acquisition begins.
Resumo:
The relationship between professionalism, education and housing practice has become increasingly strained following the introduction of austerity measures and welfare reforms across a range of countries. Focusing on the development of UK housing practice, this article considers how notions of professionalism are being reshaped within the context of welfare retrenchment and how emerging tensions have both affected the identity of housing professionals and impacted on the delivery of training and education programmes. The article analyses the changing knowledges and skills valued in contemporary housing practice and considers how the sector has responded to the challenges of austerity. The central argument is that a dominant logic of competition has culminated in a crisis of identity for the sector. Although the focus of the article is on UK housing practice, the processes identified have a wider relevance for the analysis of housing and welfare delivery in developed economies.
Resumo:
Pós-graduação em Filosofia - FFC
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Pós-graduação em Filosofia - FFC
Resumo:
Much has been learned about vertebrate development by random mutagenesis followed by phenotypic screening and by targeted gene disruption followed by phenotypic analysis in model organisms. Because the timing of many developmental events is critical, it would be useful to have temporal control over modulation of gene function, a luxury frequently not possible with genetic mutants. Here, we demonstrate that small molecules capable of conditional gene product modulation can be identified through developmental screens in zebrafish. We have identified several small molecules that specifically modulate various aspects of vertebrate ontogeny, including development of the central nervous system, the cardiovascular system, the neural crest, and the ear. Several of the small molecules identified allowed us to dissect the logic of melanocyte and otolith development and to identify critical periods for these events. Small molecules identified in this way offer potential to dissect further these and other developmental processes and to identify novel genes involved in vertebrate development.
Resumo:
In t-norm based systems many-valued logic, valuations of propositions form a non-countable set: interval [0,1]. In addition, we are given a set E of truth values p, subject to certain conditions, the valuation v is v=V(p), V reciprocal application of E on [0,1]. The general propositional algebra of t-norm based many-valued logic is then constructed from seven axioms. It contains classical logic (not many-valued) as a special case. It is first applied to the case where E=[0,1] and V is the identity. The result is a t-norm based many-valued logic in which contradiction can have a nonzero degree of truth but cannot be true; for this reason, this logic is called quasi-paraconsistent.
Resumo:
Suszko’s Thesis is a philosophical claim regarding the nature of many-valuedness. It was formulated by the Polish logician Roman Suszko during the middle 70s and states the existence of “only but two truth values”. The thesis is a reaction against the notion of many-valuedness conceived by Jan Łukasiewicz. Reputed as one of the modern founders of many-valued logics, Łukasiewicz considered a third undetermined value in addition to the traditional Fregean values of Truth and Falsehood. For Łukasiewicz, his third value could be seen as a step beyond the Aristotelian dichotomy of Being and non-Being. According to Suszko, Łukasiewicz’s ideas rested on a confusion between algebraic values (what sentences describe/denote) and logical values (truth and falsity). Thus, Łukasiewicz’s third undetermined value is no more than an algebraic value, a possible denotation for a sentence, but not a genuine logical value. Suszko’s Thesis is endorsed by a formal result baptized as Suszko’s Reduction, a theorem that states every Tarskian logic may be characterized by a two-valued semantics. The present study is intended as a thorough investigation of Suszko’s thesis and its implications. The first part is devoted to the historical roots of many-valuedness and introduce Suszko’s main motivations in formulating the double character of truth-values by drawing the distinction in between algebraic and logical values. The second part explores Suszko’s Reduction and presents the developments achieved from it; the properties of two-valued semantics in comparison to many-valued semantics are also explored and discussed. Last but not least, the third part investigates the notion of logical values in the context of non-Tarskian notions of entailment; the meaning of Suszko’s thesis within such frameworks is also discussed. Moreover, the philosophical foundations for non-Tarskian notions of entailment are explored in the light of recent debates concerning logical pluralism.
Resumo:
This paper addresses the problem of ensuring compliance of business processes, implemented within and across organisational boundaries, with the constraints stated in related business contracts. In order to deal with the complexity of this problem we propose two solutions that allow for a systematic and increasingly automated support for addressing two specific compliance issues. One solution provides a set of guidelines for progressively transforming contract conditions into business processes that are consistent with contract conditions thus avoiding violation of the rules in contract. Another solution compares rules in business contracts and rules in business processes to check for possible inconsistencies. Both approaches rely on a computer interpretable representation of contract conditions that embodies contract semantics. This semantics is described in terms of a logic based formalism allowing for the description of obligations, prohibitions, permissions and violations conditions in contracts. This semantics was based on an analysis of typical building blocks of many commercial, financial and government contracts. The study proved that our contract formalism provides a good foundation for describing key types of conditions in contracts, and has also given several insights into valuable transformation techniques and formalisms needed to establish better alignment between these two, traditionally separate areas of research and endeavour. The study also revealed a number of new areas of research, some of which we intend to address in near future.
Resumo:
This paper is concerned with the use of scientific visualization methods for the analysis of feedforward neural networks (NNs). Inevitably, the kinds of data associated with the design and implementation of neural networks are of very high dimensionality, presenting a major challenge for visualization. A method is described using the well-known statistical technique of principal component analysis (PCA). This is found to be an effective and useful method of visualizing the learning trajectories of many learning algorithms such as back-propagation and can also be used to provide insight into the learning process and the nature of the error surface.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Demand for law professionals in the conveyancing of property is decreasing because of market and institutional changes. On the market side, many transactions feature large, well-known parties and standardized transactions, which make professionals less effective or necessary for protecting the parties to private contracts. On the institutional side, public titling makes it possible to dispense with a broadening set of their former functions. Recording of deeds made professionals redundant as depositories of deeds and reduced demand for them to design title guarantees. Effective registration of rights increasingly substitutes professionals for detecting title conflicts with third parties and gathering their consent. Market changes undermine the information asymmetry rationale for regulating conveyancing, while institutional changes facilitate liberalizing not only conduct but also license regulations. These arguments are supported here by disentangling the logic of titling systems and presenting empirical evidence from the European and USA markets.
Resumo:
The importance of university-company collaboration has increased during the last decades. The drivers for that are, on the one hand, changes in business logic of companies and on the other hand the decreased state funding of universities. Many companies emphasize joint research with universities as an enabling input to their development processes, which aim at creating new innovations, products and wealth. These factors have changed universities’ operations and they have adopted several practices of dynamic business organizations, such as strategic planning, monitoring and controlling methods of internal processes etc. The objective of this thesis is to combine different characteristics of successful university-company partnership and its development. The development process starts with identifying potential partners in the university’s interest group, which requires understanding the role of different partners in the innovation system. Next, in order to find a common development basis, matching the policy and strategy between partners is needed. The third phase is to combine the academic and industrial objectives of a joint project, which is a typical form of university-company collaboration. The optimum is a win-win situation where both partners, universities and companies, can get addedvalue. For the companies added value typically means access to new research results before their competitors. For the universities added value offers a possibility to carry on high level scientific work. The research output in the form of published scientific articles is evaluated by the international science community. Because the university-company partnership is often executed by joint projects, the different forms of this kind of projects is discussed in this study. The most challenging form of collaboration is a semi-open project model, which is not based on bilateral activities between universities and companies but on a consortium of several universities, research institutes and companies. The universities and companies are core actors in the innovation system. Thus the discussion of their roles and relations to public operators like publicly funded financiers is important. In the Finnish innovation system there are at least the following doers executing strategies and policies: EU, Academy of Finland and TEKES. In addition to these, Strategic Centres for Science, Technology and Innovation which are owned jointly by companies, universities and research organizations have a very important role in their fields of business. They transfer research results into commercial actions to generate wealth. The thesis comprises two parts. The first part consists of an overview of the study including introduction, literature review, research design, synthesis of findings and conclusions. The second part introduces four original research publications.
Resumo:
This paper proposes a limitation to epistemological claims to theory building prevalent in critical realist research. While accepting the basic ontological and epistemological positions of the perspective as developed by Roy Bhaskar, it is argued that application in social science has relied on sociological concepts to explain the underlying generative mechanisms, and that in many cases this has been subject to the effects of an anthropocentric constraint. A novel contribution to critical realist research comes from the work and ideas of Gregory Bateson. This is in service of two central goals of critical realism, namely an abductive route to theory building and a commitment to interdisciplinarity. Five aspects of Bateson’s epistemology are introduced: (1) difference, (2) logical levels of abstraction, (3) recursive causal loops, (4) the logic of metaphor, and (5) Bateson’s theory of mind. The comparison between Bateson and Bhaskar’s ideas is seen as a form of double description, illustrative of the point being raised. The paper concludes with an appeal to critical realists to start exploring the writing and outlook of Bateson himself.
Resumo:
The commitments and working requirements of abstract, applied, and art of, economics are assessed within an analogy with the fields of inert matter and life. Abstract economics is the pure logic of the phenomenon. Applied positive economics presupposes many distinct abstract sciences. Art presupposes applied economics and direct knowledge of the specificities which characterize the time-space individuality of the phenomenon. This is an indetermination clearly formulated by Senior and Mill; its connection with institutionalism is discussed. The Ricardian Vice is the habit of ignoring the indetermination; its prevalence in mainstream economics is exemplified, and its causes analyzed.