873 resultados para Formal Theories
Resumo:
In the present paper we investigate the life cycles of formalized theories that appear in decision making instruments and science. In few words mixed theories are build in the following steps: Initially a small collection of facts is the kernel of the theory. To express these facts we make a special formalized language. When the collection grows we add some inference rules and thus some axioms to compress the knowledge. The next step is to generalize these rules to all expressions in the formalized language. For these rules we introduce some conclusion procedure. In such a way we make small theories for restricted fields of the knowledge. The most important procedure is the mixing of these partial knowledge systems. In that step we glue the theories together and eliminate the contradictions. The last operation is the most complicated one and some simplifying procedures are proposed.
Resumo:
The study examines the personnel training and research activities carried out by the Organization and Methods Division of the Ministry of Finance and their becoming a part and parcel of the state administration in 1943-1971. The study is a combination of institutional and ideological historical research in recent history on adult education, using a constructionist approach. Material salient to the study comes from the files of the Organization and Methods Division in the National Archives, parliamentary documents, committee reports, and the magazines. The concentrated training and research activities arranged by the Organization and Methods Division, became a part and parcel of the state administration in the midst of controversial challenges and opportunities. They served to solve social problems which beset the state administration as well as contextual challenges besetting rationalization measures, and organizational challenges. The activities were also affected by a dependence on decision-makers, administrative units, and civil servants organizations, by different views on rationalization and the holistic nature of reforms, as well as by the formal theories that served as resources. It chose long-term projects which extended to the political decision-makers and administrative units turf, and which were intended to reform the structures of the state administration and to rationalize the practices of the administrative units. The crucial questions emerged in opposite pairs (a constitutional state vs. the ideology of an administratively governed state, a system of national boards vs. a system of government through ministries, efficiency of work vs. pleasantness of work, centralized vs. decentralized rationalization activities) which were not solvable problems but impossible questions with no ultimate answers. The aim and intent of the rationalization of the state administration (the reform of the central, provincial, and local governments) was to facilitate integrated management and to render a greater amount of work by approaching management procedures scientifically and by clarifying administrative instances and their respon-sibilities in regards to each other. The means resorted to were organizational studies and committee work. In the rationalization of office work and finance control, the idea was to effect savings in administrative costs and to pare down those costs as well as to rationalize and heighten those functions by developing the institution of work study practitioners in order to coordinate employer and employee relationships and benefits (the training of work study practitioners, work study, and a two-tier work study practitioner organization). A major part of the training meant teaching and implementing leadership skills in practice, which, in turn, meant that the learning environment was the genuine work community and efforts to change it. In office rationalization, the solution to regulate the relations between the employer and the employees was the co-existence of the technical and biological rationalization and the human resource administration and the accounting and planning systems at the turn of the 1960s and 1970s. The former were based on the school of scientific management and human relations, the latter on system thinking, which was a combination of the former two. In the rationalization of the state administration, efforts were made to find solutions to stabilize management ideologies and to arrange the relationships of administrative systems in administrative science - among other things, in the Hoover Committee and the Simon decision making theory, and, in the 1960s, in system thinking. Despite the development-related vocabulary, the practical work was advanced rationalization. It was said that the practical activities of both the state administration and the administrative units depended on professional managers who saw to production results and human relations. The pedagogic experts hired to develop training came up with a training system, based on the training-technological model where the training was made a function of its own. The State Training Center was established and the training office of the Organization and Methods Division became the leader and coordinator of personnel training.
Resumo:
Three experiments are reported that examined the process by which trainees learn decision-making skills during a critical incident training program. Formal theories of category learning were used to identify two processes that may be responsible for the acquisition of decision-making skills: rule learning and exemplar learning. Experiments I and 2 used the process dissociation procedure (L. L. Jacoby, 1998) to evaluate the contribution of these processes to performance. The results suggest that trainees used a mixture of rule and exemplar learning. Furthermore, these learning processes were influenced by different aspects of training structure and design. The goal of Experiment 3 was to develop training techniques that enable trainees to use a rule adaptively. Trainees were tested on cases that represented exceptions to the rule. Unexpectedly, the results suggest that providing general instruction regarding the kinds of conditions in which a decision rule does not apply caused them to fixate on the specific conditions mentioned and impaired their ability to identify other conditions in which the rule might not apply. The theoretical, methodological, and practical implications of the results are discussed.
Resumo:
The purpose of the paper is to explore the possibility of applying existing formal theories of description and design of distributed and concurrent systems to interaction protocols for real-time multi-agent systems. In particular it is shown how the language PRALU, proposed for description of parallel logical control algorithms and rooted in the Petri net formalism, can be used for the modeling of complex concurrent conversations between agents in a multi-agent system. It is demonstrated with a known example of English auction on how to specify an agent interaction protocol using considered means.
Resumo:
Peer reviewed
Resumo:
Peer reviewed
Resumo:
This study sought to explore ways to work with a group of young people through an arts-based approach to the teaching of literacy. Through the research, the author integrated her own reflexivity applying arts methods over the past decade. The author’s past experiences were strongly informed by theories such as caring theory and maternal pedagogy, which also informed the research design. The study incorporated qualitative data collection instruments comprising interviews, journals, sketches, artifacts, and teacher field notes. Data were collected by 3 student participants for the duration of the research. Study results provide educators with data on the impact of creating informal and alternative ways to teach literacy and maintain student engagement with resistant learners.
Resumo:
Formal mentoring programs are accepted as a valuable strategy for developing young and emerging artists. This thesis presents the results of an evaluation of the SPARK National Young Artists Mentoring Program (SPARK). SPARK was a ten-month formal mentoring program managed by Youth Arts Queensland (YAQ) on behalf of the Australia Council for the Arts from 2003-2009. The program aimed to assist young and emerging Australian artists between the ages of 18-26 to establish a professional career in the arts. It was a highly successful formal arts mentoring program that facilitated 58 mentorships between young and emerging artists and professional artists from across Australia in five program rounds over its seven year lifespan. Interest from other cultural organisations looking to develop their own formal mentoring programs encouraged YAQ to commission this research to determine how the program works to achieve its effects. This study was conducted with young and emerging artists who participated in SPARK from 2003 to 2008. It took a theory-driven evaluation approach to examine SPARK as an example of what makes formal arts mentoring programs effective. It focused on understanding the program’s theory or how the program worked to achieve its desired outcomes. The program activities and assumed responses to program activities were mapped out in a theories of change model. This theoretical framework was then used to plan the points for data collection. Through the process of data collection, actual program developments were compared to the theoretical framework to see what occurred as expected and what did not. The findings were then generalised for knowledge and wider application. The findings demonstrated that SPARK was a successful and effective program and an exemplar model of a formal mentoring program preparing young and emerging artists for professional careers in the arts. They also indicate several ways in which this already strong program could be further improved, including: looking at the way mentoring relationships are set up and how the mentoring process is managed; considering the balance between artistic and professional development; developing career development competencies and networking skills; taking into account the needs of young and emerging artists to develop their professional identity and build confidence; and giving more thought to the desired program outcomes and considering the issue of timeliness and readiness for career transition. From these findings, together with principles outlined in the mentoring and career development literature, a number of necessary conditions have been identified for developing effective mentoring programs in the career development of young and emerging artists.
Resumo:
This study examines both theoretically an empirically how well the theories of Norman Holland, David Bleich, Wolfgang Iser and Stanley Fish can explain readers' interpretations of literary texts. The theoretical analysis concentrates on their views on language from the point of view of Wittgenstein's Philosophical Investigations. This analysis shows that many of the assumptions related to language in these theories are problematic. The empirical data show that readers often form very similar interpretations. Thus the study challenges the common assumption that literary interpretations tend to be idiosyncratic. The empirical data consists of freely worded written answers to questions on three short stories. The interpretations were made by 27 Finnish university students. Some of the questions addressed issues that were discussed in large parts of the texts, some referred to issues that were mentioned only in passing or implied. The short stories were "The Witch à la Mode" by D. H. Lawrence, "Rain in the Heart" by Peter Taylor and "The Hitchhiking Game" by Milan Kundera. According to Fish, readers create both the formal features of a text and their interpretation of it according to an interpretive strategy. People who agree form an interpretive community. However, a typical answer usually contains ideas repeated by several readers as well as observations not mentioned by anyone else. Therefore it is very difficult to determine which readers belong to the same interpretive community. Moreover, readers with opposing opinions often seem to pay attention to the same textual features and even acknowledge the possibility of an opposing interpretation; therefore they do not seem to create the formal features of the text in different ways. Iser suggests that an interpretation emerges from the interaction between the text and the reader when the reader determines the implications of the text and in this way fills the "gaps" in the text. Iser believes that the text guides the reader, but as he also believes that meaning is on a level beyond words, he cannot explain how the text directs the reader. The similarity in the interpretations and the fact that the agreement is strongest when related to issues that are discussed broadly in the text do, however, support his assumption that readers are guided by the text. In Bleich's view, all interpretations have personal motives and each person has an idiosyncratic language system. The situation where a person learns a word determines the most important meaning it has for that person. In order to uncover the personal etymologies of words, Bleich asks his readers to associate freely on the basis of a text and note down all the personal memories and feelings that the reading experience evokes. Bleich's theory of the idiosyncratic language system seems to rely on a misconceived notion of the role that ostensive definitions have in language use. The readers' responses show that spontaneous associations to personal life seem to colour the readers' interpretations, but such instances are rather rare. According to Holland, an interpretation reflects the reader's identity theme. Language use is regulated by shared rules, but everyone follows the rules in his or her own way. Words mean different things to different people. The problem with this view is that if there is any basis for language use, it seems to be the shared way of following linguistic rules. Wittgenstein suggests that our understanding of words is related to the shared ways of using words and our understanding of human behaviour. This view seems to give better grounds for understanding similarity and differences in literary interpretations than the theories of Holland, Bleich, Fish and Iser.
Resumo:
The intent of this study is to provide formal apparatus which facilitates the investigation of problems in the methodology of science. The introduction contains several examples of such problems and motivates the subsequent formalism.
A general definition of a formal language is presented, and this definition is used to characterize an individual’s view of the world around him. A notion of empirical observation is developed which is independent of language. The interplay of formal language and observation is taken as the central theme. The process of science is conceived as the finding of that formal language that best expresses the available experimental evidence.
To characterize the manner in which a formal language imposes structure on its universe of discourse, the fundamental concepts of elements and states of a formal language are introduced. Using these, the notion of a basis for a formal language is developed as a collection of minimal states distinguishable within the language. The relation of these concepts to those of model theory is discussed.
An a priori probability defined on sets of observations is postulated as a reflection of an individual’s ontology. This probability, in conjunction with a formal language and a basis for that language, induces a subjective probability describing an individual’s conceptual view of admissible configurations of the universe. As a function of this subjective probability, and consequently of language, a measure of the informativeness of empirical observations is introduced and is shown to be intuitively plausible – particularly in the case of scientific experimentation.
The developed formalism is then systematically applied to the general problems presented in the introduction. The relationship of scientific theories to empirical observations is discussed and the need for certain tacit, unstatable knowledge is shown to be necessary to fully comprehend the meaning of realistic theories. The idea that many common concepts can be specified only by drawing on knowledge obtained from an infinite number of observations is presented, and the problems of reductionism are examined in this context.
A definition of when one formal language can be considered to be more expressive than another is presented, and the change in the informativeness of an observation as language changes is investigated. In this regard it is shown that the information inherent in an observation may decrease for a more expressive language.
The general problem of induction and its relation to the scientific method are discussed. Two hypotheses concerning an individual’s selection of an optimal language for a particular domain of discourse are presented and specific examples from the introduction are examined.
Resumo:
Generally speaking, the term temporal logic refers to any system of rules and symbolism for representing and reasoning about propositions qualified in terms of time. In computer science, particularly in the domain of Artificial Intelligence, there are mainly two known approaches to the representation of temporal information: modal logic approaches including tense logic and hybrid temporal logic, and predicate logic approaches including temporal arguement method and reified temporal logic. On one hand, while tense logic, hybrid temporal logic and temporal argument method enjoy formal theoretical foundations, their expressiveness has been criticised as not power enough for representing general temporal knowledge; on the other hand, although reified temporal logic provides greater expressive power, most of the current systems following the temporal reification lack of complete and sound axiomatic theories. With there observations in mind, a new reified temporal logic with clear syntax and semantics in terms of a sound and complete axiomatic formalism is introduced in this paper, which retains all the expressive power of temporal reification.
Resumo:
This paper explores a novel perspective on patient safety improvements, which draws on
contemporary social network and learning theories. A case study was conducted at a Portuguese
acute university hospital. Data collection followed a staged approach, whereby 46 interviews
were conducted involving 49 respondents from a broad array of departments and professional
backgrounds. This case study highlights the importance of two major interlinked factors in
contributing to patient safety improvements. The first of these is the crucial role of formal and
informal, internal and external social networks. The second is the importance and the possible
advantage of combining formal and informal learning. The analysis suggests that initiatives
rooted in formal learning approaches alone do not necessarily lead to the creation of long-term
grounded internal safety networks, and that patient safety improvements can crucially depend on
bottom-up initiatives of communities of practice and informal learning. Traditional research on
patient safety places a strong emphasis on top-down and managerialist approaches and is often
based on the assumption that „safety? learning is primarily formal and context-independent. This
paper suggests that bottom-up initiatives and a combination of formal and informal learning can
make a major contribute to patient safety improvements.
Resumo:
Tese de doutoramento, Ciências e Tecnologias da Saúde (Medicina Legal e Ciências Forenses), Universidade de Lisboa, Faculdade de Medicina, 2014
Resumo:
A classical argument of de Finetti holds that Rationality implies Subjective Expected Utility (SEU). In contrast, the Knightian distinction between Risk and Ambiguity suggests that a rational decision maker would obey the SEU paradigm when the information available is in some sense good, and would depart from it when the information available is not good. Unlike de Finetti's, however, this view does not rely on a formal argument. In this paper, we study the set of all information structures that might be availabe to a decision maker, and show that they are of two types: those compatible with SEU theory and those for which SEU theory must fail. We also show that the former correspond to "good" information, while the latter correspond to information that is not good. Thus, our results provide a formalization of the distinction between Risk and Ambiguity. As a consequence of our main theorem (Theorem 2, Section 8), behavior not-conforming to SEU theory is bound to emerge in the presence of Ambiguity. We give two examples of situations of Ambiguity. One concerns the uncertainty on the class of measure zero events, the other is a variation on Ellberg's three-color urn experiment. We also briefly link our results to two other strands of literature: the study of ambiguous events and the problem of unforeseen contingencies. We conclude the paper by re-considering de Finetti's argument in light of our findings.
Resumo:
This dissertation examines different aspects involved in the formation of psychologists’ expert opinion in the Portuguese criminal justice system, more precisely, as this opinion is reflected in assessment reports. The present dissertation is comprised of three qualitative studies, the first sought to provide a general portrait of a sample of 106 forensic psychological reports as to their overall quality as measured in terms of relevance and coherence. Results show that the formal markers of quality are present in the sample analysed, a certain number of weaknesses have been observed, notably concerning the internal coherence of the reports as well as the relevance of the information reported on. The second study explored the opinions of 17 Portuguese judges and state prosecutors concerning the use they make of this type of forensic report. It appears that they consider these reports to be useful and very credible, specially so when they have been produced under the auspices of the National Institute of Legal Medicine and Forensic Sciences, which is the state forensic institution. Furthermore, it appears that judges and prosecutors were particularly interested in data that allowed for a personalised portrait of the assessee. The third study sought to better comprehend the conceptual bases on which psychologists construct their reports. To this end, an exploratory study was undertaken with a sample of key-actors; the analysis of their interviews shows that they define their judicial mandate as well as the basic concepts that are associated to this mandate in different ways. A theoretical framework provided by an implicit theories model was used to help understand these results.