899 resultados para Almost always propositional logic
Resumo:
This study explores variegated means through which ports have become increasingly entangled in the planning logic of neoliberal innovation-driven economy. The research topic belongs to the academic disciplines of economics and human geography. The aim of the thesis is to analyse how the notion of innovation, adopted in a variety of supranational and national port policy documents, is deployed in operational port environment in two different ports of the Baltic Sea Region: the port of Stockholm, Sweden, and the port of Klaipeda, Lithuania. This novel innovation agenda is visible in several topics I examine in the study, that is, port governance, environmental issues, and seaport – port-city interface. The gathered primary source material on port policy documents, strategies, development planning documents and reports is analysed by utilizing the qualitative content analysis research method. Moreover, the empirical part of the case study, that is, tracing innovation practices in mundane port activities is based on collected qualitative semi-structured interviews with port authorities in Klaipeda and Stockholm, researchers and other port experts. I examine the interview material by employing the theoretical reading research method. In my analysis, I have reframed port-related policy development by tracing and identifying the port transformation from “functional terminals” to “engines for growth”. My results show that this novel innovation-oriented rhetoric imprinted in the narrative “engines for growth” is often contested in daily port practices. In other words, my analysis reveals that the port authorities’ and other port actors’ attitudes towards innovations do not necessarily correspond to the new narrative of innovation and do not always “fit” within a framework of neoliberal economic thinking that glorifies the “culture of innovations”. I argue that the ability to develop innovative initiatives in the ports of Klaipeda and Stockholm is strongly predetermined by local conditions, a port’s governance model, the way port actors perceive the importance of innovations per se, demand factors and new regulations.
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
As classificações dos signos de C.S.Peirce começam a ser desenvolvidas em 1865 e se estendem a até, pelo menos, 1909. Vou apresentar o período que tem início em 1865, e possui dois momentos de intensa produção - "On a New List of Categories" e "On the Algebra of Logic: a contribution to the philosophy of notation". Em seguida apresento as dez classes de signos, uma morfologia que aparece no "Syllabus of Certain Topics of Logic", e é desenvolvida a partir de 1903. Meu propósito aqui é familiarizar o leitor com as intrincadas classificações sígnicas de Peirce.
Resumo:
ABSTRACT: The mental models theory predicts that, while conjunctions are easier than disjunctions for individuals, when denied, conjunctions are harder than disjunctions. Khemlani, Orenes, and Johnson-Laird proved that this prediction is correct in their work of 2014. In this paper, I analyze their results in order to check whether or not they really affect the mental logic theory. My conclusion is that, although Khemlani et al.'s study provides important findings, such findings do not necessarily lead to questioning or to rejecting the mental logic theory.
Resumo:
Violence has always been a part of the human experience, and therefore, a popular topic for research. It is a controversial issue, mostly because the possible sources of violent behaviour are so varied, encompassing both biological and environmental factors. However, very little disagreement is found regarding the severity of this societal problem. Most researchers agree that the number and intensity of aggressive acts among adults and children is growing. Not surprisingly, many educational policies, programs, and curricula have been developed to address this concern. The research favours programs which address the root causes of violence and seek to prevent rather than provide consequences for the undesirable behaviour. But what makes a violence prevention program effective? How should educators choose among the many curricula on the market? After reviewing the literature surrounding violence prevention programs and their effectiveness, The Second Step Violence Prevention Curriculum surfaced as unique in many ways. It was designed to address the root causes of violence in an active, student-centred way. Empathy training, anger management, interpersonal cognitive problem solving, and behavioural social skills form the basis of this program. Published in 1992, the program has been the topic of limited research, almost entirely carried out using quantitative methodologies.The purpose of this study was to understand what happens when the Second Step Violence Prevention Curriculum is implemented with a group of students and teachers. I was not seeking a statistical correlation between the frequency of violence and program delivery, as in most prior research. Rather, I wished to gain a deeper understanding of the impact ofthe program through the eyes of the participants. The Second Step Program was taught to a small, primary level, general learning disabilities class by a teacher and student teacher. Data were gathered using interviews with the teachers, personal observations, staff reports, and my own journal. Common themes across the four types of data collection emerged during the study, and these themes were isolated and explored for meaning. Findings indicate that the program does not offer a "quick fix" to this serious problem. However, several important discoveries were made. The teachers feU that the program was effective despite a lack of concrete evidence to support this claim. They used the Second Step strategies outside their actual instructional time and felt it made them better educators and disciplinarians. The students did not display a marked change in their behaviour during or after the program implementation, but they were better able to speak about their actions, the source of their aggression, and the alternatives which were available. Although they were not yet transferring their knowledge into positive action,a heightened awareness was evident. Finally, staff reports and my own journal led me to a deeper understanding ofhow perception frames reality. The perception that the program was working led everyone to feel more empowered when a violent incident occurred, and efforts were made to address the cause rather than merely to offer consequences. A general feeling that we were addressing the problem in a productive way was prevalent among the staff and students involved. The findings from this investigation have many implications for research and practice. Further study into the realm of violence prevention is greatly needed, using a balance of quantitative and qualitative methodologies. Such a serious problem can only be effectively addressed with a greater understanding of its complexities. This study also demonstrates the overall positive impact of the Second Step Violence Prevention Curriculum and, therefore, supports its continued use in our schools.
Resumo:
A number of frameworks have been suggested for online retailing, but still there exists little consensus among researchers and practitioners regarding the appropriate amount of information critical and essential to the improvement of customers' satisfaction and their purchase intention. Against this backdrop, this study contributes to the current practical and theoretical discussions and conversations about how information search and perceived risk theories can be applied to the management of online retailer website features. This paper examines the moderating role of website personalization in studying the relationship between information content provided on the top US retailers' websites, and customer satisfaction and purchase intention. The study also explores the role played by customer satisfaction and purchase intention in studying the relationship between information that is personalized to the needs of individual customers and online retailers' sales performance. Results indicate that the extent of information content features presented to online customers alone is not enough for companies looking to satisfy and motivate customers to purchase. However, information that is targeted to an individual customer influences customer satisfaction and purchase intention, and customer satisfaction in tum serves as a driver to the retailer's online sales performance.
Resumo:
RelAPS is an interactive system assisting in proving relation-algebraic theorems. The aim of the system is to provide an environment where a user can perform a relation-algebraic proof similar to doing it using pencil and paper. The previous version of RelAPS accepts only Horn-formulas. To extend the system to first order logic, we have defined and implemented a new language based on theory of allegories as well as a new calculus. The language has two different kinds of terms; object terms and relational terms, where object terms are built from object constant symbols and object variables, and relational terms from typed relational constant symbols, typed relational variables, typed operation symbols and the regular operations available in any allegory. The calculus is a mixture of natural deduction and the sequent calculus. It is formulated in a sequent style but with exactly one formula on the right-hand side. We have shown soundness and completeness of this new logic which verifies that the underlying proof system of RelAPS is working correctly.
Resumo:
Dynamic logic is an extension of modal logic originally intended for reasoning about computer programs. The method of proving correctness of properties of a computer program using the well-known Hoare Logic can be implemented by utilizing the robustness of dynamic logic. For a very broad range of languages and applications in program veri cation, a theorem prover named KIV (Karlsruhe Interactive Veri er) Theorem Prover has already been developed. But a high degree of automation and its complexity make it di cult to use it for educational purposes. My research work is motivated towards the design and implementation of a similar interactive theorem prover with educational use as its main design criteria. As the key purpose of this system is to serve as an educational tool, it is a self-explanatory system that explains every step of creating a derivation, i.e., proving a theorem. This deductive system is implemented in the platform-independent programming language Java. In addition, a very popular combination of a lexical analyzer generator, JFlex, and the parser generator BYacc/J for parsing formulas and programs has been used.
Resumo:
This Paper Intends to Develop a Coherent Methodological Framework Concerned with the Appraisal of Scientific Theories in Economics, and Which Is Based on a Postulated Aim of Science. We First Define the Scope of a Methodological Inquiry (Precise Definition of What Is Meant by the Logic of Appraisal of Scientific Theories) and Review the Work of Popper and Lakatos in the Philosophy of Science. We Then Use Their Results to Develop a Rational Structure of Scientific Activity. We Identify and Analyse Both a Micro and Macro Framework for the Process of Appraisal and Single Out the Importance of So-Called 'Fundamental Assumptions' in Creating Externalities in the Appraisal Process Which Forces Us to Adop a Multi-Level Analysis. Special Attention Is Given to the Role and Significance of the Abstraction Process and the Use of Assumptions in General. the Proposed Structure of Scientific Activity Is Illustrated with Examples From Economics.
Resumo:
Dans ce mémoire, nous proposons l’étude des représentations sociales du politique dans la presse quotidienne. Notre objet d’étude est le sens de la nation au Québec dans une période historique où la notion idéologique de nation s’avère un cadre de référence en profonde mutation dans de nombreuses sociétés. Plus particulièrement, nous voulons nous situer au centre des tensions à propos de la représentation sociale nationale en prenant comme observatoire un travail idéologique fédéraliste par des Québécois, qui à la fois se posent comme promoteurs de la nation, et visent une intégration dans un autre espace national et juridique : le Canada. Les résultats de cette étude qualitative sont issus d’une analyse sémantique du discours éditorial du journal La Presse. Nous avons ainsi examiné les différentes catégories de connaissance mobilisées lors de l’évocation de l’espace national, ainsi que la façon dont elles sont organisées au sein du discours lors des deux périodes référendaires, soit en 1980 et 1995. C’est donc dans le cadre d’une sociologie de la connaissance journalistique que nous menons cette étude. Le discours social, à partir de l’étude des théories des représentations sociales et de la sociologie des contenus médiatiques, ne peut se considérer qu’à travers l’ensemble des relations sociales dont il est le produit. Nous nous attachons ici à définir d’une part les spécificités du discours éditorial, et d’autre part les différentes catégories de connaissances utilisées dans notre corpus qui font sens. On perçoit alors, grâce à une description diachronique, l’évolution des représentations sociales ayant a trait à l’espace national québécois entre les deux périodes étudiées. Après avoir défini ce dont on parle lorsqu’il est question de l’espace national, nous nous emploierons à analyser la façon dont ce discours est organisé. Ainsi, nous mettons en avant d’une part, les différentes formes discursives, rhétoriques et argumentatives, mises en place dans le but de persuader et de justifier l’action (le rejet des deux référendums, et l’adhésion aux promesses de renouvellement du fédéralisme), et d’autre part la logique discursive mobilisée consistant à placer la nation comme un objet politique rationnel ou non. En effet, le discours éditorial nous permet de mettre au jour une organisation cognitive de la connaissance, qui à quelques nuances près, est structurée de façon manichéenne entre le rationnel (l’éditorialiste, le fédéralisme, l’économique, l’universalisme, la raison de sens commun) et l’irrationnel (le souverainisme, ses dirigeants n’étant que des rêveurs et des passionnés), se plaçant dès lors dans un rapport de communication politique, plus proche de la propagande que de l’exemplarité réflexive que pose le discours éditorial.
Resumo:
Les mesures de contrôle de la contamination croisée sont principalement concentrées dans la salle opératoire dentaire alors que les articles transférés entre la clinique et le laboratoire dentaire et les instruments de laboratoire ont reçu peu d’attention. Cette étude vise à documenter l’application des mesures d’asepsie sur ces articles par les professionnels du domaine dentaire ainsi que leurs perceptions entourant ces mesures. Un questionnaire autoadministré et anonyme a été envoyé à un échantillon aléatoire des dentistes, denturologistes et directeurs de laboratoire dentaire qui étaient inscrits aux listes des ordres professionnels en juin 2008 dans la province de Québec. Des 1100 questionnaires envoyés, 376 ont été retournés remplis. Presque trois quarts (72,1 %) des répondants affirment faire l’asepsie des instruments de laboratoire et 74,9 %, la désinfection des articles transférés mais avec des pourcentages variables selon le groupe d’articles (empreintes, prothèses, etc.). Seulement 9,1 % de professionnels identifient de façon générale les articles désinfectés avant l’envoi. Plus de la moitié des professionnels (51,4 %) trouvent qu’ils n’ont pas assez d’information sur l’asepsie des articles transférés et 62,4 %, qu’elle est difficile à appliquer. Cette étude est la première réalisée auprès des trois groupes de professionnels et la première à étudier leurs perceptions entourant l’asepsie des articles transférés et de l’instrumentation de laboratoire. Nous avons démontré que l’application des mesures d’asepsie à ces articles par les professionnels du domaine dentaire n’est pas toujours conforme aux normes proposées et qu’il existe un besoin de renforcer leur application, surtout en ce qui a trait aux articles transférés.