997 resultados para Axioms of Huzita-Hatori
Resumo:
We explore the role of deeply held beliefs, known as social axioms, in the context of employee–organization relationships. Specifically, we examine how the beliefs identified as social cynicism and reward for application moderate the relationship between employees’ work-related experiences, perceptions of CSR, attitudes, and behavioral intentions toward their firm. Utilizing a sample of 130 retail employees, we find that CSR affects more positively employees low on social cynicism and reduces distrust more so than with cynical employees. Employees exhibiting strong reward for application are less positively affected by CSR, whereas their experiences of other work-related factors are more likely to reduce distrust. Our findings suggest the need for a differentiated view of CSR in the context of employee studies and offer suggestions for future research and management practice.
Resumo:
Policy hierarchies and automated policy refinement are powerful approaches to simplify administration of security services in complex network environments. A crucial issue for the practical use of these approaches is to ensure the validity of the policy hierarchy, i.e. since the policy sets for the lower levels are automatically derived from the abstract policies (defined by the modeller), we must be sure that the derived policies uphold the high-level ones. This paper builds upon previous work on Model-based Management, particularly on the Diagram of Abstract Subsystems approach, and goes further to propose a formal validation approach for the policy hierarchies yielded by the automated policy refinement process. We establish general validation conditions for a multi-layered policy model, i.e. necessary and sufficient conditions that a policy hierarchy must satisfy so that the lower-level policy sets are valid refinements of the higher-level policies according to the criteria of consistency and completeness. Relying upon the validation conditions and upon axioms about the model representativeness, two theorems are proved to ensure compliance between the resulting system behaviour and the abstract policies that are modelled.
Resumo:
The Distributed Software Development (DSD) is a development strategy that meets the globalization needs concerned with the increase productivity and cost reduction. However, the temporal distance, geographical dispersion and the socio-cultural differences, increased some challenges and, especially, added new requirements related with the communication, coordination and control of projects. Among these new demands there is the necessity of a software process that provides adequate support to the distributed software development. This paper presents an integrated approach of software development and test that considers distributed teams peculiarities. The approach purpose is to offer support to DSD, providing a better project visibility, improving the communication between the development and test teams, minimizing the ambiguity and difficulty to understand the artifacts and activities. This integrated approach was conceived based on four pillars: (i) to identify the DSD peculiarities concerned with development and test processes, (ii) to define the necessary elements to compose the integrated approach of development and test to support the distributed teams, (iii) to describe and specify the workflows, artifacts, and roles of the approach, and (iv) to represent appropriately the approach to enable the effective communication and understanding of it.
Resumo:
The goal of the present research is to define a Semantic Web framework for precedent modelling, by using knowledge extracted from text, metadata, and rules, while maintaining a strong text-to-knowledge morphism between legal text and legal concepts, in order to fill the gap between legal document and its semantics. The framework is composed of four different models that make use of standard languages from the Semantic Web stack of technologies: a document metadata structure, modelling the main parts of a judgement, and creating a bridge between a text and its semantic annotations of legal concepts; a legal core ontology, modelling abstract legal concepts and institutions contained in a rule of law; a legal domain ontology, modelling the main legal concepts in a specific domain concerned by case-law; an argumentation system, modelling the structure of argumentation. The input to the framework includes metadata associated with judicial concepts, and an ontology library representing the structure of case-law. The research relies on the previous efforts of the community in the field of legal knowledge representation and rule interchange for applications in the legal domain, in order to apply the theory to a set of real legal documents, stressing the OWL axioms definitions as much as possible in order to enable them to provide a semantically powerful representation of the legal document and a solid ground for an argumentation system using a defeasible subset of predicate logics. It appears that some new features of OWL2 unlock useful reasoning features for legal knowledge, especially if combined with defeasible rules and argumentation schemes. The main task is thus to formalize legal concepts and argumentation patterns contained in a judgement, with the following requirement: to check, validate and reuse the discourse of a judge - and the argumentation he produces - as expressed by the judicial text.
Resumo:
We define an applicative theory of truth TPT which proves totality exactly for the polynomial time computable functions. TPT has natural and simple axioms since nearly all its truth axioms are standard for truth theories over an applicative framework. The only exception is the axiom dealing with the word predicate. The truth predicate can only reflect elementhood in the words for terms that have smaller length than a given word. This makes it possible to achieve the very low proof-theoretic strength. Truth induction can be allowed without any constraints. For these reasons the system TPT has the high expressive power one expects from truth theories. It allows embeddings of feasible systems of explicit mathematics and bounded arithmetic. The proof that the theory TPT is feasible is not easy. It is not possible to apply a standard realisation approach. For this reason we develop a new realisation approach whose realisation functions work on directed acyclic graphs. In this way, we can express and manipulate realisation information more efficiently.
Resumo:
Large numbers of microorganisms colonise the skin and mucous membranes of animals, with their highest density in the lower gastrointestinal tract. The impact of these microbes on the host can be demonstrated by comparing animals (usually mice) housed under germ-free conditions, or colonised with different compositions of microbes. Inbreeding and embryo manipulation programs have generated a wide variety of mouse strains with a fixed germ-line (isogenic) and hygiene comparisons robustly show remarkably strong interactions between the microbiota and the host, which can be summarised in three axioms. (I) Live microbes are largely confined to their spaces at body surfaces, provided the animal is not suffering from an infection. (II) There is promiscuous molecular exchange throughout the host and its microbiota in both directions [1]. (III) Every host organ system is profoundly shaped by the presence of body surface microbes. It follows that one must draw a line between live microbial and host “spaces” (I) to understand the crosstalk (II and III) at this interesting interface of the host-microbial superorganism. Of course, since microbes can adapt to very different niches, there has to be more than one line. In this issue of EMBO Reports, Johansson and colleagues have studied mucus, which is the main physical frontier for most microbes in the intestinal tract: they report how different non-pathogenic microbiota compositions affect its permeability and the functional protection of the epithelial surface [2].
Resumo:
In this paper we define the notion of an axiom dependency hypergraph, which explicitly represents how axioms are included into a module by the algorithm for computing locality-based modules. A locality-based module of an ontology corresponds to a set of connected nodes in the hypergraph, and atoms of an ontology to strongly connected components. Collapsing the strongly connected components into single nodes yields a condensed hypergraph that comprises a representation of the atomic decomposition of the ontology. To speed up the condensation of the hypergraph, we first reduce its size by collapsing the strongly connected components of its graph fragment employing a linear time graph algorithm. This approach helps to significantly reduce the time needed for computing the atomic decomposition of an ontology. We provide an experimental evaluation for computing the atomic decomposition of large biomedical ontologies. We also demonstrate a significant improvement in the time needed to extract locality-based modules from an axiom dependency hypergraph and its condensed version.
Resumo:
Residuated lattices, although originally considered in the realm of algebra providing a general setting for studying ideals in ring theory, were later shown to form algebraic models for substructural logics. The latter are non-classical logics that include intuitionistic, relevance, many-valued, and linear logic, among others. Most of the important examples of substructural logics are obtained by adding structural rules to the basic logical calculus
Resumo:
Construction of an international index of standards of living, incorporating social indicators and economic output, typically involves scaling and weighting procedures that lack welfare-economic foundations. Revealed preference axioms can be used to make quality-of-life comparisons if we can estimate the representative household's production technology for the social indicators. This method is applied to comparisons of gross domestic product (GDP) and life expectancy for 58 countries. Neither GDP rankings, nor the rankings of the Human Development Index (HDI), are consistent with the partial ordering of revealed preference. A method of constructing a utility-consistent index incorporating both consumption and life expectancy is suggested. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
The nitrogen substitution in carbon materials is investigated theoretically using the density functional theory method. Our calculations show that nitrogen substitution decreases the hydrogen adsorption energy if hydrogen atoms are adsorbed on both nitrogen atoms and the neighboring carbon atoms. On the contrary, the hydrogen adsorption energy can be increased if hydrogen atoms are adsorbed only on the neighboring carbon atoms. The reason can be explained by the electronic structures analysis of N-substituted graphene sheets. Nitrogen substitution reduces the pi electron conjugation and increases the HOMO energy of a graphene sheet, and the nitrogen atom is not stable due to its 3-valent character. This raises an interesting research topic on the optimization of the N-substitution degree, and is important to many applications such as hydrogen storage and the tokamaks device. The electronic structure studies also explain well why nitrogen substitution increases the capacitance but decreases the electron conductivity of carbon electrodes as was experimentally observed in our experiments on the supercapacitor.
Resumo:
Boron substitution in carbon materials has been comprehensively investigated using the density functional theory method. It was found that there is a correlation between the stability of the graphene sheet, the distribution of T electrons, the electrostatic potential, and the capability for hydrogen-atom adsorption. Boron substitution destabilizes the graphene structure, increases the density of the electron wave around the substitutional boron atoms, and lowers the electrostatic potential, thus improving the hydrogen adsorption energy on carbon. However, this improvement is only ca. 10-20% instead of a factor of 4 or 5. Our calculations also show that two substitutional boron atoms provide consistent and reliable results, but one substitutional boron results in contradictory conclusions. This is a warning to other computational chemists who work on boron substitution that the conclusion from one substitutional boron might not be reliable.
Resumo:
Research in the present thesis is focused on the norms, strategies,and approaches which translators employ when translating humour in Children's Literature from English into Greek. It is based on process-oriented descriptive translation studies, since the focus is on investigating the process of translation. Viewing translation as a cognitive process and a problem soling activity, this thesis Think-aloud protocols (TAPs) in order to investigate translator's minds. As it is not possible to directly observe the human mind at work, an attempt is made to ask the translators themselves to reveal their mental processes in real time by verbalising their thoughts while carrying out a translation task involving humour. In this study, thirty participants at three different levels of expertise in translation competence, i.e. tn beginner, ten competent, and ten experts translators, were requested to translate two humourous extracts from the fictional diary novel The Secret Diary of Adrian Mole, Aged 13 ¾ by Sue Townsend (1982) from English into Greek. As they translated, they were asked to verbalise their thoughts and reason them, whenever possible, so that their strategies and approaches could be detected, and that subsequently, the norms that govern these strategies and approaches could be revealed. The thesis consists of four parts: the introduction, the literature review, the study, and the conclusion, and is developed in eleven chapters. the introduction contextualises the study within translation studies (TS) and presents its rationale, research questions, aims, and significance. Chapters 1 to 7 present an extensive and inclusive literature review identifying the principles axioms that guide and inform the study. In these seven chapters the following areas are critically introduced: Children's literature (Chapter 1), Children's Literature Translation (Chapter 2), Norms in Children's Literature (Chapter 3), Strategies in Children's Literature (Chapter 4), Humour in Children's Literature Translation (Chapter 5), Development of Translation Competence (Chapter 6), and Translation Process Research (Chapter 7). In Chapters 8 - 11 the fieldwork is described in detail. the piolot and the man study are described with a reference to he environments and setting, the participants, the research -observer, the data and its analysis, and limitations of the study. The findings of the study are presented and analysed in Chapter 9. Three models are then suggested for systematising translators' norms, strategies, and approaches, thus, filling the existing gap in the field. Pedagogical norms (e.g. appropriateness/correctness, famililarity, simplicity, comprehensibility, and toning down), literary norms (e.g. sound of language and fluency). and source-text norms (e.g. equivalence) were revealed to b the most prominent general and specific norms governing the translators' strategies and approaches in the process of translating humour in ChL. The data also revealed that monitoring and communication strategies (e.g. additions, omissions, and exoticism) were the prevalent strategies employed by translators. In Chapter 10 the main findings and outcomes of a potential secondary benefit (beneficial outcomes) are discussed on the basis of the research questions and aims of the study, and implications of the study are tackled in Chapter 11. In the conclusion, suggestions for future directions are given and final remarks noted.