907 resultados para Triple Consistency Principle
Resumo:
Build is a tool for keeping modular systems in a consistent state by managing the construction tasks (e.g. compilation, linking, etc.) associated with such systems. It employs a user supplied system model and a procedural description of a task to be performed in order to perform the task. This differs from existing tools which do not explicitly separate knowledge about systems from knowledge about how systems are manipulated. BUILD provides a static framework for modeling systems and handling construction requests that makes use of programming environment specific definitions. By altering the set of definitions, BUILD can be extended to work with new programming environments to perform new tasks.
Resumo:
A European Perspective on the Precautionary Principle, Food Safety and the Free Trade Imperative of the WTO. European Law Review, Vol.27, No.2. April 2002, pp.138-155. RAE2008
Resumo:
Olusanya, Olaoluwa, Rethinking cognition as the sole basis for determining Criminal Liability under the Manifest Illegality Principle, In: 'Rethinking International Criminal Law: The Substantive Part', Europa Law Publishing, pp.67-87, 2007. RAE2008
Resumo:
Williams, Glenys, 'The Principle of Double Effect and Terminal Sedation', Medical Law Review, 9 (2001), pp.41-53 RAE2008
Resumo:
BACKGROUND:In the current climate of high-throughput computational biology, the inference of a protein's function from related measurements, such as protein-protein interaction relations, has become a canonical task. Most existing technologies pursue this task as a classification problem, on a term-by-term basis, for each term in a database, such as the Gene Ontology (GO) database, a popular rigorous vocabulary for biological functions. However, ontology structures are essentially hierarchies, with certain top to bottom annotation rules which protein function predictions should in principle follow. Currently, the most common approach to imposing these hierarchical constraints on network-based classifiers is through the use of transitive closure to predictions.RESULTS:We propose a probabilistic framework to integrate information in relational data, in the form of a protein-protein interaction network, and a hierarchically structured database of terms, in the form of the GO database, for the purpose of protein function prediction. At the heart of our framework is a factorization of local neighborhood information in the protein-protein interaction network across successive ancestral terms in the GO hierarchy. We introduce a classifier within this framework, with computationally efficient implementation, that produces GO-term predictions that naturally obey a hierarchical 'true-path' consistency from root to leaves, without the need for further post-processing.CONCLUSION:A cross-validation study, using data from the yeast Saccharomyces cerevisiae, shows our method offers substantial improvements over both standard 'guilt-by-association' (i.e., Nearest-Neighbor) and more refined Markov random field methods, whether in their original form or when post-processed to artificially impose 'true-path' consistency. Further analysis of the results indicates that these improvements are associated with increased predictive capabilities (i.e., increased positive predictive value), and that this increase is consistent uniformly with GO-term depth. Additional in silico validation on a collection of new annotations recently added to GO confirms the advantages suggested by the cross-validation study. Taken as a whole, our results show that a hierarchical approach to network-based protein function prediction, that exploits the ontological structure of protein annotation databases in a principled manner, can offer substantial advantages over the successive application of 'flat' network-based methods.
Resumo:
An iterative method for reconstructing a 3D polygonal mesh and color texture map from multiple views of an object is presented. In each iteration, the method first estimates a texture map given the current shape estimate. The texture map and its associated residual error image are obtained via maximum a posteriori estimation and reprojection of the multiple views into texture space. Next, the surface shape is adjusted to minimize residual error in texture space. The surface is deformed towards a photometrically-consistent solution via a series of 1D epipolar searches at randomly selected surface points. The texture space formulation has improved computational complexity over standard image-based error approaches, and allows computation of the reprojection error and uncertainty for any point on the surface. Moreover, shape adjustments can be constrained such that the recovered model's silhouette matches those of the input images. Experiments with real world imagery demonstrate the validity of the approach.
Resumo:
Space carving has emerged as a powerful method for multiview scene reconstruction. Although a wide variety of methods have been proposed, the quality of the reconstruction remains highly-dependent on the photometric consistency measure, and the threshold used to carve away voxels. In this paper, we present a novel photo-consistency measure that is motivated by a multiset variant of the chamfer distance. The new measure is robust to high amounts of within-view color variance and also takes into account the projection angles of back-projected pixels. Another critical issue in space carving is the selection of the photo-consistency threshold used to determine what surface voxels are kept or carved away. In this paper, a reliable threshold selection technique is proposed that examines the photo-consistency values at contour generator points. Contour generators are points that lie on both the surface of the object and the visual hull. To determine the threshold, a percentile ranking of the photo-consistency values of these generator points is used. This improved technique is applicable to a wide variety of photo-consistency measures, including the new measure presented in this paper. Also presented in this paper is a method to choose between photo-consistency measures, and voxel array resolutions prior to carving using receiver operating characteristic (ROC) curves.
Resumo:
With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.
Resumo:
This position paper outlines a new network architecture, i.e., a style of construction that identifies the objects and how they relate. We do not specify particular protocol implementations or specific interfaces and policies. After all, it should be possible to change protocols in an architecture without changing the architecture. Rather we outline the repeating patterns and structures, and how the proposed model would cope with the challenges faced by today's Internet (and that of the future). Our new architecture is based on the following principle: Application processes communicate via a distributed inter-process communication (IPC) facility. The application processes that make up this facility provide a protocol that implements an IPC mechanism, and a protocol for managing distributed IPC (routing, security and other management tasks). Existing implementation strategies, algorithms, and protocols can be cast and used within our proposed new structure.
Resumo:
The original solution to the high failure rate of software development projects was the imposition of an engineering approach to software development, with processes aimed at providing a repeatable structure to maintain a consistency in the ‘production process’. Despite these attempts at addressing the crisis in software development, others have argued that the rigid processes of an engineering approach did not provide the solution. The Agile approach to software development strives to change how software is developed. It does this primarily by relying on empowered teams of developers who are trusted to manage the necessary tasks, and who accept that change is a necessary part of a development project. The use of, and interest in, Agile methods in software development projects has expanded greatly, yet this has been predominantly practitioner driven. There is a paucity of scientific research on Agile methods and how they are adopted and managed. This study aims at addressing this paucity by examining the adoption of Agile through a theoretical lens. The lens used in this research is that of double loop learning theory. The behaviours required in an Agile team are the same behaviours required in double loop learning; therefore, a transition to double loop learning is required for a successful Agile adoption. The theory of triple loop learning highlights that power factors (or power mechanisms in this research) can inhibit the attainment of double loop learning. This study identifies the negative behaviours - potential power mechanisms - that can inhibit the double loop learning inherent in an Agile adoption, to determine how the Agile processes and behaviours can create these power mechanisms, and how these power mechanisms impact on double loop learning and the Agile adoption. This is a critical realist study, which acknowledges that the real world is a complex one, hierarchically structured into layers. An a priori framework is created to represent these layers, which are categorised as: the Agile context, the power mechanisms, and double loop learning. The aim of the framework is to explain how the Agile processes and behaviours, through the teams of developers and project managers, can ultimately impact on the double loop learning behaviours required in an Agile adoption. Four case studies provide further refinement to the framework, with changes required due to observations which were often different to what existing literature would have predicted. The study concludes by explaining how the teams of developers, the individual developers, and the project managers, working with the Agile processes and required behaviours, can inhibit the double loop learning required in an Agile adoption. A solution is then proposed to mitigate these negative impacts. Additionally, two new research processes are introduced to add to the Information Systems research toolkit.
Resumo:
BACKGROUND: Disclosure of authors' financial interests has been proposed as a strategy for protecting the integrity of the biomedical literature. We examined whether authors' financial interests were disclosed consistently in articles on coronary stents published in 2006. METHODOLOGY/PRINCIPAL FINDINGS: We searched PubMed for English-language articles published in 2006 that provided evidence or guidance regarding the use of coronary artery stents. We recorded article characteristics, including information about authors' financial disclosures. The main outcome measures were the prevalence, nature, and consistency of financial disclosures. There were 746 articles, 2985 authors, and 135 journals in the database. Eighty-three percent of the articles did not contain disclosure statements for any author (including declarations of no interests). Only 6% of authors had an article with a disclosure statement. In comparisons between articles by the same author, the types of disagreement were as follows: no disclosure statements vs declarations of no interests (64%); specific disclosures vs no disclosure statements (34%); and specific disclosures vs declarations of no interests (2%). Among the 75 authors who disclosed at least 1 relationship with an organization, there were 2 cases (3%) in which the organization was disclosed in every article the author wrote. CONCLUSIONS/SIGNIFICANCE: In the rare instances when financial interests were disclosed, they were not disclosed consistently, suggesting that there are problems with transparency in an area of the literature that has important implications for patient care. Our findings suggest that the inconsistencies we observed are due to both the policies of journals and the behavior of some authors.
Resumo:
On September 12, 2001, 54 Duke students recorded their memory of first hearing about the terrorist attacks of September 11 and of a recent everyday event. They were tested again either 1, 6, or 32 weeks later. Consistency for the flashbulb and everyday memories did not differ, in both cases declining over time. However, ratings of vividness, recollection, and belief in the accuracy of memory declined only for everyday memories. Initial visceral emotion ratings correlated with later belief in accuracy, but not consistency, for flashbulb memories. Initial visceral emotion ratings predicted later posttraumatic stress disorder symptoms. Flashbulb memories are not special in their accuracy, as previously claimed, but only in their perceived accuracy.
Resumo:
As a psychological principle, the golden rule represents an ethic of universal empathic concern. It is, surprisingly, present in the sacred texts of virtually all religions, and in philosophical works across eras and continents. Building on the literature demonstrating a positive impact of prosocial behavior on well-being, the present study investigates the psychological function of universal empathic concern in Indian Hindus, Christians, Muslims and Sikhs.
I develop a measure of the centrality of the golden rule-based ethic, within an individual’s understanding of his or her religion, that is applicable to all theistic religions. I then explore the consistency of its relationships with psychological well-being and other variables across religious groups.
Results indicate that this construct, named Moral Concern Religious Focus, can be reliably measured in disparate religious groups, and consistently predicts well-being across them. With measures of Intrinsic, Extrinsic and Quest religious orientations in the model, only Moral Concern and religiosity predict well-being. Moral Concern alone mediates the relationship between religiosity and well-being, and explains more variance in well-being than religiosity alone. The relationship between Moral Concern and well-being is mediated by increased preference for prosocial values, more satisfying interpersonal relationships, and greater meaning in life. In addition, across religious groups Moral Concern is associated with better self-reported physical and mental health, and more compassionate attitudes toward oneself and others.
Two additional types of religious focus are identified: Personal Gain, representing the motive to use religion to improve one’s life, and Relationship with God. Personal Gain is found to predict reduced preference for prosocial values, less meaning in life, and lower quality of relationships. It is associated with greater interference of pain and physical or mental health problems with daily activities, and lower self-compassion. Relationship with God is found to be associated primarily with religious variables and greater meaning in life.
I conclude that individual differences in the centrality of the golden rule and its associated ethic of universal empathic concern may play an important role in explaining the variability in associations between religion, prosocial behavior and well-being noted in the literature.
Resumo:
Developed for use with triple GEM detectors, the GEM Electronic Board (GEB) forms a crucial part of the electronics readout system being developed as part of the CMS muon upgrade program. The objective of the GEB is threefold; to provide stable powering and ground for the VFAT3 front ends, to enable high-speed communication between 24 VFAT3 front ends and an optohybrid, and to shield the GEM detector from electromagnetic interference. The paper describes the concept and design of a large-size GEB in detail, highlighting the challenges in terms of design and feasibility of this deceptively difficult system component.