893 resultados para Structured programming
Resumo:
Possibilistic Defeasible Logic Programming (P-DeLP) is a logic programming language which combines features from argumentation theory and logic programming, incorporating the treatment of possibilistic uncertainty at the object-language level. In spite of its expressive power, an important limitation in P-DeLP is that imprecise, fuzzy information cannot be expressed in the object language. One interesting alternative for solving this limitation is the use of PGL+, a possibilistic logic over Gödel logic extended with fuzzy constants. Fuzzy constants in PGL+ allow expressing disjunctive information about the unknown value of a variable, in the sense of a magnitude, modelled as a (unary) predicate. The aim of this article is twofold: firstly, we formalize DePGL+, a possibilistic defeasible logic programming language that extends P-DeLP through the use of PGL+ in order to incorporate fuzzy constants and a fuzzy unification mechanism for them. Secondly, we propose a way to handle conflicting arguments in the context of the extended framework.
Resumo:
In the last decade defeasible argumentation frameworks have evolved to become a sound setting to formalize commonsense, qualitative reasoning. The logic programming paradigm has shown to be particularly useful for developing different argument-based frameworks on the basis of different variants of logic programming which incorporate defeasible rules. Most of such frameworks, however, are unable to deal with explicit uncertainty, nor with vague knowledge, as defeasibility is directly encoded in the object language. This paper presents Possibilistic Logic Programming (P-DeLP), a new logic programming language which combines features from argumentation theory and logic programming, incorporating as well the treatment of possibilistic uncertainty. Such features are formalized on the basis of PGL, a possibilistic logic based on G¨odel fuzzy logic. One of the applications of P-DeLP is providing an intelligent agent with non-monotonic, argumentative inference capabilities. In this paper we also provide a better understanding of such capabilities by defining two non-monotonic operators which model the expansion of a given program P by adding new weighed facts associated with argument conclusions and warranted literals, respectively. Different logical properties for the proposed operators are studied
Resumo:
Random problem distributions have played a key role in the study and design of algorithms for constraint satisfaction and Boolean satisfiability, as well as in ourunderstanding of problem hardness, beyond standard worst-case complexity. We consider random problem distributions from a highly structured problem domain that generalizes the Quasigroup Completion problem (QCP) and Quasigroup with Holes (QWH), a widely used domain that captures the structure underlying a range of real-world applications. Our problem domain is also a generalization of the well-known Sudoku puz- zle: we consider Sudoku instances of arbitrary order, with the additional generalization that the block regions can have rectangular shape, in addition to the standard square shape. We evaluate the computational hardness of Generalized Sudoku instances, for different parameter settings. Our experimental hardness results show that we can generate instances that are considerably harder than QCP/QWH instances of the same size. More interestingly, we show the impact of different balancing strategies on problem hardness. We also provide insights into backbone variables in Generalized Sudoku instances and how they correlate to problem hardness.
Resumo:
We present experiments in which the laterally confined flow of a surfactant film driven by controlled surface tension gradients causes the subtended liquid layer to self-organize into an inner upstream microduct surrounded by the downstream flow. The anomalous interfacial flow profiles and the concomitant backflow are a result of the feedback between two-dimensional and three-dimensional microfluidics realized during flow in open microchannels. Bulk and surface particle image velocimetry data combined with an interfacial hydrodynamics model explain the dependence of the observed phenomena on channel geometry.
Resumo:
The main objective of this master's thesis is to study robot programming using simulation software, and also how to embed the simulation software into company's own robot controlling software. The further goal is to study a new communication interface to the assembly line's components -more precisely how to connect the robot cell into this new communication system. Conveyor lines are already available where the conveyors use the new communication standard. The robot cell is not yet capable of communicating with to other devices using the new communication protocols. The main problem among robot manufacturers is that they all have their own communication systems and programming languages. There has not been any common programming language to program all the different robot manufacturers robots, until the RRS (Realistic Robot Simulation) standards were developed. The RRS - II makes it possible to create the robot programs in the simulation software and it gives a common user interface for different robot manufacturers robots. This thesis will present the RRS - II standard and the robot manufacturers situation for the RRS - II support. Thesis presents how the simulation software can be embedded into company's own robot controlling software and also how the robot cell can be connected to the CAMX (Computer Aided Manufacturing using XML) communication system.
Resumo:
We propose a novel formulation to solve the problem of intra-voxel reconstruction of the fibre orientation distribution function (FOD) in each voxel of the white matter of the brain from diffusion MRI data. The majority of the state-of-the-art methods in the field perform the reconstruction on a voxel-by-voxel level, promoting sparsity of the orientation distribution. Recent methods have proposed a global denoising of the diffusion data using spatial information prior to reconstruction, while others promote spatial regularisation through an additional empirical prior on the diffusion image at each q-space point. Our approach reconciles voxelwise sparsity and spatial regularisation and defines a spatially structured FOD sparsity prior, where the structure originates from the spatial coherence of the fibre orientation between neighbour voxels. The method is shown, through both simulated and real data, to enable accurate FOD reconstruction from a much lower number of q-space samples than the state of the art, typically 15 samples, even for quite adverse noise conditions.
Resumo:
Many models proposed to study the evolution of collective action rely on a formalism that represents social interactions as n-player games between individuals adopting discrete actions such as cooperate and defect. Despite the importance of spatial structure in biological collective action, the analysis of n-player games games in spatially structured populations has so far proved elusive. We address this problem by considering mixed strategies and by integrating discrete-action n-player games into the direct fitness approach of social evolution theory. This allows to conveniently identify convergence stable strategies and to capture the effect of population structure by a single structure coefficient, namely, the pairwise (scaled) relatedness among interacting individuals. As an application, we use our mathematical framework to investigate collective action problems associated with the provision of three different kinds of collective goods, paradigmatic of a vast array of helping traits in nature: "public goods" (both providers and shirkers can use the good, e.g., alarm calls), "club goods" (only providers can use the good, e.g., participation in collective hunting), and "charity goods" (only shirkers can use the good, e.g., altruistic sacrifice). We show that relatedness promotes the evolution of collective action in different ways depending on the kind of collective good and its economies of scale. Our findings highlight the importance of explicitly accounting for relatedness, the kind of collective good, and the economies of scale in theoretical and empirical studies of the evolution of collective action.
Resumo:
Demyelinating diseases are characterized by a loss of oligodendrocytes leading to axonal degeneration and impaired brain function. Current strategies used for the treatment of demyelinating disease such as multiple sclerosis largely rely on modulation of the immune system. Only limited treatment options are available for treating the later stages of the disease, and these treatments require regenerative therapies to ameliorate the consequences of oligodendrocyte loss and axonal impairment. Directed differentiation of adult hippocampal neural stem/progenitor cells (NSPCs) into oligodendrocytes may represent an endogenous source of glial cells for cell-replacement strategies aiming to treat demyelinating disease. Here, we show that Ascl1-mediated conversion of hippocampal NSPCs into mature oligodendrocytes enhances remyelination in a diphtheria-toxin (DT)-inducible, genetic model for demyelination. These findings highlight the potential of targeting hippocampal NSPCs for the treatment of demyelinated lesions in the adult brain.
Resumo:
PURPOSE OF REVIEW: To provide an overview of available evidence of the potential role of epigenetics in the pathogenesis of hypertension and vascular dysfunction. RECENT FINDINGS: Arterial hypertension is a highly heritable condition. Surprisingly, however, genetic variants only explain a tiny fraction of the phenotypic variation and the term 'missing heritability' has been coined to describe this phenomenon. Recent evidence suggests that phenotypic alteration that is unrelated to changes in DNA sequence (thereby escaping detection by classic genetic methodology) offers a potential explanation. Here, we present some basic information on epigenetics and review recent work consistent with the hypothesis of epigenetically induced arterial hypertension. SUMMARY: New technologies that enable the rigorous assessment of epigenetic changes and their phenotypic consequences may provide the basis for explaining the missing heritability of arterial hypertension and offer new possibilities for treatment and/or prevention.
Resumo:
We consider the problem of multiple correlated sparse signals reconstruction and propose a new implementation of structured sparsity through a reweighting scheme. We present a particular application for diffusion Magnetic Resonance Imaging data and show how this procedure can be used for fibre orientation reconstruction in the white matter of the brain. In that framework, our structured sparsity prior can be used to exploit the fundamental coherence between fibre directions in neighbour voxels. Our method approaches the ℓ0 minimisation through a reweighted ℓ1-minimisation scheme. The weights are here defined in such a way to promote correlated sparsity between neighbour signals.
Resumo:
OBJECTIVE: The aim of this study is to review highly cited articles that focus on non-publication of studies, and to develop a consistent and comprehensive approach to defining (non-) dissemination of research findings. SETTING: We performed a scoping review of definitions of the term 'publication bias' in highly cited publications. PARTICIPANTS: Ideas and experiences of a core group of authors were collected in a draft document, which was complemented by the findings from our literature search. INTERVENTIONS: The draft document including findings from the literature search was circulated to an international group of experts and revised until no additional ideas emerged and consensus was reached. PRIMARY OUTCOMES: We propose a new approach to the comprehensive conceptualisation of (non-) dissemination of research. SECONDARY OUTCOMES: Our 'What, Who and Why?' approach includes issues that need to be considered when disseminating research findings (What?), the different players who should assume responsibility during the various stages of conducting a clinical trial and disseminating clinical trial documents (Who?), and motivations that might lead the various players to disseminate findings selectively, thereby introducing bias in the dissemination process (Why?). CONCLUSIONS: Our comprehensive framework of (non-) dissemination of research findings, based on the results of a scoping literature search and expert consensus will facilitate the development of future policies and guidelines regarding the multifaceted issue of selective publication, historically referred to as 'publication bias'.
Resumo:
Virtual Laboratories are an indispensablespace for developing practical activities in a Virtual Environment. In the field of Computer and Software Engineering different types of practical activities have tobe performed in order to obtain basic competences which are impossible to achieve by other means. This paper specifies an ontology for a general virtual laboratory.The proposed ontology provides a mechanism to select the best resources needed in a Virtual Laboratory once a specific practical activity has been defined and the maincompetences that students have to achieve in the learning process have been fixed. Furthermore, the proposed ontology can be used to develop an automatic and wizardtool that creates a Moodle Classroom using the practical activity specification and the related competences.
Resumo:
Peer-reviewed
Resumo:
We study the lysis timing of a bacteriophage population by means of a continuously infection-age-structured population dynamics model. The features of the model are the infection process of bacteria, the death process, and the lysis process which means the replication of bacteriophage viruses inside bacteria and the destruction of them. The time till lysis (or latent period) is assumed to have an arbitrary distribution. We have carried out an optimization procedure, and we have found that the latent period corresponding to maximal fitness (i.e. maximal growth rate of the bacteriophage population) is of fixed length. We also study the dependence of the optimal latent period on the amount of susceptible bacteria and the number of virions released by a single infection. Finally, the evolutionarily stable strategy of the latent period is also determined as a fixed period taking into account that super-infections are not considered