844 resultados para Many-valued logic
Resumo:
Global analyzers traditionally read and analyze the entire program at once, in a non-incremental way. However, there are many situations which are not well suited to this simple model and which instead require reanalysis of certain parts of a program which has already been analyzed. In these cases, it appears inefficient to perform the analysis of the program again from scratch, as needs to be done with current systems. We describe how the fixpoint algorithms in current generic analysis engines can be extended to support incremental analysis. The possible changes to a program are classified into three types: addition, deletion, and arbitrary change. For each one of these, we provide one or more algorithms for identifying the parts of the analysis that must be recomputed and for performing the actual recomputation. The potential benefits and drawbacks of these algorithms are discussed. Finally, we present some experimental results obtained with an implementation of the algorithms in the PLAI generic abstract interpretation framework. The results show significant benefits when using the proposed incremental analysis algorithms.
Resumo:
In this paper, we examine the issue of memory management in the parallel execution of logic programs. We concentrate on non-deterministic and-parallel schemes which we believe present a relatively general set of problems to be solved, including most of those encountered in the memory management of or-parallel systems. We present a distributed stack memory management model which allows flexible scheduling of goals. Previously proposed models (based on the "Marker model") are lacking in that they impose restrictions on the selection of goals to be executed or they may require consume a large amount of virtual memory. This paper first presents results which imply that the above mentioned shortcomings can have significant performance impacts. An extension of the Marker Model is then proposed which allows flexible scheduling of goals while keeping (virtual) memory consumption down. Measurements are presented which show the advantage of this solution. Methods for handling forward and backward execution, cut and roll back are discussed in the context of the proposed scheme. In addition, the paper shows how the same mechanism for flexible scheduling can be applied to allow the efficient handling of the very general form of suspension that can occur in systems which combine several types of and-parallelism and more sophisticated methods of executing logic programs. We believe that the results are applicable to many and- and or-parallel systems.
Resumo:
Context: This paper addresses one of the major end-user development (EUD) challenges, namely, how to pack today?s EUD support tools with composable elements. This would give end users better access to more components which they can use to build a solution tailored to their own needs. The success of later end-user software engineering (EUSE) activities largely depends on how many components each tool has and how adaptable components are to multiple problem domains. Objective: A system for automatically adapting heterogeneous components to a common development environment would offer a sizeable saving of time and resources within the EUD support tool construction process. This paper presents an automated adaptation system for transforming EUD components to a standard format. Method: This system is based on the use of description logic. Based on a generic UML2 data model, this description logic is able to check whether an end-user component can be transformed to this modeling language through subsumption or as an instance of the UML2 model. Besides it automatically finds a consistent, non-ambiguous and finite set of XSLT mappings to automatically prepare data in order to leverage the component as part of a tool that conforms to the target UML2 component model. Results: The proposed system has been successfully applied to components from four prominent EUD tools. These components were automatically converted to a standard format. In order to validate the proposed system, rich internet applications (RIA) used as an operational support system for operators at a large services company were developed using automatically adapted standard format components. These RIAs would be impossible to develop using each EUD tool separately. Conclusion: The positive results of applying our system for automatically adapting components from current tool catalogues are indicative of the system?s effectiveness. Use of this system could foster the growth of web EUD component catalogues, leveraging a vast ecosystem of user-centred SaaS to further current EUSE trends.
Resumo:
Much has been learned about vertebrate development by random mutagenesis followed by phenotypic screening and by targeted gene disruption followed by phenotypic analysis in model organisms. Because the timing of many developmental events is critical, it would be useful to have temporal control over modulation of gene function, a luxury frequently not possible with genetic mutants. Here, we demonstrate that small molecules capable of conditional gene product modulation can be identified through developmental screens in zebrafish. We have identified several small molecules that specifically modulate various aspects of vertebrate ontogeny, including development of the central nervous system, the cardiovascular system, the neural crest, and the ear. Several of the small molecules identified allowed us to dissect the logic of melanocyte and otolith development and to identify critical periods for these events. Small molecules identified in this way offer potential to dissect further these and other developmental processes and to identify novel genes involved in vertebrate development.
Resumo:
Hardware/Software partitioning (HSP) is a key task for embedded system co-design. The main goal of this task is to decide which components of an application are to be executed in a general purpose processor (software) and which ones, on a specific hardware, taking into account a set of restrictions expressed by metrics. In last years, several approaches have been proposed for solving the HSP problem, directed by metaheuristic algorithms. However, due to diversity of models and metrics used, the choice of the best suited algorithm is an open problem yet. This article presents the results of applying a fuzzy approach to the HSP problem. This approach is more flexible than many others due to the fact that it is possible to accept quite good solutions or to reject other ones which do not seem good. In this work we compare six metaheuristic algorithms: Random Search, Tabu Search, Simulated Annealing, Hill Climbing, Genetic Algorithm and Evolutionary Strategy. The presented model is aimed to simultaneously minimize the hardware area and the execution time. The obtained results show that Restart Hill Climbing is the best performing algorithm in most cases.
Resumo:
Aristotle is reportedly held to have been a Moderate Realist in that he would maintain that a concept derives from an act of grasping a mind-independent universal object that exists somehow inside of the many different things which the concept is predicated of. As far as a universal is independent of mind, it would stand for the proper object of a concept that subsumes a given number of things as its own instantiations. But we claim that Aristotle rejected such a view and instead did perceive and comprehend universality as a feature of thought rather than as a feature of reality in its own right. As showed in the chapters of Topics regarding the so-called logic of comparison (with the support of Albert the Great’s commentary), each predicate can be more or less consistent with the attribute of the subject of which it may be predicated. Both essential and accidental attributes assume a definite degree of being related to the degree of belonging to substance. Unlike particular things, the universality of a concept is to be understood always in comparison with another concept according to a hierarchy of predicates in terms of universality degree arranged by comparative terms such as ‘more’, ‘less’, and ‘likewise’. What is really mind-independent are the truth conditions which make a universal true when exclusively referring to a set of things identically meant by the same predicate whose universality is given by the place occupied in the hierarchy of predicates.
Resumo:
O neoliberalismo, do ponto de vista econômico e social, pode ser entendido como a instauração, na sociedade, de relações estritamente mercantis, fazendo com que a lógica da maximização do ganho e do rendimento seja estendida a todos os campos, promovendo a racionalidade econômica como forma de racionalidade em geral. A forma de governamentalidade neoliberal norte-americana, com sua pretensão de transmutar os indivíduos em sujeitos-microempresas e as relações humanas em relações de tipo concorrencial, faz com que os indivíduos passem a ser vistos como “capital humano”. Originalmente, o termo “capital humano” remete a uma teoria que, desenvolvida sob influência do paradigma econômico neoclássico e liderança de Theodore Schultz, foi responsável por assimilar e transferir princípios econômicos para uma realidade anteriormente isenta de significados dessa natureza, fazendo emergir um discurso que associa o humano ao capital, transportando-o, dessa forma, para uma lógica onde ele deve gerir a si mesmo, tal como uma empresa. A empresa é, assim, promovida a modelo de subjetivação, sendo cada indivíduo um capital a ser gerenciado e valorizado conforme as demandas do mercado. É por isso que o modelo de conduta empreendedora, advindo do discurso do capital humano de inspiração neoliberal e de teorias clássicas propostas por Werner Sombart e Joseph A. Schumpeter, acomete os profissionais das organizações sediadas nos países capitalistas. Esse fato é bastante expressivo entre os jovens que procuram inserir-se no mercado de trabalho, principalmente em posições estratégicas valorizadas dentro das organizações, como as de trainee. No Brasil, os programas de trainee são considerados uma estratégia de busca de atração de jovens com perfil diferenciado, sendo uma resposta encontrada por muitas organizações desde 1970 para ganhar vantagem em um cenário econômico altamente competitivo. Esses profissionais são vistos como os “talentos” da organização, sendo treinados para ocuparem cargos estratégicos em um curto espaço tempo. A fim de esclarecer de que maneira o modelo de conduta empreendedora está presente nos processos seletivos de trainee, foi realizada uma análise dos textos que descrevem as competências exigidas na seleção desses jovens, a partir da Análise Crítica do Discurso (ACD) de Fairclough (2001, 2003), a partir das categorias analíticas “modalidade” e “avaliação”, e reflexões acerca da ideologia neoliberal. Chegou-se à conclusão de que o modelo de conduta empreendedora que está presente nos processos seletivos de trainee é marcada pela expressão de um comportamento apaixonado, que, no campo do management, é entendido a partir do conceito de “paixão empreendedora”. A pesquisa desenvolvida é relevante para o campo da Administração, tanto para o campo acadêmico (uma vez que há poucos estudos que têm como objeto de pesquisa a seleção de trainees e que procuram entendê-lo a partir de um viés crítico utilizando-se da análise do discurso do capital humano), como para quem está inserido nas organizações e convive com as dificuldades e desafios de selecionar jovens para programas de trainees, já que levanta questões importantes sobre os impactos dessas iniciativas tanto para os jovens, como para as organizações que os contratam.
Resumo:
This paper defines the 3D reconstruction problem as the process of reconstructing a 3D scene from numerous 2D visual images of that scene. It is well known that this problem is ill-posed, and numerous constraints and assumptions are used in 3D reconstruction algorithms in order to reduce the solution space. Unfortunately, most constraints only work in a certain range of situations and often constraints are built into the most fundamental methods (e.g. Area Based Matching assumes that all the pixels in the window belong to the same object). This paper presents a novel formulation of the 3D reconstruction problem, using a voxel framework and first order logic equations, which does not contain any additional constraints or assumptions. Solving this formulation for a set of input images gives all the possible solutions for that set, rather than picking a solution that is deemed most likely. Using this formulation, this paper studies the problem of uniqueness in 3D reconstruction and how the solution space changes for different configurations of input images. It is found that it is not possible to guarantee a unique solution, no matter how many images are taken of the scene, their orientation or even how much color variation is in the scene itself. Results of using the formulation to reconstruct a few small voxel spaces are also presented. They show that the number of solutions is extremely large for even very small voxel spaces (5 x 5 voxel space gives 10 to 10(7) solutions). This shows the need for constraints to reduce the solution space to a reasonable size. Finally, it is noted that because of the discrete nature of the formulation, the solution space size can be easily calculated, making the formulation a useful tool to numerically evaluate the usefulness of any constraints that are added.
Resumo:
The introduction of standard on-chip buses has eased integration and boosted the production of IP functional cores. However, once an IP is bus specific retargeting to a different bus is time-consuming and tedious, and this reduces the reusability of the bus-specific IP. As new bus standards are introduced and different interconnection methods are proposed, this problem increases. Many solutions have been proposed, however these solutions either limit the IP block performance or are restricted to a particular platform. A new concept is presented that can connect IP blocks to a wide variety of interface architectures with low overhead. This is achieved through the use a special interface adaptor logic layer.
Resumo:
Job satisfaction is a significant predictor of organisational innovation – especially where employees (including shop-floor workers) experience variety in their jobs and work in a single-status environment. The relationship between job satisfaction and performance has long intrigued work psychologists. The idea that "happy workers are productive workers" underpins many theories of performance, leadership, reward and job design. But contrary to popular belief, the relationship between job satisfaction and performance at individual level has been shown to be relatively weak. Research investigating the link between job satisfaction and creativity (the antecedent to innovation) shows that job dissatisfaction promotes creative outcomes. The logic is that those who are dissatisfied (and have decided to stay with the organisation) are determined to change things and have little to lose in doing so (see JM George & J Zhou, 2002). We were therefore surprised to find in the course of our own research into managerial practices and employee attitudes in manufacturing organisations that job satisfaction was a highly significant predictor of product and technological innovation. These results held even though the research was conducted longitudinally, over two years, while controlling for prior innovation. In other words, job satisfaction was a stronger predictor of innovation than any pre-existing orientation organisations had towards working innovatively. Using prior innovation as a control variable, as well as a longitudinal research design, strengthened our case against the argument that people are satisfied because they belong to a highly innovative organisation. We found that the relationship between job satisfaction and innovation was stronger still where organisations showed that they were committed to promoting job variety, especially at shop-floor level. We developed precise instruments to measure innovation, taking into account the magnitude of the innovation both in terms of the number of people involved in its implementation, and how new and different it was. Using this instrument, we are able to give each organisation in our sample a "score" from one to seven for innovation in areas ranging from administration to production technology. We found that much innovation is incremental, involving relatively minor improvements, rather than major change. To achieve sustained innovation, organisations have to draw on the skills and knowledge of employees at all levels. We also measured job satisfaction at organisational level, constructing a mean "job satisfaction" score for all organisations in our sample, and drawing only on those companies whose employees tended to respond in a similar manner to the questions they were asked. We argue that where most of the workforce experience job satisfaction, employees are more likely to collaborate, to share ideas and aim for high standards because people are keen to sustain their positive feelings. Job variety and single-status arrangements further strengthen the relationship between satisfaction and performance. This makes sense; where employees experience variety, they are exposed to new and different ideas and, provided they feel positive about their jobs, are likely to be willing to try to apply these ideas to improve their jobs. Similarly, staff working in single-status environments where hierarchical barriers are reduced are likely to feel trusted and valued by management and there is evidence (see G Jones & J George, 1998) that people work collaboratively and constructively with those they trust. Our study suggests that there is a strong business case for promoting employee job satisfaction. Managers and HR practitioners need to ensure their strategies and practices support and sustain job satisfaction among their workforces to encourage constructive, collaborative and creative working. It is more important than ever for organisations to respond rapidly to demands of the external environment. This study shows the positive association between organisational-level job satisfaction and innovation. So if a happy workforce is the key to unlocking innovation and organisations want to thrive in the global economy, it is vital that managers and HR practitioners pay close attention to employee perceptions of the work environment. In a world where the most innovative survive it could make all the difference.
Resumo:
Integer-valued data envelopment analysis (DEA) with alternative returns to scale technology has been introduced and developed recently by Kuosmanen and Kazemi Matin. The proportionality assumption of their introduced "natural augmentability" axiom in constant and nondecreasing returns to scale technologies makes it possible to achieve feasible decision-making units (DMUs) of arbitrary large size. In many real world applications it is not possible to achieve such production plans since some of the input and output variables are bounded above. In this paper, we extend the axiomatic foundation of integer-valuedDEAmodels for including bounded output variables. Some model variants are achieved by introducing a new axiom of "boundedness" over the selected output variables. A mixed integer linear programming (MILP) formulation is also introduced for computing efficiency scores in the associated production set. © 2011 The Authors. International Transactions in Operational Research © 2011 International Federation of Operational Research Societies.
Resumo:
The relationship between professionalism, education and housing practice has become increasingly strained following the introduction of austerity measures and welfare reforms across a range of countries. Focusing on the development of UK housing practice, this article considers how notions of professionalism are being reshaped within the context of welfare retrenchment and how emerging tensions have both affected the identity of housing professionals and impacted on the delivery of training and education programmes. The article analyses the changing knowledges and skills valued in contemporary housing practice and considers how the sector has responded to the challenges of austerity. The central argument is that a dominant logic of competition has culminated in a crisis of identity for the sector. Although the focus of the article is on UK housing practice, the processes identified have a wider relevance for the analysis of housing and welfare delivery in developed economies.
Resumo:
Hybrid logic is a valuable tool for specifying relational structures, at the same time that allows defining accessibility relations between states, it provides a way to nominate and make mention to what happens at each specific state. However, due to the many sources nowadays available, we may need to deal with contradictory information. This is the reason why we came with the idea of Quasi-hybrid logic, which is a paraconsistent version of hybrid logic capable of dealing with inconsistencies in the information, written as hybrid formulas. In [5] we have already developed a semantics for this paraconsistent logic. In this paper we go a step forward, namely we study its proof-theoretical aspects. We present a complete tableau system for Quasi-hybrid logic, by combining both tableaux for Quasi-classical and Hybrid logics.
Resumo:
In this thesis we discuss in what ways computational logic (CL) and data science (DS) can jointly contribute to the management of knowledge within the scope of modern and future artificial intelligence (AI), and how technically-sound software technologies can be realised along the path. An agent-oriented mindset permeates the whole discussion, by stressing pivotal role of autonomous agents in exploiting both means to reach higher degrees of intelligence. Accordingly, the goals of this thesis are manifold. First, we elicit the analogies and differences among CL and DS, hence looking for possible synergies and complementarities along 4 major knowledge-related dimensions, namely representation, acquisition (a.k.a. learning), inference (a.k.a. reasoning), and explanation. In this regard, we propose a conceptual framework through which bridges these disciplines can be described and designed. We then survey the current state of the art of AI technologies, w.r.t. their capability to support bridging CL and DS in practice. After detecting lacks and opportunities, we propose the notion of logic ecosystem as the new conceptual, architectural, and technological solution supporting the incremental integration of symbolic and sub-symbolic AI. Finally, we discuss how our notion of logic ecosys- tem can be reified into actual software technology and extended towards many DS-related directions.
Resumo:
In Prior Analytics 1.1–22, Aristotle develops his proof system of non-modal and modal propositions. This system is given in the language of propositions, and Aristotle is concerned with establishing some properties and relations that the expressions of this language enjoy. However, modern scholarship has found some of his results inconsistent with positions defended elsewhere. The set of rules of inference of this system has also caused perplexity: there does not seem to be a single interpretation that validates all the rules which Aristotle is explicitly committed to using in his proofs. Some commentators have argued that these and other problems cannot be successfully addressed from the viewpoint of the traditional, ‘first-order’ interpretation of Aristotle’s syllogistic, whereby propositions are taken to involve quantification over individuals only. Accordingly, this interpretation not only is inadequate for formal analysis, but also stems from a misunderstanding of Aristotle’s ideas about quantification. On the contrary, in this study I purport to vindicate the adequacy and plausibility of the first-order interpretation. Together with some assumptions about the language of propositions and an appropriate regimentation, the first-order interpretation yields promising solutions to many of the problems raised by the modal syllogistic. Thus, I present a reconstruction of the language of propositions and a formal interpretation thereof which will prove respectful and responsive to most of the views endorsed by Aristotle in the ‘modal’ chapters of the Analytics.