965 resultados para GOAL PROGRAMMING APPROACH
Resumo:
Dissertação para obtenção do Grau de Doutor em Ciências da Educação
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Hybrid knowledge bases are knowledge bases that combine ontologies with non-monotonic rules, allowing to join the best of both open world ontologies and close world rules. Ontologies shape a good mechanism to share knowledge on theWeb that can be understood by both humans and machines, on the other hand rules can be used, e.g., to encode legal laws or to do a mapping between sources of information. Taking into account the dynamics present today on the Web, it is important for these hybrid knowledge bases to capture all these dynamics and thus adapt themselves. To achieve that, it is necessary to create mechanisms capable of monitoring the information flow present on theWeb. Up to today, there are no such mechanisms that allow for monitoring events and performing modifications of hybrid knowledge bases autonomously. The goal of this thesis is then to create a system that combine these hybrid knowledge bases with reactive rules, aiming to monitor events and perform actions over a knowledge base. To achieve this goal, a reactive system for the SemanticWeb is be developed in a logic-programming based approach accompanied with a language for heterogeneous rule base evolution having as its basis RIF Production Rule Dialect, which is a standard for exchanging rules over theWeb.
Resumo:
This work studies the combination of safe and probabilistic reasoning through the hybridization of Monte Carlo integration techniques with continuous constraint programming. In continuous constraint programming there are variables ranging over continuous domains (represented as intervals) together with constraints over them (relations between variables) and the goal is to find values for those variables that satisfy all the constraints (consistent scenarios). Constraint programming “branch-and-prune” algorithms produce safe enclosures of all consistent scenarios. Special proposed algorithms for probabilistic constraint reasoning compute the probability of sets of consistent scenarios which imply the calculation of an integral over these sets (quadrature). In this work we propose to extend the “branch-and-prune” algorithms with Monte Carlo integration techniques to compute such probabilities. This approach can be useful in robotics for localization problems. Traditional approaches are based on probabilistic techniques that search the most likely scenario, which may not satisfy the model constraints. We show how to apply our approach in order to cope with this problem and provide functionality in real time.
Resumo:
This study analyses financial data using the result characterization of a self-organized neural network model. The goal was prototyping a tool that may help an economist or a market analyst to analyse stock market series. To reach this goal, the tool shows economic dependencies and statistics measures over stock market series. The neural network SOM (self-organizing maps) model was used to ex-tract behavioural patterns of the data analysed. Based on this model, it was de-veloped an application to analyse financial data. This application uses a portfo-lio of correlated markets or inverse-correlated markets as input. After the anal-ysis with SOM, the result is represented by micro clusters that are organized by its behaviour tendency. During the study appeared the need of a better analysis for SOM algo-rithm results. This problem was solved with a cluster solution technique, which groups the micro clusters from SOM U-Matrix analyses. The study showed that the correlation and inverse-correlation markets projects multiple clusters of data. These clusters represent multiple trend states that may be useful for technical professionals.
Resumo:
The Intel R Xeon PhiTM is the first processor based on Intel’s MIC (Many Integrated Cores) architecture. It is a co-processor specially tailored for data-parallel computations, whose basic architectural design is similar to the ones of GPUs (Graphics Processing Units), leveraging the use of many integrated low computational cores to perform parallel computations. The main novelty of the MIC architecture, relatively to GPUs, is its compatibility with the Intel x86 architecture. This enables the use of many of the tools commonly available for the parallel programming of x86-based architectures, which may lead to a smaller learning curve. However, programming the Xeon Phi still entails aspects intrinsic to accelerator-based computing, in general, and to the MIC architecture, in particular. In this thesis we advocate the use of algorithmic skeletons for programming the Xeon Phi. Algorithmic skeletons abstract the complexity inherent to parallel programming, hiding details such as resource management, parallel decomposition, inter-execution flow communication, thus removing these concerns from the programmer’s mind. In this context, the goal of the thesis is to lay the foundations for the development of a simple but powerful and efficient skeleton framework for the programming of the Xeon Phi processor. For this purpose we build upon Marrow, an existing framework for the orchestration of OpenCLTM computations in multi-GPU and CPU environments. We extend Marrow to execute both OpenCL and C++ parallel computations on the Xeon Phi. We evaluate the newly developed framework, several well-known benchmarks, like Saxpy and N-Body, will be used to compare, not only its performance to the existing framework when executing on the co-processor, but also to assess the performance on the Xeon Phi versus a multi-GPU environment.
Resumo:
Companies are increasingly more and more dependent on distributed web-based software systems to support their businesses. This increases the need to maintain and extend software systems with up-to-date new features. Thus, the development process to introduce new features usually needs to be swift and agile, and the supporting software evolution process needs to be safe, fast, and efficient. However, this is usually a difficult and challenging task for a developer due to the lack of support offered by programming environments, frameworks, and database management systems. Changes needed at the code level, database model, and the actual data contained in the database must be planned and developed together and executed in a synchronized way. Even under a careful development discipline, the impact of changing an application data model is hard to predict. The lifetime of an application comprises changes and updates designed and tested using data, which is usually far from the real, production, data. So, coding DDL and DML SQL scripts to update database schema and data, is the usual (and hard) approach taken by developers. Such manual approach is error prone and disconnected from the real data in production, because developers may not know the exact impact of their changes. This work aims to improve the maintenance process in the context of Agile Platform by Outsystems. Our goal is to design and implement new data-model evolution features that ensure a safe support for change and a sound migration process. Our solution includes impact analysis mechanisms targeting the data model and the data itself. This provides, to developers, a safe, simple, and guided evolution process.
Resumo:
Linear logic has long been heralded for its potential of providing a logical basis for concurrency. While over the years many research attempts were made in this regard, a Curry-Howard correspondence between linear logic and concurrent computation was only found recently, bridging the proof theory of linear logic and session-typed process calculus. Building upon this work, we have developed a theory of intuitionistic linear logic as a logical foundation for session-based concurrent computation, exploring several concurrency related phenomena such as value-dependent session types and polymorphic sessions within our logical framework in an arguably clean and elegant way, establishing with relative ease strong typing guarantees due to the logical basis, which ensure the fundamental properties of type preservation and global progress, entailing the absence of deadlocks in communication. We develop a general purpose concurrent programming language based on the logical interpretation, combining functional programming with a concurrent, session-based process layer through the form of a contextual monad, preserving our strong typing guarantees of type preservation and deadlock-freedom in the presence of general recursion and higher-order process communication. We introduce a notion of linear logical relations for session typed concurrent processes, developing an arguably uniform technique for reasoning about sophisticated properties of session-based concurrent computation such as termination or equivalence based on our logical approach, further supporting our goal of establishing intuitionistic linear logic as a logical foundation for sessionbased concurrency.
Resumo:
Machine ethics is an interdisciplinary field of inquiry that emerges from the need of imbuing autonomous agents with the capacity of moral decision-making. While some approaches provide implementations in Logic Programming (LP) systems, they have not exploited LP-based reasoning features that appear essential for moral reasoning. This PhD thesis aims at investigating further the appropriateness of LP, notably a combination of LP-based reasoning features, including techniques available in LP systems, to machine ethics. Moral facets, as studied in moral philosophy and psychology, that are amenable to computational modeling are identified, and mapped to appropriate LP concepts for representing and reasoning about them. The main contributions of the thesis are twofold. First, novel approaches are proposed for employing tabling in contextual abduction and updating – individually and combined – plus a LP approach of counterfactual reasoning; the latter being implemented on top of the aforementioned combined abduction and updating technique with tabling. They are all important to model various issues of the aforementioned moral facets. Second, a variety of LP-based reasoning features are applied to model the identified moral facets, through moral examples taken off-the-shelf from the morality literature. These applications include: (1) Modeling moral permissibility according to the Doctrines of Double Effect (DDE) and Triple Effect (DTE), demonstrating deontological and utilitarian judgments via integrity constraints (in abduction) and preferences over abductive scenarios; (2) Modeling moral reasoning under uncertainty of actions, via abduction and probabilistic LP; (3) Modeling moral updating (that allows other – possibly overriding – moral rules to be adopted by an agent, on top of those it currently follows) via the integration of tabling in contextual abduction and updating; and (4) Modeling moral permissibility and its justification via counterfactuals, where counterfactuals are used for formulating DDE.
Resumo:
Earthworks involve the levelling or shaping of a target area through the moving or processing of the ground surface. Most construction projects require earthworks, which are heavily dependent on mechanical equipment (e.g., excavators, trucks and compactors). Often, earthworks are the most costly and time-consuming component of infrastructure constructions (e.g., road, railway and airports) and current pressure for higher productivity and safety highlights the need to optimize earthworks, which is a nontrivial task. Most previous attempts at tackling this problem focus on single-objective optimization of partial processes or aspects of earthworks, overlooking the advantages of a multi-objective and global optimization. This work describes a novel optimization system based on an evolutionary multi-objective approach, capable of globally optimizing several objectives simultaneously and dynamically. The proposed system views an earthwork construction as a production line, where the goal is to optimize resources under two crucial criteria (costs and duration) and focus the evolutionary search (non-dominated sorting genetic algorithm-II) on compaction allocation, using linear programming to distribute the remaining equipment (e.g., excavators). Several experiments were held using real-world data from a Portuguese construction site, showing that the proposed system is quite competitive when compared with current manual earthwork equipment allocation.
Resumo:
When a pregnant woman is guided to a hospital for obstetrics purposes, many outcomes are possible, depending on her current conditions. An improved understanding of these conditions could provide a more direct medical approach by categorizing the different types of patients, enabling a faster response to risk situations, and therefore increasing the quality of services. In this case study, the characteristics of the patients admitted in the maternity care unit of Centro Hospitalar of Porto are acknowledged, allowing categorizing the patient women through clustering techniques. The main goal is to predict the patients’ route through the maternity care, adapting the services according to their conditions, providing the best clinical decisions and a cost-effective treatment to patients. The models developed presented very interesting results, being the best clustering evaluation index: 0.65. The evaluation of the clustering algorithms proved the viability of using clustering based data mining models to characterize pregnant patients, identifying which conditions can be used as an alert to prevent the occurrence of medical complications.
Resumo:
Kidney renal failure means that one’s kidney have unexpectedly stopped functioning, i.e., once chronic disease is exposed, the presence or degree of kidney dysfunction and its progression must be assessed, and the underlying syndrome has to be diagnosed. Although the patient’s history and physical examination may denote good practice, some key information has to be obtained from valuation of the glomerular filtration rate, and the analysis of serum biomarkers. Indeed, chronic kidney sickness depicts anomalous kidney function and/or its makeup, i.e., there is evidence that treatment may avoid or delay its progression, either by reducing and prevent the development of some associated complications, namely hypertension, obesity, diabetes mellitus, and cardiovascular complications. Acute kidney injury appears abruptly, with a rapid deterioration of the renal function, but is often reversible if it is recognized early and treated promptly. In both situations, i.e., acute kidney injury and chronic kidney disease, an early intervention can significantly improve the prognosis.The assessment of these pathologies is therefore mandatory, although it is hard to do it with traditional methodologies and existing tools for problem solving. Hence, in this work, we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures based on Logic Programming, that will allow one to consider incomplete, unknown, and even contradictory information, complemented with an approach to computing centered on Artificial Neural Networks, in order to weigh the Degree-of-Confidence that one has on such a happening. The present study involved 558 patients with an age average of 51.7 years and the chronic kidney disease was observed in 175 cases. The dataset comprise twenty four variables, grouped into five main categories. The proposed model showed a good performance in the diagnosis of chronic kidney disease, since the sensitivity and the specificity exhibited values range between 93.1 and 94.9 and 91.9–94.2 %, respectively.
Resumo:
Parchment stands for a multifaceted material made from animal skin, which has been used for centuries as a writing support or as bookbinding. Due to the historic value of objects made of parchment, understanding their degradation and their condition is of utmost importance to archives, libraries and museums, i.e., the assessment of parchment degradation is mandatory, although it is hard to do with traditional methodologies and tools for problem solving. Hence, in this work we will focus on the development of a hybrid decision support system, in terms of its knowledge representation and reasoning procedures, under a formal framework based on Logic Programming, complemented with an approach to computing centered on Artificial Neural Networks, to evaluate Parchment Degradation and the respective Degree-of-Confidence that one has on such a happening.
Resumo:
Thrombotic disorders have severe consequences for the patients and for the society in general, being one of the main causes of death. These facts reveal that it is extremely important to be preventive; being aware of how probable is to have that kind of syndrome. Indeed, this work will focus on the development of a decision support system that will cater for an individual risk evaluation with respect to the surge of thrombotic complaints. The Knowledge Representation and Reasoning procedures used will be based on an extension to the Logic Programming language, allowing the handling of incomplete and/or default data. The computational framework in place will be centered on Artificial Neural Networks.
Resumo:
About 90% of breast cancers do not cause or are capable of producing death if detected at an early stage and treated properly. Indeed, it is still not known a specific cause for the illness. It may be not only a beginning, but also a set of associations that will determine the onset of the disease. Undeniably, there are some factors that seem to be associated with the boosted risk of the malady. Pondering the present study, different breast cancer risk assessment models where considered. It is our intention to develop a hybrid decision support system under a formal framework based on Logic Programming for knowledge representation and reasoning, complemented with an approach to computing centered on Artificial Neural Networks, to evaluate the risk of developing breast cancer and the respective Degree-of-Confidence that one has on such a happening.