985 resultados para Software Reuse, Objects, Concurrency, Actors, Agents
Resumo:
Multimedia objects, especially images and figures, are essential for the visualization and interpretation of research findings. The distribution and reuse of these scientific objects is significantly improved under open access conditions, for instance in Wikipedia articles, in research literature, as well as in education and knowledge dissemination, where licensing of images often represents a serious barrier. Whereas scientific publications are retrievable through library portals or other online search services due to standardized indices there is no targeted retrieval and access to the accompanying images and figures yet. Consequently there is a great demand to develop standardized indexing methods for these multimedia open access objects in order to improve the accessibility to this material. With our proposal, we hope to serve a broad audience which looks up a scientific or technical term in a web search portal first. Until now, this audience has little chance to find an openly accessible and reusable image narrowly matching their search term on first try - frustratingly so, even if there is in fact such an image included in some open access article.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-07
Resumo:
This paper provides an agent-based software exploration of the wellknown free market efficiency/equality trade-off. Our study simulates the interaction of agents producing, trading and consuming goods in the presence of different market structures, and looks at how efficient the producers/consumers mapping turn out to be as well as the resulting distribution of welfare among agents at the end of an arbitrarily large number of iterations. Two market mechanisms are compared: the competitive market (a double auction market in which agents outbid each other in order to buy and sell products) and the random one (in which products are allocated randomly). Our results confirm that the superior efficiency of the competitive market (an effective and never stopping producers/consumers mapping and a superior aggregative welfare) comes at a very high price in terms of inequality (above all when severe budget constraints are in play).
Resumo:
This portfolio thesis describes work undertaken by the author under the Engineering Doctorate program of the Institute for System Level Integration. It was carried out in conjunction with the sponsor company Teledyne Defence Limited. A radar warning receiver is a device used to detect and identify the emissions of radars. They were originally developed during the Second World War and are found today on a variety of military platforms as part of the platform’s defensive systems. Teledyne Defence has designed and built components and electronic subsystems for the defence industry since the 1970s. This thesis documents part of the work carried out to create Phobos, Teledyne Defence’s first complete radar warning receiver. Phobos was designed to be the first low cost radar warning receiver. This was made possible by the reuse of existing Teledyne Defence products, commercial off the shelf hardware and advanced UK government algorithms. The challenges of this integration are described and discussed, with detail given of the software architecture and the development of the embedded application. Performance of the embedded system as a whole is described and qualified within the context of a low cost system.
Resumo:
Hybridisation is a systematic process along which the characteristic features of hybrid logic, both at the syntactic and the semantic levels, are developed on top of an arbitrary logic framed as an institution. It also captures the construction of first-order encodings of such hybridised institutions into theories in first-order logic. The method was originally developed to build suitable logics for the specification of reconfigurable software systems on top of whatever logic is used to describe local requirements of each system’s configuration. Hybridisation has, however, a broader scope, providing a fresh example of yet another development in combining and reusing logics driven by a problem from Computer Science. This paper offers an overview of this method, proposes some new extensions, namely the introduction of full quantification leading to the specification of dynamic modalities, and exemplifies its potential through a didactical application. It is discussed how hybridisation can be successfully used in a formal specification course in which students progress from equational to hybrid specifications in a uniform setting, integrating paradigms, combining data and behaviour, and dealing appropriately with systems evolution and reconfiguration.
Resumo:
El presente artículo es resultado de la investigación: “Diseño de un modelo para mejorar los procesos de estimación de costos para las empresas desarrolladoras de software”. Se presenta una revisión de la literatura a nivel internacional con el fin de identificar tendencias y métodos para realizar estimaciones de costos de software más exactas. Por medio del método predictivo Delphi, un conjunto de expertos pertenecientes al sector de software de Barranquilla clasificaron y valoraron según la probabilidad de ocurrencia cinco escenarios realistas de estimaciones. Se diseñó un experimento completamente aleatorio cuyos resultados apuntaron a dos escenarios estadísticamente similares de manera cualitativa, con lo que se construyó un modelo de análisis basado en tres agentes: Metodología, capacidad del equipo de trabajo y productos tecnológicos; cada uno con tres categorías de cumplimiento para lograr estimaciones más precisas
Resumo:
The central motif of this work is prediction and optimization in presence of multiple interacting intelligent agents. We use the phrase `intelligent agents' to imply in some sense, a `bounded rationality', the exact meaning of which varies depending on the setting. Our agents may not be `rational' in the classical game theoretic sense, in that they don't always optimize a global objective. Rather, they rely on heuristics, as is natural for human agents or even software agents operating in the real-world. Within this broad framework we study the problem of influence maximization in social networks where behavior of agents is myopic, but complication stems from the structure of interaction networks. In this setting, we generalize two well-known models and give new algorithms and hardness results for our models. Then we move on to models where the agents reason strategically but are faced with considerable uncertainty. For such games, we give a new solution concept and analyze a real-world game using out techniques. Finally, the richest model we consider is that of Network Cournot Competition which deals with strategic resource allocation in hypergraphs, where agents reason strategically and their interaction is specified indirectly via player's utility functions. For this model, we give the first equilibrium computability results. In all of the above problems, we assume that payoffs for the agents are known. However, for real-world games, getting the payoffs can be quite challenging. To this end, we also study the inverse problem of inferring payoffs, given game history. We propose and evaluate a data analytic framework and we show that it is fast and performant.
Resumo:
Intelligent agents offer a new and exciting way of understanding the world of work. In this paper we apply agent-based modeling and simulation to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between human resource management practices and retail productivity. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents could offer potential for fostering sustainable organizational capabilities in the future. The project is still at an early stage. So far we have conducted a case study in a UK department store to collect data and capture impressions about operations and actors within departments. Furthermore, based on our case study we have built and tested our first version of a retail branch simulator which we will present in this paper.
Resumo:
Au Niger, le taux de mortalité maternelle est estimé à 535 décès pour 100 000 naissances vivantes (INS, 2013) et la probabilité pour un nouveau-né de mourir avant l’âge d’un mois est de 33 ‰. Depuis 2006, le Niger a mis en place une politique de gratuité des soins pour les femmes enceintes et les enfants de 0 à 5 ans, ce qui a contribué à une amélioration significative de la fréquentation des centres de santé. En mars 2012, un processus délibératif fut organisé pendant une conférence de trois jours pour échanger sur les acquis, limites et perspectives de cette nouvelle politique avec 160 participants dont des chercheurs, des humanitaires, des décideurs politiques et des intervenants sur le terrain. L’objectif de cette recherche est de comprendre les effets de cette conférence ainsi que d’explorer les activités du comité de suivi de la feuille de route. La recherche a été réalisée durant deux mois en été 2014 à Niamey et à N’guiguimi. Elle a reposé sur l’utilisation du cadre conceptuel de Boyko et al., (2012) qui permet de décrire les principales caractéristiques et les effets attendus des dialogues délibératifs et comprendre comment les dialogues délibératifs peuvent contribuer à l’élaboration de politiques sur la base de données probantes. Nous avons mis un accent particulier sur les trois formes d’utilisation des connaissances présentées par Dagenais et al., (2013) : instrumentale, conceptuelle et persuasive. Des entretiens semi-directifs ont été effectués avec 22 acteurs impliqués dans la mise en oeuvre des recommandations. Ils ont été enregistrés, retranscrits intégralement et traités avec le logiciel QDA Miner. Les résultats de l’analyse des discours recueillis révèlent une utilisation instrumentale des recommandations et plus visible chez les humanitaires que les décideurs et les acteurs de la société civile. Il ressort aussi de cette analyse une utilisation conceptuelle et persuasive des recommandations à un degré plus faible parmi tous les acteurs. Le comité de suivi de la feuille route de la conférence n’a pratiquement pas fonctionné, par conséquent, le processus n’a pas eu l’impact souhaité. Les principales raisons de cet échec sont liées au contexte de mise en oeuvre des recommandations (arrestation de plusieurs agents du ministère de la Santé publique qui sont des membres clés du comité de suivi à cause du détournement des fonds GAVI, manque de volonté technique et politique) et/ou aux conditions financières (absence de primes pour les membres du comité et de budget de fonctionnement.). Les iv résultats obtenus ont permis de comprendre les énormes défis (contextuels, financiers notamment) qui restent à relever en matière de transfert de connaissance dans le secteur de santé publique au Niger. En ce qui concerne la suite de la conférence, il faudrait accélérer la redynamisation du comité de suivi en le dotant d’un fonds de fonctionnement et en créant une agence autonome de gestion de la gratuité des soins; et renforcer le soutien politique autour de l’Initiative Santé Solidarité Sahel.
Resumo:
Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently
Resumo:
Requirements specification has long been recognized as critical activity in software development processes because of its impact on project risks when poorly performed. A large amount of studies addresses theoretical aspects, propositions of techniques, and recommended practices for Requirements Engineering (RE). To be successful, RE have to ensure that the specified requirements are complete and correct what means that all intents of the stakeholders in a given business context are covered by the requirements and that no unnecessary requirement was introduced. However, the accurate capture the business intents of the stakeholders remains a challenge and it is a major factor of software project failures. This master’s dissertation presents a novel method referred to as “Problem-Based SRS” aiming at improving the quality of the Software Requirements Specification (SRS) in the sense that the stated requirements provide suitable answers to real customer ́s businesses issues. In this approach, the knowledge about the software requirements is constructed from the knowledge about the customer ́s problems. Problem-Based SRS consists in an organization of activities and outcome objects through a process that contains five main steps. It aims at supporting the software requirements engineering team to systematically analyze the business context and specify the software requirements, taking also into account a first glance and vision of the software. The quality aspects of the specifications are evaluated using traceability techniques and axiomatic design principles. The cases studies conducted and presented in this document point out that the proposed method can contribute significantly to improve the software requirements specification.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
When the offshore oil and gas supplies exhaust, offshore platforms must be decommissioned and removed. The present thesis highlights the importance of evaluating the possibility of reuse of decommissioned offshore jacket platforms for offshore wind energy. In order to shift to the new structure, the topside must be removed from the substructure and a wind turbine can be installed in its place. The feasibility of this project was investigated using a finite element analysis software called Sesam. To study fatigue life in offshore structures, an exhaustive review of the background and state of the art was done. A finite element model was created by the means of Sesam and two different fatigue analysis approaches were applied and compared. In the end, an analysis methodology is suggested for the structural fatigue analysis of offshore wind turbine structures based on international standards, addressing the industry’s need to account for the combined effect of wind and hydrodynamic loads in these type of structures.
Resumo:
Cardiac arrest during heart surgery is a common procedure and allows the surgeon to perform surgical procedures in an environment free of blood and movement. Using a model of isolated rat heart, the authors compare a new cardioplegic solution containing histidine-tryptophan-glutamate (group 2) with the histidine-tryptophan-alphacetoglutarate (group 1) routinely used by some cardiac surgeons. To assess caspase, IL-8 and KI-67 in isolated rat hearts using immunohistochemistry. 20 Wistar male rats were anesthetized and heparinized. The chest was opened, cardioctomy was performed and 40 ml/kg of the appropriate cardioplegic solution was infused. The hearts were kept for 2 hours at 4ºC in the same solution, and thereafter, placed in the Langendorff apparatus for 30 minutes with Ringer-Locke solution. Immunohistochemistry analysis of caspase, IL-8, and KI-67 were performed. The concentration of caspase was lower in group 2 and Ki-67 was higher in group 2, both P<0.05. There was no statistical difference between the values of IL-8 between the groups. Histidine-tryptophan-glutamate solution was better than histidine-tryptophan-alphacetoglutarate solution because it reduced caspase (apoptosis), increased KI-67 (cell proliferation), and showed no difference in IL-8 levels compared to group 1. This suggests that the histidine-tryptophan-glutamate solution was more efficient than the histidine-tryptophan-alphacetoglutarate for the preservation of hearts of rat cardiomyocytes.
Resumo:
Perianal fistulizing Crohn's disease is one of the most severe phenotypes of inflammatory bowel diseases. Combined therapy with seton placement and anti-TNF therapy is the most common strategy for this condition. The aim of this study was to analyze the rates of complete perianal remission after combined therapy for perianal fistulizing Crohn's disease. This was a retrospective observational study with perianal fistulizing Crohn's disease patients submitted to combined therapy from four inflammatory bowel diseases referral centers. We analyzed patients' demographic characteristics, Montreal classification, concomitant medication, classification of the fistulae, occurrence of perianal complete remission and recurrence after remission. Complete perianal remission was defined as absence of drainage from the fistulae associated with seton removal. A total of 78 patients were included, 44 (55.8%) females with a mean age of 33.8 (±15) years. Most patients were treated with Infliximab, 66.2%, than with Adalimumab, 33.8%. Complex fistulae were found in 52/78 patients (66.7%). After a medium follow-up of 48.2 months, 41/78 patients (52.6%) had complete perianal remission (95% CI: 43.5%-63.6%). Recurrence occurred in four (9.8%) patients (95% CI: 0.7%-18.8%) in an average period of 74.8 months. Combined therapy lead to favorable and durable results in perianal fistulizing Crohn's disease.