812 resultados para multi-class queueing systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aeromonas salmonicida AS03, a potential fish pathogen, was isolated from Atlantic salmon, Salmo salar, in 2003. This strain was found to be resistant to ≥1000 mM HgCl2 and ≥32 mM phenylmercuric acetate as well as multiple antimicrobials. Mercury (Hg) and antibiotic resistance genes are often located on the same mobile genetic elements, so the genetic determinants of both resistances and the possibility of horizontal gene transfer were examined. Specific PCR primers were used to amplify and sequence distinctive regions of the mer operon. A. salmonicida AS03 was found to have a pDU1358-like broad-spectrum mer operon, containing merB as well as merA, merD, merP, merR and merT, most similar to Klebsiella pneumonaie plasmid pRMH760. To our knowledge, the mer operon has never before been documented in Aeromonas spp. PCR and gene sequencing were used to identify class 1 integron associated antibiotic resistance determinants and the Tet A tetracycline resistance gene. The transposase and resolvase genes of Tn1696 were identified through PCR and sequencing with Tn21 specific PCR primers. We provide phenotypic and genotypic evidence that the mer operon, the aforementioned antibiotic resistances, and the Tn1696 transposition module are located on a single plasmid or conjugative transposon that can be transferred to E. coli DH5α through conjugation in the presence of low level Hg and absence of any antibiotic selective pressure. Additionally, the presence of low-level Hg or chloramphenicol in the mating media was found to stimulate conjugation, significantly increasing the transfer frequency of conjugation above the transfer frequency measured with mating media lacking both antibiotics and Hg. This research demonstrates that mercury indirectly selects for the dissemination of the antibiotic resistance genes of A. salmonicida AS03.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aeromonas salmonicida AS03, a potential fish pathogen, was isolated from Atlantic salmon, Salmo salar, in 2003. This strain was found to be resistant to ≥1000 mM HgCl2 and ≥32 mM phenylmercuric acetate as well as multiple antimicrobials. Mercury (Hg) and antibiotic resistance genes are often located on the same mobile genetic elements, so the genetic determinants of both resistances and the possibility of horizontal gene transfer were examined. Specific PCR primers were used to amplify and sequence distinctive regions of the mer operon. A. salmonicida AS03 was found to have a pDU1358-like broad-spectrum mer operon, containing merB as well as merA, merD, merP, merR and merT, most similar to Klebsiella pneumonaie plasmid pRMH760. To our knowledge, the mer operon has never before been documented in Aeromonas spp. PCR and gene sequencing were used to identify class 1 integron associated antibiotic resistance determinants and the Tet A tetracycline resistance gene. The transposase and resolvase genes of Tn1696 were identified through PCR and sequencing with Tn21 specific PCR primers. We provide phenotypic and genotypic evidence that the mer operon, the aforementioned antibiotic resistances, and the Tn1696 transposition module are located on a single plasmid or conjugative transposon that can be transferred to E. coli DH5α through conjugation in the presence of low level Hg and absence of any antibiotic selective pressure. Additionally, the presence of low-level Hg or chloramphenicol in the mating media was found to stimulate conjugation, significantly increasing the transfer frequency of conjugation above the transfer frequency measured with mating media lacking both antibiotics and Hg. This research demonstrates that mercury indirectly selects for the dissemination of the antibiotic resistance genes of A. salmonicida AS03.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For many years, drainage design was mainly about providing sufficient network capacity. This traditional approach had been successful with the aid of computer software and technical guidance. However, the drainage design criteria had been evolving due to rapid population growth, urbanisation, climate change and increasing sustainability awareness. Sustainable drainage systems that bring benefits in addition to water management have been recommended as better alternatives to conventional pipes and storages. Although the concepts and good practice guidance had already been communicated to decision makers and public for years, network capacity still remains a key design focus in many circumstances while the additional benefits are generally considered secondary only. Yet, the picture is changing. The industry begins to realise that delivering multiple benefits should be given the top priority while the drainage service can be considered a secondary benefit instead. The shift in focus means the industry has to adapt to new design challenges. New guidance and computer software are needed to assist decision makers. For this purpose, we developed a new decision support system. The system consists of two main components – a multi-criteria evaluation framework for drainage systems and a multi-objective optimisation tool. Users can systematically quantify the performance, life-cycle costs and benefits of different drainage systems using the evaluation framework. The optimisation tool can assist users to determine combinations of design parameters such as the sizes, order and type of drainage components that maximise multiple benefits. In this paper, we will focus on the optimisation component of the decision support framework. The optimisation problem formation, parameters and general configuration will be discussed. We will also look at the sensitivity of individual variables and the benchmark results obtained using common multi-objective optimisation algorithms. The work described here is the output of an EngD project funded by EPSRC and XP Solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study contributes a rigorous diagnostic assessment of state-of-the-art multiobjective evolutionary algorithms (MOEAs) and highlights key advances that the water resources field can exploit to better discover the critical tradeoffs constraining our systems. This study provides the most comprehensive diagnostic assessment of MOEAs for water resources to date, exploiting more than 100,000 MOEA runs and trillions of design evaluations. The diagnostic assessment measures the effectiveness, efficiency, reliability, and controllability of ten benchmark MOEAs for a representative suite of water resources applications addressing rainfall-runoff calibration, long-term groundwater monitoring (LTM), and risk-based water supply portfolio planning. The suite of problems encompasses a range of challenging problem properties including (1) many-objective formulations with 4 or more objectives, (2) multi-modality (or false optima), (3) nonlinearity, (4) discreteness, (5) severe constraints, (6) stochastic objectives, and (7) non-separability (also called epistasis). The applications are representative of the dominant problem classes that have shaped the history of MOEAs in water resources and that will be dominant foci in the future. Recommendations are provided for which modern MOEAs should serve as tools and benchmarks in the future water resources literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most of water distribution systems (WDS) need rehabilitation due to aging infrastructure leading to decreasing capacity, increasing leakage and consequently low performance of the WDS. However an appropriate strategy including location and time of pipeline rehabilitation in a WDS with respect to a limited budget is the main challenge which has been addressed frequently by researchers and practitioners. On the other hand, selection of appropriate rehabilitation technique and material types is another main issue which has yet to address properly. The latter can affect the environmental impacts of a rehabilitation strategy meeting the challenges of global warming mitigation and consequent climate change. This paper presents a multi-objective optimization model for rehabilitation strategy in WDS addressing the abovementioned criteria mainly focused on greenhouse gas (GHG) emissions either directly from fossil fuel and electricity or indirectly from embodied energy of materials. Thus, the objective functions are to minimise: (1) the total cost of rehabilitation including capital and operational costs; (2) the leakage amount; (3) GHG emissions. The Pareto optimal front containing optimal solutions is determined using Non-dominated Sorting Genetic Algorithm NSGA-II. Decision variables in this optimisation problem are classified into a number of groups as: (1) percentage proportion of each rehabilitation technique each year; (2) material types of new pipeline for rehabilitation each year. Rehabilitation techniques used here includes replacement, rehabilitation and lining, cleaning, pipe duplication. The developed model is demonstrated through its application to a Mahalat WDS located in central part of Iran. The rehabilitation strategy is analysed for a 40 year planning horizon. A number of conventional techniques for selecting pipes for rehabilitation are analysed in this study. The results show that the optimal rehabilitation strategy considering GHG emissions is able to successfully save the total expenses, efficiently decrease the leakage amount from the WDS whilst meeting environmental criteria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the formulation of a Multi-objective Pipe Smoothing Genetic Algorithm (MOPSGA) and its application to the least cost water distribution network design problem. Evolutionary Algorithms have been widely utilised for the optimisation of both theoretical and real-world non-linear optimisation problems, including water system design and maintenance problems. In this work we present a pipe smoothing based approach to the creation and mutation of chromosomes which utilises engineering expertise with the view to increasing the performance of the algorithm whilst promoting engineering feasibility within the population of solutions. MOPSGA is based upon the standard Non-dominated Sorting Genetic Algorithm-II (NSGA-II) and incorporates a modified population initialiser and mutation operator which directly targets elements of a network with the aim to increase network smoothness (in terms of progression from one diameter to the next) using network element awareness and an elementary heuristic. The pipe smoothing heuristic used in this algorithm is based upon a fundamental principle employed by water system engineers when designing water distribution pipe networks where the diameter of any pipe is never greater than the sum of the diameters of the pipes directly upstream resulting in the transition from large to small diameters from source to the extremities of the network. MOPSGA is assessed on a number of water distribution network benchmarks from the literature including some real-world based, large scale systems. The performance of MOPSGA is directly compared to that of NSGA-II with regard to solution quality, engineering feasibility (network smoothness) and computational efficiency. MOPSGA is shown to promote both engineering and hydraulic feasibility whilst attaining good infrastructure costs compared to NSGA-II.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper the architecture of an experimental multiparadigmatic programming environment is sketched, showing how its parts combine together with application modules in order to perform the integration of program modules written in different programming languages and paradigms. Adaptive automata are special self-modifying formal state machines used as a design and implementation tool in the representation of complex systems. Adaptive automata have been proven to have the same formal power as Turing Machines. Therefore, at least in theory, arbitrarily complex systems may be modeled with adaptive automata. The present work briefly introduces such formal tool and presents case studies showing how to use them in two very different situations: the first one, in the name management module of a multi-paradigmatic and multi-language programming environment, and the second one, in an application program implementing an adaptive automaton that accepts a context-sensitive language.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electronic applications are currently developed under the reuse-based paradigm. This design methodology presents several advantages for the reduction of the design complexity, but brings new challenges for the test of the final circuit. The access to embedded cores, the integration of several test methods, and the optimization of the several cost factors are just a few of the several problems that need to be tackled during test planning. Within this context, this thesis proposes two test planning approaches that aim at reducing the test costs of a core-based system by means of hardware reuse and integration of the test planning into the design flow. The first approach considers systems whose cores are connected directly or through a functional bus. The test planning method consists of a comprehensive model that includes the definition of a multi-mode access mechanism inside the chip and a search algorithm for the exploration of the design space. The access mechanism model considers the reuse of functional connections as well as partial test buses, cores transparency, and other bypass modes. The test schedule is defined in conjunction with the access mechanism so that good trade-offs among the costs of pins, area, and test time can be sought. Furthermore, system power constraints are also considered. This expansion of concerns makes it possible an efficient, yet fine-grained search, in the huge design space of a reuse-based environment. Experimental results clearly show the variety of trade-offs that can be explored using the proposed model, and its effectiveness on optimizing the system test plan. Networks-on-chip are likely to become the main communication platform of systemson- chip. Thus, the second approach presented in this work proposes the reuse of the on-chip network for the test of the cores embedded into the systems that use this communication platform. A power-aware test scheduling algorithm aiming at exploiting the network characteristics to minimize the system test time is presented. The reuse strategy is evaluated considering a number of system configurations, such as different positions of the cores in the network, power consumption constraints and number of interfaces with the tester. Experimental results show that the parallelization capability of the network can be exploited to reduce the system test time, whereas area and pin overhead are strongly minimized. In this manuscript, the main problems of the test of core-based systems are firstly identified and the current solutions are discussed. The problems being tackled by this thesis are then listed and the test planning approaches are detailed. Both test planning techniques are validated for the recently released ITC’02 SoC Test Benchmarks, and further compared to other test planning methods of the literature. This comparison confirms the efficiency of the proposed methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The focus of this thesis is to discuss the development and modeling of an interface architecture to be employed for interfacing analog signals in mixed-signal SOC. We claim that the approach that is going to be presented is able to achieve wide frequency range, and covers a large range of applications with constant performance, allied to digital configuration compatibility. Our primary assumptions are to use a fixed analog block and to promote application configurability in the digital domain, which leads to a mixed-signal interface. The use of a fixed analog block avoids the performance loss common to configurable analog blocks. The usage of configurability on the digital domain makes possible the use of all existing tools for high level design, simulation and synthesis to implement the target application, with very good performance prediction. The proposed approach utilizes the concept of frequency translation (mixing) of the input signal followed by its conversion to the ΣΔ domain, which makes possible the use of a fairly constant analog block, and also, a uniform treatment of input signal from DC to high frequencies. The programmability is performed in the ΣΔ digital domain where performance can be closely achieved according to application specification. The interface performance theoretical and simulation model are developed for design space exploration and for physical design support. Two prototypes are built and characterized to validate the proposed model and to implement some application examples. The usage of this interface as a multi-band parametric ADC and as a two channels analog multiplier and adder are shown. The multi-channel analog interface architecture is also presented. The characterization measurements support the main advantages of the approach proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research topic of this paper is focused on the analysis of how trade associations perceive lobbying in Brussels and in Brasília. The analysis will be centered on business associations located in Brasília and Brussels as the two core centers of decision-making and as an attraction for the lobbying practice. The underlying principles behind the comparison between Brussels and Brasilia are two. Firstof all because the European Union and Brazil have maintained diplomatic relations since 1960. Through these relations they have built up close historical, cultural, economic and political ties. Their bilateral political relations culminated in 2007 with the establishment of a Strategic Partnership (EEAS website,n.d.). Over the years, Brazil has become a key interlocutor for the EU and it is the most important market for the EU in Latin America (European Commission, 2007). Taking into account the relations between EU and Brazil, this research could contribute to the reciprocal knowledge about the perception of lobby in the respective systems and the importance of the non-market strategy when conducting business. Second both EU and Brazilian systems have a multi-level governance structure: 28 Member States in the EU and 26 Member States in Brazil; in both systems there are three main institutions targeted by lobbying practice. The objective is to compare how differences in the institutional environments affect the perception and practice of lobbying, where institutions are defined as ‘‘regulative, normative, and cognitive structures and activities that provide stability and meaning to social behavior’’ (Peng et al., 2009). Brussels, the self-proclaimed "Capital of Europe”, is the headquarters of the European Union and has one of the highest concentrations of political power in the world. Four of the seven Institutions of the European Union are based in Brussels: the European Parliament, the European Council, the Council and the European Commission (EU website, n.d.). As the power of the EU institutions has grown, Brussels has become a magnet for lobbyists, with the latest estimates ranging from between 15,000 and 30,000 professionals representing companies, industry sectors, farmers, civil society groups, unions etc. (Burson Marsteller, 2013). Brasília is the capital of Brazil and the seat of government of the Federal District and the three branches of the federal government of Brazilian legislative, executive and judiciary. The 4 city also hosts 124 foreign embassies. The presence of the formal representations of companies and trade associations in Brasília is very limited, but the governmental interests remain there and the professionals dealing with government affairs commute there. In the European Union, Brussels has established a Transparency Register that allows the interactions between the European institutions and citizen’s associations, NGOs, businesses, trade and professional organizations, trade unions and think tanks. The register provides citizens with a direct and single access to information about who is engaged in This process is important for the quality of democracy, and for its capacity to deliver adequate policies, matching activities aimed at influencing the EU decision-making process, which interests are being pursued and what level of resources are invested in these activities (Celgene, n.d). It offers a single code of conduct, binding all organizations and self-employed individuals who accept to “play by the rules” in full respect of ethical principles (EC website, n.d). A complaints and sanctions mechanism ensures the enforcement of the rules and addresses suspected breaches of the code. In Brazil, there is no specific legislation regulating lobbying. The National Congress is currently discussing dozens of bills that address regulation of lobbying and the action of interest groups (De Aragão, 2012), but none of them has been enacted for the moment. This work will focus on class lobbying (Oliveira, 2004), which refers to the performance of the federation of national labour or industrial unions, like CNI (National Industry Confederation) in Brazil and the European Banking Federation (EBF) in Brussels. Their performance aims to influence the Executive and Legislative branches in order to defend the interests of their affiliates. When representing unions and federations, class entities cover a wide range of different and, more often than not, conflicting interests. That is why they are limited to defending the consensual and majority interest of their affiliates (Oliveira, 2004). The basic assumption of this work is that institutions matter (Peng et al, 2009) and that the trade associations and their affiliates, when doing business, have to take into account the institutional and regulatory framework where they do business.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As tecnologias da informação e comunicação (TIC) estão presentes nas mais diversas áreas e atividades cotidianas, mas, em que pesem as ações de governos e instituições privadas, a informatização da saúde ainda é um desafio em aberto no Brasil. A situação atual leva a um questionamento sobre as dificuldades associadas à informatização das práticas em saúde, assim como, quais efeitos tais dificuldades têm causado à sociedade Brasileira. Com objetivo de discutir as questões acima citadas, esta tese apresenta quatro artigos sobre processo de informação da saúde no Brasil. O primeiro artigo revisa a literatura sobre TIC em saúde e baseado em duas perspectivas teóricas – estudos Europeus acerca dos Sistemas de Informação em Saúde (SIS) nos Países em Desenvolvimento e estudos sobre Informação e Informática em Saúde, no âmbito do Movimento da Reforma Sanitária –, formula um modelo integrado que combina dimensões de análise e fatores contextuais para a compreensão dos SIS no Brasil. Já o segundo artigo apresenta os conceitos e teóricos e metodológicos da Teoria Ator-Rede (ANT), uma abordagem para o estudo de controvérsias associadas às descobertas científicas e inovações tecnológicas, por meio das redes de atores envolvidos em tais ações. Tal abordagem tem embasado estudos de SI desde 1990 e inspirou as análises dois artigos empíricos desta tese. Os dois últimos artigos foram redigidos a partir da análise da implantação de um SIS em um hospital público no Brasil ocorrida entre os anos de 2010 e 2012. Para a análise do caso, seguiram-se os atores envolvidos nas controvérsias que surgiram durante a implantação do SIS. O terceiro artigo se debruçou sobre as atividades dos analistas de sistema e usuários envolvidos na implantação do SIS. As mudanças observadas durante a implantação do sistema revelam que o sucesso do SIS não foi alcançado pela estrita e técnica execução das atividades incialmente planejadas. Pelo contrário, o sucesso foi construído coletivamente, por meio da negociação entre os atores e de dispositivos de interessamento introduzidos durante o projeto. O quarto artigo, baseado no conceito das Infraestruturas de Informação, discutiu como o sistema CATMAT foi incorporado ao E-Hosp. A análise revelou como a base instalada do CATMAT foi uma condição relevante para a sua escolha durante a implantação do E-Hosp. Além disso, descrevem-se negociações e operações heterogêneas que aconteceram durante a incorporação do CATMAT no sistema E-Hosp. Assim, esta tese argumenta que a implantação de um SIS é um empreendimento de construção coletiva, envolvendo analistas de sistema, profissionais de saúde, políticos e artefatos técnicos. Ademais, evidenciou-se como os SIS inscrevem definições e acordos, influenciando as preferências dos atores na área de saúde.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many creative and technical areas, professionals make use of paper sketches for developing and expressing concepts and models. Paper offers an almost constraint free environment where they have as much freedom to express themselves as they need. However, paper does have some disadvantages, such as size and not being able to manipulate the content (other than remove it or scratch it), which can be overcome by creating systems that can offer the same freedom people have from paper but none of the disadvantages and limitations. Only in recent years has the technology become massively available that allows doing precisely that, with the development in touch‐sensitive screens that also have the ability to interact with a stylus. In this project a prototype was created with the objective of finding a set of the most useful and usable interactions, which are composed of combinations of multi‐touch and pen. The project selected Computer Aided Software Engineering (CASE) tools as its application domain, because it addresses a solid and well‐defined discipline with still sufficient room for new developments. This was the result from the area research conducted to find an application domain, which involved analyzing sketching tools from several possible areas and domains. User studies were conducted using Model Driven Inquiry (MDI) to have a better understanding of the human sketch creation activities and concepts devised. Then the prototype was implemented, through which it was possible to execute user evaluations of the interaction concepts created. Results validated most interactions, in the face of limited testing only being possible at the time. Users had more problems using the pen, however handwriting and ink recognition were very effective, and users quickly learned the manipulations and gestures from the Natural User Interface (NUI).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, the development of intelligent agents intends to be more refined, using improved architectures and reasoning mechanisms. Revise the beliefs of an agent is also an important subject, due to the consistency that agents should have about their knowledge. In this work we propose deliberative and argumentative agents using Lego Mindstorms robots, Argumentative NXT BDI-like Agents. These agents are built using the notions of the BDI model and they are capable to reason using the DeLP formalism. They update their knowledge base with their perceptions and revise it when necessary. Two variations are presented: the Single Argumentative NXT BDI-like Agent and the MAS Argumentative NXT BDI-like Agent.