597 resultados para Sistemas de computação sem fio


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work presents an ontology to describe the semantics of IMML (Interactive Message Modeling Language) an XML-based User Interface Description Language. The ontology presents the description of all IMML elements including a natural language description and semantic rules and relationships. The ontology is implemented in OWL-DL, a standard language to ontology description that is recommended by W3C. Our main goal is to describe the semantic using languages and tools that can be processed by computers. As a consequence, we develop tools to the validation of a user interface specification and also to present the semantic description in different views

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main goal of this work is to investigate the suitability of applying cluster ensemble techniques (ensembles or committees) to gene expression data. More specifically, we will develop experiments with three diferent cluster ensembles methods, which have been used in many works in literature: coassociation matrix, relabeling and voting, and ensembles based on graph partitioning. The inputs for these methods will be the partitions generated by three clustering algorithms, representing diferent paradigms: kmeans, ExpectationMaximization (EM), and hierarchical method with average linkage. These algorithms have been widely applied to gene expression data. In general, the results obtained with our experiments indicate that the cluster ensemble methods present a better performance when compared to the individual techniques. This happens mainly for the heterogeneous ensembles, that is, ensembles built with base partitions generated with diferent clustering algorithms

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Software Product Line (SPL) consists of a software development paradigm, whose main focus is to identify features common and variability among applications in a specific domain. An LPS is designed to attend all products requirements from its product family. These requirements and LPS may have changes over time due to several factors, such as evolution of product requirements, evolution of the market, evolution of SLP process, evolution of the technologies used to develop the products. To handle these changes, LPS should be modified and evolve in order to not become obsolete, and adapt itself to new requirements. The Changes Impact Analysis is an activity that understand and identify what consequences these changes are cause on LPS. Impact Analysis on LPS may be supported by traceability relationships, which identify relationships between artefacts created during all phases of software development. Despite the solutions of change impact analysis based on traceability for software, there is a lack of solutions for assessing the change impact analysis based on traceability for LPS, since existing solutions do not include estimates specific to the artefacts of LPS. Thus, this paper proposes a process of change impact analysis and an tool for assessing the change impact through traceability of artefacts in LPS. For this purpose, we specified a process of change impact analysis that considers artifacts produced during the development of LPS. We have also implemented a tool which allows estimating and identifying artefacts and products of LPS affected from changes in other products, changes in class, changes in features, changes between releases of LPS and artefacts related to changes in core assets and variability. Finally, the results were evaluated through metrics

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Brazil is going through the process from analogical transmission to digital transmission. This new technology, in addition to providing a high quality audio and video, also allows applications to execute on television. Equipment called Set-Top Box are needed to allow the reception of this new signal and create the appropriate environment necessary to execute applications. At first, the only way to interact with these applications is given by remote control. However, the remote control has serious usability problems when used to interact with some types of applications. This research suggests a software resources implementation capable to create a environment that allows a smartphone to interact with applications. Besides this implementation, is performed a comparative study between use remote controle and smartphones to interact with applications of digital television, taking into account parameters related to usability. After analysis of data collected by the comparative study is possible to identify which device provides an interactive experience more interesting for users

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Product derivation tools are responsible for automating the development process of software product lines. The configuration knowledge, which is responsible for mapping the problem space to the solution space, plays a fundamental role on product derivation approaches. Each product derivation approach adopts different strategies and techniques to manage the existing variabilities in code assets. There is a lack of empirical studies to analyze these different approaches. This dissertation has the aim of comparing systematically automatic product derivation approaches through of the development of two different empirical studies. The studies are analyzed under two perspectives: (i) qualitative that analyzes the characteristics of approaches using specific criteria; and (ii) quantitative that quantifies specific properties of product derivation artifacts produced for the different approaches. A set of criteria and metrics are also being proposed with the aim of providing support to the qualitative and quantitative analysis. Two software product lines from the web and mobile application domains are targets of our study

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work presents the concept, design and implementation of a MP-SoC platform, named STORM (MP-SoC DirecTory-Based PlatfORM). Currently the platform is composed of the following modules: SPARC V8 processor, GPOP processor, Cache module, Memory module, Directory module and two different modles of Network-on-Chip, NoCX4 and Obese Tree. All modules were implemented using SystemC, simulated and validated, individually or in group. The modules description is presented in details. For programming the platform in C it was implemented a SPARC assembler, fully compatible with gcc s generated assembly code. For the parallel programming it was implemented a library for mutex managing, using the due assembler s support. A total of 10 simulations of increasing complexity are presented for the validation of the presented concepts. The simulations include real parallel applications, such as matrix multiplication, Mergesort, KMP, Motion Estimation and DCT 2D

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Through the adoption of the software product line (SPL) approach, several benefits are achieved when compared to the conventional development processes that are based on creating a single software system at a time. The process of developing a SPL differs from traditional software construction, since it has two essential phases: the domain engineering - when common and variables elements of the SPL are defined and implemented; and the application engineering - when one or more applications (specific products) are derived from the reuse of artifacts created in the domain engineering. The test activity is also fundamental and aims to detect defects in the artifacts produced in SPL development. However, the characteristics of an SPL bring new challenges to this activity that must be considered. Several approaches have been recently proposed for the testing process of product lines, but they have been shown limited and have only provided general guidelines. In addition, there is also a lack of tools to support the variability management and customization of automated case tests for SPLs. In this context, this dissertation has the goal of proposing a systematic approach to software product line testing. The approach offers: (i) automated SPL test strategies to be applied in the domain and application engineering, (ii) explicit guidelines to support the implementation and reuse of automated test cases at the unit, integration and system levels in domain and application engineering; and (iii) tooling support for automating the variability management and customization of test cases. The approach is evaluated through its application in a software product line for web systems. The results of this work have shown that the proposed approach can help the developers to deal with the challenges imposed by the characteristics of SPLs during the testing process

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The vascular segmentation is important in diagnosing vascular diseases like stroke and is hampered by noise in the image and very thin vessels that can pass unnoticed. One way to accomplish the segmentation is extracting the centerline of the vessel with height ridges, which uses the intensity as features for segmentation. This process can take from seconds to minutes, depending on the current technology employed. In order to accelerate the segmentation method proposed by Aylward [Aylward & Bullitt 2002] we have adapted it to run in parallel using CUDA architecture. The performance of the segmentation method running on GPU is compared to both the same method running on CPU and the original Aylward s method running also in CPU. The improvemente of the new method over the original one is twofold: the starting point for the segmentation process is not a single point in the blood vessel but a volume, thereby making it easier for the user to segment a region of interest, and; the overall gain method was 873 times faster running on GPU and 150 times more fast running on the CPU than the original CPU in Aylward

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In development of Synthetic Agents for Education, the doubt still resides about what would be a behavior that could be considered, in fact, plausible for this agent's type, which can be considered as effective on the transmission of the knowledge by the agent and the function of emotions this process. The purpose of this labor has an investigative nature in an attempt to discover what aspects are important for this behavior consistent and practical development of a chatterbot with the function of virtual tutor, within the context of learning algorithms. In this study, we explained the agents' basics, Intelligent Tutoring Systems, bots, chatterbots and how these systems need to provide credibility to report on their behavior. Models of emotions, personality and humor to computational agents are also covered, as well as previous studies by other researchers at the area. After that, the prototype is detailed, the research conducted, a summary of results achieved, the architectural model of the system, vision of computing and macro view of the features implemented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The use of increasingly complex software applications is demanding greater investment in the development of such systems to ensure applications with better quality. Therefore, new techniques are being used in Software Engineering, thus making the development process more effective. Among these new approaches, we highlight Formal Methods, which use formal languages that are strongly based on mathematics and have a well-defined semantics and syntax. One of these languages is Circus, which can be used to model concurrent systems. It was developed from the union of concepts from two other specification languages: Z, which specifies systems with complex data, and CSP, which is normally used to model concurrent systems. Circus has an associated refinement calculus, which can be used to develop software in a precise and stepwise fashion. Each step is justified by the application of a refinement law (possibly with the discharge of proof obligations). Sometimes, the same laws can be applied in the same manner in different developments or even in different parts of a single development. A strategy to optimize this calculus is to formalise these application as a refinement tactic, which can then be used as a single transformation rule. CRefine was developed to support the Circus refinement calculus. However, before the work presented here, it did not provide support for refinement tactics. The aim of this work is to provide tool support for refinement tactics. For that, we develop a new module in CRefine, which automates the process of defining and applying refinement tactics that are formalised in the tactic language ArcAngelC. Finally, we validate the extension by applying the new module in a case study, which used the refinement tactics in a refinement strategy for verification of SPARK Ada implementations of control systems. In this work, we apply our module in the first two phases of this strategy

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a need for multi-agent system designers in determining the quality of systems in the earliest phases of the development process. The architectures of the agents are also part of the design of these systems, and therefore also need to have their quality evaluated. Motivated by the important role that emotions play in our daily lives, embodied agents researchers have aimed to create agents capable of producing affective and natural interaction with users that produces a beneficial or desirable result. For this, several studies proposing architectures of agents with emotions arose without the accompaniment of appropriate methods for the assessment of these architectures. The objective of this study is to propose a methodology for evaluating architectures emotional agents, which evaluates the quality attributes of the design of architectures, in addition to evaluation of human-computer interaction, the effects on the subjective experience of users of applications that implement it. The methodology is based on a model of well-defined metrics. In assessing the quality of architectural design, the attributes assessed are: extensibility, modularity and complexity. In assessing the effects on users' subjective experience, which involves the implementation of the architecture in an application and we suggest to be the domain of computer games, the metrics are: enjoyment, felt support, warm, caring, trust, cooperation, intelligence, interestingness, naturalness of emotional reactions, believabiliy, reducing of frustration and likeability, and the average time and average attempts. We experimented with this approach and evaluate five architectures emotional agents: BDIE, DETT, Camurra-Coglio, EBDI, Emotional-BDI. Two of the architectures, BDIE and EBDI, were implemented in a version of the game Minesweeper and evaluated for human-computer interaction. In the results, DETT stood out with the best architectural design. Users who have played the version of the game with emotional agents performed better than those who played without agents. In assessing the subjective experience of users, the differences between the architectures were insignificant

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The tracking between models of the requirements and architecture activities is a strategy that aims to prevent loss of information, reducing the gap between these two initial activities of the software life cycle. In the context of Software Product Lines (SPL), it is important to have this support, which allows the correspondence between this two activities, with management of variability. In order to address this issue, this paper presents a process of bidirectional mapping, defining transformation rules between elements of a goaloriented requirements model (described in PL-AOVgraph) and elements of an architectural description (defined in PL-AspectualACME). These mapping rules are evaluated using a case study: the GingaForAll LPS. To automate this transformation, we developed the MaRiPLA tool (Mapping Requirements to Product Line Architecture), through MDD techniques (Modeldriven Development), including Atlas Transformation Language (ATL) with specification of Ecore metamodels jointly with Xtext , a DSL definition framework, and Acceleo, a code generation tool, in Eclipse environment. Finally, the generated models are evaluated based on quality attributes such as variability, derivability, reusability, correctness, traceability, completeness, evolvability and maintainability, extracted from the CAFÉ Quality Model

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work approaches the Scheduling Workover Rigs Problem (SWRP) to maintain the wells of an oil field, although difficult to resolve, is extremely important economical, technical and environmental. A mathematical formulation of this problem is presented, where an algorithmic approach was developed. The problem can be considered to find the best scheduling service to the wells by the workover rigs, taking into account the minimization of the composition related to the costs of the workover rigs and the total loss of oil suffered by the wells. This problem is similar to the Vehicle Routing Problem (VRP), which is classified as belonging to the NP-hard class. The goal of this research is to develop an algorithmic approach to solve the SWRP, using the fundamentals of metaheuristics like Memetic Algorithm and GRASP. Instances are generated for the tests to analyze the computational performance of the approaches mentioned above, using data that are close to reality. Thereafter, is performed a comparison of performance and quality of the results obtained by each one of techniques used

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies