918 resultados para Software testing. Problem-oriented programming. Teachingmethodology


Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the context of Software Engineering, web accessibility is gaining more room, establishing itself as an important quality attribute. This fact is due to initiatives of institutions such as the W3C (World Wide Web Consortium) and the introduction of norms and laws such as Section 508 that underlie the importance of developing accessible Web sites and applications. Despite these improvements, the lack of web accessibility is still a persistent problem, and could be related to the moment or phase in which this requirement is solved within the development process. From the moment when Web accessibility is generally regarded as a programming problem or treated when the application is already developed entirely. Thus, consider accessibility already during activities of analysis and requirements specification shows itself a strategy to facilitate project progress, avoiding rework in advanced phases of software development because of possible errors, or omissions in the elicitation. The objective of this research is to develop a method and a tool to support requirements elicitation of web accessibility. The strategy for the requirements elicitation of this method is grounded by the Goal-Oriented approach NFR Framework and the use of catalogs NFRs, created based on the guidelines contained in WCAG 2.0 (Web Content Accessibility Guideline) proposed by W3C

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The transmission network planning problem is a non-linear integer mixed programming problem (NLIMP). Most of the algorithms used to solve this problem use a linear programming subroutine (LP) to solve LP problems resulting from planning algorithms. Sometimes the resolution of these LPs represents a major computational effort. The particularity of these LPs in the optimal solution is that only some inequality constraints are binding. This task transforms the LP into an equivalent problem with only one equality constraint (the power flow equation) and many inequality constraints, and uses a dual simplex algorithm and a relaxation strategy to solve the LPs. The optimisation process is started with only one equality constraint and, in each step, the most unfeasible constraint is added. The logic used is similar to a proposal for electric systems operation planning. The results show a higher performance of the algorithm when compared to primal simplex methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a network node embedded based on IEEE 1451 standard developed using structured programming to access the transducers in the WTIM. The NCAP was developed using Nios II processor and uClinux, a embedded operating system developed to features restricted hardware. Both hardware and software have dynamics features and they can be configured based in the application features. Based in this features, the NCAP was developed using the minimum components of hardware and software to that being implemented in remote environment like central point of data request. Many NCAP works are implemented with an object oriented structure. This is different from the surrounding implementations. In this project the NCAP was developed using structured programming. The tests of the NCAP were made using a ZigBee interface between NCAP and WTIM and the system demonstrated in areas of difficult access for long period of time due to need for low power consumption. © 2012 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a mixed-integer linear programming approach to solving the problem of optimal type, size and allocation of distributed generators (DGs) in radial distribution systems. In the proposed formulation, (a) the steady-state operation of the radial distribution system, considering different load levels, is modeled through linear expressions; (b) different types of DGs are represented by their capability curves; (c) the short-circuit current capacity of the circuits is modeled through linear expressions; and (d) different topologies of the radial distribution system are considered. The objective function minimizes the annualized investment and operation costs. The use of a mixed-integer linear formulation guarantees convergence to optimality using existing optimization software. The results of one test system are presented in order to show the accuracy as well as the efficiency of the proposed solution technique.© 2012 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study, a novel approach for the optimal location and contract pricing of distributed generation (DG) is presented. Such an approach is designed for a market environment in which the distribution company (DisCo) can buy energy either from the wholesale energy market or from the DG units within its network. The location and contract pricing of DG is determined by the interaction between the DisCo and the owner of the distributed generators. The DisCo intends to minimise the payments incurred in meeting the expected demand, whereas the owner of the DG intends to maximise the profits obtained from the energy sold to the DisCo. This two-agent relationship is modelled in a bilevel scheme. The upper-level optimisation is for determining the allocation and contract prices of the DG units, whereas the lower-level optimisation is for modelling the reaction of the DisCo. The bilevel programming problem is turned into an equivalent single-level mixed-integer linear optimisation problem using duality properties, which is then solved using commercially available software. Results show the robustness and efficiency of the proposed model compared with other existing models. As regards to contract pricing, the proposed approach allowed to find better solutions than those reported in previous works. © The Institution of Engineering and Technology 2013.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Descreve a implementação de um software de reconhecimento de voz para o Português Brasileiro. Dentre os objetivos do trabalho tem-se a construção de um sistema de voz contínua para grandes vocabulários, apto a ser usado em aplicações em tempo-real. São apresentados os principais conceitos e características de tais sistemas, além de todos os passos necessários para construção. Como parte desse trabalho foram produzidos e disponibilizados vários recursos: modelos acústicos e de linguagem, novos corpora de voz e texto. O corpus de texto vem sendo construído através da extração e formatação automática de textos de jornais na Internet. Além disso, foram produzidos dois corpora de voz, um baseado em audiobooks e outro produzido especificamente para simular testes em tempo-real. O trabalho também propõe a utilização de técnicas de adaptação de locutor para resolução de problemas de descasamento acústico entre corpora de voz. Por último, é apresentada uma interface de programação de aplicativos que busca facilitar a utilização do decodificador Julius. Testes de desempenho são apresentados, comparando os sistemas desenvolvidos e um software comercial.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work aims to give greater visibility to the issue of software security, due to people talk a lot in security conferences, that much of both IT (Information Technology) staff and, more specifically, IS (Information Security) staff does not know this, and, thanks to the spread of the mobile computing and of the cloud computing, this lack of deeper knowledge on this subject is increasingly becoming worrisome. It aims too, make applications to be developed in a security manner, priorizing the security of the information processed. It attempts to demonstrate the secure coding techniques, the principles of software security, the means to identify software vulnerabilities, the cutting-edge software exploitation techniques and the mechanisms of mitigation. Nowadays, the security guys are in charge of the most of the security tests in applications, audits and pentests, and it is undeniable that the so-called security experts, most often come from computer network field, having few experience in software development and programming. Therefore, the development process does not consider the security issue, thanks to the lack of knowledge on the subject by the developer, and the security tests could be improved whether security experts had a greater know-how on application development. Given this problem, the goal here is to integrate information security with software development, spreading out the process of secure software development. To achieve this, a Linux distribution with proof of concept applicati... (Complete abstract click electronic access below)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: The evaluation of associations between genotypes and diseases in a case-control framework plays an important role in genetic epidemiology. This paper focuses on the evaluation of the homogeneity of both genotypic and allelic frequencies. The traditional test that is used to check allelic homogeneity is known to be valid only under Hardy-Weinberg equilibrium, a property that may not hold in practice. Results: We first describe the flaws of the traditional (chi-squared) tests for both allelic and genotypic homogeneity. Besides the known problem of the allelic procedure, we show that whenever these tests are used, an incoherence may arise: sometimes the genotypic homogeneity hypothesis is not rejected, but the allelic hypothesis is. As we argue, this is logically impossible. Some methods that were recently proposed implicitly rely on the idea that this does not happen. In an attempt to correct this incoherence, we describe an alternative frequentist approach that is appropriate even when Hardy-Weinberg equilibrium does not hold. It is then shown that the problem remains and is intrinsic of frequentist procedures. Finally, we introduce the Full Bayesian Significance Test to test both hypotheses and prove that the incoherence cannot happen with these new tests. To illustrate this, all five tests are applied to real and simulated datasets. Using the celebrated power analysis, we show that the Bayesian method is comparable to the frequentist one and has the advantage of being coherent. Conclusions: Contrary to more traditional approaches, the Full Bayesian Significance Test for association studies provides a simple, coherent and powerful tool for detecting associations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the collective imaginaries a robot is a human like machine as any androids in science fiction. However the type of robots that you will encounter most frequently are machinery that do work that is too dangerous, boring or onerous. Most of the robots in the world are of this type. They can be found in auto, medical, manufacturing and space industries. Therefore a robot is a system that contains sensors, control systems, manipulators, power supplies and software all working together to perform a task. The development and use of such a system is an active area of research and one of the main problems is the development of interaction skills with the surrounding environment, which include the ability to grasp objects. To perform this task the robot needs to sense the environment and acquire the object informations, physical attributes that may influence a grasp. Humans can solve this grasping problem easily due to their past experiences, that is why many researchers are approaching it from a machine learning perspective finding grasp of an object using information of already known objects. But humans can select the best grasp amongst a vast repertoire not only considering the physical attributes of the object to grasp but even to obtain a certain effect. This is why in our case the study in the area of robot manipulation is focused on grasping and integrating symbolic tasks with data gained through sensors. The learning model is based on Bayesian Network to encode the statistical dependencies between the data collected by the sensors and the symbolic task. This data representation has several advantages. It allows to take into account the uncertainty of the real world, allowing to deal with sensor noise, encodes notion of causality and provides an unified network for learning. Since the network is actually implemented and based on the human expert knowledge, it is very interesting to implement an automated method to learn the structure as in the future more tasks and object features can be introduced and a complex network design based only on human expert knowledge can become unreliable. Since structure learning algorithms presents some weaknesses, the goal of this thesis is to analyze real data used in the network modeled by the human expert, implement a feasible structure learning approach and compare the results with the network designed by the expert in order to possibly enhance it.