947 resultados para Factory of software


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automated acceptance testing is the testing of software done in higher level to test whether the system abides by the requirements desired by the business clients by the use of piece of script other than the software itself. This project is a study of the feasibility of acceptance tests written in Behavior Driven Development principle. The project includes an implementation part where automated accep- tance testing is written for Touch-point web application developed by Dewire (a software consultant company) for Telia (a telecom company) from the require- ments received from the customer (Telia). The automated acceptance testing is in Cucumber-Selenium framework which enforces Behavior Driven Development principles. The purpose of the implementation is to verify the practicability of this style of acceptance testing. From the completion of implementation, it was concluded that all the requirements from customer in real world can be converted into executable specifications and the process was not at all time-consuming or difficult for a low-experienced programmer like the author itself. The project also includes survey to measure the learnability and understandability of Gherkin- the language that Cucumber understands. The survey consist of some Gherkin exam- ples followed with questions that include making changes to the Gherkin exam- ples. Survey had 3 parts: first being easy, second medium and third most difficult. Survey also had a linear scale from 1 to 5 to rate the difficulty level for each part of the survey. 1 stood for very easy and 5 for very difficult. Time when the partic- ipants began the survey was also taken in order to calculate the total time taken by the participants to learn and answer the questions. Survey was taken by 18 of the employers of Dewire who had primary working role as one of the programmer, tester and project manager. In the result, tester and project manager were grouped as non-programmer. The survey concluded that it is very easy and quick to learn Gherkin. While the participants rated Gherkin as very easy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern software application testing, such as the testing of software driven by graphical user interfaces (GUIs) or leveraging event-driven architectures in general, requires paying careful attention to context. Model-based testing (MBT) approaches first acquire a model of an application, then use the model to construct test cases covering relevant contexts. A major shortcoming of state-of-the-art automated model-based testing is that many test cases proposed by the model are not actually executable. These \textit{infeasible} test cases threaten the integrity of the entire model-based suite, and any coverage of contexts the suite aims to provide. In this research, I develop and evaluate a novel approach for classifying the feasibility of test cases. I identify a set of pertinent features for the classifier, and develop novel methods for extracting these features from the outputs of MBT tools. I use a supervised logistic regression approach to obtain a model of test case feasibility from a randomly selected training suite of test cases. I evaluate this approach with a set of experiments. The outcomes of this investigation are as follows: I confirm that infeasibility is prevalent in MBT, even for test suites designed to cover a relatively small number of unique contexts. I confirm that the frequency of infeasibility varies widely across applications. I develop and train a binary classifier for feasibility with average overall error, false positive, and false negative rates under 5\%. I find that unique event IDs are key features of the feasibility classifier, while model-specific event types are not. I construct three types of features from the event IDs associated with test cases, and evaluate the relative effectiveness of each within the classifier. To support this study, I also develop a number of tools and infrastructure components for scalable execution of automated jobs, which use state-of-the-art container and continuous integration technologies to enable parallel test execution and the persistence of all experimental artifacts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Support for interoperability and interchangeability of software components which are part of a fieldbus automation system relies on the definition of open architectures, most of them involving proprietary technologies. Concurrently, standard, open and non-proprietary technologies, such as XML, SOAP, Web Services and the like, have greatly evolved and been diffused in the computing area. This article presents a FOUNDATION fieldbus (TM) device description technology named Open-EDD, based on XML and other related technologies (XLST, DOM using Xerces implementation, OO, XMIL Schema), proposing an open and nonproprietary alternative to the EDD (Electronic Device Description). This initial proposal includes defining Open-EDDML as the programming language of the technology in the FOUNDATION fieldbus (TM) protocol, implementing a compiler and a parser, and finally, integrating and testing the new technology using field devices and a commercial fieldbus configurator. This study attests that this new technology is feasible and can be applied to other configurators or HMI applications used in fieldbus automation systems. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents the lifecycle assessment (LCA) of fuel ethanol, as 100% of the vehicle fuel, from sugarcane in Brazil. The functional unit is 10,000 km run in an urban area by a car with a 1,600-cm(3) engine running on fuel hydrated ethanol, and the resulting reference flow is 1,000 kg of ethanol. The product system includes agricultural and industrial activities, distribution, cogeneration of electricity and steam, ethanol use during car driving, and industrial by-products recycling to irrigate sugarcane fields. The use of sugarcane by the ethanol agribusiness is one of the foremost financial resources for the economy of the Brazilian rural area, which occupies extensive areas and provides far-reaching potentials for renewable fuel production. But, there are environmental impacts during the fuel ethanol lifecycle, which this paper intents to analyze, including addressing the main activities responsible for such impacts and indicating some suggestions to minimize the impacts. This study is classified as an applied quantitative research, and the technical procedure to achieve the exploratory goal is based on bibliographic revision, documental research, primary data collection, and study cases at sugarcane farms and fuel ethanol industries in the northeast of SA o pound Paulo State, Brazil. The methodological structure for this LCA study is in agreement with the International Standardization Organization, and the method used is the Environmental Design of Industrial Products. The lifecycle impact assessment (LCIA) covers the following emission-related impact categories: global warming, ozone formation, acidification, nutrient enrichment, ecotoxicity, and human toxicity. The results of the fuel ethanol LCI demonstrate that even though alcohol is considered a renewable fuel because it comes from biomass (sugarcane), it uses a high quantity and diversity of nonrenewable resources over its lifecycle. The input of renewable resources is also high mainly because of the water consumption in the industrial phases, due to the sugarcane washing process. During the lifecycle of alcohol, there is a surplus of electric energy due to the cogeneration activity. Another focus point is the quantity of emissions to the atmosphere and the diversity of the substances emitted. Harvesting is the unit process that contributes most to global warming. For photochemical ozone formation, harvesting is also the activity with the strongest contributions due to the burning in harvesting and the emissions from using diesel fuel. The acidification impact potential is mostly due to the NOx emitted by the combustion of ethanol during use, on account of the sulfuric acid use in the industrial process and because of the NOx emitted by the burning in harvesting. The main consequence of the intensive use of fertilizers to the field is the high nutrient enrichment impact potential associated with this activity. The main contributions to the ecotoxicity impact potential come from chemical applications during crop growth. The activity that presents the highest impact potential for human toxicity (HT) via air and via soil is harvesting. Via water, HT potential is high in harvesting due to lubricant use on the machines. The normalization results indicate that nutrient enrichment, acidification, and human toxicity via air and via water are the most significant impact potentials for the lifecycle of fuel ethanol. The fuel ethanol lifecycle contributes negatively to all the impact potentials analyzed: global warming, ozone formation, acidification, nutrient enrichment, ecotoxicity, and human toxicity. Concerning energy consumption, it consumes less energy than its own production largely because of the electricity cogeneration system, but this process is highly dependent on water. The main causes for the biggest impact potential indicated by the normalization is the nutrient application, the burning in harvesting and the use of diesel fuel. The recommendations for the ethanol lifecycle are: harvesting the sugarcane without burning; more environmentally benign agricultural practices; renewable fuel rather than diesel; not washing sugarcane and implementing water recycling systems during the industrial processing; and improving the system of gases emissions control during the use of ethanol in cars, mainly for NOx. Other studies on the fuel ethanol from sugarcane may analyze in more details the social aspects, the biodiversity, and the land use impact.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives: To evaluate the influence of JPEG quality factors 100, 80 and 60 on the reproducibility of identification of cephalometric points on images of lateral cephalograms, compared with the Digital Imaging and Communications in Medicine (DICOM) format. Methods: The sample was composed of 30 images of digital lateral cephalograms obtained from 30 individuals (15 males and 15 females) on a phosphor plate system in DICOM format. The images were converted to JPEG with quality factors 100, 80 and 60 with the aid of software, adding up to 90 images. The 120 images (DICOM, JPEG 100, 80 and 60) were blinded and 12 cephalometric points were identified on each image by three calibrated orthodontists, using the x-y coordinate system, on a cephalometric software. Results: The results revealed that identification of cephalometric points was highly reproducible, except for the point Orbitale (Or) on the x-axis. The different file formats did not present a statistically significant difference. Conclusions: JPEG images of lateral cephalograms with quality factors 100, 80 and 60 did not present alterations in the reproducibility of identification of cephalometric points compared with the DICOM format. Good reproducibility was achieved for the 12 points, except for point Or on the x-axis. Dentomaxillofacial Radiology (2009) 38, 393-400. doi: 10.1259/dmfr/40996636

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computer assisted learning has an important role in the teaching of pharmacokinetics to health sciences students because it transfers the emphasis from the purely mathematical domain to an 'experiential' domain in which graphical and symbolic representations of actions and their consequences form the major focus for learning. Basic pharmacokinetic concepts can be taught by experimenting with the interplay between dose and dosage interval with drug absorption (e.g. absorption rate, bioavailability), drug distribution (e.g. volume of distribution, protein binding) and drug elimination (e.g. clearance) on drug concentrations using library ('canned') pharmacokinetic models. Such 'what if' approaches are found in calculator-simulators such as PharmaCalc, Practical Pharmacokinetics and PK Solutions. Others such as SAAM II, ModelMaker, and Stella represent the 'systems dynamics' genre, which requires the user to conceptualise a problem and formulate the model on-screen using symbols, icons, and directional arrows. The choice of software should be determined by the aims of the subject/course, the experience and background of the students in pharmacokinetics, and institutional factors including price and networking capabilities of the package(s). Enhanced learning may result if the computer teaching of pharmacokinetics is supported by tutorials, especially where the techniques are applied to solving problems in which the link with healthcare practices is clearly established.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many business-oriented software applications are subject to frequent changes in requirements. This paper shows that, ceteris paribus, increases in the volatility of system requirements decrease the reliability of software. Further, systems that exhibit high volatility during the development phase are likely to have lower reliability during their operational phase. In addition to the typically higher volatility of requirements, end-users who specify the requirements of business-oriented systems are usually less technically oriented than people who specify the requirements of compilers, radar tracking systems or medical equipment. Hence, the characteristics of software reliability problems for business-oriented systems are likely to differ significantly from those of more technically oriented systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

RESUMO: A utilização adequada das TIC no ensino da Matemática, nos dias de hoje é considerada por alguns como justificada e inevitável, esperando que a sua utilização melhore o ensino e a aprendizagem da Matemática. Nesta investigação, pretende-se testar o Software Winplot), no ensino e aprendizagem do gráfico da função quadrática com alunos do 10ºano, da Escola do segundo ciclo do Ensino Secundário nº9099, de modo a verificar se melhora o ensino e na aprendizagem desta temática.Para a nossa investigação Seleccionámos dois grupos de alunos do 10º ano que funcionaram como grupo de controlo e grupo experimental; depois de ambos os grupos terem realizado dois pré-testes, o grupo experimental realizou as aprendizagens no laboratório de informática com auxílio do Software Winplot, ao longo de 8 semanas, durante o 2º trimestre do ano lectivo de 2009/2010. O grupo de controlo realizou as aprendizagens, ao mesmo tempo que o grupo experimental, na sala normal de aulas sem auxílio do Software Winplot.Ao compararmos os dois grupos, o teste T de pares para amostras independentes, mostra-nos que estatisticamente não há diferenças significativas entre os dois grupos, porque os níveis de significância são maiores que p=0,05, desta feita podemos dizer que o grupo experimental, não obteve melhores resultados que o grupo de controlo, logo o Software Winplot não resultou o efeito desejado nas aprendizagens com alunos da 10ºano da Escola do segundo ciclo do ensino Secundário nº9099, sita no município de Viana (Luanda/Angola). ABSTRACT:The appropriate use of ICTs in teaching mathematics, today is considered by somo to be justified and inevitable, hoping that their use will improve the teaching and learning of mathematics.In this investigation, we intend to test the Software Winplot, teaching and learning of the graph of quadratic functions with students of grade 10, attending the second cycle of secondary School nº9099 in order to verify that improves teaching and learning of this subject.For our research selected two groups of students in 10th grade who acted as the controlo group and experimental group, after both group had undergone two pre-test, the experimental group performed the learning in the computer lab with the aid of Software Winplot, over 8 weeks during the second quarter of the academic year 2009/2010. Thr control gropu performed the learning, while the experimental group, in rregular class room without help of the Software Winplot.Comparing the two groups, the t test for independent samples pairs, shows us that there is no statistically significant differences between the two groups, because the significance levels are greater than p=0,05, this time we can say that experimental group, not yielded better results than the control group, so the Software did not result the desired effect on the learning with students from 10th grade of the School of the second cycle of Secondary nº9099, located in Viana (Luanda/Angola).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

RESUMO: Hoje em dia o software tornou-se num elemento útil na vida das pessoas e das empresas. Existe cada vez mais a necessidade de utilização de aplicações de qualidade, com o objectivo das empresas se diferenciarem no mercado. As empresas produtoras de software procuram aumentar a qualidade nos seus processos de desenvolvimento, com o objectivo de garantir a qualidade do produto final. A dimensão e complexidade do software aumentam a probabilidade do aparecimento de não-conformidades nestes produtos, resultando daí o interesse pela actividade de testes de software ao longo de todo o seu processo de concepção, desenvolvimento e manutenção. Muitos projectos de desenvolvimento de software são entregues com atraso por se verificar que na data prevista para a sua conclusão não têm um desempenho satisfatório ou por não serem confiáveis, ou ainda por serem difíceis de manter. Um bom planeamento das actividades de produção de software significa usualmente um aumento da eficiência de todo o processo produtivo, pois poderá diminuir a quantidade de defeitos e os custos que decorrem da sua correcção, aumentando a confiança na utilização do software e a facilidade da sua operação e manutenção. Assim se reconhece a importância da adopção de boas práticas no desenvolvimento do software. Para isso deve-se utilizar uma abordagem sistemática e organizada com o intuito de produzir software de qualidade. Esta tese descreve os principais modelos de desenvolvimento de software, a importância da engenharia dos requisitos, os processos de testes e principais validações da qualidade de software e como algumas empresas utilizam estes princípios no seu dia-a-dia, com o intuito de produzir um produto final mais fiável. Descreve ainda alguns exemplos como complemento ao contexto da tese. ABSTRACT: Nowadays the software has become a useful element in people's lives and it is increasingly a need for the use of quality applications from companies in order to differentiate in the market. The producers of software increase quality in their development processes, in order to ensuring final product quality. The complexity and size of software, increases the probability of the emergence of non-conformities in these products, this reason increases of interest in the business of testing software throughout the process design, development and maintenance. Many software development projects are postpone because in the date for delivered it’s has not performed satisfactorily, not to be trusted, or because it’s harder to maintain. A good planning of software production activities, usually means an increase in the efficiency of all production process, because it can decrease the number of defects and the costs of it’s correction, increasing the reliability of software in use, and make it easy to operate and maintenance. In this manner, it’s recognized the importance of adopting best practices in software development. To produce quality software, a systematic and organized approach must be used. This thesis describes the main models of software development, the importance of requirements engineering, testing processes and key validation of software quality and how some companies use these principles daily, in order to produce a final product more reliable. It also describes some examples in addition to the context of this thesis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Mecânica na Área de Manutenção e Produção