938 resultados para Software Process Improvement
Resumo:
Cada vez mais, nos dias que correm, está presente em todas as organizações a metodologia lean, que assenta numa base de melhoria contínua, de forma a responder às necessidades do mercado e à satisfação do cliente, tendo como principal finalidade a criação de valor para o produto e a eliminação de desperdícios inerente aos processos de produção do mesmo. Um elemento essencial na gestão de qualquer organização com ênfase nos resultados é o uso de indicadores de desempenho no processo de tomada de decisão. Este projeto teve como objetivo principal a identificação e eliminação de desperdícios, melhorando os processos de montagem, através do estudo dos tempos de ciclo dos centros de trabalhos/produtos mais críticos, procedendo a um balanceamento adequado e posterior simulação dos resultados através do software Arena. Posteriormente foram analisados os resultados assim como o impacto que essas mudanças causaram na empresa, com base na implementação de ferramentas de melhoria, nomeadamente ferramentas lean. Essas mudanças tiveram um impacto positivo na produção final das cadeiras e dos porta-bebés, no que diz respeito à diminuição de filas de espera entre postos, diminuição dos tempos de processamento e aumento da produção para alguns dos modelos em estudo.
Resumo:
Software protection is an essential aspect of information security to withstand malicious activities on software, and preserving software assets. However, software developers still lacks a methodology for the assessment of the deployed protections. To solve these issues, we present a novel attack simulation based software protection assessment method to assess and compare various protection solutions. Our solution relies on Petri Nets to specify and visualize attack models, and we developed a Monte Carlo based approach to simulate attacking processes and to deal with uncertainty. Then, based on this simulation and estimation, a novel protection comparison model is proposed to compare different protection solutions. Lastly, our attack simulation based software protection assessment method is presented. We illustrate our method by means of a software protection assessment process to demonstrate that our approach can provide a suitable software protection assessment for developers and software companies.
Resumo:
Business Process Management (BPM) is able to organize and frame a company focusing in the improvement or assurance of performance in order to gain competitive advantage. Although it is believed that BPM improves various aspects of organizational performance, there has been a lack of empirical evidence about this. The present study has the purpose to develop a model to show the impact of business process management in organizational performance. To accomplish that, the theoretical basis required to know the elements that configurate BPM and the measures that can evaluate the BPM success on organizational performance is built through a systematic literature review (SLR). Then, a research model is proposed according to SLR results. Empirical data will be collected from a survey of larg and mid-sized industrial and service companies headquartered in Brazil. A quantitative analysis will be performed using structural equation modeling (SEM) to show if the direct effects among BPM and organizational performance can be considered statistically significant. At the end will discuss these results and their managerial and cientific implications.Keywords: Business process management (BPM). Organizational performance. Firm performance. Business models. Structural Equation Modeling. Systematic Literature Review.
Resumo:
It is now clear that the concept of a HPC compiler which automatically produces highly efficient parallel implementations is a pipe-dream. Another route is to recognise from the outset that user information is required and to develop tools that embed user interaction in the transformation of code from scalar to parallel form, and then use conventional compilers with a set of communication calls. This represents the key idea underlying the development of the CAPTools software environment. The initial version of CAPTools is focused upon single block structured mesh computational mechanics codes. The capability for unstructured mesh codes is under test now and block structured meshes will be included next. The parallelisation process can be completed rapidly for modest codes and the parallel performance approaches that which is delivered by hand parallelisations.
Resumo:
Wrongdoing in health care is harmful action that jeopardizes patient safety and can be targeted at the patient or employees. Wrongdoing can vary from illegal, unethical or unprofessional action to inappropriate behavior in the workplace. Whistleblowing can be considered as a process where wrongdoing is suspected or oberved in health care by health care professionals and disclosed to the party that can influence the wrongful action. Whistleblowing causes severe harm to the whistleblower and to the object of whistleblowing complaint, to their personnel life and working community. The aim of this study was to analyze whistleblowing process in Finnish health care. The overall goal is to raise concern about wrongdoing and whistleblowing in Finnish health care. In this cross-sectional descriptive study the data were collected (n = 397) with probability sampling from health care professionals and members of The Union of Health and Social Care Professionals in Finland Tehy. The data were collected with questionnaire: “Whistleblowing -väärinkäytösten paljastaminen terveydenhuollossa” developed for this study and by using Webropol questionnaire -software during 26.6.-17.7.2015. The data were analyzed statistically. According to the results of this study health care professionals had suspected (67 %) and observed (66 %) wrongdoing in health care, more often than once a month (30%). Mostly were suspected (37 %) and observed (36%) inadequacy of the personnel and least violence toward the patient (3 %). Wrongdoing was whistle blown (suspected 29 %, observed 40 %) primarily inside the organization to the closest supervisor (76 %), face-to-face (88 %). Mostly the whistle was blown on nurses’ wrongdoing (58 %). Whistleblowing act didn’t end the wrongdoing (52 %) and whistleblowing had negative consequences to the whistleblower such as discrimination by the manager (35 %). Respondents with work experience less than ten years (62 %), working in temporary position (75 %) or in management position (88 %) were, more unwilling to blow the whistle. Whistleblowing should be conducted internally, to the closest manager in writing and anonymously. Wrongdoing should be dealt between the parties involved, and written warning should ensue from wrongdoing. According to the results of this study whistleblowing on wrongdoing in health care causes negative consequences to the whistleblower. In future, attention in health care should be paid to preventing wrongdoing and enhancing whistleblowing in order to decrease wrongdoing and lessen the consequences that whistleblowers face after blowing the whistle.
Resumo:
The process of developing software is a complex undertaking involving multiple stakeholders. While the intentions of these parties might vary to some extent, the ultimate goal can be seen as a satisfactory product. Lean and agile software development practices strive toward this and they place customer contentment as one of the highest aims of the process. An important aspect of any development process is the act of innovation. Without it, nothing progresses and the whole process is unnecessary. As a target domain expert, the customer is an important part of effective innovation. Problems arise, however, when the customer is not actively taking part in the activities. Lack of familiarity with software development can easily cause such issues. Unfortunately, the amount of research conducted on product innovation is unimpressive. This makes it difficult to formulate a recommended approach on stimulating the customer and encouraging a more active participation. Ultimately, a small set of high-level guidelines were identified from the available literary resources for inducing innovation. To conclude, this thesis presents the findings made during the development of a small web application and compares them to the aforementioned literature findings. While the guidelines seem to provide promising results, further empirical research is needed to attain more significant conclusions.
Resumo:
Desde hace cerca de dos siglos, los hidratos de gas han ganado un rol importante en la ingeniería de procesos, debido a su impacto económico y ambiental en la industria -- Cada día, más compañías e ingenieros ganan interés en este tema, a medida que nuevos desafíos muestran a los hidratos de gas como un factor crucial, haciendo su estudio una solución para un futuro próximo -- Los gases de hidrato son estructuras similares al hielo, compuestos de moléculas huéspedes de agua conteniendo compuestos gaseosos -- Existen naturalmente en condiciones de presiones altas y bajas temperaturas, condiciones típicas de algunos procesos químicos y petroquímicos [1] -- Basado en el trabajo doctoral de Windmeier [2] y el trabajo doctoral the Rock [3], la descripción termodinámica de las fases de los hidratos de gas es implementada siguiendo el estado del arte de la ciencia y la tecnología -- Con ayuda del Dortmund Data Bank (DDB) y el paquete de software correspondiente (DDBSP) [26], el desempeño del método fue mejorado y comparado con una gran cantidad de datos publicados alrededor del mundo -- También, la aplicabilidad de la predicción de los hidratos de gas fue estudiada enfocada en la ingeniería de procesos, con un caso de estudio relacionado con la extracción, producción y transporte del gas natural -- Fue determinado que la predicción de los hidratos de gas es crucial en el diseño del proceso del gas natural -- Donde, en las etapas de tratamiento del gas y procesamiento de líquido no se presenta ninguna formación, en la etapa de deshidratación una temperatura mínima de 290.15 K es crítica y para la extracción y transporte el uso de inhibidores es esencial -- Una composición másica de 40% de etilenglicol fue encontrada apropiada para prevenir la formación de hidrato de gas en la extracción y una composición másica de 20% de metanol en el transporte
Resumo:
Data mining, as a heatedly discussed term, has been studied in various fields. Its possibilities in refining the decision-making process, realizing potential patterns and creating valuable knowledge have won attention of scholars and practitioners. However, there are less studies intending to combine data mining and libraries where data generation occurs all the time. Therefore, this thesis plans to fill such a gap. Meanwhile, potential opportunities created by data mining are explored to enhance one of the most important elements of libraries: reference service. In order to thoroughly demonstrate the feasibility and applicability of data mining, literature is reviewed to establish a critical understanding of data mining in libraries and attain the current status of library reference service. The result of the literature review indicates that free online data resources other than data generated on social media are rarely considered to be applied in current library data mining mandates. Therefore, the result of the literature review motivates the presented study to utilize online free resources. Furthermore, the natural match between data mining and libraries is established. The natural match is explained by emphasizing the data richness reality and considering data mining as one kind of knowledge, an easy choice for libraries, and a wise method to overcome reference service challenges. The natural match, especially the aspect that data mining could be helpful for library reference service, lays the main theoretical foundation for the empirical work in this study. Turku Main Library was selected as the case to answer the research question: whether data mining is feasible and applicable for reference service improvement. In this case, the daily visit from 2009 to 2015 in Turku Main Library is considered as the resource for data mining. In addition, corresponding weather conditions are collected from Weather Underground, which is totally free online. Before officially being analyzed, the collected dataset is cleansed and preprocessed in order to ensure the quality of data mining. Multiple regression analysis is employed to mine the final dataset. Hourly visits are the independent variable and weather conditions, Discomfort Index and seven days in a week are dependent variables. In the end, four models in different seasons are established to predict visiting situations in each season. Patterns are realized in different seasons and implications are created based on the discovered patterns. In addition, library-climate points are generated by a clustering method, which simplifies the process for librarians using weather data to forecast library visiting situation. Then the data mining result is interpreted from the perspective of improving reference service. After this data mining work, the result of the case study is presented to librarians so as to collect professional opinions regarding the possibility of employing data mining to improve reference services. In the end, positive opinions are collected, which implies that it is feasible to utilizing data mining as a tool to enhance library reference service.
Resumo:
A utilização das TIC ocupam um lugar cada vez mais importante nas nossas escolas, marcado sobretudo pela evolução das tecnologias e pela utilização em contexto educativo de muitas ferramentas da Web 2.0. Esse facto é muito notório na disciplina de Educação Visual e Tecnológica, de carácter eminentemente prático, onde é permitido explorar várias ferramentas digitais para abordagem de conteúdos da disciplina e para a criação de produtos gráficos e plásticos. Com o aparecimento da Web 2.0 e a disponibilização de milhares de novas ferramentas digitais aos utilizadores da Internet, estimula-se um interesse cada vez maior na adoção de metodologias e estratégias com recurso a estes media e que suportem uma aprendizagem mais eficaz e motivadora para os alunos, articulando-se os suportes tradicionais de EVT com os novos media digitais. Neste contexto, o presente estudo é o resultado duma investigação-ação realizada no âmbito do Programa Doutoral em Multimédia em Educação da Universidade de Aveiro onde se implementou a integração de ferramentas da Web, Web 2.0 e Software Livre em contexto educativo na disciplina de EVT, na qual poderiam ser utilizadas tanto as técnicas tradicionais de realização mais usuais na disciplina como a integração e articulação com as ferramentas digitais, suportadas por software livre (e outros de utilização gratuita), a Web e a Web 2.0 para suporte ao ensino e aprendizagem dos diversos conteúdos e áreas de exploração da disciplina. Este estudo, desenhado em três ciclos, envolveu num primeiro momento a constituição de uma comunidade de prática de professores alargada, sendo criadas seis turmas de formação que reuniram um total de 112 professores que pretendiam integrar as ferramentas digitais em EVT. Para além da pesquisa, análise, seleção e catalogação destas 430 ferramentas digitais recenseadas, produziram-se 371 manuais de apoio à utilização das mesmas, sendo estes recursos disponibilizados no espaço do EVTdigital. Num segundo ciclo, decorrente da avaliação realizada, foi criada a distribuição EVTux para simplificar o acesso e utilização das ferramentas digitais em contexto de EVT. Finalmente, o terceiro ciclo, decorre da eliminação da disciplina de EVT do currículo do 2º ciclo do ensino básico e a sua substituição por duas novas disciplinas, tendo-se realizada a respetiva análise de conteúdo das metas curriculares e produzido a aplicação As ferramentas digitais do Mundo Visual, concebida para contextualizar e indexar as ferramentas digitais selecionadas para a nova disciplina de Educação Visual.Os resultados deste estudo apontam claramente para a possibilidade de integrar na disciplina de Educação Visual e Tecnológica (ou no presente momento, em Educação Visual) ferramentas digitais para abordagem aos conteúdos e áreas de exploração, bem como a possibilidade de se constituírem facilmente comunidades de prática (como foi o caso) que possam colaborar na catalogação destas ferramentas no contexto específico da disciplina e para a necessidade sentida pelos professores em obter informação e formação que os possa atualizar quanto à integração das TIC no currículo. Apresentam-se, ainda, as limitações deste estudo que passaram sobretudo pelo impacto negativo que a eliminação da disciplina provocou na motivação dos docentes e a sua consequente participação no decorrer de algumas fases do trabalho, e ainda da dificuldade de gestão de uma equipa de professores colaboradores tão numerosa e diversificada. Nesse sentido, são também apresentadas sugestões para estudos futuros.
Resumo:
The PhD project addresses the potential of using concentrating solar power (CSP) plants as a viable alternative energy producing system in Libya. Exergetic, energetic, economic and environmental analyses are carried out for a particular type of CSP plants. The study, although it aims a particular type of CSP plant – 50 MW parabolic trough-CSP plant, it is sufficiently general to be applied to other configurations. The novelty of the study, in addition to modeling and analyzing the selected configuration, lies in the use of a state-of-the-art exergetic analysis combined with the Life Cycle Assessment (LCA). The modeling and simulation of the plant is carried out in chapter three and they are conducted into two parts, namely: power cycle and solar field. The computer model developed for the analysis of the plant is based on algebraic equations describing the power cycle and the solar field. The model was solved using the Engineering Equation Solver (EES) software; and is designed to define the properties at each state point of the plant and then, sequentially, to determine energy, efficiency and irreversibility for each component. The developed model has the potential of using in the preliminary design of CSPs and, in particular, for the configuration of the solar field based on existing commercial plants. Moreover, it has the ability of analyzing the energetic, economic and environmental feasibility of using CSPs in different regions of the world, which is illustrated for the Libyan region in this study. The overall feasibility scenario is completed through an hourly analysis on an annual basis in chapter Four. This analysis allows the comparison of different systems and, eventually, a particular selection, and it includes both the economic and energetic components using the “greenius” software. The analysis also examined the impact of project financing and incentives on the cost of energy. The main technological finding of this analysis is higher performance and lower levelized cost of electricity (LCE) for Libya as compared to Southern Europe (Spain). Therefore, Libya has the potential of becoming attractive for the establishment of CSPs in its territory and, in this way, to facilitate the target of several European initiatives that aim to import electricity generated by renewable sources from North African and Middle East countries. The analysis is presented a brief review of the current cost of energy and the potential of reducing the cost from parabolic trough- CSP plant. Exergetic and environmental life cycle assessment analyses are conducted for the selected plant in chapter Five; the objectives are 1) to assess the environmental impact and cost, in terms of exergy of the life cycle of the plant; 2) to find out the points of weakness in terms of irreversibility of the process; and 3) to verify whether solar power plants can reduce environmental impact and the cost of electricity generation by comparing them with fossil fuel plants, in particular, Natural Gas Combined Cycle (NGCC) plant and oil thermal power plant. The analysis also targets a thermoeconomic analysis using the specific exergy costing (SPECO) method to evaluate the level of the cost caused by exergy destruction. The main technological findings are that the most important contribution impact lies with the solar field, which reports a value of 79%; and the materials with the vi highest impact are: steel (47%), molten salt (25%) and synthetic oil (21%). The “Human Health” damage category presents the highest impact (69%) followed by the “Resource” damage category (24%). In addition, the highest exergy demand is linked to the steel (47%); and there is a considerable exergetic demand related to the molten salt and synthetic oil with values of 25% and 19%, respectively. Finally, in the comparison with fossil fuel power plants (NGCC and Oil), the CSP plant presents the lowest environmental impact, while the worst environmental performance is reported to the oil power plant followed by NGCC plant. The solar field presents the largest value of cost rate, where the boiler is a component with the highest cost rate among the power cycle components. The thermal storage allows the CSP plants to overcome solar irradiation transients, to respond to electricity demand independent of weather conditions, and to extend electricity production beyond the availability of daylight. Numerical analysis of the thermal transient response of a thermocline storage tank is carried out for the charging phase. The system of equations describing the numerical model is solved by using time-implicit and space-backward finite differences and which encoded within the Matlab environment. The analysis presented the following findings: the predictions agree well with the experiments for the time evolution of the thermocline region, particularly for the regions away from the top-inlet. The deviations observed in the near-region of the inlet are most likely due to the high-level of turbulence in this region due to the localized level of mixing resulting; a simple analytical model to take into consideration this increased turbulence level was developed and it leads to some improvement of the predictions; this approach requires practically no additional computational effort and it relates the effective thermal diffusivity to the mean effective velocity of the fluid at each particular height of the system. Altogether the study indicates that the selected parabolic trough-CSP plant has the edge over alternative competing technologies for locations where DNI is high and where land usage is not an issue, such as the shoreline of Libya.
Resumo:
O transplante de medula óssea (TMO) é um procedimento terapêutico importante em casos relacionados à pacientes com leucemia ou linfoma. Em decorrência desse processo, uma reação conhecida como doença enxerto-versus-hospedeiro (GVHD) pode ocorrer em pacientes susceptíveis como conseqüência da presença de células imunocompetentes do doador. Entretanto, não existe um modelo para descrever completamente as ações relacionadas ao mecanismo imunológico da GVHD desde a fase que inicializa a doença até a fase efetora. O Objetivo geral deste estudo é a investigação da resposta imunológica considerando-se o sistema HLA (antígenos leucocitários humano) em pacientes que desenvolveram a GVHD em decorrência do TMO. O National Cancer Institute (NCI) – Pathway interaction Database e Reactome foram usados como bases de dados com o objetivo de se estudar a expressão de genes e vias relacionados às Classes I e II do sistema HLA (antígenos leucocitários humano). O estudo considerou a mudança de expressão de genes relacionados às 17 vias do sistema imunológico com potencialidade para se expressar em pacientes que desenvolveram a GVHD associada à TMO. Dados referentes aos transcriptomas foram obtidos utilizando-se a plataforma GPL570 Affymetrix Genoma Humano U133 Plus. A atividade relativa foi usada para determinar as alterações das vias em amostras de GVHD em relação ao controle. As análises foram realizadas utilizando-se o software Via Complex e Bioconductor. Observou-se aumento significativo da expressão de genes ralacionados às vias do sistema imune adaptativo, antígenos associados às Classe I e II do HLA, fosforilação de CD3 e CD247, sinalização dos receptores de células T em CD4+ nativas e ativação de NF-kapa β nas células B. Também observou-se alterações significativas na mudança de expressão dos genes associados às vias relacionadas à super família de moléculas B7:CD28\CTLA-4 quando comparadas ao controle. Isso pode indicar a necessidade de geração de um segundo sinal co-estimulador em GVHD, acionado pelas moléculas dessa super família. O aumento da expressão do gene CD69 nas amostras experimentais caracteriza a ativação celular e, portanto, a sinalização de estímulos em GVHD. Os achados obtidos neste estudo contribuem para melhor elucidar o mecanismo imunopatogênico associado à GVHD. P
Resumo:
Wrongdoing in health care is harmful action that jeopardizes patient safety and can be targeted at the patient or employees. Wrongdoing can vary from illegal, unethical or unprofessional action to inappropriate behavior in the workplace. Whistleblowing can be considered as a process where wrongdoing is suspected or oberved in health care by health care professionals and disclosed to the party that can influence the wrongful action. Whistleblowing causes severe harm to the whistleblower and to the object of whistleblowing complaint, to their personnel life and working community. The aim of this study was to analyze whistleblowing process in Finnish health care. The overall goal is to raise concern about wrongdoing and whistleblowing in Finnish health care. In this cross-sectional descriptive study the data were collected (n = 397) with probability sampling from health care professionals and members of The Union of Health and Social Care Professionals in Finland Tehy. The data were collected with questionnaire: “Whistleblowing -väärinkäytösten paljastaminen terveydenhuollossa” developed for this study and by using Webropol questionnaire -software during 26.6.-17.7.2015. The data were analyzed statistically. According to the results of this study health care professionals had suspected (67 %) and observed (66 %) wrongdoing in health care, more often than once a month (30%). Mostly were suspected (37 %) and observed (36%) inadequacy of the personnel and least violence toward the patient (3 %). Wrongdoing was whistle blown (suspected 29 %, observed 40 %) primarily inside the organization to the closest supervisor (76 %), face-to-face (88 %). Mostly the whistle was blown on nurses’ wrongdoing (58 %). Whistleblowing act didn’t end the wrongdoing (52 %) and whistleblowing had negative consequences to the whistleblower such as discrimination by the manager (35 %). Respondents with work experience less than ten years (62 %), working in temporary position (75 %) or in management position (88 %) were, more unwilling to blow the whistle. Whistleblowing should be conducted internally, to the closest manager in writing and anonymously. Wrongdoing should be dealt between the parties involved, and written warning should ensue from wrongdoing. According to the results of this study whistleblowing on wrongdoing in health care causes negative consequences to the whistleblower. In future, attention in health care should be paid to preventing wrongdoing and enhancing whistleblowing in order to decrease wrongdoing and lessen the consequences that whistleblowers face after blowing the whistle.
Resumo:
With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.
Resumo:
The aim of the project was to improve an existing testing machine that is produced by the company EVOLEO Technologies. New conceptions of each part have been invented in order to produce an innovative unit that combines optimal segments from the old construction with the new, improved ones. The machine is meant to be testing different kind of devices that use specific elements like: buttons, knobs, monitors. The main purpose is to create various concepts of components that could be changed in order to lower the cost, weight or to simplify the operating process. Figure 1. shows the already existing discussed device.
Resumo:
Abstract – Background – The software effort estimation research area aims to improve the accuracy of this estimation in software projects and activities. Aims – This study describes the development and usage of a web application tocollect data generated from the Planning Poker estimation process and the analysis of the collected data to investigate the impact of revising previous estimates when conducting similar estimates in a Planning Poker context. Method – Software activities were estimated by Universidade Tecnológica Federal do Paraná (UTFPR) computer students, using Planning Poker, with and without revising previous similar activities, storing data regarding the decision-making process. And the collected data was used to investigate the impact that revising similar executed activities have in the software effort estimates' accuracy.Obtained Results – The UTFPR computer students were divided into 14 groups. Eight of them showed accuracy increase in more than half of their estimates. Three of them had almost the same accuracy in more than half of their estimates. And only three of them had loss of accuracy in more than half of their estimates. Conclusion – Reviewing the similar executed software activities, when using Planning Poker, led to more accurate software estimates in most cases, and, because of that, can improve the software development process.