970 resultados para Software Process
Resumo:
Some authors have shown the need of understanding the technological structuring process in contemporary firms. From this perspective, the software industry is a very important element because it provides products and services directly to many organizations from many fields. In this case, the Brazilian software industry has some peculiarities that distinguish it from other industries located in developed countries, which makes its understanding even more relevant. There is evidence that local firms take different strategies and structural configurations to enter into a market naturally dominated by large multinational firms. Therefore, this study aims to understand not only the structural configurations assumed by domestic firms but also the dynamic and the process that lead to these different configurations. To do so, this PhD dissertation investigates the institutional environment, its entities and the isomorphic movements, by employing an exploratory, descriptive and explanatory multiple cases study. Eight software development companies from the Recife's information technology Cluster were visited. Also, a form was applied and an interview with one of the main firm s professional was conducted. Although the study is predominantly qualitative, part of the data was analyzed through charts and graphs, providing a companies and environment overview that was very useful to analysis done through the interviews interpretation. As a result, it was realized that companies are structured around hybrids business models from two ideal types of software development companies, which are: software factory and technology-based company. Regarding the development process, it was found that there is a balanced distribution between the traditional and agile development paradigm. Among the traditional methodologies, the Rational Unified Process (RUP) is predominant. The Scrum is the most used methodology among the organizations based on the Agile Manifesto's principles. Regarding the structuring process, each institutional entity acts in such way that generates different isomorphic pressure. Emphasis was given to entities such as customers, research agencies, clusters, market-leading businesses, public universities, incubators, software industry organizations, technology vendors, development tool suppliers and manager s school and background because they relate themselves in a close way with the software firms. About this relationship, a dual and bilateral influence was found. Finally, the structuring level of the organizational field has been also identified as low, which gives a chance to organizational actors of acting independently
Resumo:
This work provides a holistic investigation into the realm of feature modeling within software product lines. The work presented identifies limitations and challenges within the current feature modeling approaches. Those limitations include, but not limited to, the dearth of satisfactory cognitive presentation, inconveniency in scalable systems, inflexibility in adapting changes, nonexistence of predictability of models behavior, as well as the lack of probabilistic quantification of model’s implications and decision support for reasoning under uncertainty. The work in this thesis addresses these challenges by proposing a series of solutions. The first solution is the construction of a Bayesian Belief Feature Model, which is a novel modeling approach capable of quantifying the uncertainty measures in model parameters by a means of incorporating probabilistic modeling with a conventional modeling approach. The Bayesian Belief feature model presents a new enhanced feature modeling approach in terms of truth quantification and visual expressiveness. The second solution takes into consideration the unclear support for the reasoning under the uncertainty process, and the challenging constraint satisfaction problem in software product lines. This has been done through the development of a mathematical reasoner, which was designed to satisfy the model constraints by considering probability weight for all involved parameters and quantify the actual implications of the problem constraints. The developed Uncertain Constraint Satisfaction Problem approach has been tested and validated through a set of designated experiments. Profoundly stating, the main contributions of this thesis include the following: • Develop a framework for probabilistic graphical modeling to build the purported Bayesian belief feature model. • Extend the model to enhance visual expressiveness throughout the integration of colour degree variation; in which the colour varies with respect to the predefined probabilistic weights. • Enhance the constraints satisfaction problem by the uncertainty measuring of the parameters truth assumption. • Validate the developed approach against different experimental settings to determine its functionality and performance.
Resumo:
Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.
Resumo:
New technologies appear each moment and its use can result in countless benefits for that they directly use and for all the society as well. In this direction, the State also can use the technologies of the information and communication to improve the level of rendering of services to the citizens, to give more quality of life to the society and to optimize the public expense, centering it in the main necessities. For this, it has many research on politics of Electronic Government (e-Gov) and its main effect for the citizen and the society as a whole. This research studies the concept of Electronic Government and wishes to understand the process of implementation of Free Softwares in the agencies of the Direct Administration in the Rio Grande do Norte. Moreover, it deepens the analysis to identify if its implantation results in reduction of cost for the state treasury and intends to identify the Free Software participation in the Administration and the bases of the politics of Electronic Government in this State. Through qualitative interviews with technologies coordinators and managers in 3 State Secretaries it could be raised the ways that come being trod for the Government in order to endow the State with technological capacity. It was perceived that the Rio Grande do Norte still is an immature State in relation to practical of electronic government (e-Gov) and with Free Softwares, where few agencies have factual and viable initiatives in this area. It still lacks of a strategical definition of the paper of Technology and more investments in infrastructure of staff and equipment. One also observed advances as the creation of the normative agency, the CETIC (State Advice of Technology of the Information and Communication), the Managing Plan of Technology that provide a necessary diagnosis with the situation how much Technology in the State and considered diverse goals for the area, the accomplishment of a course of after-graduation for managers of Technology and the training in BrOffice (OppenOffice) for 1120 public servers
Resumo:
Software protection is an essential aspect of information security to withstand malicious activities on software, and preserving software assets. However, software developers still lacks a methodology for the assessment of the deployed protections. To solve these issues, we present a novel attack simulation based software protection assessment method to assess and compare various protection solutions. Our solution relies on Petri Nets to specify and visualize attack models, and we developed a Monte Carlo based approach to simulate attacking processes and to deal with uncertainty. Then, based on this simulation and estimation, a novel protection comparison model is proposed to compare different protection solutions. Lastly, our attack simulation based software protection assessment method is presented. We illustrate our method by means of a software protection assessment process to demonstrate that our approach can provide a suitable software protection assessment for developers and software companies.
Resumo:
It is now clear that the concept of a HPC compiler which automatically produces highly efficient parallel implementations is a pipe-dream. Another route is to recognise from the outset that user information is required and to develop tools that embed user interaction in the transformation of code from scalar to parallel form, and then use conventional compilers with a set of communication calls. This represents the key idea underlying the development of the CAPTools software environment. The initial version of CAPTools is focused upon single block structured mesh computational mechanics codes. The capability for unstructured mesh codes is under test now and block structured meshes will be included next. The parallelisation process can be completed rapidly for modest codes and the parallel performance approaches that which is delivered by hand parallelisations.
Resumo:
Wrongdoing in health care is harmful action that jeopardizes patient safety and can be targeted at the patient or employees. Wrongdoing can vary from illegal, unethical or unprofessional action to inappropriate behavior in the workplace. Whistleblowing can be considered as a process where wrongdoing is suspected or oberved in health care by health care professionals and disclosed to the party that can influence the wrongful action. Whistleblowing causes severe harm to the whistleblower and to the object of whistleblowing complaint, to their personnel life and working community. The aim of this study was to analyze whistleblowing process in Finnish health care. The overall goal is to raise concern about wrongdoing and whistleblowing in Finnish health care. In this cross-sectional descriptive study the data were collected (n = 397) with probability sampling from health care professionals and members of The Union of Health and Social Care Professionals in Finland Tehy. The data were collected with questionnaire: “Whistleblowing -väärinkäytösten paljastaminen terveydenhuollossa” developed for this study and by using Webropol questionnaire -software during 26.6.-17.7.2015. The data were analyzed statistically. According to the results of this study health care professionals had suspected (67 %) and observed (66 %) wrongdoing in health care, more often than once a month (30%). Mostly were suspected (37 %) and observed (36%) inadequacy of the personnel and least violence toward the patient (3 %). Wrongdoing was whistle blown (suspected 29 %, observed 40 %) primarily inside the organization to the closest supervisor (76 %), face-to-face (88 %). Mostly the whistle was blown on nurses’ wrongdoing (58 %). Whistleblowing act didn’t end the wrongdoing (52 %) and whistleblowing had negative consequences to the whistleblower such as discrimination by the manager (35 %). Respondents with work experience less than ten years (62 %), working in temporary position (75 %) or in management position (88 %) were, more unwilling to blow the whistle. Whistleblowing should be conducted internally, to the closest manager in writing and anonymously. Wrongdoing should be dealt between the parties involved, and written warning should ensue from wrongdoing. According to the results of this study whistleblowing on wrongdoing in health care causes negative consequences to the whistleblower. In future, attention in health care should be paid to preventing wrongdoing and enhancing whistleblowing in order to decrease wrongdoing and lessen the consequences that whistleblowers face after blowing the whistle.
Resumo:
The process of developing software is a complex undertaking involving multiple stakeholders. While the intentions of these parties might vary to some extent, the ultimate goal can be seen as a satisfactory product. Lean and agile software development practices strive toward this and they place customer contentment as one of the highest aims of the process. An important aspect of any development process is the act of innovation. Without it, nothing progresses and the whole process is unnecessary. As a target domain expert, the customer is an important part of effective innovation. Problems arise, however, when the customer is not actively taking part in the activities. Lack of familiarity with software development can easily cause such issues. Unfortunately, the amount of research conducted on product innovation is unimpressive. This makes it difficult to formulate a recommended approach on stimulating the customer and encouraging a more active participation. Ultimately, a small set of high-level guidelines were identified from the available literary resources for inducing innovation. To conclude, this thesis presents the findings made during the development of a small web application and compares them to the aforementioned literature findings. While the guidelines seem to provide promising results, further empirical research is needed to attain more significant conclusions.
Resumo:
Desde hace cerca de dos siglos, los hidratos de gas han ganado un rol importante en la ingeniería de procesos, debido a su impacto económico y ambiental en la industria -- Cada día, más compañías e ingenieros ganan interés en este tema, a medida que nuevos desafíos muestran a los hidratos de gas como un factor crucial, haciendo su estudio una solución para un futuro próximo -- Los gases de hidrato son estructuras similares al hielo, compuestos de moléculas huéspedes de agua conteniendo compuestos gaseosos -- Existen naturalmente en condiciones de presiones altas y bajas temperaturas, condiciones típicas de algunos procesos químicos y petroquímicos [1] -- Basado en el trabajo doctoral de Windmeier [2] y el trabajo doctoral the Rock [3], la descripción termodinámica de las fases de los hidratos de gas es implementada siguiendo el estado del arte de la ciencia y la tecnología -- Con ayuda del Dortmund Data Bank (DDB) y el paquete de software correspondiente (DDBSP) [26], el desempeño del método fue mejorado y comparado con una gran cantidad de datos publicados alrededor del mundo -- También, la aplicabilidad de la predicción de los hidratos de gas fue estudiada enfocada en la ingeniería de procesos, con un caso de estudio relacionado con la extracción, producción y transporte del gas natural -- Fue determinado que la predicción de los hidratos de gas es crucial en el diseño del proceso del gas natural -- Donde, en las etapas de tratamiento del gas y procesamiento de líquido no se presenta ninguna formación, en la etapa de deshidratación una temperatura mínima de 290.15 K es crítica y para la extracción y transporte el uso de inhibidores es esencial -- Una composición másica de 40% de etilenglicol fue encontrada apropiada para prevenir la formación de hidrato de gas en la extracción y una composición másica de 20% de metanol en el transporte
Resumo:
A utilização das TIC ocupam um lugar cada vez mais importante nas nossas escolas, marcado sobretudo pela evolução das tecnologias e pela utilização em contexto educativo de muitas ferramentas da Web 2.0. Esse facto é muito notório na disciplina de Educação Visual e Tecnológica, de carácter eminentemente prático, onde é permitido explorar várias ferramentas digitais para abordagem de conteúdos da disciplina e para a criação de produtos gráficos e plásticos. Com o aparecimento da Web 2.0 e a disponibilização de milhares de novas ferramentas digitais aos utilizadores da Internet, estimula-se um interesse cada vez maior na adoção de metodologias e estratégias com recurso a estes media e que suportem uma aprendizagem mais eficaz e motivadora para os alunos, articulando-se os suportes tradicionais de EVT com os novos media digitais. Neste contexto, o presente estudo é o resultado duma investigação-ação realizada no âmbito do Programa Doutoral em Multimédia em Educação da Universidade de Aveiro onde se implementou a integração de ferramentas da Web, Web 2.0 e Software Livre em contexto educativo na disciplina de EVT, na qual poderiam ser utilizadas tanto as técnicas tradicionais de realização mais usuais na disciplina como a integração e articulação com as ferramentas digitais, suportadas por software livre (e outros de utilização gratuita), a Web e a Web 2.0 para suporte ao ensino e aprendizagem dos diversos conteúdos e áreas de exploração da disciplina. Este estudo, desenhado em três ciclos, envolveu num primeiro momento a constituição de uma comunidade de prática de professores alargada, sendo criadas seis turmas de formação que reuniram um total de 112 professores que pretendiam integrar as ferramentas digitais em EVT. Para além da pesquisa, análise, seleção e catalogação destas 430 ferramentas digitais recenseadas, produziram-se 371 manuais de apoio à utilização das mesmas, sendo estes recursos disponibilizados no espaço do EVTdigital. Num segundo ciclo, decorrente da avaliação realizada, foi criada a distribuição EVTux para simplificar o acesso e utilização das ferramentas digitais em contexto de EVT. Finalmente, o terceiro ciclo, decorre da eliminação da disciplina de EVT do currículo do 2º ciclo do ensino básico e a sua substituição por duas novas disciplinas, tendo-se realizada a respetiva análise de conteúdo das metas curriculares e produzido a aplicação As ferramentas digitais do Mundo Visual, concebida para contextualizar e indexar as ferramentas digitais selecionadas para a nova disciplina de Educação Visual.Os resultados deste estudo apontam claramente para a possibilidade de integrar na disciplina de Educação Visual e Tecnológica (ou no presente momento, em Educação Visual) ferramentas digitais para abordagem aos conteúdos e áreas de exploração, bem como a possibilidade de se constituírem facilmente comunidades de prática (como foi o caso) que possam colaborar na catalogação destas ferramentas no contexto específico da disciplina e para a necessidade sentida pelos professores em obter informação e formação que os possa atualizar quanto à integração das TIC no currículo. Apresentam-se, ainda, as limitações deste estudo que passaram sobretudo pelo impacto negativo que a eliminação da disciplina provocou na motivação dos docentes e a sua consequente participação no decorrer de algumas fases do trabalho, e ainda da dificuldade de gestão de uma equipa de professores colaboradores tão numerosa e diversificada. Nesse sentido, são também apresentadas sugestões para estudos futuros.
Resumo:
Wrongdoing in health care is harmful action that jeopardizes patient safety and can be targeted at the patient or employees. Wrongdoing can vary from illegal, unethical or unprofessional action to inappropriate behavior in the workplace. Whistleblowing can be considered as a process where wrongdoing is suspected or oberved in health care by health care professionals and disclosed to the party that can influence the wrongful action. Whistleblowing causes severe harm to the whistleblower and to the object of whistleblowing complaint, to their personnel life and working community. The aim of this study was to analyze whistleblowing process in Finnish health care. The overall goal is to raise concern about wrongdoing and whistleblowing in Finnish health care. In this cross-sectional descriptive study the data were collected (n = 397) with probability sampling from health care professionals and members of The Union of Health and Social Care Professionals in Finland Tehy. The data were collected with questionnaire: “Whistleblowing -väärinkäytösten paljastaminen terveydenhuollossa” developed for this study and by using Webropol questionnaire -software during 26.6.-17.7.2015. The data were analyzed statistically. According to the results of this study health care professionals had suspected (67 %) and observed (66 %) wrongdoing in health care, more often than once a month (30%). Mostly were suspected (37 %) and observed (36%) inadequacy of the personnel and least violence toward the patient (3 %). Wrongdoing was whistle blown (suspected 29 %, observed 40 %) primarily inside the organization to the closest supervisor (76 %), face-to-face (88 %). Mostly the whistle was blown on nurses’ wrongdoing (58 %). Whistleblowing act didn’t end the wrongdoing (52 %) and whistleblowing had negative consequences to the whistleblower such as discrimination by the manager (35 %). Respondents with work experience less than ten years (62 %), working in temporary position (75 %) or in management position (88 %) were, more unwilling to blow the whistle. Whistleblowing should be conducted internally, to the closest manager in writing and anonymously. Wrongdoing should be dealt between the parties involved, and written warning should ensue from wrongdoing. According to the results of this study whistleblowing on wrongdoing in health care causes negative consequences to the whistleblower. In future, attention in health care should be paid to preventing wrongdoing and enhancing whistleblowing in order to decrease wrongdoing and lessen the consequences that whistleblowers face after blowing the whistle.
Resumo:
With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.
Resumo:
Abstract – Background – The software effort estimation research area aims to improve the accuracy of this estimation in software projects and activities. Aims – This study describes the development and usage of a web application tocollect data generated from the Planning Poker estimation process and the analysis of the collected data to investigate the impact of revising previous estimates when conducting similar estimates in a Planning Poker context. Method – Software activities were estimated by Universidade Tecnológica Federal do Paraná (UTFPR) computer students, using Planning Poker, with and without revising previous similar activities, storing data regarding the decision-making process. And the collected data was used to investigate the impact that revising similar executed activities have in the software effort estimates' accuracy.Obtained Results – The UTFPR computer students were divided into 14 groups. Eight of them showed accuracy increase in more than half of their estimates. Three of them had almost the same accuracy in more than half of their estimates. And only three of them had loss of accuracy in more than half of their estimates. Conclusion – Reviewing the similar executed software activities, when using Planning Poker, led to more accurate software estimates in most cases, and, because of that, can improve the software development process.
Resumo:
Relatório de Estágio apresentado à Escola Superior de Educação do Instituto Politécnico de Castelo Branco para cumprimento dos requisitos necessários à obtenção do grau de Mestre em Educação Pré-escolar e Ensino do 1.º Ciclo do Ensino Básico.