908 resultados para Security-critical software
Resumo:
This dissertation examined how United States illicit drug control policy, often commonly referred to as the "war on drugs," contributes to the reproduction of gendered and racialized social relations. Specifically, it analyzed the identity producing practices of United States illicit drug control policy as it relates to the construction of U.S. identities. ^ Drawing on the theoretical contributions of feminist postpositivists, three cases of illicit drug policy practice were discussed. In the first case, discourse analysis was employed to examine recent debates (1986-2005) in U.S. Congressional Hearings about the proper understanding of the illicit drug "threat." The analysis showed how competing policy positions are tied to differing understandings of proper masculinity and the role of policymakers as protectors of the national interest. Utilizing critical visual methodologies, the second case examined a public service media campaign circulated by the Office of National Drug Control Policy that tied the "war on drugs" with another security concern in the U.S., the "war on terror." This case demonstrated how the media campaign uses messages about race, masculinity, and femininity to produce privileged notions of state identity and proper citizenship. The third case examined the gendered politics of drug interdiction at the U.S. border. Using qualitative research methodologies including semi-structured interviews and participant observation, it examined how gender is produced through drug interdiction at border sites like Miami International Airport. By paying attention to the discourse that circulates about women drug couriers, it showed how gender is normalized in a national security setting. ^ What this dissertation found is that illicit drug control policy takes the form it does because of the politics of gender and racial identity and that, as a result, illicit drug policy is implicated in the reproduction of gender and racial inequities. It concluded that a more socially conscious and successful illicit drug policy requires an awareness of the gendered and racialized assumptions that inform and shape policy practices.^
Resumo:
Recent advances in electronic and computer technologies lead to wide-spread deployment of wireless sensor networks (WSNs). WSNs have wide range applications, including military sensing and tracking, environment monitoring, smart environments, etc. Many WSNs have mission-critical tasks, such as military applications. Thus, the security issues in WSNs are kept in the foreground among research areas. Compared with other wireless networks, such as ad hoc, and cellular networks, security in WSNs is more complicated due to the constrained capabilities of sensor nodes and the properties of the deployment, such as large scale, hostile environment, etc. Security issues mainly come from attacks. In general, the attacks in WSNs can be classified as external attacks and internal attacks. In an external attack, the attacking node is not an authorized participant of the sensor network. Cryptography and other security methods can prevent some of external attacks. However, node compromise, the major and unique problem that leads to internal attacks, will eliminate all the efforts to prevent attacks. Knowing the probability of node compromise will help systems to detect and defend against it. Although there are some approaches that can be used to detect and defend against node compromise, few of them have the ability to estimate the probability of node compromise. Hence, we develop basic uniform, basic gradient, intelligent uniform and intelligent gradient models for node compromise distribution in order to adapt to different application environments by using probability theory. These models allow systems to estimate the probability of node compromise. Applying these models in system security designs can improve system security and decrease the overheads nearly in every security area. Moreover, based on these models, we design a novel secure routing algorithm to defend against the routing security issue that comes from the nodes that have already been compromised but have not been detected by the node compromise detecting mechanism. The routing paths in our algorithm detour those nodes which have already been detected as compromised nodes or have larger probabilities of being compromised. Simulation results show that our algorithm is effective to protect routing paths from node compromise whether detected or not.
Resumo:
The tragic events of September 11th ushered a new era of unprecedented challenges. Our nation has to be protected from the alarming threats of adversaries. These threats exploit the nation's critical infrastructures affecting all sectors of the economy. There is the need for pervasive monitoring and decentralized control of the nation's critical infrastructures. The communications needs of monitoring and control of critical infrastructures was traditionally catered for by wired communication systems. These technologies ensured high reliability and bandwidth but are however very expensive, inflexible and do not support mobility and pervasive monitoring. The communication protocols are Ethernet-based that used contention access protocols which results in high unsuccessful transmission and delay. An emerging class of wireless networks, named embedded wireless sensor and actuator networks has potential benefits for real-time monitoring and control of critical infrastructures. The use of embedded wireless networks for monitoring and control of critical infrastructures requires secure, reliable and timely exchange of information among controllers, distributed sensors and actuators. The exchange of information is over shared wireless media. However, wireless media is highly unpredictable due to path loss, shadow fading and ambient noise. Monitoring and control applications have stringent requirements on reliability, delay and security. The primary issue addressed in this dissertation is the impact of wireless media in harsh industrial environment on the reliable and timely delivery of critical data. In the first part of the dissertation, a combined networking and information theoretic approach was adopted to determine the transmit power required to maintain a minimum wireless channel capacity for reliable data transmission. The second part described a channel-aware scheduling scheme that ensured efficient utilization of the wireless link and guaranteed delay. Various analytical evaluations and simulations are used to evaluate and validate the feasibility of the methodologies and demonstrate that the protocols achieved reliable and real-time data delivery in wireless industrial networks.
Resumo:
This dissertation established a software-hardware integrated design for a multisite data repository in pediatric epilepsy. A total of 16 institutions formed a consortium for this web-based application. This innovative fully operational web application allows users to upload and retrieve information through a unique human-computer graphical interface that is remotely accessible to all users of the consortium. A solution based on a Linux platform with My-SQL and Personal Home Page scripts (PHP) has been selected. Research was conducted to evaluate mechanisms to electronically transfer diverse datasets from different hospitals and collect the clinical data in concert with their related functional magnetic resonance imaging (fMRI). What was unique in the approach considered is that all pertinent clinical information about patients is synthesized with input from clinical experts into 4 different forms, which were: Clinical, fMRI scoring, Image information, and Neuropsychological data entry forms. A first contribution of this dissertation was in proposing an integrated processing platform that was site and scanner independent in order to uniformly process the varied fMRI datasets and to generate comparative brain activation patterns. The data collection from the consortium complied with the IRB requirements and provides all the safeguards for security and confidentiality requirements. An 1-MR1-based software library was used to perform data processing and statistical analysis to obtain the brain activation maps. Lateralization Index (LI) of healthy control (HC) subjects in contrast to localization-related epilepsy (LRE) subjects were evaluated. Over 110 activation maps were generated, and their respective LIs were computed yielding the following groups: (a) strong right lateralization: (HC=0%, LRE=18%), (b) right lateralization: (HC=2%, LRE=10%), (c) bilateral: (HC=20%, LRE=15%), (d) left lateralization: (HC=42%, LRE=26%), e) strong left lateralization: (HC=36%, LRE=31%). Moreover, nonlinear-multidimensional decision functions were used to seek an optimal separation between typical and atypical brain activations on the basis of the demographics as well as the extent and intensity of these brain activations. The intent was not to seek the highest output measures given the inherent overlap of the data, but rather to assess which of the many dimensions were critical in the overall assessment of typical and atypical language activations with the freedom to select any number of dimensions and impose any degree of complexity in the nonlinearity of the decision space.
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic.^ This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.^
Resumo:
Small states that lack capacity and act on their own may fall victim to international and domestic terrorism, transnational organized crime or criminal gangs. The critical issue is not whether small Caribbean states should cooperate in meeting security challenges, but it is rather in what manner, and by which mechanisms can they overcome obstacles in the way of cooperation. The remit of the Regional Security System (RSS) has expanded dramatically, but its capabilities have improved very slowly. The member governments of the RSS are reluctant to develop military capacity beyond current levels since they see economic and social development and disaster relief as priorities, requiring little investment in military hardware. The RSS depends on international donors such as the USA, Canada, Great Britain, and increasingly China to fund training programs, maintain equipment and acquire material. In the view of most analysts, an expanded regional arrangement based on an RSS nucleus is not likely in the foreseeable future. Regional political consensus remains elusive and the predominance of national interests over regional considerations continues to serve as an obstacle to any CARICOM wide regional defense mechanism. Countries in the Caribbean, including the members of the RSS, have to become more responsible for their own security from their own resources. While larger CARICOM economies can do this, it would be difficult for most OECS members of the RSS to do the same. The CARICOM region including the RSS member countries, have undertaken direct regional initiatives in security collaboration. Implementation of the recommendations of the Regional Task Force on Crime and Security (RTFCS) and the structure and mechanisms created for the staging of the Cricket World Cup (CWC 2007) resulted in unprecedented levels of cooperation and permanent legacy institutions for the regional security toolbox. The most important tier of security relationships for the region is the United States and particularly USSOUTHCOM. The Caribbean Basin Security Initiative [CBSI] in which the countries of the RSS participate is a useful U.S. sponsored tool to strengthen the capabilities of the Caribbean countries and promote regional ownership of security initiatives. Future developments under discussion by policy makers in the Caribbean security environment include the granting of law enforcement authority to the military, the formation of a single OECS Police Force, and the creation of a single judicial and law enforcement space. The RSS must continue to work with its CARICOM partners, as well as with the traditional “Atlantic Powers” particularly Canada, the United States and the United Kingdom to implement a general framework for regional security collaboration. Regional security cooperation should embrace wider traditional and non-traditional elements of security appropriate to the 21st century. Security cooperation must utilize to the maximum the best available institutions, mechanisms, techniques and procedures already available in the region. The objective should not be the creation of new agencies but rather the generation of new resources to take effective operations to higher cumulative levels. Security and non-security tools should be combined for both strategic and operational purposes. Regional, hemispheric, and global implications of tactical and operational actions must be understood and appreciated by the forces of the RSS member states. The structure and mechanisms, created for the staging of Cricket World Cup 2007 should remain as legacy institutions and a toolbox for improving regional security cooperation in the Caribbean. RSS collaboration should build on the process of operational level synergies with traditional military partners. In this context, the United States must be a true partner with shared interests, and with the ability to work unobtrusively in a nationalistic environment. Withdrawal of U.S. support for the RSS is not an option.
Resumo:
This study on China’s relations with Brazil and Argentina, as well as its implications for U.S. concerns examines two main questions: Why China’s increasing influence on Brazil and Argentina may be considered a cause for U.S. security concerns? And if this is the case, how do China’s strategic alliances with the two countries has impacted U.S. leadership? In an effort to look at China’s influence from multidimensional angles and beyond China’s visible economic influence in these two countries, this paper argues that China’s interest in the Latin American region, with a focus on brazil and Argentina, responds to a more crafted, pragmatic and tailored vision with long-term strategic and political goals. The results of this study reveal that China – avoiding intra-regional competition through a strategic diversification of sectors – has been able to secure critical resources for its population as well as promote enduring alliances in the region that could represent a plausible cause of concern for U.S. interests. In this regard, China’s avoidance of a direct challenge to traditional partners’ influence has responded to the gaps left by a gradual, but steady lack of U.S. involvement.
Resumo:
As researchers and practitioners move towards a vision of software systems that configure, optimize, protect, and heal themselves, they must also consider the implications of such self-management activities on software reliability. Autonomic computing (AC) describes a new generation of software systems that are characterized by dynamically adaptive self-management features. During dynamic adaptation, autonomic systems modify their own structure and/or behavior in response to environmental changes. Adaptation can result in new system configurations and capabilities, which need to be validated at runtime to prevent costly system failures. However, although the pioneers of AC recognize that validating autonomic systems is critical to the success of the paradigm, the architectural blueprint for AC does not provide a workflow or supporting design models for runtime testing. ^ This dissertation presents a novel approach for seamlessly integrating runtime testing into autonomic software. The approach introduces an implicit self-test feature into autonomic software by tailoring the existing self-management infrastructure to runtime testing. Autonomic self-testing facilitates activities such as test execution, code coverage analysis, timed test performance, and post-test evaluation. In addition, the approach is supported by automated testing tools, and a detailed design methodology. A case study that incorporates self-testing into three autonomic applications is also presented. The findings of the study reveal that autonomic self-testing provides a flexible approach for building safe, reliable autonomic software, while limiting the development and performance overhead through software reuse. ^
Resumo:
Recent studies on the economic status of women in Miami-Dade County (MDC) reveal an alarming rate of economic insecurity and significant obstacles for women to achieve economic security. Consistent barriers to women's economic security affect not only the health and wellbeing of women and their families, but also economic prospects for the community. A key study reveals in Miami-Dade County, "Thirty-nine percent of single female-headed families with at least one child are living at or below the federal poverty level" and "over half of working women do not earn adequate income to cover their basic necessities" (Brion 2009, 1). Moreover, conventional measures of poverty do not adequately capture women's struggles to support themselves and their families, nor do they document the numbers of women seeking basic self-sufficiency. Even though there is lack of accurate data on women in the county, which is a critical problem, there is also a dearth of social science research on existing efforts to enhance women's economic security in Miami-Dade County. My research contributes to closing the information gap by examining the characteristics and strategies of women-led community development organizations (CDOs) in MDC, working to address women's economic insecurity. The research is informed by a framework developed by Marilyn Gittell, who pioneered an approach to study women-led CDOs in the United States. On the basis of research in nine U.S. cities, she concluded that women-led groups increased community participation and "by creating community networks and civic action, they represent a model for community development efforts" (Gittell, et al. 2000, 123). My study documents the strategies and networks of women-led CDOs in MDC that prioritize women's economic security. Their strategies are especially important during these times of economic recession and government reductions in funding towards social services. The focus of the research is women-led CDOs that work to improve social services access, economic opportunity, civic participation and capacity, and women's rights. Although many women-led CDOs prioritize building social infrastructures that promote change, inequalities in economic and political status for women without economic security remain a challenge (Young 2004). My research supports previous studies by Gittell, et al., finding that women-led CDOs in Miami-Dade County have key characteristics of a model of community development efforts that use networking and collaboration to strengthen their broad, integrated approach. The resulting community partnerships, coupled with participation by constituents in the development process, build a foundation to influence policy decisions for social change. In addition, my findings show that women-led CDOs in Miami-Dade County have a major focus on alleviating poverty and economic insecurity, particularly that of women. Finally, it was found that a majority of the five organizations network transnationally, using lessons learned to inform their work of expanding the agency of their constituents and placing the economic empowerment of women as central in the process of family and community development.
Resumo:
Japan is an important ally of the United States–the world’s third biggest economy, and one of the regional great powers in Asia. Making sense of Japan’s foreign and security policies is crucial for the future of peace and stability in Northeast Asia, where the possible sources of conflict such as territorial disputes or the disputes over Japan’s war legacy issues are observed. This dissertation explored Japan’s foreign and security policies based on Japan’s identities and unconscious ideologies. It employed an analysis of selected Japanese films from the late 1940s to the late 1950s, as well as from the late 1990s to the mid-2000s. The analysis demonstrated that Japan’s foreign and security policies could be understood in terms of a broader social narrative that was visible in Japanese popular cultural products, including films and literatures. Narratives of Japanese families from the patriarch’s point of view, for example, had constantly shaped Japan’s foreign and security policies. As a result, the world was ordered hierarchically in the eyes of the Japan Self. In the 1950s, Japan tenaciously constructed close but asymmetrical security relations with the U.S. in which Japan willingly subjugated itself to the U.S. In the 2000s, Japan again constructed close relations with the U.S. by doing its best to support American responses to the 9/11 terrorist attacks by mobilizing Japan’s SDFs in the way Japan had never done in the past. The concepts of identity and unconscious ideology are helpful in understanding how Japan’s own understanding of self, of others, and of the world have shaped its own behaviors. These concepts also enable Japan to reevaluate its own behaviors reflexively, which departs from existing alternative approaches. This study provided a critical analytical explanation of the dynamics at work in Japan’s sense of identity, particularly with regard to its foreign and security policies.
Resumo:
Recent advances in electronic and computer technologies lead to wide-spread deployment of wireless sensor networks (WSNs). WSNs have wide range applications, including military sensing and tracking, environment monitoring, smart environments, etc. Many WSNs have mission-critical tasks, such as military applications. Thus, the security issues in WSNs are kept in the foreground among research areas. Compared with other wireless networks, such as ad hoc, and cellular networks, security in WSNs is more complicated due to the constrained capabilities of sensor nodes and the properties of the deployment, such as large scale, hostile environment, etc. Security issues mainly come from attacks. In general, the attacks in WSNs can be classified as external attacks and internal attacks. In an external attack, the attacking node is not an authorized participant of the sensor network. Cryptography and other security methods can prevent some of external attacks. However, node compromise, the major and unique problem that leads to internal attacks, will eliminate all the efforts to prevent attacks. Knowing the probability of node compromise will help systems to detect and defend against it. Although there are some approaches that can be used to detect and defend against node compromise, few of them have the ability to estimate the probability of node compromise. Hence, we develop basic uniform, basic gradient, intelligent uniform and intelligent gradient models for node compromise distribution in order to adapt to different application environments by using probability theory. These models allow systems to estimate the probability of node compromise. Applying these models in system security designs can improve system security and decrease the overheads nearly in every security area. Moreover, based on these models, we design a novel secure routing algorithm to defend against the routing security issue that comes from the nodes that have already been compromised but have not been detected by the node compromise detecting mechanism. The routing paths in our algorithm detour those nodes which have already been detected as compromised nodes or have larger probabilities of being compromised. Simulation results show that our algorithm is effective to protect routing paths from node compromise whether detected or not.
Resumo:
The future power grid will effectively utilize renewable energy resources and distributed generation to respond to energy demand while incorporating information technology and communication infrastructure for their optimum operation. This dissertation contributes to the development of real-time techniques, for wide-area monitoring and secure real-time control and operation of hybrid power systems. ^ To handle the increased level of real-time data exchange, this dissertation develops a supervisory control and data acquisition (SCADA) system that is equipped with a state estimation scheme from the real-time data. This system is verified on a specially developed laboratory-based test bed facility, as a hardware and software platform, to emulate the actual scenarios of a real hybrid power system with the highest level of similarities and capabilities to practical utility systems. It includes phasor measurements at hundreds of measurement points on the system. These measurements were obtained from especially developed laboratory based Phasor Measurement Unit (PMU) that is utilized in addition to existing commercially based PMU’s. The developed PMU was used in conjunction with the interconnected system along with the commercial PMU’s. The tested studies included a new technique for detecting the partially islanded micro grids in addition to several real-time techniques for synchronization and parameter identifications of hybrid systems. ^ Moreover, due to numerous integration of renewable energy resources through DC microgrids, this dissertation performs several practical cases for improvement of interoperability of such systems. Moreover, increased number of small and dispersed generating stations and their need to connect fast and properly into the AC grids, urged this work to explore the challenges that arise in synchronization of generators to the grid and through introduction of a Dynamic Brake system to improve the process of connecting distributed generators to the power grid.^ Real time operation and control requires data communication security. A research effort in this dissertation was developed based on Trusted Sensing Base (TSB) process for data communication security. The innovative TSB approach improves the security aspect of the power grid as a cyber-physical system. It is based on available GPS synchronization technology and provides protection against confidentiality attacks in critical power system infrastructures. ^
Resumo:
Ensuring the correctness of software has been the major motivation in software research, constituting a Grand Challenge. Due to its impact in the final implementation, one critical aspect of software is its architectural design. By guaranteeing a correct architectural design, major and costly flaws can be caught early on in the development cycle. Software architecture design has received a lot of attention in the past years, with several methods, techniques and tools developed. However, there is still more to be done, such as providing adequate formal analysis of software architectures. On these regards, a framework to ensure system dependability from design to implementation has been developed at FIU (Florida International University). This framework is based on SAM (Software Architecture Model), an ADL (Architecture Description Language), that allows hierarchical compositions of components and connectors, defines an architectural modeling language for the behavior of components and connectors, and provides a specification language for the behavioral properties. The behavioral model of a SAM model is expressed in the form of Petri nets and the properties in first order linear temporal logic. This dissertation presents a formal verification and testing approach to guarantee the correctness of Software Architectures. The Software Architectures studied are expressed in SAM. For the formal verification approach, the technique applied was model checking and the model checker of choice was Spin. As part of the approach, a SAM model is formally translated to a model in the input language of Spin and verified for its correctness with respect to temporal properties. In terms of testing, a testing approach for SAM architectures was defined which includes the evaluation of test cases based on Petri net testing theory to be used in the testing process at the design level. Additionally, the information at the design level is used to derive test cases for the implementation level. Finally, a modeling and analysis tool (SAM tool) was implemented to help support the design and analysis of SAM models. The results show the applicability of the approach to testing and verification of SAM models with the aid of the SAM tool.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.