942 resultados para Nonlinear static analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Information security devices must preserve security properties even in the presence of faults. This in turn requires a rigorous evaluation of the system behaviours resulting from component failures, especially how such failures affect information flow. We introduce a compositional method of static analysis for fail-secure behaviour. Our method uses reachability matrices to identify potentially undesirable information flows based on the fault modes of the system's components.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. ^ Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. ^ The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures—cash in advance and documentary credit—have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research focuses on the design and verification of inter-organizational controls. Instead of looking at a documentary procedure, which is the flow of documents and data among the parties, the research examines the underlying deontic purpose of the procedure, the so-called deontic process, and identifies control requirements to secure this purpose. The vision of the research is a formal theory for streamlining bureaucracy in business and government procedures. Underpinning most inter-organizational procedures are deontic relations, which are about rights and obligations of the parties. When all parties trust each other, they are willing to fulfill their obligations and honor the counter parties’ rights; thus controls may not be needed. The challenge is in cases where trust may not be assumed. In these cases, the parties need to rely on explicit controls to reduce their exposure to the risk of opportunism. However, at present there is no analytic approach or technique to determine which controls are needed for a given contracting or governance situation. The research proposes a formal method for deriving inter-organizational control requirements based on static analysis of deontic relations and dynamic analysis of deontic changes. The formal method will take a deontic process model of an inter-organizational transaction and certain domain knowledge as inputs to automatically generate control requirements that a documentary procedure needs to satisfy in order to limit fraud potentials. The deliverables of the research include a formal representation namely Deontic Petri Nets that combine multiple modal logics and Petri nets for modeling deontic processes, a set of control principles that represent an initial formal theory on the relationships between deontic processes and documentary procedures, and a working prototype that uses model checking technique to identify fraud potentials in a deontic process and generate control requirements to limit them. Fourteen scenarios of two well-known international payment procedures -- cash in advance and documentary credit -- have been used to test the prototype. The results showed that all control requirements stipulated in these procedures could be derived automatically.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Android is becoming ubiquitous and currently has the largest share of the mobile OS market with billions of application downloads from the official app market. It has also become the platform most targeted by mobile malware that are becoming more sophisticated to evade state-of-the-art detection approaches. Many Android malware families employ obfuscation techniques in order to avoid detection and this may defeat static analysis based approaches. Dynamic analysis on the other hand may be used to overcome this limitation. Hence in this paper we propose DynaLog, a dynamic analysis based framework for characterizing Android applications. The framework provides the capability to analyse the behaviour of applications based on an extensive number of dynamic features. It provides an automated platform for mass analysis and characterization of apps that is useful for quickly identifying and isolating malicious applications. The DynaLog framework leverages existing open source tools to extract and log high level behaviours, API calls, and critical events that can be used to explore the characteristics of an application, thus providing an extensible dynamic analysis platform for detecting Android malware. DynaLog is evaluated using real malware samples and clean applications demonstrating its capabilities for effective analysis and detection of malicious applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Ph.D, Computing) -- Queen's University, 2016-09-30 09:55:51.506

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider a nonlinear system and show the unexpected and surprising result that, even for high dissipation, the mean energy of a particle can attain higher values than when there is no dissipation in the system. We reconsider the time-dependent annular billiard in the presence of inelastic collisions with the boundaries. For some magnitudes of dissipation, we observe the phenomenon of boundary crisis, which drives the particles to an asymptotic attractive fixed point located at a value of energy that is higher than the mean energy of the nondissipative case and so much higher than the mean energy just before the crisis. We should emphasize that the unexpected results presented here reveal the importance of a nonlinear dynamics analysis to explain the paradoxical strategy of introducing dissipation in the system in order to gain energy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The main objective of this work is to present an alternative boundary element method (BEM) formulation for the static analysis of three-dimensional non-homogeneous isotropic solids. These problems can be solved using the classical boundary element formulation, analyzing each subregion separately and then joining them together by introducing equilibrium and displacements compatibility. Establishing relations between the displacement fundamental solutions of the different domains, the alternative technique proposed in this paper allows analyzing all the domains as one unique solid, not requiring equilibrium or compatibility equations. This formulation also leads to a smaller system of equations when compared to the usual subregion technique, and the results obtained are even more accurate. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Almeida E. S. de, Haddad E. A. and Hewings G. J. D. Transport-regional equity issue revisited, Regional Studies. The objective of this paper is to analyse the relationship between transport and regional equity in Minas Gerais, Brazil. Furthermore, the existence of a trade-off between economic performance and regional equity is investigated as well. To do so, the paper develops a spatial computable general equilibrium model based on Brocker and Schneider`s approach of 2002 to implement comparative static analysis, explicitly incorporating iceberg transportation costs. Four activities are modelled, namely production, final demand, transportation and exports. Two production factors are assumed: labour and other factors. The model has 12 domestic regions and three external regions. Four counterfactual experiments are developed based on decreases in transportation costs due to a `distance shortening`. The main findings indicate that if the transport infrastructure improvement is focused only among poor regions, the promotion of regional equity is insignificant. If the transport infrastructure improvement links are concentrated among rich regions, there is an increase in regional income inequalities. However, if the improvements are targeted to the roads linking poor regions and rich ones, there is greater promotion of regional equity. The same result will occur when improvements are made to all road links of the state. [image omitted] Almeida E. S. de, Haddad E. A. et Hewings G. J. D. La question du rapport entre le transport et l`equilibre regional vue sous un jour nouveau, Regional Studies. Cet article cherche a analyser le rapport entre le transport et l`equilibre regional en Minas Gerais au Bresil. En outre, on examine la presence d`un echange entre la performance economoique et l`equilibre regional. Pour le faire, on construit un modele geographique de l`equilibre general a utiliser sur ordinateur fonde sur l`approche de Brockner et Schneider en 2002 afin de mettre en oeuvre une analyse statique comparative qui comprend explicitement les frais de transport iceberg. On modelise quatre activites, a savoir, la production, la demande finale, le transport et l`exportation. On fait deux suppositions quant aux facteurs de production: la main-d`oevre et d`autres facteurs. Le modele embrasse douze regions internes et trois regions externes. On fait quatre experiences paradoxales fondees sur la baisse des frais de transport due a une `reduction des distances`. Les principaux resultats indiquent que si l`amelioration de l`equipement de transport ne porte que sur les regions defavorisees, la promotion de l`equilibre regional s`avere negligeable. Si l`amelioration de l`equipement de transport focalise les regions riches, il s`avere un creusement des ecarts des revenus regionaux. Cependant, si les ameliorations ciblent les routes qui relient les regions defavorisees aux regions riches, il s`avere une plus grande promotion de l`equilibre regional. Il en va de meme pour la situation ou on a apporte des amenagements a toutes les liaisons routieres de l`etat. Modele geographique de l`equilibre general a utiliser sur ordinateur Equilibre regional Peformance economique Frais de transport Almeida E. S. de, Haddad E. A. und Hewings G. J. D. Die Wiederaufnahme der Frage von Verkehrswesen im Verhaltnis zu regionaler Fairness, Regional Studies. Dieser Aufsatz beabsichtigt, die Beziehung zwischen Verkehrswesen und regionaler Fairness in Minas Gerais (Brasilien) zu analysieren und zugleich auch das Vorkommen von Einbussen entweder bei wirtschaftlicher Leistung der regionaler Fairness zu untersuchen. Zu diesem Zwecke wird ein auf dem Ansatz von Brocker und Schneider (2002) aufbauendes raumliches komputables allgemeines Gleichgewichtsmodell entwickelt, um vergleichende statistische Analysen durchzufuhren, wobei verborgene `Eisberg`-Transportkosten ausdrucklich berucksichtigt werden. Es werden vier Unternehmenstatigk eiten aufgefuhrt: Herstellung, Nachfrage, Transportwesen und Exporte, und zwei Produktionsfaktoren vorausgesetzt: Arbeitskrafte und andere Faktoren. Das Modell umfasst zwolf Inlandsregionen und drei externe Regionen. Es werden vier gegensatzliche Experimente entwickelt, die auf einer Abnahme der Transportkosten infolge einer `Verkurzung der Entfernungen` beruhen. Die Hauptbefunde weisen darauf hin, dass die Forderung regionaler Fairness unbedeutend bleibt, wenn die Verbesserungen der Transportinfrastruktur sich nur auf minderbemittelte Regionen konzentrieren; werden die Verbesserungen der Verbindungen der Transportinfrastruktur in wohlhabenden Regionen durchgefuhrt, so nehmen regionale Einkommensunterschiede zu. Wenn die Verbesserungen jedoch auf Strassen abzielen, die wohlhabende Regionen mit weniger bemittelten verbinden, wird regionale Fairness starker gefordert. Das gleiche Ergebnis wird sich einstellen, wenn Verbesserungen an allen Strassenverbindungen des Staates vorgenommen werden. Raumliches, komputables, allgemeines Gleichgewichtsmodell Regionale Fairness Wirtschaftsleistung Transportkosten Almeida E. S. de, Haddad E. A. y Hewings G. J. D. Revisando el tema de la igualdad del transporte en las regiones, Regional Studies. El objetivo de este documento es analizar la relacion entre el transporte y la igualdad regional en Minas Gerais, Brasil. Asimismo investigamos la existencia de una compensacion entre el rendimiento economico y la igualdad regional. Para ello desarrollamos un modelo de equilibrio general computable y espacial basado en el enfoque de Brocker y Schneider en 2002 para hacer un analisis estatico y comparativo, explicitamente incorporando los costes ocultos de transporte. Se modelan cuatro actividades: la produccion, la demanda final, el transporte y las exportaciones. Suponemos que existen dos factores de produccion: mano de obra y otros factores. En este modelo, existen doce regiones internas y tres regiones externas. Desarrollamos cuatro experimentos contrafactuales basados en la disminucion de los costes de transporte debido a una `acortamiento de las distancias`. Los principales resultados indican que si la mejora de la infraestructura del transporte se centra solo entre las regiones mas pobres, el fomento de la igualdad regional es insignificante. Si los enlaces de la mejora de la infraestructura del transporte se concentran en las regiones ricas, aumentan las desigualdades de ingresos regionales. Sin embargo, si se mejoran los enlaces de carreteras entre las regiones pobres y ricas, se fomenta mejor la igualdad regional. El mismo resultado ocurre cuando se mejoran los enlaces de todas las carreteras del estado. Modelo de equilibrio general computable y espacial Igualdad regional Rendimiento economico Costes de transporte.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The conventional convection-dispersion model is widely used to interrelate hepatic availability (F) and clearance (Cl) with the morphology and physiology of the liver and to predict effects such as changes in liver blood flow on F and Cl. The extension of this model to include nonlinear kinetics and zonal heterogeneity of the liver is not straightforward and requires numerical solution of partial differential equation, which is not available in standard nonlinear regression analysis software. In this paper, we describe an alternative compartmental model representation of hepatic disposition (including elimination). The model allows the use of standard software for data analysis and accurately describes the outflow concentration-time profile for a vascular marker after bolus injection into the liver. In an evaluation of a number of different compartmental models, the most accurate model required eight vascular compartments, two of them with back mixing. In addition, the model includes two adjacent secondary vascular compartments to describe the tail section of the concentration-time profile for a reference marker. The model has the added flexibility of being easy to modify to model various enzyme distributions and nonlinear elimination. Model predictions of F, MTT, CV2, and concentration-time profile as well as parameter estimates for experimental data of an eliminated solute (palmitate) are comparable to those for the extended convection-dispersion model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diante de uma discussão não consensual a respeito da existência ou não de um trade-off entre inflação e desemprego (curva de Phillips), esta dissertação analisa a evolução desta relação na economia brasileira no período 1980-2010 através de duas análises diferentes: A primeira é uma análise considerada estática, realizada com a utilização de uma regressão linear simples. A segunda consiste em uma análise dinâmica, onde é utilizada uma regressão com coeficientes time-varying, com a estimação dos coeficientes sendo realizada com a aplicação do filtro de Kalman. Os resultados econométricos mostraram que a relação entre inflação e desemprego de fato se alterou ao longo do período analisado: A curva de Phillips se torna horizontal após o Plano Real e fica levemente positiva após o Regime de Metas de Inflação. Sendo assim, este trabalho basicamente se divide em duas partes: A primeira consiste de uma contextualização teórica da relação entre inflação e desemprego e do regime de metas de inflação. A segunda parte traz a análise econométrica, onde é descrita a evolução do trade-off. Diante dos resultados encontrados, são apresentadas suas possíveis causas e é realizada uma análise qualitativa da atual política monetária praticada pelo Banco Central do Brasil.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract — The analytical methods based on evaluation models of interactive systems were proposed as an alternative to user testing in the last stages of the software development due to its costs. However, the use of isolated behavioral models of the system limits the results of the analytical methods. An example of these limitations relates to the fact that they are unable to identify implementation issues that will impact on usability. With the introduction of model-based testing we are enable to test if the implemented software meets the specified model. This paper presents an model-based approach for test cases generation from the static analysis of source code.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper reports on the development of specific slicing techniques for functional programs and their use for the identification of possible coherent components from monolithic code. An associated tool is also introduced. This piece of research is part of a broader project on program understanding and re-engineering of legacy code supported by formal methods