942 resultados para Nonlinear static analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent approaches to mobile code safety, like proof- arrying code, involve associating safety information to programs. The code supplier provides a program and also includes with it a certifícate (or proof) whose validity entails compliance with a predefined safety policy. The intended benefit is that the program consumer can locally validate the certifícate w.r.t. the "untrusted" program by means of a certifícate checker—a process which should be much simpler, eflicient, and automatic than generating the original proof. We herein introduce a novel approach to mobile code safety which follows a similar scheme, but which is based throughout on the use of abstract interpretation techniques. In our framework the safety policy is specified by using an expressive assertion language defined over abstract domains. We identify a particular slice of the abstract interpretation-based static analysis results which is especially useful as a certifícate. We propose an algorithm for checking the validity of the certifícate on the consumer side which is itself in fact a very simplified and eflicient specialized abstract-interpreter. Our ideas are illustrated through an example implemented in the CiaoPP system. Though further experimentation is still required, we believe the proposed approach is of interest for bringing the automation and expressiveness which is inherent in the abstract interpretation techniques to the área of mobile code safety.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tesis doctoral se enmarca dentro de la computación con membranas. Se trata de un tipo de computación bio-inspirado, concretamente basado en las células de los organismos vivos, en las que se producen múltiples reacciones de forma simultánea. A partir de la estructura y funcionamiento de las células se han definido diferentes modelos formales, denominados P sistemas. Estos modelos no tratan de modelar el comportamiento biológico de una célula, sino que abstraen sus principios básicos con objeto de encontrar nuevos paradigmas computacionales. Los P sistemas son modelos de computación no deterministas y masivamente paralelos. De ahí el interés que en los últimos años estos modelos han suscitado para la resolución de problemas complejos. En muchos casos, consiguen resolver de forma teórica problemas NP-completos en tiempo polinómico o lineal. Por otra parte, cabe destacar también la aplicación que la computación con membranas ha tenido en la investigación de otros muchos campos, sobre todo relacionados con la biología. Actualmente, una gran cantidad de estos modelos de computación han sido estudiados desde el punto de vista teórico. Sin embargo, el modo en que pueden ser implementados es un reto de investigación todavía abierto. Existen varias líneas en este sentido, basadas en arquitecturas distribuidas o en hardware dedicado, que pretenden acercarse en lo posible a su carácter no determinista y masivamente paralelo, dentro de un contexto de viabilidad y eficiencia. En esta tesis doctoral se propone la realización de un análisis estático del P sistema, como vía para optimizar la ejecución del mismo en estas plataformas. Se pretende que la información recogida en tiempo de análisis sirva para configurar adecuadamente la plataforma donde se vaya a ejecutar posteriormente el P sistema, obteniendo como consecuencia una mejora en el rendimiento. Concretamente, en esta tesis se han tomado como referencia los P sistemas de transiciones para llevar a cabo el estudio de dicho análisis estático. De manera un poco más específica, el análisis estático propuesto en esta tesis persigue que cada membrana sea capaz de determinar sus reglas activas de forma eficiente en cada paso de evolución, es decir, aquellas reglas que reúnen las condiciones adecuadas para poder ser aplicadas. En esta línea, se afronta el problema de los estados de utilidad de una membrana dada, que en tiempo de ejecución permitirán a la misma conocer en todo momento las membranas con las que puede comunicarse, cuestión que determina las reglas que pueden aplicarse en cada momento. Además, el análisis estático propuesto en esta tesis se basa en otra serie de características del P sistema como la estructura de membranas, antecedentes de las reglas, consecuentes de las reglas o prioridades. Una vez obtenida toda esta información en tiempo de análisis, se estructura en forma de árbol de decisión, con objeto de que en tiempo de ejecución la membrana obtenga las reglas activas de la forma más eficiente posible. Por otra parte, en esta tesis se lleva a cabo un recorrido por un número importante de arquitecturas hardware y software que diferentes autores han propuesto para implementar P sistemas. Fundamentalmente, arquitecturas distribuidas, hardware dedicado basado en tarjetas FPGA y plataformas basadas en microcontroladores PIC. El objetivo es proponer soluciones que permitan implantar en dichas arquitecturas los resultados obtenidos del análisis estático (estados de utilidad y árboles de decisión para reglas activas). En líneas generales, se obtienen conclusiones positivas, en el sentido de que dichas optimizaciones se integran adecuadamente en las arquitecturas sin penalizaciones significativas. Summary Membrane computing is the focus of this doctoral thesis. It can be considered a bio-inspired computing type. Specifically, it is based on living cells, in which many reactions take place simultaneously. From cell structure and operation, many different formal models have been defined, named P systems. These models do not try to model the biological behavior of the cell, but they abstract the basic principles of the cell in order to find out new computational paradigms. P systems are non-deterministic and massively parallel computational models. This is why, they have aroused interest when dealing with complex problems nowadays. In many cases, they manage to solve in theory NP problems in polynomial or lineal time. On the other hand, it is important to note that membrane computing has been successfully applied in many researching areas, specially related to biology. Nowadays, lots of these computing models have been sufficiently characterized from a theoretical point of view. However, the way in which they can be implemented is a research challenge, that it is still open nowadays. There are some lines in this way, based on distributed architectures or dedicated hardware. All of them are trying to approach to its non-deterministic and parallel character as much as possible, taking into account viability and efficiency. In this doctoral thesis it is proposed carrying out a static analysis of the P system in order to optimize its performance in a computing platform. The general idea is that after data are collected in analysis time, they are used for getting a suitable configuration of the computing platform in which P system is going to be performed. As a consequence, the system throughput will improve. Specifically, this thesis has made use of Transition P systems for carrying out the study in static analysis. In particular, the static analysis proposed in this doctoral thesis tries to achieve that every membrane can efficiently determine its active rules in every evolution step. These rules are the ones that can be applied depending on the system configuration at each computational step. In this line, we are going to tackle the problem of the usefulness states for a membrane. This state will allow this membrane to know the set of membranes with which communication is possible at any time. This is a very important issue in determining the set of rules that can be applied. Moreover, static analysis in this thesis is carried out taking into account other properties such as membrane structure, rule antecedents, rule consequents and priorities among rules. After collecting all data in analysis time, they are arranged in a decision tree structure, enabling membranes to obtain the set of active rules as efficiently as possible in run-time system. On the other hand, in this doctoral thesis is going to carry out an overview of hardware and software architectures, proposed by different authors in order to implement P systems, such as distributed architectures, dedicated hardware based on PFGA, and computing platforms based on PIC microcontrollers. The aim of this overview is to propose solutions for implementing the results of the static analysis, that is, usefulness states and decision trees for active rules. In general, conclusions are satisfactory, because these optimizations can be properly integrated in most of the architectures without significant penalties.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

. Este proyecto tiene por objeto el desarrollo y la implantación de un sistema de optimización económica de instalaciones de producción de Régimen Especial, en una compañía eléctrica que representa a este tipo de instalaciones para su participación en los distintos mercados de electricidad. Inicialmente, se analizarán las opciones de participación en los distintos mercados de cada tecnología, en función de la legislación a la que pudieran estar acogidas las instalaciones del Régimen Especial; según tamaño y tipología de instalaciones, características o combustibles utilizados. En segundo lugar, se estudiará la relación entre dichas instalaciones y los distintos organismos reguladores del mercado; la Comisión Nacional de la Energía, el Ministerio de Industria y Turismo, Red Eléctrica de España y el Operador del mercado eléctrico. Posteriormente, se realizará un modelo de análisis estático de la situación actual de los mercados eléctricos, su estructura y funcionamiento, obteniendo para cada tipo de instalación el escenario de optimización de base, a partir del cual se podrá desarrollar el modelo dinámico que permitirá conocer en cualquier momento cuál será la mejor opción técnico-económica para cada tipo de instalación, optimizando así el presupuesto de las instalaciones objeto del estudio. Finalmente, este modelo se implementará en los sistemas de la compañía eléctrica, como una herramienta que permitirá asegurar la optimización en tiempo real a las instalaciones a las que representa en el mercado, optimizando sus propios costes a través de la implementación de este sistema automático y logrando así maximizar los ingresos de la compañía eléctrica. ABSTRACT DEVELOPMENT OF A SYSTEM OF ECONOMIC OPTIMIZATION FOR PLANTS OF SPECIAL REGIME. This project takes as an object the development and the implantation of a system of economic optimization of facilities of production of Special Regime, in an electrical company that it represents to this type of facilities for his participation in the different markets of electricity. Initially, there will be analyzed the options of participation in the different markets of every technology, depending on the legislation to which there could be received the facilities of the Special Regime; according to size and typology of facilities, characteristics or used fuels. Secondly, the relation will be studied between the above mentioned facilities and the different regulatory organisms of the market; the National Commission of the Energy, the Department of Industry and Tourism, Electrical Network of Spain and the Operator of the electrical market. Later, there will carry out a model of static analysis of the current situation of the electrical markets, his structure and functioning, obtaining for every type of installation the scene of base optimization, from which there will be able to develop the dynamic model who will allow to know at any time which will be the best technical - economic option for every type of installation, optimizing this way the budget of the facilities I object of the study. Finally, this model will be implemented in the systems of the electrical company, as a tool that will allow to assure the real time optimization to the facilities to which it represents on the market, optimizing his own costs across the implementation of this automatic system and managing this way to maximize the income of the electrical company.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the ever growing trend of smart phones and tablets, Android is becoming more and more popular everyday. With more than one billion active users i to date, Android is the leading technology in smart phone arena. In addition to that, Android also runs on Android TV, Android smart watches and cars. Therefore, in recent years, Android applications have become one of the major development sectors in software industry. As of mid 2013, the number of published applications on Google Play had exceeded one million and the cumulative number of downloads was more than 50 billionii. A 2013 survey also revealed that 71% of the mobile application developers work on developing Android applicationsiii. Considering this size of Android applications, it is quite evident that people rely on these applications on a daily basis for the completion of simple tasks like keeping track of weather to rather complex tasks like managing one’s bank accounts. Hence, like every other kind of code, Android code also needs to be verified in order to work properly and achieve a certain confidence level. Because of the gigantic size of the number of applications, it becomes really hard to manually test Android applications specially when it has to be verified for various versions of the OS and also, various device configurations such as different screen sizes and different hardware availability. Hence, recently there has been a lot of work on developing different testing methods for Android applications in Computer Science fraternity. The model of Android attracts researchers because of its open source nature. It makes the whole research model more streamlined when the code for both, application and the platform are readily available to analyze. And hence, there has been a great deal of research in testing and static analysis of Android applications. A great deal of this research has been focused on the input test generation for Android applications. Hence, there are a several testing tools available now, which focus on automatic generation of test cases for Android applications. These tools differ with one another on the basis of their strategies and heuristics used for this generation of test cases. But there is still very little work done on the comparison of these testing tools and the strategies they use. Recently, some research work has been carried outiv in this regard that compared the performance of various available tools with respect to their respective code coverage, fault detection, ability to work on multiple platforms and their ease of use. It was done, by running these tools on a total of 60 real world Android applications. The results of this research showed that although effective, these strategies being used by the tools, also face limitations and hence, have room for improvement. The purpose of this thesis is to extend this research into a more specific and attribute-­‐ oriented way. Attributes refer to the tasks that can be completed using the Android platform. It can be anything ranging from a basic system call for receiving an SMS to more complex tasks like sending the user to another application from the current one. The idea is to develop a benchmark for Android testing tools, which is based on the performance related to these attributes. This will allow the comparison of these tools with respect to these attributes. For example, if there is an application that plays some audio file, will the testing tool be able to generate a test input that will warrant the execution of this audio file? Using multiple applications using different attributes, it can be visualized that which testing tool is more useful for which kinds of attributes. In this thesis, it was decided that 9 attributes covering the basic nature of tasks, will be targeted for the assessment of three testing tools. Later this can be done for much more attributes to compare even more testing tools. The aim of this work is to show that this approach is effective and can be used on a much larger scale. One of the flagship features of this work, which also differentiates it with the previous work, is that the applications used, are all specially made for this research. The reason for doing that is to analyze just that specific attribute in isolation, which the application is focused on, and not allow the tool to get bottlenecked by something trivial, which is not the main attribute under testing. This means 9 applications, each focused on one specific attribute. The main contributions of this thesis are: A summary of the three existing testing tools and their respective techniques for automatic test input generation of Android Applications. • A detailed study of the usage of these testing tools using the 9 applications specially designed and developed for this study. • The analysis of the obtained results of the study carried out. And a comparison of the performance of the selected tools.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O método dos elementos finitos é o método numérico mais difundido na análise de estruturas. Ao longo das últimas décadas foram formulados inúmeros elementos finitos para análise de cascas e placas. As formulações de elementos finitos lidam bem com o campo de deslocamentos, mas geralmente faltam testes que possam validar os resultados obtidos para o campo das tensões. Este trabalho analisa o elemento finito T6-3i, um elemento finito triangular de seis nós proposto dentro de uma formulação geometricamente exata, em relação aos seus resultados de tensões, comparando-os com as teorias analíticas de placas, resultados de tabelas para o cálculo de momentos em placas retangulares e do ANSYSr, um software comercial para análise estrutural, mostrando que o T6-3i pode apresentar resultados insatisfatórios. Na segunda parte deste trabalho, as potencialidades do T6-3i são expandidas, sendo proposta uma formulação dinâmica para análise não linear de cascas. Utiliza-se um modelo Lagrangiano atualizado e a forma fraca é obtida do Teorema dos Trabalhos Virtuais. São feitas simulações numéricas da deformação de domos finos que apresentam vários snap-throughs e snap-backs, incluindo domos com vincos curvos, mostrando a robustez, simplicidade e versatilidade do elemento na sua formulação e na geração das malhas não estruturadas necessárias para as simulações.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Projet de recherche réalisé en 2014-2015 avec l'appui du Fonds de recherche du Québec – Société et culture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The eardrum separates the external ear from the middle ear and it is responsible to convert the acoustical energy into mechanical energy. It is divided by pars tensa and pars flaccida. The aim of this work is to analyze the susceptibility of the four quadrants of the pars tensa under negative pressure, to different lamina propria fibers distribution. The development of associated ear pathology, in particular the formation of retraction pockets, is also evaluated. To analyze these effects, a computational biomechanical model of the tympano-ossicular chain was constructed using computerized tomography images and based on the finite element method. Three fibers distributions in the eardrum middle layer were compared: case 1 (eardrum with a circular band of fibers surrounding all quadrants equally), case 2 (eardrum with a circular band of fibers that decreases in thickness in posterior quadrants), case 3 (eardrum without circular fibers in the posterior/superior quadrant). A static analysis was performed by applying approximately 3000Pa in the eardrum. The pars tensa of the eardrum was divided in four quadrants and the displacement of a central point of each quadrant analyzed. The largest displacements of the eardrum were obtained for the eardrum without circular fibers in the posterior/superior quadrant.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The eardrum separates the external ear from the middle ear and it is responsible to convert the acoustical energy into mechanical energy. It is divided by pars tensa and pars flaccida. The aim of this work is to analyze the susceptibility of the four quadrants of the pars tensa under negative pressure, to different lamina propria fibers distribution. The development of associated ear pathology, in particular the formation of retraction pockets, is also evaluated. To analyze these effects, a computational biomechanical model of the tympano-ossicular chain was constructed using computerized tomography images and based on the finite element method. Three fibers distributions in the eardrum middle layer were compared: case 1 (eardrum with a circular band of fibers surrounding all quadrants equally), case 2 (eardrum with a circular band of fibers that decreases in thickness in posterior quadrants), case 3 (eardrum without circular fibers in the posterior/superior quadrant). A static analysis was performed by applying approximately 3000Pa in the eardrum. The pars tensa of the eardrum was divided in four quadrants and the displacement of a central point of each quadrant analyzed. The largest displacements of the eardrum were obtained for the eardrum without circular fibers in the posterior/superior quadrant.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Projet de recherche réalisé en 2014-2015 avec l'appui du Fonds de recherche du Québec – Société et culture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Summary form only given. The Java programming language supports concurrency. Concurrent programs are harder to verify than their sequential counterparts due to their inherent nondeterminism and a number of specific concurrency problems such as interference and deadlock. In previous work, we proposed a method for verifying concurrent Java components based on a mix of code inspection, static analysis tools, and the ConAn testing tool. The method was derived from an analysis of concurrency failures in Java components, but was not applied in practice. In this paper, we explore the method by applying it to an implementation of the well-known readers-writers problem and a number of mutants of that implementation. We only apply it to a single, well-known example, and so we do not attempt to draw any general conclusions about the applicability or effectiveness of the method. However, the exploration does point out several strengths and weaknesses in the method, which enable us to fine-tune the method before we carry out a more formal evaluation on other, more realistic components.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Linear typing schemes can be used to guarantee non-interference and so the soundness of in-place update with respect to a functional semantics. But linear schemes are restrictive in practice, and more restrictive than necessary to guarantee soundness of in-place update. This limitation has prompted research into static analysis and more sophisticated typing disciplines to determine when in-place update may be safely used, or to combine linear and non-linear schemes. Here we contribute to this direction by defining a new typing scheme that better approximates the semantic property of soundness of in-place update for a functional semantics. We begin from the observation that some data are used only in a read-only context, after which it may be safely re-used before being destroyed. Formalising the in-place update interpretation in a machine model semantics allows us to refine this observation, motivating three usage aspects apparent from the semantics that are used to annotate function argument types. The aspects are (1) used destructively, (2), used read-only but shared with result, and (3) used read-only and not shared with the result. The main novelty is aspect (2), which allows a linear value to be safely read and even aliased with a result of a function without being consumed. This novelty makes our type system more expressive than previous systems for functional languages in the literature. The system remains simple and intuitive, but it enjoys a strong soundness property whose proof is non-trivial. Moreover, our analysis features principal types and feasible type reconstruction, as shown in M. Konen'y (In TYPES 2002 workshop, Nijmegen, Proceedings, Springer-Verlag, 2003).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis introduces and develops a novel real-time predictive maintenance system to estimate the machine system parameters using the motion current signature. Recently, motion current signature analysis has been addressed as an alternative to the use of sensors for monitoring internal faults of a motor. A maintenance system based upon the analysis of motion current signature avoids the need for the implementation and maintenance of expensive motion sensing technology. By developing nonlinear dynamical analysis for motion current signature, the research described in this thesis implements a novel real-time predictive maintenance system for current and future manufacturing machine systems. A crucial concept underpinning this project is that the motion current signature contains infor­mation relating to the machine system parameters and that this information can be extracted using nonlinear mapping techniques, such as neural networks. Towards this end, a proof of con­cept procedure is performed, which substantiates this concept. A simulation model, TuneLearn, is developed to simulate the large amount of training data required by the neural network ap­proach. Statistical validation and verification of the model is performed to ascertain confidence in the simulated motion current signature. Validation experiment concludes that, although, the simulation model generates a good macro-dynamical mapping of the motion current signature, it fails to accurately map the micro-dynamical structure due to the lack of knowledge regarding performance of higher order and nonlinear factors, such as backlash and compliance. Failure of the simulation model to determine the micro-dynamical structure suggests the pres­ence of nonlinearity in the motion current signature. This motivated us to perform surrogate data testing for nonlinearity in the motion current signature. Results confirm the presence of nonlinearity in the motion current signature, thereby, motivating the use of nonlinear tech­niques for further analysis. Outcomes of the experiment show that nonlinear noise reduction combined with the linear reverse algorithm offers precise machine system parameter estimation using the motion current signature for the implementation of the real-time predictive maintenance system. Finally, a linear reverse algorithm, BJEST, is developed and applied to the motion current signature to estimate the machine system parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kutatásunk alapvetése, hogy egy ország versenyképessége az értékteremtő munkamegosztást támogató teljes közösségi intézményrendszer sikeres működésén múlik. Munkánkkal arra kerestük a választ, milyen értékek és motivációk alakítják a magyar gazdaság intézményrendszerét. Nem a hivatalos magatartási szabályok statikus elemzésére koncentráltunk, hanem a normák, konvenciók és innovációk világára, az intézményrendszer jövőjét befolyásoló dinamikus elemekre. Elemzésünk fókuszában a társadalmi és vállalkozói értékek, a gazdaságpolitika formálók versenyképességi narratívái, a helyi gazdaságok versenyképességi tényezői, a versenyképesség javítását szolgáló magánkezdeményezések és a nonprofit szektor működése álltak. Fő eredményünk, hogy a Magyarország jövőbeli versenyképességét befolyásoló tudati elemek - a gazdasági döntéshozók motivációi és normái – megfelelő alapot teremtenek a gazdaságunk versenyképességét megerősítő üzleti, civil és kormányzati kezdeményezések számára. Magas közösségi és morális elvárások jellemzik a lakosság és a vállalkozók értékrendjét. A gazdaságpolitika-alkotók nyitottak az intézményi problémákra, a magyar véleményformálók körében egyetértés van a fő versenyképességi kihívásokat illetően. Jól azonosíthatóak a szervezők erőfeszítéseit kompenzálni képes versenyképességi összefogások keretei. A helyi gazdaságfejlesztés intézményei alakulóban vannak. A nonprofit szektor működési viszonyainak bizonytalansága ellenére a közcélúság és a versenyképesség közös területein (mint az atipikus foglalkoztatás) jól teljesít. Ezek az eredmények egyszerre nyitnak perspektívát a tudományos vizsgálódás és a gyakorlati cselekvés számára. Az önérdek és a közösségi értékteremtő képesség javításának motivációja közötti kapcsolat tudományos vizsgálata, a társadalmi innovációk kutatása a versenyképesség javíthatóságának kereteit tárhatja fel. Az üzleti, civil vagy kormányzati szereplők pedig akkor tudják a fogyasztói, közösségi elvárásokat sikeresen összeegyeztetni stratégiai céljaikkal, ha a gazdasági és társadalmi szereplők normáihoz, konvencióihoz igazítva alakítják ki intézményformáló stratégiáikat. __________ The competitiveness of nations is based on the successful function of the institutions that support the division of labor on value creation – this is the basic principle of this research. Our project investigates what values and motivations shape the institutional setting of Hungarian economy. We study the world of norms, conventions and innovations – the elements that shape the institutions. The static analysis of official rules has only a minor role in this approach. Research focuses (1) on the value system of entrepreneurs (2), on the mind setting of public managers and executives of economic policy (3) on the factors of local economic competitiveness, (4) on the actions of private and non-profit sector in order to enhance competitiveness. The main finding of this research is that the cognitive factors that shape the competitiveness of Hungary – the norms and motivations of decision makers in the economy – give a positive support for the competitiveness strengthening initiatives of business, non-profit and public sectors. The studies on the values system of entrepreneurs and citizens show that expectations and moral values connected to competitiveness are strong. The public managers of economic policy are open-minded and there is a general consensus of experts, business and politics on the key competitiveness challenges of Hungary. There are well defined frameworks to conceptualize the schemes that make organizers’ efforts affordable in private initiatives for competitiveness. There are various developments on the field of institutions for local economic development. The nonprofit sector has good results on the common fields of competiveness and equity (like atypical forms of employment) despite the uncertainties in the background of the sector. These results open perspectives both for scientific research and practical applications. The research on connection between individual goals and motivation to improve value creating ability of the society and the study of social innovation reveal new aspects of competitiveness. Business, non-profit or public leaders can better synchronize their strategies with the expectation of consumers, communities and constituencies if their intentions to shape institutional settings fit better to the norms and conventions of the social and economic stakeholders.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. ^ We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. ^ We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. ^ We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). ^ In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis studies the static and seismic behavior of simple structures made with gabion box walls. The analysis was performed considering a one-story building with standard dimensions in plan (6m x 5m) and a lightweight timber roof. The main focus of the present investigation is to find the principals aspects of the seismic behavior of a one story building made with gabion box walls, in order to prevent a failure due to seismic actions and in this way help to reduce the seismic risk of developing countries where this natural disaster have a significant intensity. Regarding the gabion box wall, it has been performed some calculations and analysis in order to understand the static and dynamic behavior. From the static point of view, it has been performed a verification of the normal stress computing the normal stress that arrives at the base of the gabion wall and the corresponding capacity of the ground. Moreover, regarding the seismic analysis, it has been studied the in-plane and out-of-plane behavior. The most critical aspect was discovered to be the out-of-plane behavior, for which have been developed models considering the “rigid- no tension model” for masonry, finding a kinematically admissible multiplier that will create a collapse mechanism for the structure. Furthermore, it has been performed a FEM and DEM models to find the maximum displacement at the center of the wall, maximum tension stresses needed for calculating the steel connectors for joining consecutive gabions and the dimensions (length of the wall and distance between orthogonal walls or buttresses) of a geometrical configuration for the standard modulus of the structure, in order to ensure an adequate safety margin for earthquakes with a PGA around 0.4-0.5g. Using the results obtained before, it has been created some rules of thumb, that have to be satisfy in order to ensure a good behavior of these structure.